US20120179008A1 - Vigilance Monitoring System - Google Patents
Vigilance Monitoring System Download PDFInfo
- Publication number
- US20120179008A1 US20120179008A1 US13/351,857 US201213351857A US2012179008A1 US 20120179008 A1 US20120179008 A1 US 20120179008A1 US 201213351857 A US201213351857 A US 201213351857A US 2012179008 A1 US2012179008 A1 US 2012179008A1
- Authority
- US
- United States
- Prior art keywords
- driver
- data
- sensors
- sensor
- vigilance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000012544 monitoring process Methods 0.000 title abstract description 46
- 230000004424 eye movement Effects 0.000 claims description 28
- 230000033001 locomotion Effects 0.000 abstract description 109
- 238000000034 method Methods 0.000 abstract description 48
- 206010062519 Poor quality sleep Diseases 0.000 abstract description 10
- 230000036626 alertness Effects 0.000 abstract description 10
- 230000000747 cardiac effect Effects 0.000 abstract description 2
- 230000000241 respiratory effect Effects 0.000 abstract description 2
- 238000012545 processing Methods 0.000 description 67
- 206010016256 fatigue Diseases 0.000 description 62
- 238000004458 analytical method Methods 0.000 description 39
- 238000001514 detection method Methods 0.000 description 32
- 230000004044 response Effects 0.000 description 32
- 230000000694 effects Effects 0.000 description 27
- 239000000463 material Substances 0.000 description 25
- 238000010586 diagram Methods 0.000 description 16
- 230000008569 process Effects 0.000 description 13
- 230000007613 environmental effect Effects 0.000 description 12
- 230000036544 posture Effects 0.000 description 12
- 238000012360 testing method Methods 0.000 description 12
- 239000002033 PVDF binder Substances 0.000 description 11
- 230000006870 function Effects 0.000 description 11
- 229920002981 polyvinylidene fluoride Polymers 0.000 description 11
- 238000013473 artificial intelligence Methods 0.000 description 10
- ZRHANBBTXQZFSP-UHFFFAOYSA-M potassium;4-amino-3,5,6-trichloropyridine-2-carboxylate Chemical compound [K+].NC1=C(Cl)C(Cl)=NC(C([O-])=O)=C1Cl ZRHANBBTXQZFSP-UHFFFAOYSA-M 0.000 description 10
- 210000003128 head Anatomy 0.000 description 8
- 238000004364 calculation method Methods 0.000 description 7
- 230000000007 visual effect Effects 0.000 description 7
- 230000008859 change Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 6
- 239000011521 glass Substances 0.000 description 6
- 230000003068 static effect Effects 0.000 description 6
- 238000001914 filtration Methods 0.000 description 5
- 239000000446 fuel Substances 0.000 description 5
- 230000000737 periodic effect Effects 0.000 description 5
- 230000009467 reduction Effects 0.000 description 5
- 230000004397 blinking Effects 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 238000002360 preparation method Methods 0.000 description 4
- 230000029058 respiratory gaseous exchange Effects 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 210000003205 muscle Anatomy 0.000 description 3
- 230000035790 physiological processes and functions Effects 0.000 description 3
- 230000002829 reductive effect Effects 0.000 description 3
- 238000012552 review Methods 0.000 description 3
- 238000010200 validation analysis Methods 0.000 description 3
- 206010041349 Somnolence Diseases 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 2
- 239000000919 ceramic Substances 0.000 description 2
- 230000000052 comparative effect Effects 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 230000003750 conditioning effect Effects 0.000 description 2
- 230000001276 controlling effect Effects 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 230000000875 corresponding effect Effects 0.000 description 2
- 238000013481 data capture Methods 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 230000035080 detection of muscle activity involved in regulation of muscle adaptation Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000006698 induction Effects 0.000 description 2
- 238000002347 injection Methods 0.000 description 2
- 239000007924 injection Substances 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 238000012806 monitoring device Methods 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 238000010183 spectrum analysis Methods 0.000 description 2
- 239000007921 spray Substances 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- LFQSCWFLJHTTHZ-UHFFFAOYSA-N Ethanol Chemical compound CCO LFQSCWFLJHTTHZ-UHFFFAOYSA-N 0.000 description 1
- 241000542420 Sphyrna tudes Species 0.000 description 1
- 206010042434 Sudden death Diseases 0.000 description 1
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000001154 acute effect Effects 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 239000005030 aluminium foil Substances 0.000 description 1
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 208000008784 apnea Diseases 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 238000013523 data management Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000000994 depressogenic effect Effects 0.000 description 1
- 230000002542 deteriorative effect Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 239000002001 electrolyte material Substances 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000004399 eye closure Effects 0.000 description 1
- 210000000744 eyelid Anatomy 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- 230000000147 hypnotic effect Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 239000012774 insulation material Substances 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000926 neurological effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000035479 physiological effects, processes and functions Effects 0.000 description 1
- 238000007639 printing Methods 0.000 description 1
- 230000000246 remedial effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 230000001235 sensitizing effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 208000011580 syndromic disease Diseases 0.000 description 1
- 238000010998 test method Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
- 239000011800 void material Substances 0.000 description 1
- 238000004804 winding Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/18—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
- A61B5/384—Recording apparatus or displays specially adapted therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M21/00—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
- A61M21/02—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis for inducing sleep or relaxation, e.g. by direct nerve stimulation, hypnosis, analgesia
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60L—PROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
- B60L3/00—Electric devices on electrically-propelled vehicles for safety purposes; Monitoring operating variables, e.g. speed, deceleration or energy consumption
- B60L3/02—Dead-man's devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60T—VEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
- B60T7/00—Brake-action initiating means
- B60T7/12—Brake-action initiating means for automatic initiation; for initiation not subject to will of driver or passenger
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60T—VEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
- B60T7/00—Brake-action initiating means
- B60T7/12—Brake-action initiating means for automatic initiation; for initiation not subject to will of driver or passenger
- B60T7/14—Brake-action initiating means for automatic initiation; for initiation not subject to will of driver or passenger operated upon collapse of driver
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/06—Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/398—Electrooculography [EOG], e.g. detecting nystagmus; Electroretinography [ERG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7232—Signal processing specially adapted for physiological signals or for diagnostic purposes involving compression of the physiological signal, e.g. to extend the signal recording period
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M21/00—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
- A61M2021/0005—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus
- A61M2021/0044—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the sight sense
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M21/00—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
- A61M2021/0005—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus
- A61M2021/0083—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus especially for waking up
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M2230/00—Measuring parameters of the user
- A61M2230/08—Other bio-electrical signals
- A61M2230/10—Electroencephalographic signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M2230/00—Measuring parameters of the user
- A61M2230/08—Other bio-electrical signals
- A61M2230/14—Electro-oculogram [EOG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M2230/00—Measuring parameters of the user
- A61M2230/60—Muscle strain, i.e. measured on the user
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60L—PROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
- B60L2200/00—Type of vehicles
- B60L2200/26—Rail vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60L—PROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
- B60L2260/00—Operating Modes
- B60L2260/40—Control modes
- B60L2260/46—Control modes by self learning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0872—Driver physiology
Definitions
- the present invention relates to a vigilance monitoring system.
- the invention relates to a system for monitoring, recording and/or analysing vigilance, alertness or wakefulness and/or a stressed state of an operator of equipment or machinery in a variety of situations including situations wherein the degree of vigilance of the operator has implications for the safety or well being of the operator or other persons.
- a typical application may include monitoring the driver of a vehicle or pilot of an aircraft, although the invention also has applications in areas involving related occupations such as train drivers and operators of equipment such as cranes and industrial machinery in general, and where lack of operator vigilance can give rise to harmful social or economic consequences.
- the system of the present invention will be described herein with reference to monitoring a driver of a vehicle nevertheless it is not thereby limited to such applications.
- other applications may include monitoring routine, acute or sub-acute physiological parameters of a person or subject in a home, work, clinic or hospital environment.
- the monitored parameters may include cardiac, respiratory and movement parameters as well as parameters relating to apnea events, subject sleep states or sudden death syndrome on-set.
- the monitoring system is designed, inter alia, to provide non-invasive monitoring of a driver's physiological data including movement activity, heart activity, respiration and other physiological functions.
- the monitored physiological data may undergo specific analysis processing to assist in determining of the driver's state of vigilance.
- the system is designed to detect various states of the driver's activity and detect certain conditions of driver fatigue or relaxation state that could lead to an unsafe driving condition or conditions.
- the system of the present invention includes means for gathering movement data associated with the driver.
- the movement gathering means may include a plurality of sensors such as touch sensitive mats placed in locations of the vehicle that make contact with the driver, such as the seat, steering wheel, pedal(s), seat belt or the like. Each location may include several sensors or mats to more accurately monitor movements of the driver.
- Signals from the various sensors/mats may be processed and analysed by a processing means.
- the processing means may include a digital computer.
- the processing means may be programmed to recognize particular movement signatures or patterns of movement, driver posture or profile and to interpret these to indicate that vigilance has deteriorated or is below an acceptable threshold.
- the processing means may include one or more algorithms.
- the sensors or mats may include piezoelectric, electrostatic, piezo ceramic or strain gauge material.
- the latter may be manufactured by separating two conductive materials such as aluminium foil with an electrolyte material which is capable of passing AC but not DC current.
- the sensors or mats may include Capacitive Static Discharge (CSD) or Polyvinylidene fluoride (PVDF) material.
- CSD Capacitive Static Discharge
- PVDF Polyvinylidene fluoride
- the sensors/mats may be covered with a non-obtrusive, flexible surface which is capable of detecting pressure and/or monitoring electrophysiological activity.
- the pressure detecting capability may be used for detecting driver movement.
- the or each sensor may produce an output signal that represents the magnitude of the pressure or force that is applied to the sensor.
- the or each pressure signal may thus represent an absolute or quantitative measure of pressure applied to the sensor.
- the electrophysiological activity may include electrical signals generated by the body of the driver eg. electrical muscle activity and/or pulse activity.
- the sensors or mats may be located in various parts of a vehicle.
- the seat of the driver may be divided into several sections such as upper or back and lower or seat.
- the upper or back section may include sensors in the top edge, centre and base.
- the lower or seat section may include sensors in the front edge, centre and rear.
- the or each sensor may include CSD or PVDF material.
- the steering wheel may include a plurality of sensors.
- the steering wheel may be divided into eight zones such as upper, upper left, upper right, left, right, lower left, lower right and lower.
- At least one sensor may be associated with each zone.
- the or each sensor may include CSD or PVDF material.
- the floor covering such as carpet may include a plurality of sensors.
- the floor covering or carpet may be divided into a plurality of zones. At least one sensor may be associated with each zone.
- the or each sensor may include CSD or PVDF material.
- the accelerator, clutch and brake pedals may include a plurality of sensors. Each pedal may be divided into a plurality of zones such as upper, middle and lower. At least one sensor may be associated with each zone. The or each sensor may include. CSD, PVDF or other movement sensitive material.
- the seat belt may include one or a plurality of sensors.
- a sensor or sensors may be embedded in the fixed (i.e. non-retractable) section of the seat belt.
- the or each sensor may include CSD or PVDF material.
- a head tilt device incorporating a positional switch or the like may be associated with the drivers cap, glasses or goggles or may be arranged to clip over the drivers ear or glasses.
- the head tilt device may be adapted to provide a signal or data which alters in accordance with the position of the driver's head.
- a radio tracking device may determine and track a subject's head movements.
- a head band and/or chin band sensor may be used to monitor EEG, EMG and EOG signals.
- the head band sensor may include separate left and right frontal zones and left and right eye zones.
- the sensor may include CSD or PVDF material or other material sensitive to measuring patient skin electrical surface variations and/or impedance.
- Various sensors/techniques may be adapted for monitoring eye movement including those based on reflected light, electric skin potential, contact lenses, limbus tracking, video imaging and magnetic induction.
- the sensors/techniques may include EOG electrodes, infrared detection of eye movements and/or video tracking and processing of eye movements.
- the sensors/techniques may be adapted for monitoring the left eye only or the right eye only or both eyes.
- Raw data which is collected from the various sensors positioned around the vehicle may be filtered and amplified prior to processing and analysis.
- a significant purpose of the processing and analysis is to determine the driver's state of vigilance, alertness or wakefulness.
- the system may be adapted to effect remedial action, ie. the system may take steps to alert the driver or to actively intervene in the control of the vehicle, when it is deemed that such action is warranted or desirable.
- Processing of data may be performed in several stages, including primary, secondary and tertiary analysis.
- Primary analysis refers to processing of raw data from the various sensors. This raw data may be filtered and amplified prior to analog to digital conversion. Primary analysis may be adapted to determine valid body movements of the driver as distinct from spurious signals and artefacts due to environmental factors including noise.
- Valid body movements may be determined by applying a combination of processing techniques including:
- Threshold detection may facilitate distinguishing random and non-significant electrical noise (typically spikes of small duration) relative to signals representing valid or actual body movements. Threshold detection may apply to both amplitude and duration of the signals.
- the relevant threshold(s) may be determined from clinical trials and/or historical data. Where the detection is based on amplitude it may be determined in both negative and positive phases of the signal. Amplitude detection may be based on a measurement of the peak-to-peak signal. Alternatively, the positive and negative peak amplitudes can be measured separately.
- Threshold detection may be combined with a form of zero-base line detection so that electronic offsets do not adversely affect the accuracy of threshold detections. Each body movement which exceeds the predetermined amplitude and/or duration may be classified as an event for further processing.
- Secondary analysis may be adapted to process the results of primary analysis. Secondary analysis may process data for the purpose of presentation and/or display. Data may be displayed or printed in a tabular, graphical or other format which facilitates interpretation of the data. One purpose of the representation and/or display is to represent a driver's state of vigilance and/or fatigue. In one form each event identified during primary analysis may be counted for a fixed period of time or epoch. The fixed period of time may be 30 seconds or 60 seconds, or other period which is adequate for determining body movement trends. The count value or number of valid body movements in a given period (eg. 30 seconds) may be represented in a display as, say, the length of a vertical bar.
- the average amplitude associated with each event may be indicated by the length of the vertical bar whilst the count value or number of valid body movements for each epoch may be represented by colouring the vertical bar.
- the colours green, blue, yellow, orange, red may indicate count values or movement numbers in ascending order ie. green indicating the lowest movement number for a particular epoch and red indicating the highest movement number for a particular epoch.
- data may be displayed on a 3 dimensional graph wherein for example the x dimension of the graph represents time or epochs, the y dimension represents the average amplitude, while the z dimension represents the number of events during a particular epoch.
- the above display techniques may facilitate interpretation of the number of valid body movements and the amplitude of those movements and association of this data with the driver's activity or state of vigilance, alertness or wakefulness.
- the period of each individual body movement may also be relevant to measure the period of each individual body movement as this may provide an indication of the energy that is associated with the movement. For example, if a driver squeezes the steering wheel in a rapid response as distinct from gripping the wheel as part of a focussed steering manoeuvre, the pattern of signals in each case will be different.
- the rapid response may appear as a small cluster of movements/signals or as a single movement/signal with a relatively short duration or period of time.
- the steering manoeuvre may appear as a larger cluster of movements/signals over a relatively longer period of time or as a single movement/signal having a relatively long duration.
- the type of signal which may be expected will depend in part upon the type of sensor.
- piezo ceramic or PVDF sensors may emit fewer clusters of signals but may emit signals with larger time periods in relation to the actual period of the movement which is being monitored.
- a capacitive electrostatic sensor is more likely to emit clusters of “spikes” being relatively short period signals. It may be necessary to record the energy level of each movement as this energy level may fall below a certain threshold when the driver is in a fatigued state. If, for example the driver has relaxed, then the energy of a body movement in the actions of driving may be significantly more subdued than in the case where the driver is alert, and his muscle activity is significantly greater. Therefore it may be useful to measure and record each and every body movement.
- This data could be displayed on high-resolution graphs where for example the X-axis represents 1 ⁇ 2 second periods and 960 lines make up each continuous section—or 480 seconds (8 minutes). The largest amplitude signal in each 1 ⁇ 2 second period could then be displayed on the X-Axis.
- the Y-Axis on the other hand could be represented by a scale of amplitudes representing each body movement. This graph would be more precise in representing the actual signal level of each body-movement and the subsequent muscle status for a driver.
- Detection of groups of movements may include user configurable or preset values for;
- Periodic measurement analysis can detect, for example, absence of movements which can be associated with a driver's fatigue.
- Tertiary analysis may be adapted to process the results of secondary analysis.
- One purpose of tertiary analysis is to determine the status of a drivers state of vigilance, alertness or wakefulness.
- Tertiary analysis may process the results of secondary analysis to produce intermediate data and/or indicate trends in the data.
- the intermediate data and trends may be used to provide summary reports and further tabular and/or graphic representations of a drivers status or condition.
- the intermediate data may be processed by one or more vigilance algorithms to determine the status of a driver's vigilance, alertness or wakefulness.
- Intermediate data of various types may be derived and the vigilance algorithm(s) may make use of such data to determine the status of the driver.
- the intermediate data may include:
- the vigilance algorithm(s) may be adapted to correlate the intermediate data and/or apply combinational logic to the data to detect patterns of movement (or lack thereof) which, based on historical data or clinical trials, indicates that the driver is or may be excessively relaxed or is below an acceptable threshold of vigilance, alertness or wakefulness.
- the vigilance algorithm(s) may incorporate one or more look up tables including reference movement data and default values associated with acceptable and unacceptable levels of driver fatigue. Histograms including movement histograms of the kind described in AU Patent 632932 based on the work of Rechitilles and Kayles (R & K) may be used as well as tables showing weighted values and actual movement data for each sensor.
- the vigilance algorithm(s) may determine a vigilance probability factor (0-100%) as a function of weighted movement data values.
- the system may be arranged to intervene in the control of the vehicle or to alert the driver of the vehicle and/or other vehicles.
- Vehicle control intervention may include restriction of speed, controlled application of brakes, cutting-off fuel and/or disabling the accelerator pedal.
- Driver alerting intervention may include use of sprays designed to stimulate the driver, vibrating the steering wheel, seat belt or floor area in the vicinity of the driver, an audible alarm and/or use of bright cabin lights.
- the driver can also be alerted by winding down the driver window and/or other effective alerting methods as may be applicable to each individual driver.
- Drivers of other vehicles may also be alerted by means of flashing hazard lights and/or sounding of a siren.
- Vehicle control intervention may be integrated with and form part of a vehicle control system or it may be interfaced to an existing vehicle control system. Vehicle control intervention may be interfaced with GSM or other communication systems to provide early warning indication that a driver or operator of equipment is in a stressed, fatigued or other undesirable condition that may be detected.
- GSM Global System for Mobile communications
- Calibration can set the system's detection parameters in accordance with varying driver movement and other driver signals. Calibration is beneficial because driver sensor and signal outputs will vary with different drivers. Background noise will also vary with different vehicles.
- the need for calibration may be proportional to the critical nature of the driving or dependent on the level of accuracy required for fatigue monitoring and detection.
- Artificial intelligence may be embodied in one or more automated systems including one or more mathematical algorithms. Artificial intelligence includes the systems ability to self-learn or teach itself conditions associated with the driver which constitute normal or alert driving as distinct from conditions which constitute abnormal or fatigued driving.
- Artificial intelligence may allow the driver of a specific vehicle to select a mode of operation during which the driver's movements during normal or wakeful driving are monitored and diagnosed in order to determine typical thresholds and correlations between various sensors, for the purpose of determining true fatigue states of the driver as distinct from alert states of the driver. Artificial intelligence may also facilitate adaptation of the vigilance algorithm(s), to the specific vehicle's background noise characteristics.
- Artificial intelligence may include different response patterns for correlating movement data from the various' sensors for distinguishing valid driver movements from environmental vibrations and noise. These may be classified and described by, for example, a look up table that records expected patterns or combinations of signals for different cases of environmental noise as distinct from driver generated signals. For example, if the driver moves his hand, signals from sensors in the steering wheel and arm sections of the seat may correlate according to a specific pattern. Alternatively, if the vehicle undergoes a severe or even subtle vibration due to road or engine effects, a broader range of sensors may be similarly affected and this may be manifested as amplitudes which follow predetermined correlation patterns. Signals from the sensors may increase in strength or amplitude according to the proximity of the source of the sound or vibrations.
- the source of the vibration may manifest itself as a pattern of similar waveforms across the various sensors which reduce progressively in amplitude as the sensors distance from the source increases.
- the floor sensors may register maximum amplitude whereas the steering wheel sensors which are furthest from the road noise may register minimum amplitude.
- the phase relationship of vibrations from various sources may also provide some guide as to the likely source of the vibrations. For example, if the vibrations emanate from the driver's movement then it is more likely that several signals with similar phase may be detected. On the other hand, if the signals have varying phase relationships, then it is more likely that the source of the vibrations giving rise to these signals is random as may be expected if the vibrations emanate from the vehicle environment.
- phase signals arising from driver movements may be distinguished from similar phase signals arising from artefacts or the vehicle environment by relating the environmental noise to sensors located near sources of expected noise in the vehicles, eg. engine noise, wheel noise, and other vibrations and noise. This may be detected by carefully locating microphones and vibration sensors in the vehicle.
- Cancellation of environmental noise can be assisted by monitoring signals from the microphones and sensors with a view to applying the most effective signal cancellation techniques in order to reduce as much as possible the artefact or noise effects or unwanted signals within the vehicle environment.
- noise cancellation techniques includes detection of the various road bumps and ignoring the effect of these bumps on the data being analysed from the various vehicle sensors of interest.
- noise cancellation techniques includes detection of various engine noises and application of a signal of opposite phase to the motor noise in order to cancel the artefact.
- phase cancellation techniques which may be adopted is disclosed in PCT application AU97/00275, the disclosure of which is incorporated herein by cross-reference.
- noise cancellation include filtering wherein highpass, lowpass and notch filters may be used to assist artefact removal.
- Artificial intelligence may learn to ignore periodic signals from sensors in the vehicle as these are likely to arise from mechanical rotations within the vehicle, thus improving the separation of artefact signals from signals of interest, such as signals which indicate true driver movement.
- Points of calculation and analysis of sensor data for the purpose of comparison and correlation with previously monitored data may include:
- Artificial intelligence may be applied in conjunction with pressure sensors in vehicle seats and/or seat belts to control air bag deployment.
- air bag deployment may be restricted for children or validated with different crash conditions for children and adults. For example, if a child is not detected as being thrust forward by means of pressure data received from seat/seat belt sensors, deployment of air bags and possible air bag injury to the child may be avoided.
- Deployment of air bags may generally be validated more intelligently by analysing data relating to passenger or driver posture, movement, thrust, body movement, unique driver or passenger ‘yaw’ etc.
- the system may include means for testing a driver's response times. Such tests may, if carried out at regular intervals, pre-empt serious driving conditions as can be brought about by driver fatigue or a lapse in vigilance.
- the testing means may be adapted to provide a simple method for prompting the driver and for testing the driver's response time.
- the response test may, for example, request the driver to respond to a series of prompts. These prompts may include requesting the driver to squeeze left or right hand sections of the steering wheel or squeeze with both left and right hands at the same time in response to a prompt.
- the means for prompting the driver may include, for example, LEDs located on the dash of the vehicle or other position that the driver is visually aware of.
- a left LED blinking may for example, prompt the driver to squeeze the left hand on the steering wheel.
- a right LED blinking may prompt the driver to squeeze the right hand on the steering wheel.
- the centre LED blinking may prompt the driver to squeeze both hands on the steering wheel.
- two LEDs could be used in the above example, except that both LEDs blinking may prompt the driver to squeeze with both hands.
- the drivers response or level of alertness may be detected by measuring the response time of the driver, where the response time is measured as the time between illumination of an LED and a correct response with the hand or hands.
- the system can verify the results and alert the driver.
- the system may also determine the accuracy of the driver's responses to ascertain the status of the driver's vigilance.
- a further example of means for testing the driver's response may include means for flashing random numbers on the windscreen.
- the driver may be prompted to respond by squeezing the steering wheel a number of times as determined by the number flashed.
- the numbers may be flashed on the screen at different locations relative to the steering wheel with the position of the hands on the wheel responding to the side of the screen where the flashes were detected. This type of test should be conducted only when the driver is not turning, changing gear, braking or performing other critical driving functions.
- driver response tests are not anticipated by the driver to more accurately detect the driver's state of vigilance. It is of course also important that the selected method of testing driver response, does not in any way distract the driver or contribute to the driver's lapse in concentration.
- the system may be built into a vehicle sun visor as a visual touch screen display allowing a comprehensive visualisation of a drivers activity.
- the touch screen may include a color display for displaying movement/pressure outputs associated with each sensor.
- a display of the status of a plurality of sensors may provide a visual indication of a relaxed versus an active driver state.
- apparatus for determining a vigilance state of a subject such as a driver of a vehicle or the like, said apparatus including:
- a method for determining a vigilance state of a subject such as a driver of a vehicle or the like, said method including the steps of:
- FIG. 1 shows a block diagram of a vigilance monitoring system according to the present invention
- FIG. 2 shows a flow diagram of an algorithm for processing data from sensors associated with a vehicle and driver
- FIG. 3A shows a simplified block diagram of a system for cancelling environmental noise from driver interfaced sensors
- FIG. 3B shows waveforms associated with the system of FIG. 3A ;
- FIG. 4A shows a flow diagram of a movement processing algorithm according to the present invention
- FIG. 4B shows examples of data reduction and syntactic signal processing associated with a sample signal waveform
- FIG. 5 shows sample outputs of data following secondary analysis by the system of FIG. 4A ;
- FIG. 6 shows an embodiment of steering wheels sensors
- FIG. 7 shows a block diagram of a vigilance monitoring system utilizing video data
- FIG. 8 shows a flow diagram of an algorithm suitable for processing video data
- FIGS. 9 and 10 show examples of data produced by the system of FIGS. 7 and 8 ;
- FIG. 11 is a flow diagram of the main vigilance processing algorithm
- FIG. 12 is a block diagram of a vehicle monitoring system according to the present invention.
- FIG. 13 shows one form of transducer for monitoring posture of a driver or equipment operator
- FIG. 14 shows a block diagram of an embodiment of an anti snooze device according to the present invention.
- FIG. 15 shows a calibrate mode algorithm
- FIG. 16 shows a main relax detection algorithm
- block 12 shows a plurality of sensors 1 to 11 associated with a vehicle and driver.
- the or each sensor may include piezoelectric or electrostatic material such as CSD or PVDF material.
- the material can be divided into plural sections of the driver's seat, for example.
- the various sensors are summarized below.
- These sensors can be, for example, positional switch devices. The output from these positional devices is amplified, filtered and finally data acquisitioned and analysed. This sensor device is designed to output a signal or digital data which changes state in accordance with the tilt of the driver's head. By calibration of the system in accordance with normal driving conditions this output can correlate the normal driving condition with the fatigued driver condition.
- the driver headband sensors can be, for example, a Capacitive Static Discharge Material (CSDM) or PVD Material (PVDM) that can be divided into the various sections (as listed below) of the driver's headband sensor. The output from the various sensors is amplified, filtered and finally data acquisitioned and analysed.
- the headband material can contain conductive sections designed to pick-up the patient's electro-encephalograph (EEG) signals.
- the driver steering wheel or other steering device sensors can be, for example, a CSDM or PVD material that can be divided into the various sections (as listed below) of the driver's steering wheel or other steering device.
- the output from the various sensors is amplified, filtered, and finally data acquisitioned and analysed.
- FIG. 6 An alternative form of steering wheel sensor is shown in FIG. 6 .
- the driver carpet sensors can be, for example, a Capacitive Static Discharge Material (CSDM) or PVD Material (PVDM) that can be divided into the various sections (as listed below) of the driver's carpet area.
- CSDM Capacitive Static Discharge Material
- PVDM PVD Material
- the driver accelerator sensors can be, for example, a Capacitive Static Discharge Material (CSDM) or PVD Material (PVDM) that can be divided into the various sections (as listed below) of the accelerator pedal.
- CSDM Capacitive Static Discharge Material
- PVDM PVD Material
- the driver brake sensors can be, for example, a Capacitive Static Discharge Material (CSDM) or PVD Material (PVDM) that can be divided into the various sections (as listed below) of the brake pedal.
- CSDM Capacitive Static Discharge Material
- PVDM PVD Material
- Other sensors are referred to in block 13 , including steering wheel movement and direction sensors and sensors for detecting environmental noise and vibrations.
- Block 16 includes a central processing unit and one or more algorithms for processing the digital signals. Block 16 also makes use of the vigilance processing algorithm(s) in block 17 .
- the vigilance processing algorithm(s) in block 17 are adapted to determine the status of the driver state of vigilance, alertness or wakefulness. This status may be expressed as a vigilance factor (0-100%).
- the central processing unit may alert the driver of the vehicle and/or other vehicles.
- the driver alert means may include:
- the central processing unit may intervene in the control of the vehicle.
- Vehicle intervention may enable the vehicle to be brought into a safe or safer status.
- Vehicle intervention may include speed restriction or reduction or complete removal of fuel supply.
- the accelerator pedal may need to be disabled, for example when a driver has his foot depressed on the accelerator pedal and is in an unsafe or fatigued state.
- the vehicle may have its horn or hazard flashing lights activated to warn other drivers, and/or have its fuel injection de-activated, and/or speed reduced by gentle and controlled safe braking.
- the vehicle may have its fuel supply reduced, and/or its speed reduced by gentle and controlled safe braking, to a safe cruising speed. The driver may then be prompted again before the vehicle undergoes further intervention.
- Another option for vehicle intervention is to provide a form of ignition override, as used in some alcohol based systems.
- the vehicle ignition or starting process may be inhibited by an inappropriate driver state which in the present case may be drowsiness or excessive fatigue.
- vehicle intervention options may be instigated by an onboard computer or electronic interface eg. by communication with the speed controller or fuel injection logic.
- the computer system may include intelligence to arbitrate the most appropriate intervention sequence or process to minimize risk to the vehicle driver or its passengers.
- FIG. 2 shows a flow diagram of an algorithm for processing data from sensors associated with the vehicle and driver.
- Block 20 shows a plurality of arrows on the left representing data inputs from various sensors associated with a vehicle, following conversion to digital data.
- the digital data is input to block 21 which determines whether the data conforms to valid amplitude thresholds stored in block 22 . Signals beyond the thresholds are classified as noise or artefact and are ignored.
- the data is then input to block 23 which detects whether the data conforms to valid time duration thresholds stored in block 24 . Signals beyond the thresholds are classified as invalid and are ignored.
- the thresholds stored in blocks 22 and 24 are, for the purpose of the present embodiment, determined empirically from experimental trials.
- the data is then input to block 25 for signal compression.
- block 25 The role of block 25 is to simplify further processing by presenting the data in a minimized form. This is done by syntactic processing whereby main data points only of the signals such as various peaks, troughs and zero crossings or central points defining peaks of the signals are presented for further processing.
- the data is then input to block 26 where it is categorized and summarized in terms of amplitude or power range, number of movements per second or other epoch, and phase relationships between the signals.
- the data may be displayed on tabular or graphical form and/or may be subjected to further automated processing to determine vigilance status.
- FIG. 3A shows a block diagram of a system for removing environmental noise from driver interfaced sensors.
- Block 30 represents various sensors for monitoring driver movements and block 31 represents sensors for monitoring environmental vibration and noise and vehicle artefacts.
- Blocks 32 and 33 represent circuits for amplifying and filtering signals from blocks 30 and 31 respectively.
- Block 34 represents analogue to digital converters for converting the signals from blocks 32 and 33 into digital form for processing via the digital signal processor in block 35 .
- Block 35 includes an algorithm for performing signal cancellation as illustrated in FIG. 3B .
- waveform A represents a signal from a driver interfaced sensor or sensors (Block 30 of FIG. 3A ).
- Waveform B represents a signal from a sensor or sensors associated with the vehicle engine and road noise pickup locations (Block 31 of FIG. 3A ).
- Waveform C represents a signal after it is processed by Block 35 . It may be seen that the signal represented by waveform C is obtained by cancelling or subtracting the signal represented by waveform B from the signal represented by waveform A.
- the signal represented by waveform C is a true or valid movement signal which is not corrupted by environmental noise.
- FIG. 4A shows a flow diagram of a movement processing algorithm according to the present invention.
- signals from sensors 1 to 11 shown in block 12 of FIG. 1 are filtered, then referenced to period and amplitude threshold values before being converted to syntactic data.
- the syntactic data is correlated for determination of certain combinations of sensor movement signals indicating that the driver is in a vigilant or wakeful state.
- a sensor signal or any combination of sensor signals are analysed as being void of subject movement, this may be interpreted as an indication the driver is suspected of being in a non-vigilant or fatigued state. Analysis of the fatigued state is determined by certain expected patterns from the various sensor signals. Such patterns include very little movement from the steering Wheel and very little movement from the seat sensors, indicating that the driver may be excessively relaxed and subject to fatigue, or at risk of fatigue on-set.
- the functions of blocks 40 to 61 are as follows:
- FIG. 1 shows how the analog signals from sensors 1 to 11 are: converted to a digital signal ( FIG. 1 , block 15 ); input to the central processing unit ( FIG. 1 , block 16 ); and processed by a vigilance processing algorithm ( FIG. 1 , block 17 ).
- the start of the algorithm in FIG. 4A represents the start of a process, which is repeated many times for each input sensor 1 to 11 ( FIG. 4A shows the process for sensors 1 , 2 , 3 ).
- This process analyses data from each input sensor for the purpose of final determination of the driver's vigilance state, and whether this state warrants an alarm alert in order to assist in preventing a potential accident.
- the analog signal from each sensor is amplified, filtered and then converted to a digital signal in preparation for signal processing.
- Variables A,C,E,-U provide to the processing algorithms threshold amplitude and period values to allow sensor signal data reductions to be determined and to allow data reduction and syntactic signal processing.
- the variables (A,C,E-U) are determined via controlled studies from experimental and research data.
- FIG. 4B shows examples of: (1) signals components which are ignored due to being below a minimum amplitude threshold, (2) syntactic data where the signal is represented by troughs and peaks of the signal, and (3) high frequency component being ignored due to being below a minimum period threshold. The latter recognizes relatively lower frequencies which are typically due to driver movements.
- Inputs from Sensors 4 to 11 Subject to System Configuration Input from each of the vehicles sensors is amplified, filtered and then analog to digital converted, in preparation for signal processing. This is performed by blocks similar to blocks 41 to 46 . Inputs from more than 11 sensors can be catered for if required.
- Variable data via default table (as determined by clinical data and/or neuro node self learning and adjustment), resulting from customisation to specific subject's driving characteristics and system adaptation.
- Variables B,D,F,-V.
- Longer-term data storage is designed to log the driver's movement data from each of the sensors. This stored data can be accessed at a later stage in order to review the driver's performance history in regards to movement analysis and subsequent vigilance.
- Short term direct access storage used for storing parameters such as the past 10 minutes of syntactic data for each sensor channel, in order to correlate the various data from each sensor or channel and compare this data combination to pre-defined sets of rules designed to describe combinations of sensor outputs which are typical of driver fatigue conditions.
- Driver specific profile and calibration data can be stored for later correlation reference. By correlating with various thresholds or reference conditions the system is able to determine interaction to sensors when a particular driver's conditions is similar to pre-stored reference characteristics.
- This comparative data is stored as data in look up tables.
- the data can consist of frequency and/or amplitude characteristics for a range of driver states or alternatively the data can consist of samples of data (with acceptable variations to the samples of data) that exist for a range of driver states.
- Vehicle output signals These include steering wheel movements, direction of steering wheel movements, speed of vehicle, change of speed of vehicle, engine vibration and noise, road vibration and noise.
- Driver steering wheel adjustments By processing driver steering wheel adjustments and comparing these adjustments with the various sensor signals and correlation of various sensor signals, it is possible to determine the probability that the driver is in a state of fatigue and the degree of driver fatigue.
- the vehicle signals are also analysed in order to assist in noise cancellation (ie vehicle noise as opposed to driver movement) and more accurate identification of valid driver movements).
- FIG. 5 shows typical samples of processed data following secondary analysis for sensor signals 1 to 4 .
- the data shows in graphical form the number of valid movements detected for each sensors 1 to 4 during successive time intervals n, n+1, n+2 . . . . Tertiary analysis may be performed on this data which would allow simple to view correlation between the various sensors.
- the samples shown in FIG. 5 demonstrate an example (dotted line) where the various sensors all experience obvious movement detection.
- the steering wheel sensors shown in FIG. 6 are divided into eight sections as follows:
- Top 62 top left 63 , top right 64 , left 65 , right 66 , bottom left 67 , bottom right 68 and bottom 69 .
- Sensors 62 - 69 are linked via eight cables to output pins 1 to 8 respectively.
- a common connection to each sensor is linked by cables to output pin 9 .
- Alternative configurations are possible with more or less sensors and with the option of sensor arrays on both the upper and lower surfaces of the steering wheel grip surface.
- the outputs represented by pins 1 to 9 are connected to analogue signal conditioning circuits and via analogue to digital convertors to digital signal processing circuits as described above.
- the pressure may be compared to previous values and/or calibrated values to determine whether a pattern of increased or decreased pressure reflects driver fatigue onset.
- the system may calculate and deduce an appropriate point at which the driver should be alerted.
- the appropriate point may be determined from a combination of pre-calibrated data for a specific driver and/or pre-programmed patterns, states or trends in the data including relative and absolute pressure values obtained from a set or subset of vehicle sensors.
- FIG. 7 shows a block diagram of a vigilance monitoring system utilizing video data.
- Block 70 represents a video CCD (charge coupled device) camera which may be located on the drivers visor, dash-board or other suitable location to enable video monitoring of the driver's eyes. An infra-red lens may be utilized to facilitate reliable night video monitoring capability.
- the output of the video camera is passed to block 71 .
- Block 71 is an analog to digital converter for digitizing the video signal prior to processing via block 72 .
- Block 72 is a central processing unit and includes a video processing algorithm.
- the video processing algorithm has eye recognition software designed to identify eyes in contrast to other parts of the drivers face. Eyes are detected using special processing software that allows the driver's eyes to be analysed.
- This analysis includes determining the area of the eye's opening and correlating the eye's opening area to previous similar measurements. In this way eye processing can determine whether a driver's eyes are remaining open as would be expected in an alert state or whether the current eye opening of the driver is relatively less (when compared to earlier eye opening measurements). Rates or degrees of eye closure are able to be detected and continually monitored in this manner.
- Block 73 represents outputs of block 72 including
- FIG. 8 shows a flow diagram of an algorithm suitable for processing video data.
- the functions of blocks 80 to 94 are as follows:
- CAPTURE EYE VIDEO DATA Capture current video frame. Digitise video frame of subject's eyes. Eye data can be captured via one or more of the following means: CCD video camera, Electro-oculogram data capture means via subject worn headband, direct electrode attachment, driver glasses, head-cap or movement sensors, infrared or other light beam detection means.
- Patterns include:
- Fatigue threshold time period variable X set from default values, subject calibration or system self-learning/calculation.
- Fatigue threshold time period variable Y set from default values, subject calibration or system self-learning/calculation.
- Blink rate fatigue characteristics set from default values, subject calibration or system self-learning/calculation.
- FIGS. 9 and 10 show examples of eye opening and eye position data produced by the system of FIGS. 7 and 8 .
- FIG. 11 is a flow chart of the main vigilance processing algorithm. The functions of blocks 95 to 99 are as follows:
- FIG. 12 is a block diagram of a vehicle monitoring system according to the present invention.
- FIG. 12 is an overview of a system which utilizes many of the features discussed herein.
- the functions of blocks 100 to 118 are as follows:
- Driver EEG sensors direct attach electrode, headband, wireless electrode, driver cap and other EEG signal pick-up means.
- Driver EEG sensors direct attach electrode, headband, wireless electrode, driver cap and other EEG signal pickup means.
- Driver Eye movement Detection via electrode, driver glasses/goggles, infrared or other light beam means of tracking detection or other means.
- Vehicle status interface speed, direction, accelerator position, break position, indicators, lights amongst other vehicle status data.
- phase signal detection and processing Applies processing which determines patterns of in-phase signal occurrence and associates these with driver or background noise as originating source.
- Anti-phase signal detection and processing Applies processing which determines patterns of anti-phase signal occurrence and associates these with driver or background noise as originating source.
- Vehicle background and Environmental Noise Sensors to allow noise cancellation, filtering and reduction.
- These sensors include microphone and vibration sensors located at strategic positions in order to pick up background vehicle noise such as road noise and engine noise.
- Fourier transform and frequency analysis of background noise assists in selection of digital filtering characteristics to most effectively minimise vehicle environmental noise and assist in distinguishing driver related fatigue monitoring signals.
- System will continually “self-learn” various vehicle background and threshold noise levels, frequency and other characteristics in order to determine changing vehicle noise conditions and subsequent noise cancellation or capability to ignore unwanted vehicle noise while processing “real” driver movement and physiological signals and subsequent fatigue status.
- Artificial intelligence Signal characteristics as generated by a range of varying road conditions can be programmed into the system. The input data relating to various road conditions thereby provides a means to further distinguish wanted driver related signals from unwanted background noise signals.
- Driver Fatigue Processing Algorithm Correlation with previous driver fatigue conditions together with comparison of outputs for each of above listed fatigue algorithms (Driver EEG, motion, eye, vehicle status).
- Vehicle fatigue display systems for displaying to the driver the current fatigue status or early warning indicators of fatigue status.
- System communication storage and printing peripheral interface Data storage, reporting processing, reporting print interface, wireless and wire connected interfaces, for real-time or post communication of fatigue data and fatigue status information.
- System can include GSM, cellular phone, satellite or other means of moving vehicle tracking and data exchange in real-time or at any required later stage. This information transfer can be an effective means for trucks and other vehicles to have their driver status processed and reviewed, as appropriate and as required.
- FIG. 13 shows one form of transducer for monitoring posture of a driver or equipment operator.
- FIG. 13 shows a webbed structure comprising strips or elements of flexible PVDF or Piezo material separated by flexible insulation material terminated at A, B, C, D, E, F, G and H. Output signals from the respective strips are buffered, amplified, filtered and then analog to digital converted to data. This data may be processed to determine an actual position of pressure applied to the above structure. By analysing the two main co-ordinates and the amplitudes of signals associated with those co-ordinates, the exact position of pressure applied by the vehicle driver or equipment operator may be determined.
- the position where greatest pressure is applied is defined by the intersection of web strip pairs (eg. Band F) which produce the greatest signal amplitude.
- the position may be described by coordinates reflecting the web strip pairs (eg. B,F) which produce the greatest signal amplitude.
- the above transducer may be used in conjunction with the movement sensors described herein to provide a further layer of positional information relating to applied pressure for each sensor. This information may be important in circumstances where a driver's pressure to the steering wheel or the driver's pattern of hand placement (with respective applied pressure) varies in accordance with alertness and drowsiness.
- the posture of the driver or equipment operator may be monitored, stored, correlated with various threshold states and/or displayed in meaningful graphic or numerical form.
- the threshold states may be derived by way of calibration for each specific driver's posture profile under various states of fatigue and/or stress states and conditions.
- the anti-snooze device shown in FIG. 14 includes sensors (block 120 ) connected to an acquisition and processing means (block 121 ).
- Block 122 includes monitoring means designed to amplify, filter and digital to analog convert driver sensor signals in preparation for digital signal processing.
- the digital signal processing means (block 121 ) includes a calibration algorithm as shown in FIG. 15 and a main relax detection algorithm as shown in FIG. 16 .
- the driver can select the relax calibration function, then take on the driving posture that would most closely represents a relaxed or possibly fatigued driving state and the system will then monitor and store the minimum threshold of driver activity over a period of approximately but not limited to 10 seconds, as a relaxed driver reference level.
- the driver can select an active calibration function, then take on the driving posture that would most closely represents normal driving state and the system will then monitor and store the minimum threshold of driver activity over a period of approximately but not limited to 10 seconds, as an active driver reference level.
- the relaxed and active driver reference levels stored in the system may be displayed on the visual touch screen display for various sensors.
- the system may perform a validation function by replaying the drivers relaxed and active reference levels on the touch screen. This allows easy comparison to be made with actual sensor levels when the driver adopts postures representing normal/fatigued states and serves to validate the correctness of the stored reference levels.
- the driver can also select a sensitivity function which may determine how close to the relaxed level the driver needs to be before the anti-snooze system alerts the driver.
- a sensitivity function which may determine how close to the relaxed level the driver needs to be before the anti-snooze system alerts the driver.
- the anti-snooze device has the ability to act as a self warning aid by simply alerting the driver when his posture or driving vigilance is deteriorating. If, for example, a drivers steering wheel grip erodes or undergoes fatigue, the anti-snooze system can be calibrated to detect this condition and alert the driver.
- the driver can have calibration data determined by an off-road simulator that more accurately defines the characteristics of each specific drivers activity variations and physiological variations during dangerously relaxed or fatigued driving conditions.
- the calibration data can be up-loaded to the anti-snooze device to provide more accurate relaxed and active reference levels.
- the calibration data may also provide more accurate means of determining the relative effect that each individual sensor has during a drivers transition from active and alert to drowsy and fatigued. The effects of each sensor may be recorded and this data may assist in more accurate anti-snooze detection.
- the system may detect the drivers hand pressures via the steering wheel sensors, the drivers respiration and ECG via the seatbelt sensors, and the drivers posture and movement via the seat sensors.
- the anti-snooze system may continually monitor and average the signal amplitudes of all sensors, while comparing the current levels of sensor amplitude with the calibrated levels.
- the system may also compare current movement sensor patterns to reference data.
- This reference data can represent certain threshold levels calibrated to each individual driver or general reference conditions.
- the various sensors may be weighted in accordance with their respective importance in determining whether a driver's current state of activity is below the threshold or appropriately close to the relaxed mode calibrated reference level to warrant that the driver be alerted.
- the anti-snooze device can restrict the speed of the vehicle or slowly bring the vehicle to a stand still in order to reduce the likelihood of an accident. This ability to restrict the vehicle's speed could be overridden by the driver as is possible in “auto-cruise” devices currently available on many vehicles.
- the techniques and methodologies may include relatively complex neurological waveform analysis techniques, video tracking of driver eye motions, sophisticated noise cancellation and simpler driver interactive processes such as sensitizing the steering wheel, seat-belt, gear-stick and other driver cabin regions.
- One application for the present invention may include a truck driver vigilance monitoring (TDVM) system.
- TDVM truck driver vigilance monitoring
- This system may be designed around the “dead-man” handle concept as applied successfully in trains.
- a variation of this system may provide visual cues and driver vigilance response testing.
- the TDVM system may include pre-programmed Light Emitting Diode (LED) displays to be activated in various sequences and at various frequencies and durations.
- the truck driver can be visually prompted by way of these LEDS to press the steering wheel according to whether the left or right or both LEDS are flashed.
- the response time and accuracy of the driver's response to the prompts may be measured and relayed back to a remote monitoring control station.
- LED Light Emitting Diode
- Various drivers will have calibrated “vigilant response times and accuracy levels” which can be compared to actual current response times. Where appropriate, an alarm can be activated, if the response times indicate fatigue on-set or a potentially dangerous state.
- sequences and durations can be validated in accordance with clinical trials to provide an effective method of vigilance detection.
- Sequences and patterns of visual truck cabin prompts can be established to minimize driver conditioning.
- Frequency of vigilance test prompts can be determined in accordance with requirements as determined via field studies.
- Safety considerations to avoid driver distraction by the proposed monitoring system may be implemented. Techniques such as utilization of “busy” response prompts especially designed within the system to alert the monitoring control unit that the driver is vigilant but unable to respond at the time due to driving demands.
- the TDVM system may include the following components:
- This software may include a processing algorithm(s) designed to evaluate various driver prompts and response times. Evaluation of these response times may produce a probability factor associated with driver vigilance for each specific driver. Analysis capability of driver response times may be an important element of the system. Accuracy of vigilance probability outcome, clinical analysis and scientific validation associated with this process may determine effectiveness of the monitoring system.
- This device may adapt to the truck steering wheel and provide output signals subject to a particular zone of the steering wheel, which has been activated by applying various degrees of pressure to the steering wheel.
- Controller Unit & Monitoring Device (CU&MD).
- This device may provide a communication link and data management for interfacing the truck's CU&MD to a remotely located monitoring station.
- This device may also provide the transducer interface and transducer signal recording and detection capabilities.
- This device may also output control to the driver indicator LEDS and record and transmit vigilance response times to the remote monitoring station.
- This device may be interfaced to the CU&MD unit and may provide visual response prompt to the truck driver.
- This system may facilitate a remote operators visual alarms when vigilance response times are outside acceptable thresholds.
- This system may also provide communication links to the truck.
- This system may also provide analysis and system reporting to allow real-time tracking of vigilance performance and vigilance alarm status.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Heart & Thoracic Surgery (AREA)
- Psychology (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Anesthesiology (AREA)
- Psychiatry (AREA)
- Developmental Disabilities (AREA)
- Hospice & Palliative Care (AREA)
- Educational Technology (AREA)
- Social Psychology (AREA)
- Child & Adolescent Psychology (AREA)
- Pain & Pain Management (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- General Physics & Mathematics (AREA)
- Sustainable Development (AREA)
- Sustainable Energy (AREA)
- Power Engineering (AREA)
- Hematology (AREA)
- Acoustics & Sound (AREA)
- Automation & Control Theory (AREA)
- Mathematical Physics (AREA)
- Ophthalmology & Optometry (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Traffic Control Systems (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
The present invention pertains to a system and method of monitoring the alertness or wakefulness of a driver. The monitored parameters include cardiac, respiratory and movement parameters. Sensors are located in various locations of the driver side section to detect the vigilance of a driver. These sensors include pressure sensors embedded in the seat and pedals, and a head band for monitoring EEG, EMG and EOG signals.
Description
- The present invention relates to a vigilance monitoring system. In particular the invention relates to a system for monitoring, recording and/or analysing vigilance, alertness or wakefulness and/or a stressed state of an operator of equipment or machinery in a variety of situations including situations wherein the degree of vigilance of the operator has implications for the safety or well being of the operator or other persons. A typical application may include monitoring the driver of a vehicle or pilot of an aircraft, although the invention also has applications in areas involving related occupations such as train drivers and operators of equipment such as cranes and industrial machinery in general, and where lack of operator vigilance can give rise to harmful social or economic consequences.
- The system of the present invention will be described herein with reference to monitoring a driver of a vehicle nevertheless it is not thereby limited to such applications. For example, other applications may include monitoring routine, acute or sub-acute physiological parameters of a person or subject in a home, work, clinic or hospital environment. The monitored parameters may include cardiac, respiratory and movement parameters as well as parameters relating to apnea events, subject sleep states or sudden death syndrome on-set.
- The monitoring system is designed, inter alia, to provide non-invasive monitoring of a driver's physiological data including movement activity, heart activity, respiration and other physiological functions. The monitored physiological data may undergo specific analysis processing to assist in determining of the driver's state of vigilance. The system is designed to detect various states of the driver's activity and detect certain conditions of driver fatigue or relaxation state that could lead to an unsafe driving condition or conditions.
- The system of the present invention includes means for gathering movement data associated with the driver. The movement gathering means may include a plurality of sensors such as touch sensitive mats placed in locations of the vehicle that make contact with the driver, such as the seat, steering wheel, pedal(s), seat belt or the like. Each location may include several sensors or mats to more accurately monitor movements of the driver.
- Signals from the various sensors/mats may be processed and analysed by a processing means. The processing means may include a digital computer. The processing means may be programmed to recognize particular movement signatures or patterns of movement, driver posture or profile and to interpret these to indicate that vigilance has deteriorated or is below an acceptable threshold. The processing means may include one or more algorithms.
- The sensors or mats may include piezoelectric, electrostatic, piezo ceramic or strain gauge material. The latter may be manufactured by separating two conductive materials such as aluminium foil with an electrolyte material which is capable of passing AC but not DC current. In one form the sensors or mats may include Capacitive Static Discharge (CSD) or Polyvinylidene fluoride (PVDF) material. The sensors/mats may be covered with a non-obtrusive, flexible surface which is capable of detecting pressure and/or monitoring electrophysiological activity.
- The pressure detecting capability may be used for detecting driver movement. The or each sensor may produce an output signal that represents the magnitude of the pressure or force that is applied to the sensor. The or each pressure signal may thus represent an absolute or quantitative measure of pressure applied to the sensor. The electrophysiological activity may include electrical signals generated by the body of the driver eg. electrical muscle activity and/or pulse activity.
- The sensors or mats may be located in various parts of a vehicle. The seat of the driver may be divided into several sections such as upper or back and lower or seat. The upper or back section may include sensors in the top edge, centre and base. The lower or seat section may include sensors in the front edge, centre and rear. The or each sensor may include CSD or PVDF material.
- The steering wheel may include a plurality of sensors. The steering wheel may be divided into eight zones such as upper, upper left, upper right, left, right, lower left, lower right and lower. At least one sensor may be associated with each zone. The or each sensor may include CSD or PVDF material.
- The floor covering such as carpet may include a plurality of sensors. The floor covering or carpet may be divided into a plurality of zones. At least one sensor may be associated with each zone. The or each sensor may include CSD or PVDF material.
- The accelerator, clutch and brake pedals may include a plurality of sensors. Each pedal may be divided into a plurality of zones such as upper, middle and lower. At least one sensor may be associated with each zone. The or each sensor may include. CSD, PVDF or other movement sensitive material.
- The seat belt may include one or a plurality of sensors. In one form a sensor or sensors may be embedded in the fixed (i.e. non-retractable) section of the seat belt. The or each sensor may include CSD or PVDF material.
- In some embodiments a head tilt device incorporating a positional switch or the like may be associated with the drivers cap, glasses or goggles or may be arranged to clip over the drivers ear or glasses. The head tilt device may be adapted to provide a signal or data which alters in accordance with the position of the driver's head. Alternatively a radio tracking device may determine and track a subject's head movements.
- In critical applications of vigilance monitoring including applications involving pilots of aircraft, persons responsible for navigating/controlling shipping and drivers of road or rail transport it may be desirable to utilize more comprehensive methods of vigilance monitoring. The latter may include techniques used in conventional sleep monitoring. A head band and/or chin band sensor may be used to monitor EEG, EMG and EOG signals. The head band sensor may include separate left and right frontal zones and left and right eye zones. The sensor may include CSD or PVDF material or other material sensitive to measuring patient skin electrical surface variations and/or impedance.
- Various sensors/techniques may be adapted for monitoring eye movement including those based on reflected light, electric skin potential, contact lenses, limbus tracking, video imaging and magnetic induction. The sensors/techniques may include EOG electrodes, infrared detection of eye movements and/or video tracking and processing of eye movements. The sensors/techniques may be adapted for monitoring the left eye only or the right eye only or both eyes.
- Raw data which is collected from the various sensors positioned around the vehicle may be filtered and amplified prior to processing and analysis. A significant purpose of the processing and analysis is to determine the driver's state of vigilance, alertness or wakefulness. In some embodiments, the system may be adapted to effect remedial action, ie. the system may take steps to alert the driver or to actively intervene in the control of the vehicle, when it is deemed that such action is warranted or desirable.
- Processing of data may be performed in several stages, including primary, secondary and tertiary analysis.
- Primary analysis refers to processing of raw data from the various sensors. This raw data may be filtered and amplified prior to analog to digital conversion. Primary analysis may be adapted to determine valid body movements of the driver as distinct from spurious signals and artefacts due to environmental factors including noise.
- Valid body movements may be determined by applying a combination of processing techniques including:
-
- 1. signal threshold detection whereby signals below or above a pre-determined threshold are ignored and/or classified as noise or artefact,
- 2. frequency filtering whereby high-pass, low-pass and notch filters are adapted to remove noise and artefact signals,
- 3. signal compression whereby data is minimized by presenting main data points such as signal peaks, troughs, averages and zero crossings.
- 4. half period, amplitude analysis of signals, including analysis as disclosed in AU Patent 632932 entitled “Analysis System for Physiological Variables”, assigned to the present applicant, the disclosure of which is incorporated herein by cross reference.
- Threshold detection may facilitate distinguishing random and non-significant electrical noise (typically spikes of small duration) relative to signals representing valid or actual body movements. Threshold detection may apply to both amplitude and duration of the signals. The relevant threshold(s) may be determined from clinical trials and/or historical data. Where the detection is based on amplitude it may be determined in both negative and positive phases of the signal. Amplitude detection may be based on a measurement of the peak-to-peak signal. Alternatively, the positive and negative peak amplitudes can be measured separately. Threshold detection may be combined with a form of zero-base line detection so that electronic offsets do not adversely affect the accuracy of threshold detections. Each body movement which exceeds the predetermined amplitude and/or duration may be classified as an event for further processing.
- Secondary analysis may be adapted to process the results of primary analysis. Secondary analysis may process data for the purpose of presentation and/or display. Data may be displayed or printed in a tabular, graphical or other format which facilitates interpretation of the data. One purpose of the representation and/or display is to represent a driver's state of vigilance and/or fatigue. In one form each event identified during primary analysis may be counted for a fixed period of time or epoch. The fixed period of time may be 30 seconds or 60 seconds, or other period which is adequate for determining body movement trends. The count value or number of valid body movements in a given period (eg. 30 seconds) may be represented in a display as, say, the length of a vertical bar.
- Where it is desired to display the energy or power associated with valid body movements in a particular epoch or time period, the average amplitude associated with each event may be indicated by the length of the vertical bar whilst the count value or number of valid body movements for each epoch may be represented by colouring the vertical bar. For example the colours green, blue, yellow, orange, red may indicate count values or movement numbers in ascending order ie. green indicating the lowest movement number for a particular epoch and red indicating the highest movement number for a particular epoch. Alternatively, data may be displayed on a 3 dimensional graph wherein for example the x dimension of the graph represents time or epochs, the y dimension represents the average amplitude, while the z dimension represents the number of events during a particular epoch. The above display techniques, may facilitate interpretation of the number of valid body movements and the amplitude of those movements and association of this data with the driver's activity or state of vigilance, alertness or wakefulness.
- It may also be relevant to measure the period of each individual body movement as this may provide an indication of the energy that is associated with the movement. For example, if a driver squeezes the steering wheel in a rapid response as distinct from gripping the wheel as part of a focussed steering manoeuvre, the pattern of signals in each case will be different. The rapid response may appear as a small cluster of movements/signals or as a single movement/signal with a relatively short duration or period of time. In contrast, the steering manoeuvre may appear as a larger cluster of movements/signals over a relatively longer period of time or as a single movement/signal having a relatively long duration.
- The type of signal which may be expected (cluster or single movement/signal) will depend in part upon the type of sensor. For example, piezo ceramic or PVDF sensors may emit fewer clusters of signals but may emit signals with larger time periods in relation to the actual period of the movement which is being monitored. A capacitive electrostatic sensor is more likely to emit clusters of “spikes” being relatively short period signals. It may be necessary to record the energy level of each movement as this energy level may fall below a certain threshold when the driver is in a fatigued state. If, for example the driver has relaxed, then the energy of a body movement in the actions of driving may be significantly more subdued than in the case where the driver is alert, and his muscle activity is significantly greater. Therefore it may be useful to measure and record each and every body movement. This data could be displayed on high-resolution graphs where for example the X-axis represents ½ second periods and 960 lines make up each continuous section—or 480 seconds (8 minutes). The largest amplitude signal in each ½ second period could then be displayed on the X-Axis. The Y-Axis on the other hand could be represented by a scale of amplitudes representing each body movement. This graph would be more precise in representing the actual signal level of each body-movement and the subsequent muscle status for a driver.
- It may also be useful to detect events that are represented by groups of movements, where, for example, the groups of movements may be indicative of a driver activity of interest. Detection of groups of movements may include user configurable or preset values for;
-
- the maximum time between consecutive body-movements in order to qualify as being counted as part of a periodic body-movement.
- the number of consecutive body-movements that are required to qualify for a periodic movement.
- the time period during which this number of body-movements must exist in order to qualify as a periodic body-movement.
- Periodic measurement analysis can detect, for example, absence of movements which can be associated with a driver's fatigue.
- Tertiary analysis may be adapted to process the results of secondary analysis. One purpose of tertiary analysis is to determine the status of a drivers state of vigilance, alertness or wakefulness. Tertiary analysis may process the results of secondary analysis to produce intermediate data and/or indicate trends in the data. The intermediate data and trends may be used to provide summary reports and further tabular and/or graphic representations of a drivers status or condition. The intermediate data may be processed by one or more vigilance algorithms to determine the status of a driver's vigilance, alertness or wakefulness. Intermediate data of various types may be derived and the vigilance algorithm(s) may make use of such data to determine the status of the driver. The intermediate data may include:
-
- Rate of change of body movement detections
- Rate of change of body movement amplitudes
- Area under curve of time versus body movement, for various sequential epochs to detect trends of subject movement changes (amplitude or number of movements)
- Correlation of sensor data for patterns of amplitude, energy and body movement changes that can be associated with driver fatigue
- Change in frequency of body movement signals
- Change in amplitude periods of body movement signals
- Change in phase relationships of body movement signals
- Relative phase relationship between each section and other types of sensor sections.
- Following tertiary analysis the vigilance algorithm(s) may be adapted to correlate the intermediate data and/or apply combinational logic to the data to detect patterns of movement (or lack thereof) which, based on historical data or clinical trials, indicates that the driver is or may be excessively relaxed or is below an acceptable threshold of vigilance, alertness or wakefulness.
- The vigilance algorithm(s) may incorporate one or more look up tables including reference movement data and default values associated with acceptable and unacceptable levels of driver fatigue. Histograms including movement histograms of the kind described in AU Patent 632932 based on the work of Rechitschaffen and Kayles (R & K) may be used as well as tables showing weighted values and actual movement data for each sensor.
- The vigilance algorithm(s) may determine a vigilance probability factor (0-100%) as a function of weighted movement data values.
- Upon detecting a vigilance probability factor which is below an acceptable threshold, the system may be arranged to intervene in the control of the vehicle or to alert the driver of the vehicle and/or other vehicles. Vehicle control intervention may include restriction of speed, controlled application of brakes, cutting-off fuel and/or disabling the accelerator pedal. Driver alerting intervention may include use of sprays designed to stimulate the driver, vibrating the steering wheel, seat belt or floor area in the vicinity of the driver, an audible alarm and/or use of bright cabin lights. The driver can also be alerted by winding down the driver window and/or other effective alerting methods as may be applicable to each individual driver. Drivers of other vehicles may also be alerted by means of flashing hazard lights and/or sounding of a siren. Vehicle control intervention may be integrated with and form part of a vehicle control system or it may be interfaced to an existing vehicle control system. Vehicle control intervention may be interfaced with GSM or other communication systems to provide early warning indication that a driver or operator of equipment is in a stressed, fatigued or other undesirable condition that may be detected.
- To assist differentiating normal and acceptable driver vigilance from fatigued or inappropriate driver conditions, calibration of the various sensor and transducer outputs is possible. Calibration can set the system's detection parameters in accordance with varying driver movement and other driver signals. Calibration is beneficial because driver sensor and signal outputs will vary with different drivers. Background noise will also vary with different vehicles. The need for calibration may be proportional to the critical nature of the driving or dependent on the level of accuracy required for fatigue monitoring and detection.
- The need for calibration may to some extent be removed by utilizing artificial intelligence to distinguish baseline conditions for a drivers normal wakeful state to facilitate subsequent analysis and determining when a driver's state indicates fatigue or lapse of vigilance. Artificial intelligence may be embodied in one or more automated systems including one or more mathematical algorithms. Artificial intelligence includes the systems ability to self-learn or teach itself conditions associated with the driver which constitute normal or alert driving as distinct from conditions which constitute abnormal or fatigued driving.
- Artificial intelligence may allow the driver of a specific vehicle to select a mode of operation during which the driver's movements during normal or wakeful driving are monitored and diagnosed in order to determine typical thresholds and correlations between various sensors, for the purpose of determining true fatigue states of the driver as distinct from alert states of the driver. Artificial intelligence may also facilitate adaptation of the vigilance algorithm(s), to the specific vehicle's background noise characteristics.
- Artificial intelligence may include different response patterns for correlating movement data from the various' sensors for distinguishing valid driver movements from environmental vibrations and noise. These may be classified and described by, for example, a look up table that records expected patterns or combinations of signals for different cases of environmental noise as distinct from driver generated signals. For example, if the driver moves his hand, signals from sensors in the steering wheel and arm sections of the seat may correlate according to a specific pattern. Alternatively, if the vehicle undergoes a severe or even subtle vibration due to road or engine effects, a broader range of sensors may be similarly affected and this may be manifested as amplitudes which follow predetermined correlation patterns. Signals from the sensors may increase in strength or amplitude according to the proximity of the source of the sound or vibrations. Where the source of the vibration is localized, this may manifest itself as a pattern of similar waveforms across the various sensors which reduce progressively in amplitude as the sensors distance from the source increases. For example, if the source of the vibration is road noise, the floor sensors may register maximum amplitude whereas the steering wheel sensors which are furthest from the road noise may register minimum amplitude.
- The phase relationship of vibrations from various sources may also provide some guide as to the likely source of the vibrations. For example, if the vibrations emanate from the driver's movement then it is more likely that several signals with similar phase may be detected. On the other hand, if the signals have varying phase relationships, then it is more likely that the source of the vibrations giving rise to these signals is random as may be expected if the vibrations emanate from the vehicle environment.
- Similar phase signals arising from driver movements may be distinguished from similar phase signals arising from artefacts or the vehicle environment by relating the environmental noise to sensors located near sources of expected noise in the vehicles, eg. engine noise, wheel noise, and other vibrations and noise. This may be detected by carefully locating microphones and vibration sensors in the vehicle.
- Cancellation of environmental noise can be assisted by monitoring signals from the microphones and sensors with a view to applying the most effective signal cancellation techniques in order to reduce as much as possible the artefact or noise effects or unwanted signals within the vehicle environment.
- One example of the application of noise cancellation techniques includes detection of the various road bumps and ignoring the effect of these bumps on the data being analysed from the various vehicle sensors of interest.
- Another example of the application of noise cancellation techniques includes detection of various engine noises and application of a signal of opposite phase to the motor noise in order to cancel the artefact. One example of phase cancellation techniques which may be adopted is disclosed in PCT application AU97/00275, the disclosure of which is incorporated herein by cross-reference.
- Other examples of noise cancellation include filtering wherein highpass, lowpass and notch filters may be used to assist artefact removal.
- Artificial intelligence may learn to ignore periodic signals from sensors in the vehicle as these are likely to arise from mechanical rotations within the vehicle, thus improving the separation of artefact signals from signals of interest, such as signals which indicate true driver movement.
- Artificial intelligence may also learn to recognize changes in the driver's state which reflect changes in driver vigilance or wakefulness. Points of calculation and analysis of sensor data for the purpose of comparison and correlation with previously monitored data may include:
-
- spectral analysis of signals with a range of consecutive time periods; ½ period time amplitude analysis of signals and other techniques used in conventional sleep analysis as disclosed in AU Patent 632932;
- calculation of the number of movements per consecutive periods of time,
- wherein the consecutive periods of time may typically be, 1 second or ½ second;
- calculation of average signal levels during periods of, say, 20 or 30 seconds;
- calculation of total “area under the curve” or integration of sensor signals for a period of, say, 20 or 30 seconds;
- correlation and relationship between various combinations of input sensor channels;
- ECG heart rate and respiration signals, the latter signals providing an indication of the driver's wakeful state, as heart-rate and respiration signals during the sleep state are well documented in a number of medical journals.
- Artificial intelligence may be applied in conjunction with pressure sensors in vehicle seats and/or seat belts to control air bag deployment. In this way air bag deployment may be restricted for children or validated with different crash conditions for children and adults. For example, if a child is not detected as being thrust forward by means of pressure data received from seat/seat belt sensors, deployment of air bags and possible air bag injury to the child may be avoided.
- Deployment of air bags may generally be validated more intelligently by analysing data relating to passenger or driver posture, movement, thrust, body movement, unique driver or passenger ‘yaw’ etc.
- The system may include means for testing a driver's response times. Such tests may, if carried out at regular intervals, pre-empt serious driving conditions as can be brought about by driver fatigue or a lapse in vigilance. The testing means may be adapted to provide a simple method for prompting the driver and for testing the driver's response time. The response test may, for example, request the driver to respond to a series of prompts. These prompts may include requesting the driver to squeeze left or right hand sections of the steering wheel or squeeze with both left and right hands at the same time in response to a prompt. The means for prompting the driver may include, for example, LEDs located on the dash of the vehicle or other position that the driver is visually aware of. A left LED blinking may for example, prompt the driver to squeeze the left hand on the steering wheel. A right LED blinking may prompt the driver to squeeze the right hand on the steering wheel. The centre LED blinking may prompt the driver to squeeze both hands on the steering wheel. Alternatively two LEDs could be used in the above example, except that both LEDs blinking may prompt the driver to squeeze with both hands.
- The drivers response or level of alertness may be detected by measuring the response time of the driver, where the response time is measured as the time between illumination of an LED and a correct response with the hand or hands. In a case where an inappropriate response time is detected (potentially signalling driver fatigue or onset of driver fatigue) the system can verify the results and alert the driver. The system may also determine the accuracy of the driver's responses to ascertain the status of the driver's vigilance.
- A further example of means for testing the driver's response may include means for flashing random numbers on the windscreen. The driver may be prompted to respond by squeezing the steering wheel a number of times as determined by the number flashed. The numbers may be flashed on the screen at different locations relative to the steering wheel with the position of the hands on the wheel responding to the side of the screen where the flashes were detected. This type of test should be conducted only when the driver is not turning, changing gear, braking or performing other critical driving functions.
- It is desirable to ensure that the driver response tests are not anticipated by the driver to more accurately detect the driver's state of vigilance. It is of course also important that the selected method of testing driver response, does not in any way distract the driver or contribute to the driver's lapse in concentration.
- The system may be built into a vehicle sun visor as a visual touch screen display allowing a comprehensive visualisation of a drivers activity. The touch screen may include a color display for displaying movement/pressure outputs associated with each sensor. A display of the status of a plurality of sensors may provide a visual indication of a relaxed versus an active driver state.
- According to one aspect of the present invention, there is provided apparatus for determining a vigilance state of a subject such as a driver of a vehicle or the like, said apparatus including:
- means for monitoring one or more physiological variables associated with said subject;
- means for deriving from said one or more variables data representing physiological states of said subject corresponding to the or each variable; and
- means for determining from said data when the vigilance state of said subject is below a predetermined threshold.
- According to a further aspect of the present invention, there is provided a method for determining a vigilance state of a subject such as a driver of a vehicle or the like, said method including the steps of:
- monitoring one or more physiological variables associated with said subject;
- deriving from said one or more physiological variables data representing physiological states of said subject corresponding to the or each variable; and
- determining from said data when the vigilance state of said subject is below a predetermined threshold.
- Preferred embodiments of the present invention will now be described with reference to the accompanying drawings wherein:—
-
FIG. 1 shows a block diagram of a vigilance monitoring system according to the present invention; -
FIG. 2 shows a flow diagram of an algorithm for processing data from sensors associated with a vehicle and driver; -
FIG. 3A shows a simplified block diagram of a system for cancelling environmental noise from driver interfaced sensors; -
FIG. 3B shows waveforms associated with the system ofFIG. 3A ; -
FIG. 4A shows a flow diagram of a movement processing algorithm according to the present invention; -
FIG. 4B shows examples of data reduction and syntactic signal processing associated with a sample signal waveform; -
FIG. 5 shows sample outputs of data following secondary analysis by the system ofFIG. 4A ; -
FIG. 6 shows an embodiment of steering wheels sensors; -
FIG. 7 shows a block diagram of a vigilance monitoring system utilizing video data; -
FIG. 8 shows a flow diagram of an algorithm suitable for processing video data; -
FIGS. 9 and 10 show examples of data produced by the system ofFIGS. 7 and 8 ; -
FIG. 11 is a flow diagram of the main vigilance processing algorithm; -
FIG. 12 is a block diagram of a vehicle monitoring system according to the present invention; -
FIG. 13 shows one form of transducer for monitoring posture of a driver or equipment operator; -
FIG. 14 shows a block diagram of an embodiment of an anti snooze device according to the present invention; -
FIG. 15 shows a calibrate mode algorithm; and -
FIG. 16 shows a main relax detection algorithm. - Referring to
FIG. 1 , block 12 shows a plurality ofsensors 1 to 11 associated with a vehicle and driver. The or each sensor may include piezoelectric or electrostatic material such as CSD or PVDF material. The material can be divided into plural sections of the driver's seat, for example. The various sensors are summarized below. -
-
- Drivers seat top edge of upper section
- Drivers seat centre of upper section
- Drivers seat base of upper section
-
-
- Drivers seat front edge of lower section
- Drivers seat centre of lower section
- Drivers seat rear of lower section
-
-
- Driver's seat-belt upper section
- Driver's seat-belt lower section
- The driver's head tilt per driver cap or a device to clip over drivers ear or as part of driving goggles or glasses. These sensors can be, for example, positional switch devices. The output from these positional devices is amplified, filtered and finally data acquisitioned and analysed. This sensor device is designed to output a signal or digital data which changes state in accordance with the tilt of the driver's head. By calibration of the system in accordance with normal driving conditions this output can correlate the normal driving condition with the fatigued driver condition.
- The driver headband sensors can be, for example, a Capacitive Static Discharge Material (CSDM) or PVD Material (PVDM) that can be divided into the various sections (as listed below) of the driver's headband sensor. The output from the various sensors is amplified, filtered and finally data acquisitioned and analysed. The headband material can contain conductive sections designed to pick-up the patient's electro-encephalograph (EEG) signals.
-
-
- Driver headband left frontal
- Driver headband right frontal
- Driver headband left eye
- Driver headband right eye
EEG, EMG and'EOG parameters monitored in critical driving conditions.
In some critical applications of vigilance monitoring, such as pilots of aircraft, personnel responsible for navigating and controlling ships, drivers of road or rail transport or passenger vehicles, it can be appropriate to apply more comprehensive methods of vigilance monitoring. These more comprehensive monitoring techniques can include techniques for analysing the frequency composition of a subjects EEG physiological data. Half Period Amplitude analysis (AU patent 632932) or spectral analysis can be applied in order to determine if the subject is entering a trance or non-vigilant state or if the subject is becoming drowsy. This type of sleep staging can be derived in real time to facilitate determination of the subject's state of vigilance. If the subject is detected as being in a risk category the present system will alert the driver in order to prevent a potential vehicle accident due to the driver's lapse in concentration.
One method of electrode attachment, but not limited to, could be the application of a headband by the driver where this head-band and/or chin-band could connect the EEG, EMG and EOG signals to the monitoring device for purpose of analysing the signals for determination of the subjects state of wakefulness.
- Various techniques can be applied for the purpose of eye movement monitoring including;
-
- Techniques based on reflected light.
- Techniques based on electric skin potential.
- Techniques based on Contact lenses
- Techniques based on Limbus tracking
- Techniques based on video imaging
- Techniques based on Magnetic Induction
- Driving goggles or glasses with infra-red detection capability for monitoring driver's eye movements, or EOG signal pick up via electrodes.
-
-
- Driver's eyes left
- Driver's eyes right
- sources of eye movements can include EOG electrodes, Infrared detection of eye movements, or video tracking and processing of eye movements.
- The driver steering wheel or other steering device sensors can be, for example, a CSDM or PVD material that can be divided into the various sections (as listed below) of the driver's steering wheel or other steering device. The output from the various sensors is amplified, filtered, and finally data acquisitioned and analysed.
-
-
- Drivers steering wheel top left section
- Drivers steering wheel top right section
- Drivers steering wheel bottom left section
- Drivers steering wheel bottom right section
- An alternative form of steering wheel sensor is shown in
FIG. 6 . - The driver carpet sensors can be, for example, a Capacitive Static Discharge Material (CSDM) or PVD Material (PVDM) that can be divided into the various sections (as listed below) of the driver's carpet area. The output from the various sensors is amplified, filtered and finally data acquisitioned and analysed.
- The driver accelerator sensors can be, for example, a Capacitive Static Discharge Material (CSDM) or PVD Material (PVDM) that can be divided into the various sections (as listed below) of the accelerator pedal. The output from the various sensors is amplified, filtered and finally data acquisitioned and analysed.
-
-
- Drivers accelerator pedal top section
- Drivers accelerator pedal center section
- Drivers accelerator pedal bottom section
10. Driver Clutch Pedal (where Applicable) Sensor
The driver clutch sensors can be, for example, a Capacitive Static Discharge Material (CSDM) or PVD Material (PVDM) that can be divided into the various sections (as listed below) of the driver's clutch pedal (where applicable). The output from the various sensors is amplified, filtered and finally data acquisitioned and analysed.
-
-
- Drivers clutch pedal (if applicable) top section;
- Drivers clutch pedal (if applicable) center section
- Drivers clutch pedal (if applicable) bottom section
- The driver brake sensors can be, for example, a Capacitive Static Discharge Material (CSDM) or PVD Material (PVDM) that can be divided into the various sections (as listed below) of the brake pedal. The output from the various sensors is amplified, filtered and finally data acquisitioned and analysed.
-
-
- Drivers brake pedal top section
- Drivers brake pedal center section
- Drivers brake pedal bottom section
- Other sensors are referred to in
block 13, including steering wheel movement and direction sensors and sensors for detecting environmental noise and vibrations. - The outputs from the various sensors are amplified and filtered in
block 14 in preparation for analog to digital conversion inblock 15. The sensor signals are input in digital form to block 16.Block 16 includes a central processing unit and one or more algorithms for processing the digital signals.Block 16 also makes use of the vigilance processing algorithm(s) inblock 17. The vigilance processing algorithm(s) inblock 17 are adapted to determine the status of the driver state of vigilance, alertness or wakefulness. This status may be expressed as a vigilance factor (0-100%). Upon detecting a vigilance factor which is below an acceptable threshold, the central processing unit may alert the driver of the vehicle and/or other vehicles. The driver alert means may include: -
-
- Flashing Hazard Lights
- Sounding of siren
-
-
- Scent sprays which are designed to activate the drivers vigilance state
- Vibration modulation for driver—can include vibration of steering wheel or floor area to alert driver
- Vibration modulation for driver seat-belt
- Vibration modulation for driver steering wheel
- Audible alarm system at frequencies and durations or sequence of durations as tested be most effective in alerting the driver
- Cabin bright lights designed to avoid driving hazard but tested for improving driver vigilance
- Upon detecting a vigilance factor which is below an acceptable threshold, the central processing unit may intervene in the control of the vehicle. Vehicle intervention may enable the vehicle to be brought into a safe or safer status. Vehicle intervention may include speed restriction or reduction or complete removal of fuel supply. In some circumstances the accelerator pedal may need to be disabled, for example when a driver has his foot depressed on the accelerator pedal and is in an unsafe or fatigued state.
- Where a driver is detected as ignoring or not responding to response requests or appropriate acknowledgement that the driver is in a vigilant state, the vehicle may have its horn or hazard flashing lights activated to warn other drivers, and/or have its fuel injection de-activated, and/or speed reduced by gentle and controlled safe braking.
Where a driver is detected as suffering from fatigue and is not responding to response tests, the vehicle may have its fuel supply reduced, and/or its speed reduced by gentle and controlled safe braking, to a safe cruising speed. The driver may then be prompted again before the vehicle undergoes further intervention. - Another option for vehicle intervention is to provide a form of ignition override, as used in some alcohol based systems. In this type of system the vehicle ignition or starting process may be inhibited by an inappropriate driver state which in the present case may be drowsiness or excessive fatigue.
- In many modern vehicles vehicle intervention options may be instigated by an onboard computer or electronic interface eg. by communication with the speed controller or fuel injection logic. The computer system, may include intelligence to arbitrate the most appropriate intervention sequence or process to minimize risk to the vehicle driver or its passengers.
-
FIG. 2 shows a flow diagram of an algorithm for processing data from sensors associated with the vehicle and driver.Block 20 shows a plurality of arrows on the left representing data inputs from various sensors associated with a vehicle, following conversion to digital data. The digital data is input to block 21 which determines whether the data conforms to valid amplitude thresholds stored inblock 22. Signals beyond the thresholds are classified as noise or artefact and are ignored. The data is then input to block 23 which detects whether the data conforms to valid time duration thresholds stored inblock 24. Signals beyond the thresholds are classified as invalid and are ignored. The thresholds stored inblocks block 25 is to simplify further processing by presenting the data in a minimized form. This is done by syntactic processing whereby main data points only of the signals such as various peaks, troughs and zero crossings or central points defining peaks of the signals are presented for further processing. The data is then input to block 26 where it is categorized and summarized in terms of amplitude or power range, number of movements per second or other epoch, and phase relationships between the signals. The data may be displayed on tabular or graphical form and/or may be subjected to further automated processing to determine vigilance status. -
FIG. 3A shows a block diagram of a system for removing environmental noise from driver interfaced sensors.Block 30 represents various sensors for monitoring driver movements and block 31 represents sensors for monitoring environmental vibration and noise and vehicle artefacts. -
Blocks blocks Block 34 represents analogue to digital converters for converting the signals fromblocks block 35.Block 35 includes an algorithm for performing signal cancellation as illustrated inFIG. 3B . - In
FIG. 3B waveform A represents a signal from a driver interfaced sensor or sensors (Block 30 ofFIG. 3A ). Waveform B represents a signal from a sensor or sensors associated with the vehicle engine and road noise pickup locations (Block 31 ofFIG. 3A ). Waveform C represents a signal after it is processed byBlock 35. It may be seen that the signal represented by waveform C is obtained by cancelling or subtracting the signal represented by waveform B from the signal represented by waveform A. The signal represented by waveform C is a true or valid movement signal which is not corrupted by environmental noise. -
FIG. 4A shows a flow diagram of a movement processing algorithm according to the present invention. Referring toFIG. 4A , signals fromsensors 1 to 11 shown inblock 12 ofFIG. 1 are filtered, then referenced to period and amplitude threshold values before being converted to syntactic data. The syntactic data is correlated for determination of certain combinations of sensor movement signals indicating that the driver is in a vigilant or wakeful state. When a sensor signal or any combination of sensor signals are analysed as being void of subject movement, this may be interpreted as an indication the driver is suspected of being in a non-vigilant or fatigued state. Analysis of the fatigued state is determined by certain expected patterns from the various sensor signals. Such patterns include very little movement from the steering Wheel and very little movement from the seat sensors, indicating that the driver may be excessively relaxed and subject to fatigue, or at risk of fatigue on-set. The functions ofblocks 40 to 61 are as follows: -
FIG. 1 shows how the analog signals fromsensors 1 to 11 are: converted to a digital signal (FIG. 1 , block 15); input to the central processing unit (FIG. 1 , block 16); and processed by a vigilance processing algorithm (FIG. 1 , block 17). The start of the algorithm inFIG. 4A represents the start of a process, which is repeated many times for eachinput sensor 1 to 11 (FIG. 4A shows the process forsensors - Signal A/D Data Output. The analog signal from each sensor is amplified, filtered and then converted to a digital signal in preparation for signal processing.
- Variables A,C,E,-U provide to the processing algorithms threshold amplitude and period values to allow sensor signal data reductions to be determined and to allow data reduction and syntactic signal processing. The variables (A,C,E-U) are determined via controlled studies from experimental and research data.
FIG. 4B shows examples of: (1) signals components which are ignored due to being below a minimum amplitude threshold, (2) syntactic data where the signal is represented by troughs and peaks of the signal, and (3) high frequency component being ignored due to being below a minimum period threshold. The latter recognizes relatively lower frequencies which are typically due to driver movements.
Inputs fromSensors 4 to 11, Subject to System Configuration
Input from each of the vehicles sensors is amplified, filtered and then analog to digital converted, in preparation for signal processing. This is performed by blocks similar toblocks 41 to 46. Inputs from more than 11 sensors can be catered for if required. - Variable data via default table (as determined by clinical data and/or neuro node self learning and adjustment), resulting from customisation to specific subject's driving characteristics and system adaptation.
Variables: B,D,F,-V. By comparing the sensor data to various amplitude thresholds and pulse periods, it is possible to ignore data that is likely to be noise or artefact and include data that is distinguishable as movement data from a driver. The movement data is distinguished by measuring the amplitude and period characteristics of the sensor signal. Movement data is also distinguished by comparing signal patterns and characteristics of sensors to patterns and characteristics of typical driver's movements (as determined by comparative data used for correlating against current data, this data being derived from system self-learning and/or calibration processes.) - Is peak to peak amplitude of sensor output greater than threshold variable A ? Retain time reference and value of each signal excursion of input sensor exceeding amplitude reference A.
- Is peak to peak amplitude of sensor output greater than threshold variable C ? Retain time reference and value of each signal excursion of input sensor exceeding amplitude reference C.
- Is peak to peak amplitude of sensor output greater than threshold variable E ? Retain time reference and value of each signal excursion of input sensor exceeding amplitude reference E.
- Is peak to peak amplitude of sensor output greater than threshold variable B ? Retain time reference and value of each signal excursion of input sensor exceeding amplitude reference B.
- Is peak to peak amplitude of sensor output greater than threshold variable D ? Retain time reference and value of each signal excursion of input sensor exceeding amplitude reference D.
- Is peak to peak amplitude of sensor output greater than threshold variable F ? Retain time reference and value of each signal excursion of input sensor exceeding amplitude reference F.
Storage of Input Sensors Period and Amplitude with Time Reference
The syntactic data from the full range of sensors is stored in random access memory for the purpose of processing and determination of a subject's vigilant state. The syntactic data is also archived to allow post analysis report and validation or review of driver fatigue and performance. This can be particularly useful where truck drivers and other critical transport or passenger drivers are required to be checked for performance and vigilance compliance. - Longer-term data storage is designed to log the driver's movement data from each of the sensors. This stored data can be accessed at a later stage in order to review the driver's performance history in regards to movement analysis and subsequent vigilance.
- Short term direct access storage used for storing parameters such as the past 10 minutes of syntactic data for each sensor channel, in order to correlate the various data from each sensor or channel and compare this data combination to pre-defined sets of rules designed to describe combinations of sensor outputs which are typical of driver fatigue conditions.
- Store syntactic representation of sensor signal exceeding threshold A and 8, with timer reference, amplitude and pulse width.
- Store syntactic representation of sensor signal exceeding threshold C and D, with timer reference, amplitude and pulse width.
- Store syntactic representation of sensor signal exceeding threshold E and F, with timer reference, amplitude and pulse width.
- Driver specific profile and calibration data can be stored for later correlation reference. By correlating with various thresholds or reference conditions the system is able to determine interaction to sensors when a particular driver's conditions is similar to pre-stored reference characteristics. This comparative data is stored as data in look up tables. The data can consist of frequency and/or amplitude characteristics for a range of driver states or alternatively the data can consist of samples of data (with acceptable variations to the samples of data) that exist for a range of driver states.
- Vehicle output signals. These include steering wheel movements, direction of steering wheel movements, speed of vehicle, change of speed of vehicle, engine vibration and noise, road vibration and noise.
By processing driver steering wheel adjustments and comparing these adjustments with the various sensor signals and correlation of various sensor signals, it is possible to determine the probability that the driver is in a state of fatigue and the degree of driver fatigue. The vehicle signals are also analysed in order to assist in noise cancellation (ie vehicle noise as opposed to driver movement) and more accurate identification of valid driver movements). - Correlate all channels of sensor activity and determine if driver fatigue is a probability and what level of driver fatigue is detected. Look up table of specific driver calibration values and reference states is used to determine actual driver state and level of fatigue of driver, along with probability of data accuracy. Standard reference data tables and default values are also used for determination of driver fatigue. See sample R&K style histograms, movement histograms and tables showing weighted value of each sensor and actual movement detection from each sensor to determine fatigue probability as a function of movement detection with appropriate weighting.
-
FIG. 5 shows typical samples of processed data following secondary analysis forsensor signals 1 to 4. The data shows in graphical form the number of valid movements detected for eachsensors 1 to 4 during successive time intervals n, n+1, n+2 . . . . Tertiary analysis may be performed on this data which would allow simple to view correlation between the various sensors. The samples shown inFIG. 5 demonstrate an example (dotted line) where the various sensors all experience obvious movement detection. - The steering wheel sensors shown in
FIG. 6 are divided into eight sections as follows: -
Top 62, top left 63, top right 64, left 65, right 66, bottom left 67, bottom right 68 and bottom 69. - Sensors 62-69 are linked via eight cables to
output pins 1 to 8 respectively. A common connection to each sensor is linked by cables tooutput pin 9. Alternative configurations are possible with more or less sensors and with the option of sensor arrays on both the upper and lower surfaces of the steering wheel grip surface. The outputs represented bypins 1 to 9 are connected to analogue signal conditioning circuits and via analogue to digital convertors to digital signal processing circuits as described above. - It is desirable to measure pressure of a driver's hand or hands on the steering wheel at all times. The pressure may be compared to previous values and/or calibrated values to determine whether a pattern of increased or decreased pressure reflects driver fatigue onset.
- If the driver's state of consciousness or concentration changes due to fatigue onset or the like, the system may calculate and deduce an appropriate point at which the driver should be alerted. The appropriate point may be determined from a combination of pre-calibrated data for a specific driver and/or pre-programmed patterns, states or trends in the data including relative and absolute pressure values obtained from a set or subset of vehicle sensors.
-
FIG. 7 shows a block diagram of a vigilance monitoring system utilizing video data.Block 70 represents a video CCD (charge coupled device) camera which may be located on the drivers visor, dash-board or other suitable location to enable video monitoring of the driver's eyes. An infra-red lens may be utilized to facilitate reliable night video monitoring capability. The output of the video camera is passed to block 71.Block 71 is an analog to digital converter for digitizing the video signal prior to processing viablock 72.Block 72 is a central processing unit and includes a video processing algorithm. The video processing algorithm has eye recognition software designed to identify eyes in contrast to other parts of the drivers face. Eyes are detected using special processing software that allows the driver's eyes to be analysed. This analysis includes determining the area of the eye's opening and correlating the eye's opening area to previous similar measurements. In this way eye processing can determine whether a driver's eyes are remaining open as would be expected in an alert state or whether the current eye opening of the driver is relatively less (when compared to earlier eye opening measurements). Rates or degrees of eye closure are able to be detected and continually monitored in this manner. - The video processing algorithm also detects blink rate and possibly eye movements to determine whether the drivers eyes appear to be alert or possibly fixed in a dangerous “trance state” as may be apparent during lapses of driver vigilance.
Block 73 represents outputs ofblock 72 including -
- eyes blink rate
- eyes closure, calculated as a percentage ratio of current eyes open area to previously calculated maximal eyes open area.
- eyes focus factor, determined by measuring number of eye movements per second, extent of eye movements (ie small eye movements or larger eye movement deflections)
- the nature of eye movements can reflect appropriate patterns of movement of a driver's eyes such as focus on sections of the road for an appropriate time as well as inappropriate patterns of movement associated with fatigue or lack of vigilance
- type of eye movements, ie vertical, horizontal, stare
The above measures may be gauged against actual trials in order to determine relevant indices that correlate to a driver's fatigued state.
-
FIG. 8 shows a flow diagram of an algorithm suitable for processing video data. The functions ofblocks 80 to 94 are as follows: - CAPTURE EYE VIDEO DATA—Capture current video frame. Digitise video frame of subject's eyes. Eye data can be captured via one or more of the following means:
CCD video camera, Electro-oculogram data capture means via subject worn headband, direct electrode attachment, driver glasses, head-cap or movement sensors, infrared or other light beam detection means. - Apply edge detection, signal contrast variation and shape recognition, amongst other processing techniques to determine the border of the subject's eye lids.
Determine area of each of the subject's eye openings, height of each eye opening, blink events for each eye, blink rate and time reference associated with each blink event. - Correlate current eye position data with previous position eye data. Review eye position trend data and determine trends and patterns of eye movements that indicate on-set of or driver fatigue state. Patterns include:
-
- states of staring or trance like states indicating loss of road concentration.
- slowly rolling eye movements (typical of sleep onset).
- eye focus directions and association of these directions with driver fatigue
Process digitised video frame and detect subject's left and right eye movement patterns and activity of eyes and association of this activity with driver fatigue.
Compare current blink rates, past blink rates and look-up table blink rate characteristics, thresholds for various fatigue on-set and fatigue blink rates and blink characteristics associated with various driver states.
Compare current eye opening area with thresholds for fatigue and fatigue on-set conditions to determine vigilant driver eye opening status versus fatigued driver eye opening status.
- Look Up Table with Characteristic Patterns of;
-
- eye movements and threshold data for fatigued versus vigilant subjects.
- Blink rate typical thresholds and characteristics
- Eye opening typical and default thresholds
- Eye movement typical and default characteristics for driver fatigue on-set.
- Store subject's left & right eye opening area, eye opening height, blink rates, eye position and eye movements together with time reference.
- Calibration data derived from subject and vehicle calibration procedures.
-
- determination of fatigue on-set blink rates thresholds.
- Determination of eye opening fatigue on-set thresholds.
- Determination of eye position, movement characteristics and activity characteristics for fatigue on-set thresholds.
- EOG patterns for wake, drive activity, fatigue on-set, fatigue.
- Trance and hypnotic EOG eye characteristics.
- Fatigue threshold time period variable X set from default values, subject calibration or system self-learning/calculation.
- Is mean of eye opening area below “fatigue mean eye opening threshold X “?
- Fatigue threshold time period variable Y set from default values, subject calibration or system self-learning/calculation.
- Is time duration below mean eye opening fatigue threshold (X) greater than Y ?
- Blink rate fatigue characteristics set from default values, subject calibration or system self-learning/calculation.
- Does blink rate and characteristics comply with fatigue blink rate characteristics ?
- Apply eye data processing and determine left & right opening area and blink events.
Correlate current and past video captured eye movement data.
Detection of fatigue eye opening on-set and detection of fatigue blink rate on-set. - Eye movement fatigue determination diagram.
FIGS. 9 and 10 show examples of eye opening and eye position data produced by the system ofFIGS. 7 and 8 .
FIG. 11 is a flow chart of the main vigilance processing algorithm. The functions ofblocks 95 to 99 are as follows: -
-
- Vigilance Movement Processing Algorithm. (see
FIG. 4A ) - Vigilance Eye Status Processing Algorithm.
Probability of Driver Fatigue and Degree of Vigilance Determination Algorithm (correlates subject Movement Status and Eye Processing Status).
- Vigilance Movement Processing Algorithm. (see
- LED indicator display panel.
- Eye Status Vigilance factor 0-100%.
- 0-100%—displayed as bar graph, meter or other means.
- Vigilance probability Factor 0-100%
-
FIG. 12 is a block diagram of a vehicle monitoring system according to the present invention.FIG. 12 is an overview of a system which utilizes many of the features discussed herein. The functions ofblocks 100 to 118 are as follows: - Driver EEG sensors—direct attach electrode, headband, wireless electrode, driver cap and other EEG signal pick-up means.
- Driver EEG sensors—direct attach electrode, headband, wireless electrode, driver cap and other EEG signal pickup means.
- Driver Motion, Movement and Physiological Parameter sensors.
- Driver Eye movement
Detection via electrode, driver glasses/goggles, infrared or other light beam means of tracking detection or other means. - Vehicle status interface; speed, direction, accelerator position, break position, indicators, lights amongst other vehicle status data.
- In phase signal detection and processing. Applies processing which determines patterns of in-phase signal occurrence and associates these with driver or background noise as originating source.
- Anti-phase signal detection and processing. Applies processing which determines patterns of anti-phase signal occurrence and associates these with driver or background noise as originating source.
- Vehicle background and Environmental Noise Sensors to allow noise cancellation, filtering and reduction.
These sensors include microphone and vibration sensors located at strategic positions in order to pick up background vehicle noise such as road noise and engine noise. Fourier transform and frequency analysis of background noise assists in selection of digital filtering characteristics to most effectively minimise vehicle environmental noise and assist in distinguishing driver related fatigue monitoring signals. System will continually “self-learn” various vehicle background and threshold noise levels, frequency and other characteristics in order to determine changing vehicle noise conditions and subsequent noise cancellation or capability to ignore unwanted vehicle noise while processing “real” driver movement and physiological signals and subsequent fatigue status. Artificial intelligence;
Signal characteristics as generated by a range of varying road conditions can be programmed into the system. The input data relating to various road conditions thereby provides a means to further distinguish wanted driver related signals from unwanted background noise signals. - Driver EEG sensors—direct attach electrode, Algorithm
- Driver EEG sensors—direct attach electrode, algorithm
- Driver Motion, Movement, Physiology algorithm
- Vehicle status interface Algorithm
- Driver Fatigue Processing Algorithm. Correlation with previous driver fatigue conditions together with comparison of outputs for each of above listed fatigue algorithms (Driver EEG, motion, eye, vehicle status).
- Driver vigilance interactive response testing.
- Driver alert and alarm systems for re-instatement of vigilance.
- Driver vehicle car intervention to reduce or limit speed and other means of increasing vehicle safety and reducing vulnerability to driver fatigue status.
- Vehicle fatigue display systems for displaying to the driver the current fatigue status or early warning indicators of fatigue status.
- System communication storage and printing peripheral interface. Data storage, reporting processing, reporting print interface, wireless and wire connected interfaces, for real-time or post communication of fatigue data and fatigue status information. System can include GSM, cellular phone, satellite or other means of moving vehicle tracking and data exchange in real-time or at any required later stage. This information transfer can be an effective means for trucks and other vehicles to have their driver status processed and reviewed, as appropriate and as required.
-
FIG. 13 shows one form of transducer for monitoring posture of a driver or equipment operator.FIG. 13 shows a webbed structure comprising strips or elements of flexible PVDF or Piezo material separated by flexible insulation material terminated at A, B, C, D, E, F, G and H. Output signals from the respective strips are buffered, amplified, filtered and then analog to digital converted to data. This data may be processed to determine an actual position of pressure applied to the above structure. By analysing the two main co-ordinates and the amplitudes of signals associated with those co-ordinates, the exact position of pressure applied by the vehicle driver or equipment operator may be determined. - The position where greatest pressure is applied is defined by the intersection of web strip pairs (eg. Band F) which produce the greatest signal amplitude. The position may be described by coordinates reflecting the web strip pairs (eg. B,F) which produce the greatest signal amplitude. The above transducer may be used in conjunction with the movement sensors described herein to provide a further layer of positional information relating to applied pressure for each sensor. This information may be important in circumstances where a driver's pressure to the steering wheel or the driver's pattern of hand placement (with respective applied pressure) varies in accordance with alertness and drowsiness.
- The posture of the driver or equipment operator may be monitored, stored, correlated with various threshold states and/or displayed in meaningful graphic or numerical form. The threshold states may be derived by way of calibration for each specific driver's posture profile under various states of fatigue and/or stress states and conditions.
- The anti-snooze device shown in
FIG. 14 includes sensors (block 120) connected to an acquisition and processing means (block 121).Block 122 includes monitoring means designed to amplify, filter and digital to analog convert driver sensor signals in preparation for digital signal processing. The digital signal processing means (block 121) includes a calibration algorithm as shown inFIG. 15 and a main relax detection algorithm as shown inFIG. 16 . - The driver can select the relax calibration function, then take on the driving posture that would most closely represents a relaxed or possibly fatigued driving state and the system will then monitor and store the minimum threshold of driver activity over a period of approximately but not limited to 10 seconds, as a relaxed driver reference level.
- The driver can select an active calibration function, then take on the driving posture that would most closely represents normal driving state and the system will then monitor and store the minimum threshold of driver activity over a period of approximately but not limited to 10 seconds, as an active driver reference level.
- The relaxed and active driver reference levels stored in the system may be displayed on the visual touch screen display for various sensors. The system may perform a validation function by replaying the drivers relaxed and active reference levels on the touch screen. This allows easy comparison to be made with actual sensor levels when the driver adopts postures representing normal/fatigued states and serves to validate the correctness of the stored reference levels.
- The driver can also select a sensitivity function which may determine how close to the relaxed level the driver needs to be before the anti-snooze system alerts the driver. By viewing the anti-snooze device screen the driver can relax or adopt normal vigilant driving posture and adjust sensitivity control so that the anti-snooze device appears to track and detect the drivers relaxed state. The anti-snooze device has the ability to act as a self warning aid by simply alerting the driver when his posture or driving vigilance is deteriorating. If, for example, a drivers steering wheel grip erodes or undergoes fatigue, the anti-snooze system can be calibrated to detect this condition and alert the driver.
- It is possible for the driver to have calibration data determined by an off-road simulator that more accurately defines the characteristics of each specific drivers activity variations and physiological variations during dangerously relaxed or fatigued driving conditions. The calibration data can be up-loaded to the anti-snooze device to provide more accurate relaxed and active reference levels. The calibration data may also provide more accurate means of determining the relative effect that each individual sensor has during a drivers transition from active and alert to drowsy and fatigued. The effects of each sensor may be recorded and this data may assist in more accurate anti-snooze detection.
- During calibration modes the system may detect the drivers hand pressures via the steering wheel sensors, the drivers respiration and ECG via the seatbelt sensors, and the drivers posture and movement via the seat sensors.
- The anti-snooze system may continually monitor and average the signal amplitudes of all sensors, while comparing the current levels of sensor amplitude with the calibrated levels. The system may also compare current movement sensor patterns to reference data. This reference data can represent certain threshold levels calibrated to each individual driver or general reference conditions. The various sensors may be weighted in accordance with their respective importance in determining whether a driver's current state of activity is below the threshold or appropriately close to the relaxed mode calibrated reference level to warrant that the driver be alerted.
- If the driver is detected as being within the range of sensor amplitudes and activity to warrant being alerted, the anti-snooze device can restrict the speed of the vehicle or slowly bring the vehicle to a stand still in order to reduce the likelihood of an accident. This ability to restrict the vehicle's speed could be overridden by the driver as is possible in “auto-cruise” devices currently available on many vehicles.
- The techniques and methodologies may include relatively complex neurological waveform analysis techniques, video tracking of driver eye motions, sophisticated noise cancellation and simpler driver interactive processes such as sensitizing the steering wheel, seat-belt, gear-stick and other driver cabin regions.
- One application for the present invention may include a truck driver vigilance monitoring (TDVM) system. This system may be designed around the “dead-man” handle concept as applied successfully in trains. A variation of this system may provide visual cues and driver vigilance response testing.
- The TDVM system may include pre-programmed Light Emitting Diode (LED) displays to be activated in various sequences and at various frequencies and durations. The truck driver can be visually prompted by way of these LEDS to press the steering wheel according to whether the left or right or both LEDS are flashed. The response time and accuracy of the driver's response to the prompts may be measured and relayed back to a remote monitoring control station.
- Various drivers will have calibrated “vigilant response times and accuracy levels” which can be compared to actual current response times. Where appropriate, an alarm can be activated, if the response times indicate fatigue on-set or a potentially dangerous state.
- The sequences and durations can be validated in accordance with clinical trials to provide an effective method of vigilance detection. Sequences and patterns of visual truck cabin prompts can be established to minimize driver conditioning. Frequency of vigilance test prompts can be determined in accordance with requirements as determined via field studies.
- Safety considerations to avoid driver distraction by the proposed monitoring system may be implemented. Techniques such as utilization of “busy” response prompts especially designed within the system to alert the monitoring control unit that the driver is vigilant but unable to respond at the time due to driving demands.
- The TDVM system may include the following components:
- 1. Analysis Software This software may include a processing algorithm(s) designed to evaluate various driver prompts and response times. Evaluation of these response times may produce a probability factor associated with driver vigilance for each specific driver. Analysis capability of driver response times may be an important element of the system. Accuracy of vigilance probability outcome, clinical analysis and scientific validation associated with this process may determine effectiveness of the monitoring system.
- This device may adapt to the truck steering wheel and provide output signals subject to a particular zone of the steering wheel, which has been activated by applying various degrees of pressure to the steering wheel.
- This device may provide a communication link and data management for interfacing the truck's CU&MD to a remotely located monitoring station.
This device may also provide the transducer interface and transducer signal recording and detection capabilities.
This device may also output control to the driver indicator LEDS and record and transmit vigilance response times to the remote monitoring station. - This device may be interfaced to the CU&MD unit and may provide visual response prompt to the truck driver.
- This system may facilitate a remote operators visual alarms when vigilance response times are outside acceptable thresholds.
This system may also provide communication links to the truck.
This system may also provide analysis and system reporting to allow real-time tracking of vigilance performance and vigilance alarm status. - Finally, it is to be understood that various alterations, modifications and/or additions may be introduced into the constructions and arrangements of parts previously described without departing from the spirit or ambit of the invention.
Claims (2)
1-32. (canceled)
33. A driver vigilance monitor comprising:
a memory device containing driver-specific data;
a sensor integrated into a vehicle's seatbelt for detecting and outputting a value of a physiological parameter;
a sensor for detecting eye movement or driver head position; and
a processor for determining a driver's vigilance state by comparing the driver-specific data in the memory device with output received from the sensor integrated into the vehicle's seatbelt and the sensor for detecting eye movement or driver head position.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/351,857 US20120179008A1 (en) | 1999-01-27 | 2012-01-17 | Vigilance Monitoring System |
US14/551,971 US20150088397A1 (en) | 1999-01-27 | 2014-11-24 | Vigilance Monitoring System |
Applications Claiming Priority (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AUPP8325 | 1999-01-27 | ||
AUPP8325A AUPP832599A0 (en) | 1999-01-27 | 1999-01-27 | Vigilance monitoring system |
AUPQ3740A AUPQ374099A0 (en) | 1999-10-29 | 1999-10-29 | Monitoring system |
AUPQ3740 | 1999-10-29 | ||
PCT/AU1999/001166 WO2000044580A1 (en) | 1999-01-27 | 1999-12-24 | Vigilance monitoring system |
US09/890,324 US6575902B1 (en) | 1999-01-27 | 1999-12-24 | Vigilance monitoring system |
US10/417,247 US8096946B2 (en) | 1999-01-27 | 2003-04-15 | Vigilance monitoring system |
US13/351,857 US20120179008A1 (en) | 1999-01-27 | 2012-01-17 | Vigilance Monitoring System |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/417,247 Continuation US8096946B2 (en) | 1999-01-27 | 2003-04-15 | Vigilance monitoring system |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/551,971 Continuation US20150088397A1 (en) | 1999-01-27 | 2014-11-24 | Vigilance Monitoring System |
Publications (1)
Publication Number | Publication Date |
---|---|
US20120179008A1 true US20120179008A1 (en) | 2012-07-12 |
Family
ID=25645979
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/890,324 Expired - Lifetime US6575902B1 (en) | 1999-01-27 | 1999-12-24 | Vigilance monitoring system |
US10/417,247 Expired - Fee Related US8096946B2 (en) | 1999-01-27 | 2003-04-15 | Vigilance monitoring system |
US13/351,857 Abandoned US20120179008A1 (en) | 1999-01-27 | 2012-01-17 | Vigilance Monitoring System |
US14/551,971 Abandoned US20150088397A1 (en) | 1999-01-27 | 2014-11-24 | Vigilance Monitoring System |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/890,324 Expired - Lifetime US6575902B1 (en) | 1999-01-27 | 1999-12-24 | Vigilance monitoring system |
US10/417,247 Expired - Fee Related US8096946B2 (en) | 1999-01-27 | 2003-04-15 | Vigilance monitoring system |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/551,971 Abandoned US20150088397A1 (en) | 1999-01-27 | 2014-11-24 | Vigilance Monitoring System |
Country Status (4)
Country | Link |
---|---|
US (4) | US6575902B1 (en) |
AU (1) | AU767533B2 (en) |
DE (1) | DE19983911B4 (en) |
WO (1) | WO2000044580A1 (en) |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100214087A1 (en) * | 2007-01-24 | 2010-08-26 | Toyota Jidosha Kabushiki Kaisha | Anti-drowsing device and anti-drowsing method |
US20110082628A1 (en) * | 2009-10-06 | 2011-04-07 | Brendan J. Murnane | Cruise control modification |
US20120101690A1 (en) * | 2010-03-12 | 2012-04-26 | Tata Consultancy Services Limited | System for vehicle security, personalization and cardiac activity monitoring of a driver |
US20120296528A1 (en) * | 2011-05-20 | 2012-11-22 | Matthias Marcus Wellhoefer | Haptic steering wheel, steering-wheel system and driver assistance system for a motor vehicle |
US20130245894A1 (en) * | 2012-03-13 | 2013-09-19 | GM Global Technology Operations LLC | Driver assistance system |
US20130328673A1 (en) * | 2012-06-01 | 2013-12-12 | Denso Corporation | Driving ability reduction determining apparatus |
US20140155706A1 (en) * | 2011-06-17 | 2014-06-05 | Technische Universitaet Muenchen | Method and system for quantifying anaesthesia or a state of vigilance |
ITTV20130025A1 (en) * | 2013-02-27 | 2014-08-28 | Giorgio Marcon | ELECTRONIC SECURITY SYSTEM FOR MULTIPLE FUNCTIONS. |
WO2015048959A1 (en) * | 2013-10-01 | 2015-04-09 | Continental Teves Ag & Co. Ohg | Method and device for an automatic steering intervention |
US20160028824A1 (en) * | 2014-07-23 | 2016-01-28 | Here Global B.V. | Highly Assisted Driving Platform |
US20160157734A1 (en) * | 1999-09-14 | 2016-06-09 | Hoana Medical, Inc. | Passive physiological monitoring (p2m) system |
US9527508B1 (en) | 2015-08-13 | 2016-12-27 | Winbond Electronics Corp. | Mobile vehicle safety apparatus and safety monitoring method thereof |
US20170287307A1 (en) * | 2016-03-31 | 2017-10-05 | Robert Bosch Gmbh | Method for furnishing a warning signal, and method for generating a pre-microsleep pattern for detection of an impending microsleep event for a vehicle |
FR3051342A1 (en) * | 2016-05-18 | 2017-11-24 | Airbus Operations Sas | SYSTEM AND METHOD FOR EVALUATING ACTION CAPACITY OF AN INDIVIDUAL. |
US9917281B2 (en) | 2012-09-07 | 2018-03-13 | Nitto Denko Corporation | Top-emitting white organic light-emitting diodes having improved efficiency and stability |
EP3157426B1 (en) * | 2014-06-20 | 2018-06-20 | FRAUNHOFER-GESELLSCHAFT zur Förderung der angewandten Forschung e.V. | Device, method, and computer program for detecting micro- sleep |
EP3239958A4 (en) * | 2014-12-26 | 2018-08-15 | The Yokohama Rubber Co., Ltd. | Collision avoidance system and collision avoidance method |
US10136850B2 (en) | 2009-10-14 | 2018-11-27 | Delta Tooling Co., Ltd. | Biological state estimation device, biological state estimation system, and computer program |
CN109646024A (en) * | 2019-01-09 | 2019-04-19 | 浙江强脑科技有限公司 | Method for detecting fatigue driving, device and computer readable storage medium |
US10635101B2 (en) | 2017-08-21 | 2020-04-28 | Honda Motor Co., Ltd. | Methods and systems for preventing an autonomous vehicle from transitioning from an autonomous driving mode to a manual driving mode based on a risk model |
US10657398B2 (en) * | 2018-06-26 | 2020-05-19 | David Johnson | Sleepy driver alert system and method |
US10773750B2 (en) | 2017-03-07 | 2020-09-15 | Continental Automotive Gmbh | Device and method for detecting manual guidance of a steering wheel |
WO2020204809A1 (en) * | 2019-03-29 | 2020-10-08 | Agency For Science, Technology And Research | Classifying signals for movement control of an autonomous vehicle |
CN116211310A (en) * | 2023-05-09 | 2023-06-06 | 中国第一汽车股份有限公司 | Myoelectric sensor and detection method thereof |
Families Citing this family (418)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19952857C1 (en) * | 1999-11-03 | 2001-08-09 | Bosch Gmbh Robert | Device for driver status-dependent control |
BR0110596A (en) | 2000-05-05 | 2005-08-02 | Hill Rom Services Inc | Patient monitoring system, computer system, patient information monitoring system, patient care device, walker device, patient care device, and computer display |
JP2004512058A (en) | 2000-05-05 | 2004-04-22 | ヒル−ロム サービシーズ,インコーポレイティド | Hospital bed remote control |
DE10042367A1 (en) * | 2000-08-29 | 2002-05-02 | Bosch Gmbh Robert | Method and device for diagnosing a driver's ability to drive in a motor vehicle |
US7666151B2 (en) | 2002-11-20 | 2010-02-23 | Hoana Medical, Inc. | Devices and methods for passive patient monitoring |
US7629890B2 (en) | 2003-12-04 | 2009-12-08 | Hoana Medical, Inc. | System and methods for intelligent medical vigilance with bed exit detection |
JP3495982B2 (en) * | 2000-12-21 | 2004-02-09 | 株式会社エム・アイ・ラボ | Driver drowsiness alarm using closed air type sound sensor |
AU2002256048A1 (en) | 2001-03-30 | 2002-10-15 | Hill-Rom Services, Inc. | Hospital bed and network system |
US6849050B1 (en) * | 2001-05-07 | 2005-02-01 | The United States Of America As Represented By The Secretary Of The Army | System and method for determining visual alertness |
DE60229383D1 (en) | 2001-06-13 | 2008-11-27 | Compumedics Ltd | PROCESS FOR MONITORING AWARENESS |
JP4301537B2 (en) * | 2001-08-13 | 2009-07-22 | パイオニア株式会社 | Navigation system and method for moving body, and computer program |
DE10151014B4 (en) * | 2001-10-16 | 2015-10-08 | Volkswagen Ag | Method and device for attention control of a motor vehicle driver |
US8301108B2 (en) | 2002-11-04 | 2012-10-30 | Naboulsi Mouhamad A | Safety control system for vehicles |
GB2381586A (en) * | 2001-11-01 | 2003-05-07 | Oxford Biosignals Ltd | Electro-Oculographic Sleep Monitoring |
DE10154400C2 (en) | 2001-11-06 | 2003-10-30 | Fom Future Office Man Gmbh | Monitoring and control system for manned vehicles |
AUPR872301A0 (en) * | 2001-11-08 | 2001-11-29 | Sleep Diagnostics Pty Ltd | Alertness monitor |
GB0203035D0 (en) * | 2002-02-08 | 2002-03-27 | Univ Bristol | A method of and an apparatus for measuring a person's ability to perform a motor control task |
EP1764118B1 (en) | 2002-02-15 | 2010-08-25 | Gilead Palo Alto, Inc. | Polymer coating for medical devices |
JP2003276470A (en) * | 2002-03-22 | 2003-09-30 | Nissan Motor Co Ltd | Information presentation control device |
DE10217822C1 (en) * | 2002-04-17 | 2003-09-25 | Daimler Chrysler Ag | Viewing direction identification method for vehicle driver using evaluation of speech signals for determining speaking direction |
JP3846872B2 (en) * | 2002-06-27 | 2006-11-15 | パイオニア株式会社 | Driver mental state information provision system |
ITRM20020371A1 (en) * | 2002-07-10 | 2004-01-12 | Maurizio Catello Pennarola | OFF-ROAD AIRCRAFT NAVIGATION MANAGEMENT SYSTEM AND ALARM COMMUNICATIONS. |
JP4216546B2 (en) * | 2002-08-28 | 2009-01-28 | ダイハツ工業株式会社 | Vehicle occupant fatigue evaluation method |
NL1021496C2 (en) * | 2002-09-19 | 2004-03-22 | Joannes Hermanus Heiligers | Monitoring device for alertness of person, especially vehicle driver, generates warning signal if eye is detected as being closed too long |
ATE479343T1 (en) | 2002-10-01 | 2010-09-15 | Nellcor Puritan Bennett Inc | USE OF A HEADBAND FOR VOLTAGE DISPLAY AND SYSTEM OF OXYMETER AND HEADBAND |
US7698909B2 (en) | 2002-10-01 | 2010-04-20 | Nellcor Puritan Bennett Llc | Headband with tension indicator |
JP3732476B2 (en) * | 2002-10-22 | 2006-01-05 | 株式会社日立製作所 | Biological measuring device |
US8512221B2 (en) * | 2003-02-28 | 2013-08-20 | Consolidated Research Of Richmond, Inc. | Automated treatment system for sleep |
US7654948B2 (en) * | 2003-02-28 | 2010-02-02 | Consolidate Research of Richmond, Inc. | Automated insomnia treatment system |
US8066639B2 (en) * | 2003-06-10 | 2011-11-29 | Abbott Diabetes Care Inc. | Glucose measuring device for use in personal area network |
US8289172B2 (en) * | 2005-03-24 | 2012-10-16 | Matos Jeffrey A | Method and system of aircraft pilot assessment and for remedial action upon pilot impairment |
US8164464B2 (en) * | 2005-03-24 | 2012-04-24 | Matos Jeffrey A | Method and system of aircraft pilot assessment |
US7047056B2 (en) | 2003-06-25 | 2006-05-16 | Nellcor Puritan Bennett Incorporated | Hat-based oximeter sensor |
EP1648295A4 (en) * | 2003-06-26 | 2010-01-06 | Hoana Medical Inc | Radiation stress non-invasive blood pressure method |
JP2005018594A (en) * | 2003-06-27 | 2005-01-20 | Canon Inc | Fingerprint input device, its manufacturing method, and personal identification system |
JP2005038406A (en) * | 2003-06-27 | 2005-02-10 | Canon Inc | Fingerprint input apparatus and personal identification system using it |
US7399205B2 (en) | 2003-08-21 | 2008-07-15 | Hill-Rom Services, Inc. | Plug and receptacle having wired and wireless coupling |
US7049947B2 (en) * | 2003-09-29 | 2006-05-23 | Nattel Group, Inc. | System and method for monitoring the operational condition of a motor vehicle |
US8412297B2 (en) | 2003-10-01 | 2013-04-02 | Covidien Lp | Forehead sensor placement |
DE102004022581B4 (en) * | 2003-10-06 | 2017-08-03 | Volkswagen Ag | Driver assistance system |
EP1694202B1 (en) * | 2003-12-04 | 2014-12-31 | Hoana Medical, Inc. | Intelligent medical vigilance system |
US7075458B2 (en) | 2004-01-27 | 2006-07-11 | Paul Steven Dowdy | Collision avoidance method and system |
JP2007525266A (en) * | 2004-02-18 | 2007-09-06 | ホアナ メディカル、インコーポレイテッド | Method and system for incorporating a passive sensor array in a mattress to monitor a patient |
EP1715787A4 (en) * | 2004-02-18 | 2009-04-08 | Maquet Critical Care Ab | Method and device using myoelectrical activity for optimizing a patient's ventilatory assist |
US7369951B2 (en) * | 2004-02-27 | 2008-05-06 | Board Of Trustees Of Michigan State University | Digital, self-calibrating proximity switch |
US20050195079A1 (en) * | 2004-03-08 | 2005-09-08 | David Cohen | Emergency situation detector |
GB2412431B (en) * | 2004-03-25 | 2007-11-07 | Hewlett Packard Development Co | Self-calibration for an eye tracker |
DE102004046305A1 (en) * | 2004-03-30 | 2005-10-13 | Daimlerchrysler Ag | Restraining system for motor vehicle with restraining belt has belt lock holder with force sensor for measuring restraining belt tension, evaluation circuit for evaluating tension signal to determine breathing and/or heart activity |
US8048002B2 (en) * | 2004-04-27 | 2011-11-01 | Jamshid Ghajar | Method for improving cognition and motor timing |
US20060004298A1 (en) * | 2004-06-30 | 2006-01-05 | Kennedy Philip R | Software controlled electromyogram control systerm |
US20060019224A1 (en) * | 2004-07-23 | 2006-01-26 | Pics, Inc. | Insomnia assessment and treatment device and method |
FR2873646B1 (en) * | 2004-07-30 | 2006-09-15 | Koyo Steering Europ K S E Soc | METHOD FOR LIMITING LOSS OF ASSISTANCE ON A POWER-ASSISTED STEERING OF A MOTOR VEHICLE |
US7319386B2 (en) | 2004-08-02 | 2008-01-15 | Hill-Rom Services, Inc. | Configurable system for alerting caregivers |
US7639146B2 (en) * | 2004-09-29 | 2009-12-29 | Baura Gail D | Blink monitor for detecting blink occurrence in a living subject |
FR2880166B1 (en) * | 2004-12-24 | 2008-10-31 | Renault Sas | METHOD AND DEVICE FOR ASSISTING ALERT DRIVING IN THE EVENT OF AN EMERGENCY SITUATION FOR A MOTOR VEHICLE |
US8075484B2 (en) * | 2005-03-02 | 2011-12-13 | Martin Moore-Ede | Systems and methods for assessing equipment operator fatigue and using fatigue-risk-informed safety-performance-based systems and methods to replace or supplement prescriptive work-rest regulations |
WO2006096135A1 (en) * | 2005-03-08 | 2006-09-14 | National University Of Singapore | A system and method for monitoring mental fatigue |
US7746235B2 (en) | 2005-03-10 | 2010-06-29 | Delphi Technologies, Inc. | System and method of detecting eye closure based on line angles |
US7253738B2 (en) * | 2005-03-10 | 2007-08-07 | Delphi Technologies, Inc. | System and method of detecting eye closure based on edge lines |
US7253739B2 (en) | 2005-03-10 | 2007-08-07 | Delphi Technologies, Inc. | System and method for determining eye closure state |
WO2006125256A1 (en) * | 2005-05-23 | 2006-11-30 | Fairclough Corporation Pty Ltd | Monitoring system for mechanically self-guided vehicle |
DE102005026479B4 (en) * | 2005-06-09 | 2017-04-20 | Daimler Ag | Method for inattention recognition as a function of at least one driver-individual parameter |
US9451895B2 (en) * | 2005-07-25 | 2016-09-27 | Gal Markel | Mobile communication device and other devices with cardiovascular monitoring capability |
US20070096896A1 (en) * | 2005-10-28 | 2007-05-03 | Zingelewicz Virginia A | System and method for securing an infrastructure |
JP4497081B2 (en) * | 2005-10-31 | 2010-07-07 | トヨタ自動車株式会社 | Human condition detection device |
US7458609B2 (en) * | 2005-11-03 | 2008-12-02 | Ford Global Technologies, Llc | System and method for adjustably positioning a restraint system in a motor vehicle |
US10878646B2 (en) | 2005-12-08 | 2020-12-29 | Smartdrive Systems, Inc. | Vehicle event recorder systems |
US20070150138A1 (en) | 2005-12-08 | 2007-06-28 | James Plante | Memory management in event recording systems |
US7423540B2 (en) * | 2005-12-23 | 2008-09-09 | Delphi Technologies, Inc. | Method of detecting vehicle-operator state |
US20070194922A1 (en) * | 2006-02-17 | 2007-08-23 | Lear Corporation | Safe warn building system and method |
US8000887B2 (en) * | 2006-02-17 | 2011-08-16 | Lear Corporation | Method and system of directing vehicles traveling over roadway during emergency |
US7831379B2 (en) * | 2006-02-17 | 2010-11-09 | Lear Corporation | Roadside signage control from vehicle operating data |
BRPI0708414A2 (en) * | 2006-03-01 | 2011-05-31 | Optalert Pty Ltd | disability monitor |
US8050863B2 (en) * | 2006-03-16 | 2011-11-01 | Gray & Company, Inc. | Navigation and control system for autonomous vehicles |
US8996240B2 (en) | 2006-03-16 | 2015-03-31 | Smartdrive Systems, Inc. | Vehicle event recorders with integrated web server |
US9201842B2 (en) | 2006-03-16 | 2015-12-01 | Smartdrive Systems, Inc. | Vehicle event recorder systems and networks having integrated cellular wireless communications systems |
WO2007111223A1 (en) * | 2006-03-24 | 2007-10-04 | Pioneer Corporation | Device and method for detecting mental conditions of driver |
US8269617B2 (en) * | 2009-01-26 | 2012-09-18 | Drivecam, Inc. | Method and system for tuning the effect of vehicle characteristics on risk prediction |
US8508353B2 (en) * | 2009-01-26 | 2013-08-13 | Drivecam, Inc. | Driver risk assessment system and method having calibrating automatic event scoring |
US8849501B2 (en) * | 2009-01-26 | 2014-09-30 | Lytx, Inc. | Driver risk assessment system and method employing selectively automatic event scoring |
US20100056957A1 (en) * | 2006-06-09 | 2010-03-04 | Universite Joseph Fourier | Method and device for the rehabilitation and/or correction of postural symmetry in static or dynamic situations |
JP2008013004A (en) * | 2006-07-04 | 2008-01-24 | Fuji Heavy Ind Ltd | Driving support system by generating flavor |
WO2008013440A1 (en) * | 2006-07-24 | 2008-01-31 | Hua Hean Kee | Drowsiness prevention apparatus |
US7551068B2 (en) * | 2006-08-28 | 2009-06-23 | Lear Corporation | Vehicle seat alert system |
JP2008061931A (en) * | 2006-09-11 | 2008-03-21 | Toyota Motor Corp | Vehicle and body information gathering system |
EP2082383B1 (en) * | 2006-10-13 | 2012-06-13 | Toyota Jidosha Kabushiki Kaisha | On-board warning apparatus and warning method |
US8989959B2 (en) | 2006-11-07 | 2015-03-24 | Smartdrive Systems, Inc. | Vehicle operator performance history recording, scoring and reporting systems |
US8649933B2 (en) | 2006-11-07 | 2014-02-11 | Smartdrive Systems Inc. | Power management systems for automotive video event recorders |
US8868288B2 (en) | 2006-11-09 | 2014-10-21 | Smartdrive Systems, Inc. | Vehicle exception event management systems |
DE102006056094A1 (en) * | 2006-11-28 | 2008-05-29 | Robert Bosch Gmbh | Driver assistance system with presence monitoring |
US8078334B2 (en) | 2007-01-23 | 2011-12-13 | Alan Goodrich | Unobtrusive system and method for monitoring the physiological condition of a target user of a vehicle |
JP4748084B2 (en) * | 2007-03-06 | 2011-08-17 | トヨタ自動車株式会社 | Psychological state estimation device |
US7652583B2 (en) * | 2007-03-20 | 2010-01-26 | Deere & Company | Method and system for maintaining operator alertness |
JP5309126B2 (en) * | 2007-03-29 | 2013-10-09 | ニューロフォーカス・インコーポレーテッド | System, method, and apparatus for performing marketing and entertainment efficiency analysis |
US20080242951A1 (en) * | 2007-03-30 | 2008-10-02 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Effective low-profile health monitoring or the like |
US8117047B1 (en) | 2007-04-16 | 2012-02-14 | Insight Diagnostics Inc. | Healthcare provider organization |
US8184856B2 (en) * | 2007-04-30 | 2012-05-22 | Delphi Technologies, Inc. | Method and apparatus for assessing driver head pose with a headrest-mounted relative motion sensor |
US7970175B2 (en) * | 2007-04-30 | 2011-06-28 | Delphi Technologies, Inc. | Method and apparatus for assessing head pose of a vehicle driver |
US8386312B2 (en) | 2007-05-01 | 2013-02-26 | The Nielsen Company (Us), Llc | Neuro-informatics repository system |
WO2008137581A1 (en) | 2007-05-01 | 2008-11-13 | Neurofocus, Inc. | Neuro-feedback based stimulus compression device |
US8239092B2 (en) | 2007-05-08 | 2012-08-07 | Smartdrive Systems Inc. | Distributed vehicle event recorder systems having a portable memory data transfer system |
US20090024449A1 (en) * | 2007-05-16 | 2009-01-22 | Neurofocus Inc. | Habituation analyzer device utilizing central nervous system, autonomic nervous system and effector system measurements |
US8392253B2 (en) | 2007-05-16 | 2013-03-05 | The Nielsen Company (Us), Llc | Neuro-physiology and neuro-behavioral based stimulus targeting system |
CN101711124A (en) * | 2007-06-06 | 2010-05-19 | 神经焦点公司 | Multi-market program and commercial response monitoring system using neuro-response measurements |
US8494905B2 (en) | 2007-06-06 | 2013-07-23 | The Nielsen Company (Us), Llc | Audience response analysis using simultaneous electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) |
US9020585B2 (en) * | 2007-06-18 | 2015-04-28 | New York University | Electronic identity card |
JP4974788B2 (en) * | 2007-06-29 | 2012-07-11 | キヤノン株式会社 | Image processing apparatus, image processing method, program, and storage medium |
SE532317C2 (en) * | 2007-07-05 | 2009-12-15 | Svenska Utvecklings Entrepreno | Device for waking up drivers and operators |
KR20100038107A (en) | 2007-07-30 | 2010-04-12 | 뉴로포커스, 인크. | Neuro-response stimulus and stimulus attribute resonance estimator |
US20090054802A1 (en) * | 2007-08-22 | 2009-02-26 | National Yang-Ming University | Sunglass type sleep detecting and preventing device |
US8635105B2 (en) | 2007-08-28 | 2014-01-21 | The Nielsen Company (Us), Llc | Consumer experience portrayal effectiveness assessment system |
US8386313B2 (en) | 2007-08-28 | 2013-02-26 | The Nielsen Company (Us), Llc | Stimulus placement system using subject neuro-response measurements |
US8392254B2 (en) * | 2007-08-28 | 2013-03-05 | The Nielsen Company (Us), Llc | Consumer experience assessment system |
US8392255B2 (en) | 2007-08-29 | 2013-03-05 | The Nielsen Company (Us), Llc | Content based selection and meta tagging of advertisement breaks |
WO2009036327A1 (en) | 2007-09-14 | 2009-03-19 | Corventis, Inc. | Adherent device for respiratory monitoring and sleep disordered breathing |
EP2194858B1 (en) | 2007-09-14 | 2017-11-22 | Corventis, Inc. | Medical device automatic start-up upon contact to patient tissue |
EP3922171A1 (en) | 2007-09-14 | 2021-12-15 | Medtronic Monitoring, Inc. | Adherent cardiac monitor with advanced sensing capabilities |
WO2009036333A1 (en) | 2007-09-14 | 2009-03-19 | Corventis, Inc. | Dynamic pairing of patients to data collection gateways |
US8116841B2 (en) | 2007-09-14 | 2012-02-14 | Corventis, Inc. | Adherent device with multiple physiological sensors |
US8374688B2 (en) | 2007-09-14 | 2013-02-12 | Corventis, Inc. | System and methods for wireless body fluid monitoring |
WO2009036256A1 (en) | 2007-09-14 | 2009-03-19 | Corventis, Inc. | Injectable physiological monitoring system |
US20090076341A1 (en) * | 2007-09-14 | 2009-03-19 | Corventis, Inc. | Adherent Athletic Monitor |
US8494610B2 (en) | 2007-09-20 | 2013-07-23 | The Nielsen Company (Us), Llc | Analysis of marketing and entertainment effectiveness using magnetoencephalography |
US20090083129A1 (en) | 2007-09-20 | 2009-03-26 | Neurofocus, Inc. | Personalized content delivery using neuro-response priming data |
WO2009052490A1 (en) * | 2007-10-18 | 2009-04-23 | Carnett John B | Method and apparatus for soothing a baby |
US8082160B2 (en) | 2007-10-26 | 2011-12-20 | Hill-Rom Services, Inc. | System and method for collection and communication of data from multiple patient care devices |
US20100069775A1 (en) * | 2007-11-13 | 2010-03-18 | Michael Milgramm | EEG-Related Methods |
US7574254B2 (en) * | 2007-11-13 | 2009-08-11 | Wavesynch Technologies, Inc. | Method for monitoring attentiveness and productivity in a subject |
US20090292212A1 (en) * | 2008-05-20 | 2009-11-26 | Searete Llc, A Limited Corporation Of The State Of Delaware | Circulatory monitoring systems and methods |
US9672471B2 (en) * | 2007-12-18 | 2017-06-06 | Gearbox Llc | Systems, devices, and methods for detecting occlusions in a biological subject including spectral learning |
US20090287076A1 (en) * | 2007-12-18 | 2009-11-19 | Boyden Edward S | System, devices, and methods for detecting occlusions in a biological subject |
US20090292214A1 (en) * | 2008-05-22 | 2009-11-26 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Circulatory monitoring systems and methods |
US20090287110A1 (en) * | 2008-05-14 | 2009-11-19 | Searete Llc | Circulatory monitoring systems and methods |
US20090287109A1 (en) * | 2008-05-14 | 2009-11-19 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Circulatory monitoring systems and methods |
US20090292222A1 (en) * | 2008-05-14 | 2009-11-26 | Searete Llc | Circulatory monitoring systems and methods |
US8636670B2 (en) * | 2008-05-13 | 2014-01-28 | The Invention Science Fund I, Llc | Circulatory monitoring systems and methods |
US9717896B2 (en) * | 2007-12-18 | 2017-08-01 | Gearbox, Llc | Treatment indications informed by a priori implant information |
US20090281413A1 (en) * | 2007-12-18 | 2009-11-12 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Systems, devices, and methods for detecting occlusions in a biological subject |
US20100036209A1 (en) * | 2008-08-07 | 2010-02-11 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Circulatory monitoring systems and methods |
US20090287191A1 (en) * | 2007-12-18 | 2009-11-19 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Circulatory monitoring systems and methods |
US20100036263A1 (en) * | 2008-08-07 | 2010-02-11 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Circulatory monitoring systems and methods |
US20090281412A1 (en) * | 2007-12-18 | 2009-11-12 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | System, devices, and methods for detecting occlusions in a biological subject |
US20100036268A1 (en) * | 2008-08-07 | 2010-02-11 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Circulatory monitoring systems and methods |
US20090287101A1 (en) * | 2008-05-13 | 2009-11-19 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Circulatory monitoring systems and methods |
US20090163856A1 (en) * | 2007-12-19 | 2009-06-25 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Treatment indications informed by a prior implant information |
US20100036269A1 (en) * | 2008-08-07 | 2010-02-11 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Circulatory monitoring systems and methods |
US20090287120A1 (en) * | 2007-12-18 | 2009-11-19 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Circulatory monitoring systems and methods |
US20090292213A1 (en) * | 2008-05-21 | 2009-11-26 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Circulatory monitoring systems and methods |
US20090174560A1 (en) * | 2008-01-03 | 2009-07-09 | General Electric Company | Systems, Apparatuses And Methods For Monitoring Physical Conditions Of A Vehicle Driver |
GB0800615D0 (en) * | 2008-01-14 | 2008-02-20 | Hypo Safe As | Implantable electronic device |
US20110251468A1 (en) * | 2010-04-07 | 2011-10-13 | Ivan Osorio | Responsiveness testing of a patient having brain state changes |
US7798004B2 (en) * | 2008-01-28 | 2010-09-21 | Caterpillar Inc | Monitoring system for machine vibration |
US7978085B1 (en) | 2008-02-29 | 2011-07-12 | University Of South Florida | Human and physical asset movement pattern analyzer |
US8628478B2 (en) | 2009-02-25 | 2014-01-14 | Empire Technology Development Llc | Microphone for remote health sensing |
JP5405500B2 (en) | 2008-03-12 | 2014-02-05 | コーヴェンティス,インク. | Predicting cardiac decompensation based on cardiac rhythm |
ES2316307B2 (en) * | 2008-04-02 | 2010-12-03 | Universidad Politecnica De Madrid | SAFE DEVICE FOR DRIVING AGAINST SITUATIONS OF DEAD OR SLEEPING MAN. |
WO2009146214A1 (en) | 2008-04-18 | 2009-12-03 | Corventis, Inc. | Method and apparatus to measure bioelectric impedance of patient tissue |
CN101565036B (en) * | 2008-04-21 | 2011-04-27 | 上海汽车集团股份有限公司 | Method for preventing fatigue driving |
CA2727942A1 (en) * | 2008-05-02 | 2009-11-05 | Peter Demmelbauer | Device for stimulating areas of the human brain |
US8552851B2 (en) * | 2008-05-09 | 2013-10-08 | Nec Corporation | Operation state judgement method and system |
JP5127576B2 (en) * | 2008-06-11 | 2013-01-23 | ヤマハ発動機株式会社 | Mental work load detection device and motorcycle equipped with the same |
JP5196252B2 (en) * | 2008-06-26 | 2013-05-15 | 株式会社アドヴィックス | Vehicle control device |
IT1390617B1 (en) * | 2008-07-30 | 2011-09-09 | Iannotta | SAFETY SYSTEM FOR VEHICLES. |
CN101642375B (en) * | 2008-08-04 | 2012-05-09 | 南京大学 | Fatigue evaluation method and system |
EP2312551A4 (en) * | 2008-08-05 | 2014-10-15 | Panasonic Corp | Driver awareness degree judgment device, method, and program |
FR2935525A1 (en) * | 2008-08-26 | 2010-03-05 | Olivier Jean Pierre Allain | Somnolence prevention device for driver seated on seat of e.g. passenger car, has liquid reservoir communicated with nozzle that is mounted on backrest and headrest of seat for dissipating atomized liquid in direction of neck of person |
US8364220B2 (en) | 2008-09-25 | 2013-01-29 | Covidien Lp | Medical sensor and technique for using the same |
US8257274B2 (en) | 2008-09-25 | 2012-09-04 | Nellcor Puritan Bennett Llc | Medical sensor and technique for using the same |
ES2317810A1 (en) * | 2008-10-30 | 2009-04-16 | Universidad Politecnica De Madrid | Method and system for controlling driver activity in road vehicles (Machine-translation by Google Translate, not legally binding) |
KR101173944B1 (en) * | 2008-12-01 | 2012-08-20 | 한국전자통신연구원 | System and method for controlling sensibility of driver |
GB0822237D0 (en) * | 2008-12-05 | 2009-01-14 | Howell Steven | Remote health and security monitoring |
EP2375968B1 (en) | 2008-12-15 | 2018-11-14 | Medtronic Monitoring, Inc. | Patient monitoring systems and methods |
US8270814B2 (en) | 2009-01-21 | 2012-09-18 | The Nielsen Company (Us), Llc | Methods and apparatus for providing video with embedded media |
US8464288B2 (en) | 2009-01-21 | 2013-06-11 | The Nielsen Company (Us), Llc | Methods and apparatus for providing personalized media in video |
US9357240B2 (en) | 2009-01-21 | 2016-05-31 | The Nielsen Company (Us), Llc | Methods and apparatus for providing alternate media for video decoders |
US8854199B2 (en) * | 2009-01-26 | 2014-10-07 | Lytx, Inc. | Driver risk assessment system and method employing automated driver log |
US8866621B2 (en) * | 2009-02-25 | 2014-10-21 | Empire Technology Development Llc | Sudden infant death prevention clothing |
WO2010103361A1 (en) * | 2009-03-09 | 2010-09-16 | Abb Research Ltd | Method for determining operator condition, device therefrom and their use in alarm response system in a facility |
US8824666B2 (en) * | 2009-03-09 | 2014-09-02 | Empire Technology Development Llc | Noise cancellation for phone conversation |
US20100250325A1 (en) | 2009-03-24 | 2010-09-30 | Neurofocus, Inc. | Neurological profiles for market matching and stimulus presentation |
US9186075B2 (en) | 2009-03-24 | 2015-11-17 | Covidien Lp | Indicating the accuracy of a physiological parameter |
US8515515B2 (en) | 2009-03-25 | 2013-08-20 | Covidien Lp | Medical sensor with compressible light barrier and technique for using the same |
EP2237237B1 (en) | 2009-03-30 | 2013-03-20 | Tobii Technology AB | Eye closure detection using structured illumination |
US8781548B2 (en) | 2009-03-31 | 2014-07-15 | Covidien Lp | Medical sensor with flexible components and technique for using the same |
CN101859473A (en) * | 2009-04-07 | 2010-10-13 | 陈耕田 | Fatigue driving early warning and automatic controlling device (machine) |
US8577570B2 (en) * | 2009-04-17 | 2013-11-05 | Honda Motor Co., Ltd. | Touch point calibration method for a motor vehicle |
US8193941B2 (en) | 2009-05-06 | 2012-06-05 | Empire Technology Development Llc | Snoring treatment |
US20100286545A1 (en) * | 2009-05-06 | 2010-11-11 | Andrew Wolfe | Accelerometer based health sensing |
WO2010143535A1 (en) * | 2009-06-08 | 2010-12-16 | 公立大学法人名古屋市立大学 | Sleepiness assessment device |
WO2010151603A1 (en) | 2009-06-23 | 2010-12-29 | L&P Property Management Company | Drowsy driver detection system |
US8427326B2 (en) * | 2009-07-30 | 2013-04-23 | Meir Ben David | Method and system for detecting the physiological onset of operator fatigue, drowsiness, or performance decrement |
US20110046473A1 (en) * | 2009-08-20 | 2011-02-24 | Neurofocus, Inc. | Eeg triggered fmri signal acquisition |
US8655437B2 (en) | 2009-08-21 | 2014-02-18 | The Nielsen Company (Us), Llc | Analysis of the mirror neuron system for evaluation of stimulus |
US10987015B2 (en) | 2009-08-24 | 2021-04-27 | Nielsen Consumer Llc | Dry electrodes for electroencephalography |
US8756657B2 (en) * | 2009-09-29 | 2014-06-17 | Ebay Inc. | Mobile or user device authentication and tracking |
US9688286B2 (en) * | 2009-09-29 | 2017-06-27 | Omnitracs, Llc | System and method for integrating smartphone technology into a safety management platform to improve driver safety |
WO2011045936A1 (en) * | 2009-10-15 | 2011-04-21 | パナソニック株式会社 | Driving attention amount determination device, method, and computer program |
WO2011050283A2 (en) | 2009-10-22 | 2011-04-28 | Corventis, Inc. | Remote detection and monitoring of functional chronotropic incompetence |
US20110106750A1 (en) | 2009-10-29 | 2011-05-05 | Neurofocus, Inc. | Generating ratings predictions using neuro-response data |
US9560984B2 (en) | 2009-10-29 | 2017-02-07 | The Nielsen Company (Us), Llc | Analysis of controlled and automatic attention for introduction of stimulus material |
US8209224B2 (en) | 2009-10-29 | 2012-06-26 | The Nielsen Company (Us), Llc | Intracluster content management using neuro-response priming data |
US8591412B2 (en) * | 2009-11-18 | 2013-11-26 | Nohands, Llc | Method and system for preventing virus-related obesity and obesity related diseases |
US8335716B2 (en) | 2009-11-19 | 2012-12-18 | The Nielsen Company (Us), Llc. | Multimedia advertisement exchange |
US8335715B2 (en) | 2009-11-19 | 2012-12-18 | The Nielsen Company (Us), Llc. | Advertisement exchange using neuro-response data |
CN102069710B (en) * | 2009-11-24 | 2014-03-26 | 鸿富锦精密工业(深圳)有限公司 | Device and method for monitoring driving |
US8482415B2 (en) | 2009-12-04 | 2013-07-09 | Covidien Lp | Interactive multilevel alarm |
US8564424B2 (en) * | 2009-12-07 | 2013-10-22 | Inventioneers Etc., Llc | Steering wheel hand position sensing device |
US9451897B2 (en) | 2009-12-14 | 2016-09-27 | Medtronic Monitoring, Inc. | Body adherent patch with electronics for physiologic monitoring |
US8965498B2 (en) | 2010-04-05 | 2015-02-24 | Corventis, Inc. | Method and apparatus for personalized physiologic parameters |
US8684742B2 (en) | 2010-04-19 | 2014-04-01 | Innerscope Research, Inc. | Short imagery task (SIT) research method |
US8655428B2 (en) | 2010-05-12 | 2014-02-18 | The Nielsen Company (Us), Llc | Neuro-response data synchronization |
DE102010049152B4 (en) * | 2010-05-21 | 2015-11-12 | Johnson Controls Gmbh | Vehicle seat with intelligent actuators |
EP2409874A1 (en) * | 2010-07-21 | 2012-01-25 | BIOTRONIK SE & Co. KG | Operating ability monitoring system |
US8392251B2 (en) | 2010-08-09 | 2013-03-05 | The Nielsen Company (Us), Llc | Location aware presentation of stimulus material |
US8392250B2 (en) | 2010-08-09 | 2013-03-05 | The Nielsen Company (Us), Llc | Neuro-response evaluated stimulus in virtual reality environments |
US8396744B2 (en) | 2010-08-25 | 2013-03-12 | The Nielsen Company (Us), Llc | Effective virtual reality environments for presentation of marketing materials |
CN101941425B (en) * | 2010-09-17 | 2012-08-22 | 上海交通大学 | Intelligent recognition device and method for fatigue state of driver |
US20120075122A1 (en) | 2010-09-24 | 2012-03-29 | Honeywell International Inc. | Alert generation and related aircraft operating methods |
US9775545B2 (en) | 2010-09-28 | 2017-10-03 | Masimo Corporation | Magnetic electrical connector for patient monitors |
JP5710767B2 (en) | 2010-09-28 | 2015-04-30 | マシモ コーポレイション | Depth of consciousness monitor including oximeter |
US9330567B2 (en) * | 2011-11-16 | 2016-05-03 | Autoconnect Holdings Llc | Etiquette suggestion |
DE102010042547B4 (en) | 2010-10-15 | 2018-07-26 | Deutsche Telekom Ag | Method for haptic interaction in telecommunications and telecommunications terminal designed for this purpose |
US10292625B2 (en) | 2010-12-07 | 2019-05-21 | Earlysense Ltd. | Monitoring a sleeping subject |
ES2366219B1 (en) * | 2010-12-24 | 2012-09-27 | Fico Mirrors, S.A. | METHOD AND SYSTEM OF MEASUREMENT OF PHYSIOLOGICAL PARAMETERS. |
US9946334B2 (en) * | 2010-12-30 | 2018-04-17 | Denso International America, Inc. | Method to determine driver workload function and usage of driver workload function for human-machine interface performance assessment |
DE102011002920A1 (en) * | 2011-01-20 | 2012-07-26 | Robert Bosch Gmbh | Method for monitoring the posture of a motorcyclist |
US8482418B1 (en) | 2011-02-18 | 2013-07-09 | Pursuit Enterprises | Method and apparatus for monitoring and treatment of sleep-related conditions |
US9292471B2 (en) | 2011-02-18 | 2016-03-22 | Honda Motor Co., Ltd. | Coordinated vehicle response system and method for driver behavior |
US20120212345A1 (en) * | 2011-02-18 | 2012-08-23 | Polly Harman | Device for the treatment of sleep-related conditions |
US8698639B2 (en) | 2011-02-18 | 2014-04-15 | Honda Motor Co., Ltd. | System and method for responding to driver behavior |
US8731736B2 (en) | 2011-02-22 | 2014-05-20 | Honda Motor Co., Ltd. | System and method for reducing driving skill atrophy |
US20120221895A1 (en) * | 2011-02-26 | 2012-08-30 | Pulsar Informatics, Inc. | Systems and methods for competitive stimulus-response test scoring |
TWI434233B (en) | 2011-05-17 | 2014-04-11 | Ind Tech Res Inst | Predictive drowsiness alarm method |
WO2012170816A2 (en) * | 2011-06-09 | 2012-12-13 | Prinsell Jeffrey | Sleep onset detection system and method |
US9380978B2 (en) * | 2011-06-29 | 2016-07-05 | Bruce Reiner | Method and apparatus for real-time measurement and analysis of occupational stress and fatigue and performance outcome predictions |
EP2729058B1 (en) | 2011-07-05 | 2019-03-13 | Saudi Arabian Oil Company | Floor mat system and associated, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees |
US10108783B2 (en) | 2011-07-05 | 2018-10-23 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for monitoring health of employees using mobile devices |
US9710788B2 (en) | 2011-07-05 | 2017-07-18 | Saudi Arabian Oil Company | Computer mouse system and associated, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees |
US9844344B2 (en) | 2011-07-05 | 2017-12-19 | Saudi Arabian Oil Company | Systems and method to monitor health of employee when positioned in association with a workstation |
US9526455B2 (en) | 2011-07-05 | 2016-12-27 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees |
US9492120B2 (en) | 2011-07-05 | 2016-11-15 | Saudi Arabian Oil Company | Workstation for monitoring and improving health and productivity of employees |
US8872640B2 (en) * | 2011-07-05 | 2014-10-28 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for monitoring health and ergonomic status of drivers of vehicles |
US9256711B2 (en) | 2011-07-05 | 2016-02-09 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for providing health information to employees via augmented reality display |
US9962083B2 (en) | 2011-07-05 | 2018-05-08 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for monitoring and improving biomechanical health of employees |
US8695692B2 (en) * | 2011-07-29 | 2014-04-15 | Baker Hughes Incorporated | Downhole condition alert system for a drill operator |
US8941499B2 (en) * | 2011-08-01 | 2015-01-27 | Honda Motor Co., Ltd. | Monitoring system for use with a vehicle and method of assembling same |
US8606492B1 (en) | 2011-08-31 | 2013-12-10 | Drivecam, Inc. | Driver log generation |
CN103974656B (en) * | 2011-09-05 | 2016-10-12 | 富山县 | Sleepy detection method and equipment |
US8744642B2 (en) | 2011-09-16 | 2014-06-03 | Lytx, Inc. | Driver identification based on face data |
US8996234B1 (en) | 2011-10-11 | 2015-03-31 | Lytx, Inc. | Driver performance determination based on geolocation |
US9298575B2 (en) | 2011-10-12 | 2016-03-29 | Lytx, Inc. | Drive event capturing based on geolocation |
DE102011086740A1 (en) * | 2011-11-21 | 2013-05-23 | Zf Friedrichshafen Ag | Detection device for recording vital signs |
US8989914B1 (en) | 2011-12-19 | 2015-03-24 | Lytx, Inc. | Driver identification based on driving maneuver signature |
US20130204153A1 (en) * | 2012-02-06 | 2013-08-08 | Emily Ruth Buzhardt | Generating an alarm based on brain wave patterns of a user |
US9451303B2 (en) | 2012-02-27 | 2016-09-20 | The Nielsen Company (Us), Llc | Method and system for gathering and computing an audience's neurologically-based reactions in a distributed framework involving remote storage and computing |
US9292858B2 (en) | 2012-02-27 | 2016-03-22 | The Nielsen Company (Us), Llc | Data collection system for aggregating biologically based measures in asynchronous geographically distributed public environments |
US9569986B2 (en) | 2012-02-27 | 2017-02-14 | The Nielsen Company (Us), Llc | System and method for gathering and analyzing biometric user feedback for use in social media and advertising applications |
CN104246848B (en) * | 2012-04-02 | 2017-04-12 | 丰田自动车株式会社 | Driving assistance device |
US8676428B2 (en) | 2012-04-17 | 2014-03-18 | Lytx, Inc. | Server request for downloaded information from a vehicle-based monitor |
US9240079B2 (en) | 2012-04-17 | 2016-01-19 | Lytx, Inc. | Triggering a specialized data collection mode |
US10102773B2 (en) * | 2012-04-23 | 2018-10-16 | The Boeing Company | Methods for evaluating human performance in aviation |
US9004687B2 (en) | 2012-05-18 | 2015-04-14 | Sync-Think, Inc. | Eye tracking headset and system for neuropsychological testing including the detection of brain damage |
CN103505224B (en) * | 2012-06-27 | 2015-01-07 | 东北大学 | Fatigue driving remote monitoring and alarm system and method based on physiological information analysis |
DE102012014717A1 (en) * | 2012-07-25 | 2014-01-30 | Audi Ag | Method and driver assistance system for operating a vehicle in the event of a driver's health disorder |
US9728228B2 (en) | 2012-08-10 | 2017-08-08 | Smartdrive Systems, Inc. | Vehicle event playback apparatus and methods |
US8989835B2 (en) | 2012-08-17 | 2015-03-24 | The Nielsen Company (Us), Llc | Systems and methods to gather and analyze electroencephalographic data |
JP6215944B2 (en) * | 2012-08-20 | 2017-10-18 | オートリブ ディベロップメント エービー | Processing related to eyelid movement to detect drowsiness |
DE102012219508A1 (en) * | 2012-10-25 | 2014-04-30 | Robert Bosch Gmbh | Method and device for driver status detection |
JP5910755B2 (en) * | 2012-11-24 | 2016-04-27 | トヨタ自動車株式会社 | Vehicle state determination device, vehicle state determination method, and driving operation diagnosis device |
US12032817B2 (en) | 2012-11-27 | 2024-07-09 | Neonode Inc. | Vehicle user interface |
US9092093B2 (en) | 2012-11-27 | 2015-07-28 | Neonode Inc. | Steering wheel user interface |
US9344683B1 (en) | 2012-11-28 | 2016-05-17 | Lytx, Inc. | Capturing driving risk based on vehicle state and automatic detection of a state of a location |
US9265458B2 (en) | 2012-12-04 | 2016-02-23 | Sync-Think, Inc. | Application of smooth pursuit cognitive testing paradigms to clinical drug development |
RU2499692C1 (en) * | 2012-12-12 | 2013-11-27 | Олег Игоревич Антипов | Method of control over driver or dispatcher vigilance and device to prevent falling asleep |
JP6063775B2 (en) * | 2013-03-01 | 2017-01-18 | 東洋紡株式会社 | Dozing prevention method and dozing prevention device |
US9380976B2 (en) | 2013-03-11 | 2016-07-05 | Sync-Think, Inc. | Optical neuroinformatics |
US9418617B1 (en) | 2013-03-13 | 2016-08-16 | Google Inc. | Methods and systems for receiving input controls |
US9320450B2 (en) | 2013-03-14 | 2016-04-26 | The Nielsen Company (Us), Llc | Methods and apparatus to gather and analyze electroencephalographic data |
US9398875B2 (en) | 2013-11-07 | 2016-07-26 | Honda Motor Co., Ltd. | Method and system for biological signal analysis |
US9751534B2 (en) | 2013-03-15 | 2017-09-05 | Honda Motor Co., Ltd. | System and method for responding to driver state |
TWI498858B (en) * | 2013-03-29 | 2015-09-01 | Wistron Corp | Computing system and method for automatically detecting fatigue status of user |
US10499856B2 (en) | 2013-04-06 | 2019-12-10 | Honda Motor Co., Ltd. | System and method for biological signal processing with highly auto-correlated carrier sequences |
CA2909785C (en) | 2013-05-01 | 2023-09-26 | Musc Foundation For Research Development | Monitoring neurological functional status |
FI124068B (en) * | 2013-05-03 | 2014-02-28 | Jyvaeskylaen Yliopisto | A method to improve driving safety |
US20150051508A1 (en) | 2013-08-13 | 2015-02-19 | Sync-Think, Inc. | System and Method for Cognition and Oculomotor Impairment Diagnosis Using Binocular Coordination Analysis |
US11317861B2 (en) | 2013-08-13 | 2022-05-03 | Sync-Think, Inc. | Vestibular-ocular reflex test and training system |
US10115164B1 (en) | 2013-10-04 | 2018-10-30 | State Farm Mutual Automobile Insurance Company | Systems and methods to quantify and differentiate individual insurance risk based on actual driving behavior and driving environment |
NZ630770A (en) | 2013-10-09 | 2016-03-31 | Resmed Sensor Technologies Ltd | Fatigue monitoring and management system |
US9501878B2 (en) | 2013-10-16 | 2016-11-22 | Smartdrive Systems, Inc. | Vehicle event playback apparatus and methods |
JP6312193B2 (en) * | 2013-10-21 | 2018-04-18 | テイ・エス テック株式会社 | Awakening device and seat |
US9958939B2 (en) | 2013-10-31 | 2018-05-01 | Sync-Think, Inc. | System and method for dynamic content delivery based on gaze analytics |
US9610955B2 (en) | 2013-11-11 | 2017-04-04 | Smartdrive Systems, Inc. | Vehicle fuel consumption monitor and feedback systems |
DE102013224512B4 (en) * | 2013-11-29 | 2024-02-01 | Bayerische Motoren Werke Aktiengesellschaft | System and method for determining a touch threshold and vehicle |
DE102014100965B4 (en) * | 2014-01-28 | 2016-01-14 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Driver assistance system |
US20170042439A1 (en) * | 2014-02-14 | 2017-02-16 | National University Of Singapore | System, device and methods for brainwave-based technologies |
US8892310B1 (en) | 2014-02-21 | 2014-11-18 | Smartdrive Systems, Inc. | System and method to detect execution of driving maneuvers |
US9622702B2 (en) | 2014-04-03 | 2017-04-18 | The Nielsen Company (Us), Llc | Methods and apparatus to gather and analyze electroencephalographic data |
DE102014206626A1 (en) * | 2014-04-07 | 2015-10-08 | Bayerische Motoren Werke Aktiengesellschaft | Fatigue detection using data glasses (HMD) |
CN103989471B (en) * | 2014-05-08 | 2015-11-04 | 东北大学 | A kind of method for detecting fatigue driving based on electroencephalogram identification |
US9465981B2 (en) | 2014-05-09 | 2016-10-11 | Barron Associates, Inc. | System and method for communication |
US9476729B2 (en) * | 2014-05-29 | 2016-10-25 | GM Global Technology Operations LLC | Adaptive navigation and location-based services based on user behavior patterns |
KR102051142B1 (en) * | 2014-06-13 | 2019-12-02 | 현대모비스 주식회사 | System for managing dangerous driving index for vehicle and method therof |
DE102014215461B4 (en) * | 2014-08-05 | 2023-06-15 | Robert Bosch Gmbh | Method and device for operating a vehicle, in particular a railway vehicle |
JP2017530757A (en) | 2014-09-09 | 2017-10-19 | トーヴェック・インコーポレーテッド | Method and apparatus for monitoring personal attention and providing notifications using a wearable device |
DE102014218744A1 (en) * | 2014-09-18 | 2016-03-24 | Bayerische Motoren Werke Aktiengesellschaft | A method and apparatus for improving the condition of an occupant of a vehicle |
DE102014224483A1 (en) * | 2014-12-01 | 2016-06-02 | Bayerische Motoren Werke Aktiengesellschaft | Support the breathing of a driver |
US10154815B2 (en) | 2014-10-07 | 2018-12-18 | Masimo Corporation | Modular physiological sensors |
ES2632494T3 (en) * | 2014-10-13 | 2017-09-13 | MY E.G. Services Berhad | Procedure and system to improve road safety |
US9904362B2 (en) * | 2014-10-24 | 2018-02-27 | GM Global Technology Operations LLC | Systems and methods for use at a vehicle including an eye tracking device |
US9663127B2 (en) | 2014-10-28 | 2017-05-30 | Smartdrive Systems, Inc. | Rail vehicle event detection and recording system |
DE102014222355A1 (en) * | 2014-11-03 | 2016-05-04 | Bayerische Motoren Werke Aktiengesellschaft | Fatigue detection with sensors of data glasses |
CN104305964B (en) * | 2014-11-11 | 2016-05-04 | 东南大学 | Wear-type fatigue detection device and method |
US11069257B2 (en) | 2014-11-13 | 2021-07-20 | Smartdrive Systems, Inc. | System and method for detecting a vehicle event and generating review criteria |
US20160144778A1 (en) | 2014-11-24 | 2016-05-26 | David M. Tucker | Enhanced communication system for vehicle hazard lights |
CN104382599B (en) * | 2014-12-05 | 2017-01-18 | 京东方科技集团股份有限公司 | Method, equipment and wearable device for measuring activities of cervical vertebrae |
FR3030384B1 (en) * | 2014-12-17 | 2018-05-11 | Valeo Systemes Thermiques | METHOD FOR MAINTAINING THE VIGILANCE OF A DRIVER FOR A MOTOR VEHICLE |
CA3081166A1 (en) | 2015-01-06 | 2016-07-14 | David Burton | Mobile wearable monitoring systems |
JP6191633B2 (en) | 2015-02-20 | 2017-09-06 | トヨタ自動車株式会社 | Driving assistance device |
JP6154411B2 (en) | 2015-02-27 | 2017-06-28 | 本田技研工業株式会社 | Vehicle warning device |
US20180028088A1 (en) * | 2015-02-27 | 2018-02-01 | University Of Houston System | Systems and methods for medical procedure monitoring |
FR3033303B1 (en) * | 2015-03-03 | 2017-02-24 | Renault Sas | DEVICE AND METHOD FOR PREDICTING A LEVEL OF VIGILANCE IN A DRIVER OF A MOTOR VEHICLE. |
US10173687B2 (en) | 2015-03-16 | 2019-01-08 | Wellen Sham | Method for recognizing vehicle driver and determining whether driver can start vehicle |
US10703211B2 (en) | 2015-03-16 | 2020-07-07 | Thunder Power New Energy Vehicle Development Company Limited | Battery pack, battery charging station, and charging method |
US9954260B2 (en) | 2015-03-16 | 2018-04-24 | Thunder Power New Energy Vehicle Development Company Limited | Battery system with heat exchange device |
KR101648017B1 (en) * | 2015-03-23 | 2016-08-12 | 현대자동차주식회사 | Display apparatus, vehicle and display method |
US9679420B2 (en) | 2015-04-01 | 2017-06-13 | Smartdrive Systems, Inc. | Vehicle event recording system and method |
CN204515387U (en) * | 2015-04-03 | 2015-07-29 | 乐卡汽车智能科技(北京)有限公司 | Electroencephalogramsignal signal collection equipment and car-mounted terminal control system |
CN104840204B (en) * | 2015-04-10 | 2018-01-19 | 京东方科技集团股份有限公司 | A kind of fatigue driving monitoring method and equipment |
US9936250B2 (en) | 2015-05-19 | 2018-04-03 | The Nielsen Company (Us), Llc | Methods and apparatus to adjust content presented to an individual |
JP6692610B2 (en) * | 2015-06-09 | 2020-05-13 | 富士通株式会社 | Detecting program, detecting method and detecting device |
US20160362111A1 (en) * | 2015-06-12 | 2016-12-15 | Jaguar Land Rover Limited | Driver awareness sensing and indicator control |
CN104886789B (en) * | 2015-06-29 | 2016-08-17 | 京东方科技集团股份有限公司 | A kind of package structure |
WO2017003719A2 (en) | 2015-06-30 | 2017-01-05 | 3M Innovative Properties Company | Illuminator |
WO2017028895A1 (en) * | 2015-08-17 | 2017-02-23 | Polar Electro Oy | Enhancing vehicle system control |
WO2017031437A1 (en) * | 2015-08-20 | 2017-02-23 | Zansors Llc | Neuro-vigilance integrated contact eye lens and system |
US10319037B1 (en) * | 2015-09-01 | 2019-06-11 | State Farm Mutual Automobile Insurance Company | Systems and methods for assessing risk based on driver gesture behaviors |
CN105172599B (en) * | 2015-09-25 | 2018-03-06 | 大陆汽车电子(芜湖)有限公司 | The active automobile instrument system of integrated wearable device |
US9676395B2 (en) | 2015-10-30 | 2017-06-13 | Ford Global Technologies, Llc | Incapacitated driving detection and prevention |
DE102015223271A1 (en) * | 2015-11-25 | 2017-06-01 | Preh Car Connect Gmbh | A method of outputting a sequence of motion instructions |
US10642955B2 (en) | 2015-12-04 | 2020-05-05 | Saudi Arabian Oil Company | Devices, methods, and computer medium to provide real time 3D visualization bio-feedback |
US10475351B2 (en) | 2015-12-04 | 2019-11-12 | Saudi Arabian Oil Company | Systems, computer medium and methods for management training systems |
US9889311B2 (en) | 2015-12-04 | 2018-02-13 | Saudi Arabian Oil Company | Systems, protective casings for smartphones, and associated methods to enhance use of an automated external defibrillator (AED) device |
US10628770B2 (en) | 2015-12-14 | 2020-04-21 | Saudi Arabian Oil Company | Systems and methods for acquiring and employing resiliency data for leadership development |
US9712736B2 (en) * | 2015-12-15 | 2017-07-18 | Intel Coprporation | Electroencephalography (EEG) camera control |
US9599986B1 (en) * | 2015-12-22 | 2017-03-21 | International Business Machines Corporation | Emergency automated vehicle control system to monitor emergency medical events through body area networks |
WO2017138940A1 (en) * | 2016-02-11 | 2017-08-17 | Ford Global Technologies, Llc | Spatial clustering of personal devices in vehicles |
US20170231545A1 (en) * | 2016-02-14 | 2017-08-17 | Earlysense Ltd. | Apparatus and methods for monitoring a subject |
US10238335B2 (en) | 2016-02-18 | 2019-03-26 | Curaegis Technologies, Inc. | Alertness prediction system and method |
WO2017182104A1 (en) * | 2016-04-20 | 2017-10-26 | Siemens Rail Automation S.A.U. | Dead man's control system and method for a vehicle |
US10360787B2 (en) | 2016-05-05 | 2019-07-23 | Hill-Rom Services, Inc. | Discriminating patient care communications system |
JP2017220097A (en) * | 2016-06-09 | 2017-12-14 | 株式会社デンソー | Drive support device |
US20170360360A1 (en) * | 2016-06-15 | 2017-12-21 | Yousef ALQURASHI | Sleep monitoring cap |
WO2018008666A1 (en) * | 2016-07-07 | 2018-01-11 | 国立研究開発法人産業技術総合研究所 | Physiological condition assessing device, physiological condition assessing method, program for physiological condition assessing device, and physiological condition assessing system |
CN107788967B (en) * | 2016-08-30 | 2021-07-06 | 华邦电子股份有限公司 | Fatigue detection device and fatigue detection method |
US20180055354A1 (en) * | 2016-08-31 | 2018-03-01 | Alcohol Countermeasure Systems (International) Inc. | Novel non-intrusive approach to assess drowsiness based on eye movements and blinking |
DE112016007124T5 (en) * | 2016-09-08 | 2019-05-16 | Ford Motor Company | Methods and apparatus for monitoring a level of activity of a driver |
WO2018053511A1 (en) | 2016-09-19 | 2018-03-22 | Ntt Innovation Institute, Inc. | Threat scoring system and method |
CN106446831B (en) * | 2016-09-24 | 2021-06-25 | 江西欧迈斯微电子有限公司 | Face recognition method and device |
WO2018098289A1 (en) * | 2016-11-23 | 2018-05-31 | Cognifisense, Inc. | Identifying and measuring bodily states and feedback systems background |
US11757857B2 (en) | 2017-01-23 | 2023-09-12 | Ntt Research, Inc. | Digital credential issuing system and method |
DE102017201405B4 (en) * | 2017-01-30 | 2018-12-13 | Audi Ag | Method for operating a motor vehicle |
US9904287B1 (en) * | 2017-05-04 | 2018-02-27 | Toyota Research Institute, Inc. | Systems and methods for mitigating vigilance decrement while maintaining readiness using augmented reality in a vehicle |
JP6874516B2 (en) * | 2017-05-11 | 2021-05-19 | いすゞ自動車株式会社 | Vehicle driving control system and vehicle driving control method |
US11117515B2 (en) * | 2017-05-19 | 2021-09-14 | Yazaki Corporation | Monitoring system |
JP2020115902A (en) * | 2017-05-25 | 2020-08-06 | パナソニックIpマネジメント株式会社 | Awakening guide control device and awakening guide system |
JP6848702B2 (en) * | 2017-06-07 | 2021-03-24 | トヨタ自動車株式会社 | Awakening support device and awakening support method |
JP6638701B2 (en) | 2017-06-08 | 2020-01-29 | トヨタ自動車株式会社 | Driving awareness estimation device |
CN107248263A (en) * | 2017-08-08 | 2017-10-13 | 京东方科技集团股份有限公司 | A kind of fatigue driving detecting system |
CN110998035A (en) * | 2017-08-08 | 2020-04-10 | 住友重机械工业株式会社 | Shovel, shovel support device, and shovel management device |
JP7155266B2 (en) * | 2017-08-23 | 2022-10-18 | エヌティーティー リサーチ インコーポレイテッド | Systems and methods for racing data analysis using telemetry data and wearable sensor data |
JP6804416B2 (en) * | 2017-09-25 | 2020-12-23 | 矢崎総業株式会社 | Monitoring system |
US10379535B2 (en) | 2017-10-24 | 2019-08-13 | Lear Corporation | Drowsiness sensing system |
WO2019096673A1 (en) * | 2017-11-15 | 2019-05-23 | Sony Corporation | Terminal device, infrastructure equipment and methods |
US10824132B2 (en) | 2017-12-07 | 2020-11-03 | Saudi Arabian Oil Company | Intelligent personal protective equipment |
CN107985199B (en) * | 2017-12-29 | 2023-04-07 | 吉林大学 | Passenger car driver working state detection and fatigue warning system and method |
KR20190088783A (en) * | 2018-01-19 | 2019-07-29 | 주식회사 마에스트로 | Apparatus and Method for measuring fatigue of an user |
US10717443B2 (en) * | 2018-01-22 | 2020-07-21 | Rivian Ip Holdings, Llc | Occupant awareness monitoring for autonomous vehicles |
JP7141681B2 (en) * | 2018-01-29 | 2022-09-26 | 株式会社Agama-X | Information processing device, information processing system and program |
CN112041910B (en) * | 2018-03-30 | 2023-08-18 | 索尼半导体解决方案公司 | Information processing apparatus, mobile device, method, and program |
US11372893B2 (en) | 2018-06-01 | 2022-06-28 | Ntt Security Holdings Corporation | Ensemble-based data curation pipeline for efficient label propagation |
CN109035960A (en) * | 2018-06-15 | 2018-12-18 | 吉林大学 | Driver's driving mode analysis system and analysis method based on simulation driving platform |
WO2020003179A1 (en) * | 2018-06-28 | 2020-01-02 | 3M Innovative Properties Company | Notification delivery for workers wearing personal protective equipment |
US10611384B1 (en) * | 2018-07-27 | 2020-04-07 | Uatc, Llc | Systems and methods for autonomous vehicle operator vigilance management |
CN108937923A (en) * | 2018-08-02 | 2018-12-07 | 扬州市紫麓信息技术有限公司 | A kind of real-time driving fatigue monitoring system based on EEG signals, electro-ocular signal and electromyography signal |
DE102018215674A1 (en) * | 2018-09-14 | 2020-03-19 | Continental Automotive Gmbh | Procedure for reducing the start-up time, vehicle, computer program and data carrier signal |
WO2020079990A1 (en) * | 2018-10-19 | 2020-04-23 | 株式会社デンソー | Obstacle degree calculating system, and driving guide system |
US11807227B2 (en) * | 2018-11-02 | 2023-11-07 | Intel Corporation | Methods and apparatus to generate vehicle warnings |
US11429230B2 (en) | 2018-11-28 | 2022-08-30 | Neonode Inc | Motorist user interface sensor |
WO2020123673A1 (en) | 2018-12-11 | 2020-06-18 | Ess-Help, Inc. | Enhanced operation of vehicle hazard and lighting communication systems |
US11590887B2 (en) | 2019-03-15 | 2023-02-28 | Ess-Help, Inc. | Control of high visibility vehicle light communication systems |
US11518298B2 (en) | 2019-03-15 | 2022-12-06 | ESS-Help, lnc. | High visibility lighting for autonomous vehicles |
WO2020190889A1 (en) * | 2019-03-15 | 2020-09-24 | Ess-Help, Inc. | Control of high visibility vehicle light communication systems |
KR102452526B1 (en) | 2019-03-28 | 2022-10-06 | 이에스에스-헬프, 아이엔씨. | Remote vehicle hazard and communication beacon |
US11252185B2 (en) | 2019-03-28 | 2022-02-15 | NTT Security Corporation | Graph stream mining pipeline for efficient subgraph detection |
EP3730331B1 (en) * | 2019-04-26 | 2023-03-08 | Zenuity AB | Method and device for controlling a driver assistance |
TWI749323B (en) * | 2019-04-30 | 2021-12-11 | 先進光電科技股份有限公司 | Mobile Vehicle Assist System |
EP3744253A1 (en) * | 2019-05-28 | 2020-12-02 | Kubatronik Leiterplatten GmbH | Testing apparatus for testing the condition of a user of a device, and transport device comprising such a testing apparatus |
US11571155B2 (en) * | 2019-06-06 | 2023-02-07 | Honda Motor Co., Ltd. | System and method for completing a measurement of anxiety |
KR102673306B1 (en) * | 2019-07-29 | 2024-06-10 | 현대자동차주식회사 | Driver monitoring apparatus and method |
CA3150943A1 (en) | 2019-08-12 | 2021-02-18 | Ess-Help, Inc. | System for communication of hazardous vehicle and road conditions |
CN110525444B (en) * | 2019-08-20 | 2022-05-10 | 浙江吉利汽车研究院有限公司 | Method and device for processing abnormal body condition of driver |
DE102019212412A1 (en) * | 2019-08-20 | 2021-02-25 | Zf Friedrichshafen Ag | Monitoring device and method for monitoring a living being in the interior of a vehicle and computer program |
US11826146B2 (en) * | 2019-10-10 | 2023-11-28 | Waymo Llc | Psychomotor vigilance testing for persons tasked with monitoring autonomous vehicles |
CN110606214A (en) * | 2019-10-28 | 2019-12-24 | 航宇救生装备有限公司 | Intelligent pilot protection system architecture |
GB2589337A (en) * | 2019-11-27 | 2021-06-02 | Continental Automotive Gmbh | Method of determining fused sensor measurement and vehicle safety system using the fused sensor measurement |
US11200407B2 (en) * | 2019-12-02 | 2021-12-14 | Motorola Solutions, Inc. | Smart badge, and method, system and computer program product for badge detection and compliance |
JP7351253B2 (en) * | 2020-03-31 | 2023-09-27 | いすゞ自動車株式会社 | Approval/refusal decision device |
US11033214B1 (en) * | 2020-05-19 | 2021-06-15 | United Arab Emirates University | Wearable eye tracking system |
US11070454B1 (en) | 2020-06-22 | 2021-07-20 | Bank Of America Corporation | System for routing functionality packets based on monitoring real-time indicators |
DE102020210325B4 (en) | 2020-08-13 | 2024-02-08 | Volkswagen Aktiengesellschaft | Device for alleviating symptoms of Willis-Ekbom disease in a motor vehicle, motor vehicle and computer program product |
DE112021004410T5 (en) * | 2020-09-30 | 2023-07-13 | Joyson Safety Systems Acquisition Llc | STRENGTH MEASURING SEAT BELT ARRANGEMENT |
US11845441B2 (en) * | 2020-11-02 | 2023-12-19 | Imam Abdulrahman Bin Faisal University | Speed based hands-on alarm system for a steering wheel |
DE102021101208A1 (en) | 2021-01-21 | 2022-07-21 | Bayerische Motoren Werke Aktiengesellschaft | Method for operating a driver assistance system, driver assistance system and motor vehicle with a driver assistance system |
CN113616191A (en) * | 2021-06-30 | 2021-11-09 | 展讯半导体(南京)有限公司 | Vital sign monitoring method and system based on intelligent helmet, helmet and medium |
US11667303B2 (en) * | 2021-11-02 | 2023-06-06 | Robert Bosch Gmbh | Hands-off detection for autonomous and partially autonomous vehicles |
US20230271617A1 (en) * | 2022-02-25 | 2023-08-31 | Hong Kong Productivity Council | Risky driving prediction method and system based on brain-computer interface, and electronic device |
DE102022210376B3 (en) | 2022-09-30 | 2023-12-07 | Volkswagen Aktiengesellschaft | Motor vehicle and method for issuing a warning to a user of a motor vehicle wearing a contact lens on one eye |
FR3141404A1 (en) * | 2022-10-31 | 2024-05-03 | Alstom Holdings | Monitoring device for railway vehicle driver, with improved comfort |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4569536A (en) * | 1983-07-04 | 1986-02-11 | Noboru Tsuge | Seat belt system |
US4660528A (en) * | 1986-03-17 | 1987-04-28 | Gene Buck | Apparatus for remote termination of the operation of a selected motor vehicle |
US5783997A (en) * | 1994-11-16 | 1998-07-21 | Pioneer Electronic Corporation | Cardiac rate measuring apparatus |
US5835868A (en) * | 1996-08-30 | 1998-11-10 | Mcelroy; Alejandro S. | Automated system for immobilizing a vehicle and method |
US6511424B1 (en) * | 1997-01-11 | 2003-01-28 | Circadian Technologies, Inc. | Method of and apparatus for evaluation and mitigation of microsleep events |
Family Cites Families (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3953831A (en) | 1973-07-09 | 1976-04-27 | Estrada Richard J | Alarm system for use with cigarette lighter receptacle of vehicle |
US3918176A (en) * | 1974-05-22 | 1975-11-11 | Us Transport | Visual divided attention alcohol safety interlock system |
US4259665A (en) | 1979-05-29 | 1981-03-31 | Rmr Systems, Inc. | Driver sleep or fatigue alarm |
DE3443644A1 (en) * | 1983-11-30 | 1985-06-05 | Aisin Seiki K.K., Kariya, Aichi | DEVICE FOR MONITORING THE DRIVER'S CONSTITUTION AND SAFETY IN A ROAD VEHICLE |
JPH0779803B2 (en) | 1987-12-09 | 1995-08-30 | 日本電装株式会社 | Doze detector |
AU632932B2 (en) | 1990-06-26 | 1993-01-14 | Compumedics Limited | Analysis system for physiological variables |
JP3369201B2 (en) | 1991-10-02 | 2003-01-20 | マツダ株式会社 | Arousal maintenance device |
US5289824A (en) * | 1991-12-26 | 1994-03-01 | Instromedix, Inc. | Wrist-worn ECG monitor |
US5465079A (en) | 1992-08-14 | 1995-11-07 | Vorad Safety Systems, Inc. | Method and apparatus for determining driver fitness in real time |
JP3269153B2 (en) * | 1993-01-06 | 2002-03-25 | 三菱自動車工業株式会社 | Arousal level determination device |
JPH06197888A (en) * | 1993-01-06 | 1994-07-19 | Mitsubishi Motors Corp | Doze warning device for vehicle |
JP2639619B2 (en) | 1993-09-13 | 1997-08-13 | 月星化成株式会社 | Anti-slip soles |
GB9323970D0 (en) * | 1993-11-22 | 1994-01-12 | Toad Innovations Ltd | Safety device |
SE508285C2 (en) | 1994-06-07 | 1998-09-21 | Biosys Ab | Method and apparatus for assessing wakefulness and drowsiness at various stages between wakefulness and sleep in a way that is not monitored non-interfering |
US5595488A (en) * | 1994-08-04 | 1997-01-21 | Vigilant Ltd. | Apparatus and method for monitoring and improving the alertness of a subject |
US5691693A (en) * | 1995-09-28 | 1997-11-25 | Advanced Safety Concepts, Inc. | Impaired transportation vehicle operator system |
JP3512493B2 (en) | 1994-11-16 | 2004-03-29 | パイオニア株式会社 | Driving mental state detection device |
US5585785A (en) | 1995-03-03 | 1996-12-17 | Gwin; Ronnie | Driver alarm |
US5689241A (en) * | 1995-04-24 | 1997-11-18 | Clarke, Sr.; James Russell | Sleep detection and driver alert apparatus |
US5682882A (en) * | 1995-04-28 | 1997-11-04 | Lieberman; Harris R. | Vigilance monitor system |
US6126595A (en) * | 1995-05-12 | 2000-10-03 | Seiko Epson Corporation | Device for diagnosing physiological state and device for controlling the same |
GB9522872D0 (en) * | 1995-11-08 | 1996-01-10 | Oxford Medical Ltd | Improvements relating to physiological monitoring |
RU2107460C1 (en) * | 1996-05-28 | 1998-03-27 | Акционерное общество закрытого типа "Нейроком" | Method and device for recording galvanic skin responses |
US5813993A (en) | 1996-04-05 | 1998-09-29 | Consolidated Research Of Richmond, Inc. | Alertness and drowsiness detection and tracking system |
JP3183161B2 (en) * | 1996-04-12 | 2001-07-03 | 三菱自動車工業株式会社 | Arousal level estimation device |
IL118854A0 (en) * | 1996-07-15 | 1996-10-31 | Atlas Dan | Personal micro-monitoring and alerting device for sleepiness |
US6265978B1 (en) * | 1996-07-14 | 2001-07-24 | Atlas Researches, Ltd. | Method and apparatus for monitoring states of consciousness, drowsiness, distress, and performance |
US5942979A (en) * | 1997-04-07 | 1999-08-24 | Luppino; Richard | On guard vehicle safety warning system |
US6154123A (en) * | 1997-09-05 | 2000-11-28 | Breed Automotive Technology, Inc. | Driver alertness monitoring system |
DE19801009C1 (en) * | 1998-01-14 | 1999-04-22 | Daimler Chrysler Ag | Method of braking motor vehicle |
US6218947B1 (en) * | 2000-05-23 | 2001-04-17 | Ronald L. Sutherland | Driver sleep alarm |
-
1999
- 1999-12-24 AU AU22710/00A patent/AU767533B2/en not_active Ceased
- 1999-12-24 WO PCT/AU1999/001166 patent/WO2000044580A1/en active IP Right Grant
- 1999-12-24 DE DE19983911.5T patent/DE19983911B4/en not_active Expired - Lifetime
- 1999-12-24 US US09/890,324 patent/US6575902B1/en not_active Expired - Lifetime
-
2003
- 2003-04-15 US US10/417,247 patent/US8096946B2/en not_active Expired - Fee Related
-
2012
- 2012-01-17 US US13/351,857 patent/US20120179008A1/en not_active Abandoned
-
2014
- 2014-11-24 US US14/551,971 patent/US20150088397A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4569536A (en) * | 1983-07-04 | 1986-02-11 | Noboru Tsuge | Seat belt system |
US4660528A (en) * | 1986-03-17 | 1987-04-28 | Gene Buck | Apparatus for remote termination of the operation of a selected motor vehicle |
US5783997A (en) * | 1994-11-16 | 1998-07-21 | Pioneer Electronic Corporation | Cardiac rate measuring apparatus |
US5835868A (en) * | 1996-08-30 | 1998-11-10 | Mcelroy; Alejandro S. | Automated system for immobilizing a vehicle and method |
US6511424B1 (en) * | 1997-01-11 | 2003-01-28 | Circadian Technologies, Inc. | Method of and apparatus for evaluation and mitigation of microsleep events |
Cited By (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160157734A1 (en) * | 1999-09-14 | 2016-06-09 | Hoana Medical, Inc. | Passive physiological monitoring (p2m) system |
US8537000B2 (en) * | 2007-01-24 | 2013-09-17 | Toyota Jidosha Kabushiki Kaisha | Anti-drowsing device and anti-drowsing method |
US20100214087A1 (en) * | 2007-01-24 | 2010-08-26 | Toyota Jidosha Kabushiki Kaisha | Anti-drowsing device and anti-drowsing method |
US20110082628A1 (en) * | 2009-10-06 | 2011-04-07 | Brendan J. Murnane | Cruise control modification |
US10136850B2 (en) | 2009-10-14 | 2018-11-27 | Delta Tooling Co., Ltd. | Biological state estimation device, biological state estimation system, and computer program |
US9144389B2 (en) * | 2010-03-12 | 2015-09-29 | Tata Consultancy Services Limited | System for vehicle security, personalization and cardiac activity monitoring of a driver |
US20120101690A1 (en) * | 2010-03-12 | 2012-04-26 | Tata Consultancy Services Limited | System for vehicle security, personalization and cardiac activity monitoring of a driver |
US20120296528A1 (en) * | 2011-05-20 | 2012-11-22 | Matthias Marcus Wellhoefer | Haptic steering wheel, steering-wheel system and driver assistance system for a motor vehicle |
US20140155706A1 (en) * | 2011-06-17 | 2014-06-05 | Technische Universitaet Muenchen | Method and system for quantifying anaesthesia or a state of vigilance |
US9474452B2 (en) * | 2011-06-17 | 2016-10-25 | Technische Universitaet Muenchen | Method and system for quantifying anaesthesia or a state of vigilance |
US20130245894A1 (en) * | 2012-03-13 | 2013-09-19 | GM Global Technology Operations LLC | Driver assistance system |
US20130328673A1 (en) * | 2012-06-01 | 2013-12-12 | Denso Corporation | Driving ability reduction determining apparatus |
US9079526B2 (en) * | 2012-06-01 | 2015-07-14 | Denso Corporation | Driving ability reduction determining apparatus |
US9917281B2 (en) | 2012-09-07 | 2018-03-13 | Nitto Denko Corporation | Top-emitting white organic light-emitting diodes having improved efficiency and stability |
ITTV20130025A1 (en) * | 2013-02-27 | 2014-08-28 | Giorgio Marcon | ELECTRONIC SECURITY SYSTEM FOR MULTIPLE FUNCTIONS. |
WO2015048959A1 (en) * | 2013-10-01 | 2015-04-09 | Continental Teves Ag & Co. Ohg | Method and device for an automatic steering intervention |
US9889873B2 (en) * | 2013-10-01 | 2018-02-13 | Continental Teves Ag & Co. Ohg | Method and device for an automatic steering intervention |
US20160200348A1 (en) * | 2013-10-01 | 2016-07-14 | Continental Teves Ag & Co. Ohg | Method and Device for an Automatic Steering Intervention |
EP3157426B1 (en) * | 2014-06-20 | 2018-06-20 | FRAUNHOFER-GESELLSCHAFT zur Förderung der angewandten Forschung e.V. | Device, method, and computer program for detecting micro- sleep |
US9628565B2 (en) * | 2014-07-23 | 2017-04-18 | Here Global B.V. | Highly assisted driving platform |
US20160028824A1 (en) * | 2014-07-23 | 2016-01-28 | Here Global B.V. | Highly Assisted Driving Platform |
US11343316B2 (en) | 2014-07-23 | 2022-05-24 | Here Global B.V. | Highly assisted driving platform |
US10334049B2 (en) * | 2014-07-23 | 2019-06-25 | Here Global B.V. | Highly assisted driving platform |
EP3239958A4 (en) * | 2014-12-26 | 2018-08-15 | The Yokohama Rubber Co., Ltd. | Collision avoidance system and collision avoidance method |
US9527508B1 (en) | 2015-08-13 | 2016-12-27 | Winbond Electronics Corp. | Mobile vehicle safety apparatus and safety monitoring method thereof |
US20170287307A1 (en) * | 2016-03-31 | 2017-10-05 | Robert Bosch Gmbh | Method for furnishing a warning signal, and method for generating a pre-microsleep pattern for detection of an impending microsleep event for a vehicle |
US10152871B2 (en) * | 2016-03-31 | 2018-12-11 | Robert Bosch Gmbh | Method for furnishing a warning signal, and method for generating a pre-microsleep pattern for detection of an impending microsleep event for a vehicle |
CN107273789A (en) * | 2016-03-31 | 2017-10-20 | 罗伯特·博世有限公司 | Method for providing caution signal to recognize the microsleep that will be faced |
US10172566B2 (en) | 2016-05-18 | 2019-01-08 | Airbus Operations Sas | System and method for evaluating action capacities of an individual |
FR3051342A1 (en) * | 2016-05-18 | 2017-11-24 | Airbus Operations Sas | SYSTEM AND METHOD FOR EVALUATING ACTION CAPACITY OF AN INDIVIDUAL. |
US10773750B2 (en) | 2017-03-07 | 2020-09-15 | Continental Automotive Gmbh | Device and method for detecting manual guidance of a steering wheel |
US10635101B2 (en) | 2017-08-21 | 2020-04-28 | Honda Motor Co., Ltd. | Methods and systems for preventing an autonomous vehicle from transitioning from an autonomous driving mode to a manual driving mode based on a risk model |
US10657398B2 (en) * | 2018-06-26 | 2020-05-19 | David Johnson | Sleepy driver alert system and method |
CN109646024A (en) * | 2019-01-09 | 2019-04-19 | 浙江强脑科技有限公司 | Method for detecting fatigue driving, device and computer readable storage medium |
WO2020204809A1 (en) * | 2019-03-29 | 2020-10-08 | Agency For Science, Technology And Research | Classifying signals for movement control of an autonomous vehicle |
CN116211310A (en) * | 2023-05-09 | 2023-06-06 | 中国第一汽车股份有限公司 | Myoelectric sensor and detection method thereof |
Also Published As
Publication number | Publication date |
---|---|
US20150088397A1 (en) | 2015-03-26 |
US8096946B2 (en) | 2012-01-17 |
WO2000044580A1 (en) | 2000-08-03 |
US6575902B1 (en) | 2003-06-10 |
DE19983911T5 (en) | 2013-01-31 |
US20040044293A1 (en) | 2004-03-04 |
WO2000044580A8 (en) | 2000-09-28 |
AU767533B2 (en) | 2003-11-13 |
AU2271000A (en) | 2000-08-18 |
DE19983911B4 (en) | 2018-09-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8096946B2 (en) | Vigilance monitoring system | |
US7639146B2 (en) | Blink monitor for detecting blink occurrence in a living subject | |
US8725311B1 (en) | Driver health and fatigue monitoring system and method | |
EP0764000B1 (en) | Apparatus for monitoring and estimating the awakeness of a person | |
US8902070B2 (en) | Eye closure detection using structured illumination | |
US20140276090A1 (en) | Driver health and fatigue monitoring system and method using optics | |
CN107234968A (en) | A kind of automotive control system based on multi-mode biological response | |
WO2015175435A1 (en) | Driver health and fatigue monitoring system and method | |
WO2015174963A1 (en) | Driver health and fatigue monitoring system and method | |
DE102012002037A1 (en) | Device for analyzing state e.g. health of driver, has processing module to analyze time-and-frequency-domain HRV parameters in digital signal to display stress state and activity of nervous system of driver on traffic light system | |
DE202012001096U1 (en) | Device for carrying out driver status analyzes | |
US11751784B2 (en) | Systems and methods for detecting drowsiness in a driver of a vehicle | |
KR101259663B1 (en) | Incapacity monitor | |
AU2009253692A1 (en) | Method and device for the detection of microsleep events | |
KR20130064473A (en) | Bio-signal transfer device, vehicle control device ,vehicle automatic control system and method using thereof | |
WO2018207905A1 (en) | Vehicle driving control system and vehicle driving control method | |
Arefnezhad et al. | Driver drowsiness classification using data fusion of vehicle-based measures and ECG signals | |
Nasri et al. | A Review of Driver Drowsiness Detection Systems: Techniques, Advantages and Limitations | |
WO2008054460A2 (en) | Stay awake | |
Iampetch et al. | EEG-based mental fatigue prediction for driving application | |
Stork et al. | Various approaches to driver fatigue detection: A review | |
KR101932147B1 (en) | sleepiness diagnostic method and apparatus using signal of electrocardiogram for driver | |
KR100637797B1 (en) | Apparatus of drowsiness detection while driving based on non-constrained respiration measurement | |
Lee et al. | Development of a Real-Time Driver Health Detection System Using a Smart Steering Wheel | |
NAKAGAWA et al. | Basic Study on Assessment of the Driver's Condition Based on Physiological Indices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |