Nothing Special   »   [go: up one dir, main page]

US20150112606A1 - Calculating Pulse Transit Time - Google Patents

Calculating Pulse Transit Time Download PDF

Info

Publication number
US20150112606A1
US20150112606A1 US14/521,829 US201414521829A US2015112606A1 US 20150112606 A1 US20150112606 A1 US 20150112606A1 US 201414521829 A US201414521829 A US 201414521829A US 2015112606 A1 US2015112606 A1 US 2015112606A1
Authority
US
United States
Prior art keywords
subject
implementations
data
time
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/521,829
Inventor
David Da He
Richard Robehr Bijjani
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert F Dudley As Trustee Of Quanttus Liquidating Trust
Original Assignee
Quanttus Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Quanttus Inc filed Critical Quanttus Inc
Priority to US14/521,829 priority Critical patent/US20150112606A1/en
Assigned to QUANTTUS, INC. reassignment QUANTTUS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BIJJANI, RICHARD ROBEHR, HE, DAVID DA
Priority to US14/630,288 priority patent/US20150164351A1/en
Publication of US20150112606A1 publication Critical patent/US20150112606A1/en
Assigned to ROBERT F. DUDLEY, AS TRUSTEE OF THE QUANTTUS LIQUIDATING TRUST reassignment ROBERT F. DUDLEY, AS TRUSTEE OF THE QUANTTUS LIQUIDATING TRUST ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: QUANTTUS, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0453Sensor means for detecting worn on the body to detect health condition by physiological monitoring, e.g. electrocardiogram, temperature, breathing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/02028Determining haemodynamic parameters not otherwise provided for, e.g. cardiac contractility or left ventricular ejection fraction
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • A61B5/02055Simultaneously evaluating both cardiovascular condition and temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • A61B5/02108Measuring pressure in heart or blood vessels from analysis of pulse wave characteristics
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • A61B5/02108Measuring pressure in heart or blood vessels from analysis of pulse wave characteristics
    • A61B5/02125Measuring pressure in heart or blood vessels from analysis of pulse wave characteristics of pulse wave propagation time
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02416Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02416Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
    • A61B5/02427Details of sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/026Measuring blood flow
    • A61B5/0285Measuring or recording phase velocity of blood waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/026Measuring blood flow
    • A61B5/029Measuring or recording blood output from the heart, e.g. minute volume
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1102Ballistocardiography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/113Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14542Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring blood gases
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4809Sleep detection, i.e. determining whether a subject is asleep or not
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4815Sleep quality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4818Sleep apnoea
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4833Assessment of subject's compliance to treatment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/486Bio-feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/681Wristwatch-type devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6824Arm or wrist
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7278Artificial waveform generation or derivation, e.g. synthesising signals from measured signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7282Event detection, e.g. detecting unique waveforms indicative of a medical condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/02007Evaluating blood vessel condition, e.g. elasticity, compliance
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02405Determining heart rate variability
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02438Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/053Measuring electrical impedance or conductance of a portion of the body
    • A61B5/0531Measuring skin impedance
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • A61B5/346Analysis of electrocardiograms
    • A61B5/349Detecting specific parameters of the electrocardiograph cycle
    • A61B5/363Detecting tachycardia or bradycardia
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks

Definitions

  • This document describes technology related to consumer biometric devices.
  • Various types of sensors can be used for sensing biometric parameters.
  • a method in one aspect, includes obtaining, using a first sensor, a first data set representing time-varying information on at least one pulse pressure wave within vasculature at a first body part of a subject. The method also includes obtaining, using a second sensor, a second data set representing time-varying information about motion of the subject at the first body part of a subject. The method also includes identifying, using one or more processors, a first point in the first data set, the first point representing an arrival time of the pulse pressure wave at the first body part. The method also includes identifying, using the one or more processors, a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject. The method also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • PTT pulse transit time
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including obtaining a first data set representing time-varying information on at least one pulse pressure wave within vasculature at a first body part of a subject.
  • the operations also include obtaining a second data set representing time-varying information about motion of the subject at the first body part of a subject.
  • the operations also include identifying a first point in the first data set. The first point represents an arrival time of the pulse pressure wave at the first body part.
  • the operations also include identifying a second point in the second dataset. The second point represents an earlier time at which the pulse pressure wave traverses a second body part of the subject.
  • the operations also include computing a pulse transit time (PTT) as a difference between the first and second points.
  • the PTT represents a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • the device also includes a second sensor configured to obtain a second data set representing time-varying information about motion of the subject at the first body part of a subject.
  • the device also includes memory.
  • the device also includes one or more processors.
  • the one or more processors are configured to receive the first and second data sets.
  • the one or more processors are also configured to identify a first point in the first data set, the first point representing an arrival time of the pulse pressure wave at the first body part.
  • the one or more processors are also configured to identify a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject.
  • the one or more processors are also configured to compute a pulse transit time (PTT) as a difference between the first and second points.
  • PTT represents a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • Implementations can include one or more of the following features.
  • the information about the at least one pulse pressure wave includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • data including at least one of the first data set and the second data set is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the first body part is an arm of the subject.
  • the first body part is a wrist of the subject.
  • the first sensor includes an optical sensor and the second sensor includes an accelerometer or a gyroscope.
  • identifying the first point includes computing, by the one or more processors, a cross-correlation of a template segment with each of multiple segments of the first dataset. Identifying the first point also includes identifying, based on the computed cross-correlations, at least one candidate segment of the first dataset as including the first point. Identifying the first point also includes identifying, by the one or more processors, a first feature within the identified candidate segment as the first point.
  • identifying the second point includes determining a reference point in the second data set, the reference point corresponding to substantially the same point in time as the first point in the first data set. Identifying the second point also includes identifying one or more target features within a predetermined time range relative to the reference point. Identifying the second point also includes selecting a time point corresponding to one of the target features as the second point.
  • the target features includes at least one of a peak and a valley.
  • the method also includes computing a blood pressure of the subject as a function of the PTT.
  • the blood pressure includes a systolic pressure and a diastolic pressure.
  • a diastolic pressure is calculated as a linear function of the logarithm of the PTT.
  • a systolic pressure is calculated as a linear function of the diastolic pressure.
  • the pre-determined time range is associated with the systole portion of the subject's heartbeat.
  • the method also includes accepting user-input for initiating computation of the PTT.
  • the method also includes computing arterial stiffness as a function of the PTT.
  • the device also includes a mechanism that allows the device to be worn by the subject.
  • the mechanism does not reduce a mobility of the subject.
  • the one or more processors are also configured to compute a blood pressure of the subject as a function of the PTT.
  • the device also includes an input mechanism configured to accept user-input for initiating computation of the PTT.
  • the one or more processors are also configured to compute arterial stiffness as a function of the PTT.
  • a method in another aspect, includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject. The method also includes detecting arrhythmia of the subject based on the data.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject.
  • the operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the operations also include detecting arrhythmia of the subject based on the data.
  • the device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the device also includes a processor configured to receive data from one or more of the light-emitting element, the optical sensor, and the motion sensor. The processor is also configured to detect arrhythmia of the subject based on the data.
  • Implementations can include one or more of the following features.
  • the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at a single location of the subject.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the single location is an arm of the subject.
  • the single location is a wrist of the subject.
  • the arrhythmia includes atrial fibrillation (AFIB).
  • AFIB atrial fibrillation
  • the arrhythmia includes atrial flutter.
  • the method also includes identifying, based on gross motion data of the subject, one or more period of high activity of the subject.
  • the data that the arrhythmia detection is based on does not include data collected during the one or more periods of high activity.
  • the data that the arrhythmia detection is based on includes data collected during the one or more periods of high activity.
  • processing the data includes plotting R wave to R wave intervals (RR i ) versus next consecutive R wave to R wave intervals (RR i+1 ).
  • processing the data includes determining whether a spread of plotted data points exceeds a predetermined spread value.
  • the method also includes determining that the subject experienced atrial fibrillation (AFIB) if the spread of the plotted data points exceeds the predetermined spread value.
  • AFIB atrial fibrillation
  • processing the data includes determining whether multiple clusters of plotted data points are offset from a diagonal.
  • the method also includes determining that the subject experienced atrial flutter if there are multiple clusters of plotted data points offset from the diagonal.
  • processing the data includes determining one or more of heart rate, heart rate variability, and blood pressure of the subject.
  • determining the heart rate of the subject includes calculating a distance between two consecutive reference points in the first dataset, the distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • the reference points are local maxima or local minima.
  • the reference points are peaks or valleys.
  • determining the heart rate variability of the subject includes calculating distances between multiple pairs of consecutive reference points in the first dataset, each distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • Atrial fibrillation is detected if the heart rate variability of the subject crosses a threshold.
  • determining the blood pressure of the subject includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject. Determining the blood pressure of the subject also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject. Determining the blood pressure of the subject also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject, wherein the PTT is related to an elasticity of one or more blood vessels of the subject. Determining the blood pressure of the subject also includes determining the blood pressure of the subject based on the elasticity of the one or more blood vessels.
  • PTT pulse transit time
  • the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • processing the data includes plotting R wave to R wave intervals (RR i ) versus next consecutive R wave to R wave intervals (RR i+1 ).
  • processing the data includes determining whether a spread of plotted data points exceeds a predetermined spread value.
  • the processor is also configured to determine that the subject experienced atrial fibrillation (AFIB) if the spread of the plotted data points exceeds the predetermined spread value.
  • AFIB atrial fibrillation
  • processing the data includes determining whether multiple clusters of plotted data points are offset from a diagonal.
  • the processor is also configured to determine that the subject experienced atrial flutter if there are multiple clusters of plotted data points offset from the diagonal.
  • a method in another aspect, includes processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in each of one or more subjects acquired at a location of each of the subjects.
  • the method also includes processing data that represents time-varying information about motion of the one or more subjects acquired at the location of each of the subjects.
  • the method also includes determining, based on the data, a quality of care provided to the one or more subjects by a care facility that cares for the one or more subjects.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in each of one or more subjects acquired at a location of each of the subjects.
  • the operations also include processing data that represents time-varying information about motion of the one or more subjects acquired at the location of each of the subjects.
  • the operations also include determining, based on the data, a quality of care provided to the one or more subjects by a care facility that cares for the one or more subjects.
  • a biofeedback device configured to be worn by one or more subjects includes a light source configured to emit light toward the skin of the subject.
  • the device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the device also includes a processor configured to receive data from one or more of the light-emitting element, the optical sensor, and the motion sensor.
  • the processor is also configured to determine, based on the data, a quality of care provided to one or more subjects by a care facility that cares for the one or more subjects.
  • Implementations can include one or more of the following features.
  • the information about at least one pulse pressure wave propagating through blood in the subjects includes photoplethysmographic (PPG) data and the information about motion of the subjects includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at single locations of each of the subjects.
  • the data is acquired by devices worn by the subjects.
  • the devices are mobile and do not reduce mobility of the subjects.
  • the devices process the data.
  • the single location of each of the subjects is an arm of the subject.
  • the single location is a wrist of the subject.
  • determining a quality of care provided to the one or more subjects includes determining a level of physical activity experienced by each of the one or more subjects by comparing gross motion data of each subject to a threshold value.
  • the threshold is based on a metric defined by a health organization.
  • the level of physical activity includes an amount of time that each subject has exercised over a particular time period.
  • the level of physical activity includes an amount of time or a distance that each subject has walked over a particular time period.
  • the method also includes processing data that represents information about an amount of ultraviolet light that each of the one or more subjects has been exposed to over a particular time period.
  • the method also includes determining an amount of time that each of the one or more subjects has spent outside over the particular time period based on the information about the ultraviolet light.
  • the method also includes comparing the quality of care provided by the care facility to a quality of care provided by another care facility that cares for one or more other subjects.
  • the device also includes an ultraviolet light sensor configured to measure levels of ultraviolet light that each of the one or more subjects is exposed to over a particular time period.
  • the processor is also configured to process data that represents information about the levels of ultraviolet light that each of the one or more subjects is exposed to over the particular time period.
  • the processor is also configured to determine an amount of time that each of the one or more subjects has spent outside over the particular time period based on the information about the levels of ultraviolet light.
  • determining the quality of care provided to the one or more subjects includes determining a level of physical activity experienced by each of the one or more subjects by comparing gross motion data of each subject to a threshold value.
  • a method in another aspect, includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The data is acquired while the subject is in a situation associated with risk indicated by the data.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The data is acquired while the subject is in a situation associated with risk indicated by the data.
  • the device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor.
  • the processor is also configured to process the data to determine whether the subject is in a situation associated with risk and to derive a measure of a level of risk associated with the subject.
  • Implementations can include one or more of the following features.
  • the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at a single location of the subject.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the single location is an arm of the subject.
  • the single location is a wrist of the subject.
  • the method also includes using the processed data to derive a measure of a level of risk associated with the subject.
  • the method also includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject.
  • the method also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject.
  • the method also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • PTT pulse transit time
  • the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • the method also includes determining a blood pressure of the subject based on the PTT.
  • the risk includes trauma to the subject and the data is indicative of the existence of the trauma.
  • the method also includes providing the processed data to a party that is responding to the trauma.
  • the processed data is transmitted from a device worn by the subject to a remote device.
  • the remote device is a server associated with an emergency service provider.
  • the processed data is provided to the party before the party has reached the subject.
  • the method also includes processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in additional subjects acquired at a location of each of the subjects.
  • the method also includes processing data that represents time-varying information about motion of the additional subjects acquired at the location of each of the subjects.
  • the data is acquired while the additional subjects are in the situation associated with the risk, and the risk includes trauma.
  • the method also includes providing the processed data for the subject and the additional subjects to a party that is responding to the trauma, before the party has reached the subjects.
  • the processed data is transmitted from devices worn by the subjects to a remote device.
  • the remote device is a server associated with an emergency service provider.
  • the method also includes providing information to the party that enables the party to assess a level of risk associated with each of the subjects before the party has reached the subjects.
  • the method also includes providing the processed data to a medical facility to which the subject is taken for medical care.
  • the risk includes trauma.
  • providing the processed data to a medical facility includes providing the processed data to an urgent care division of the medical facility.
  • the information is provided to the urgent care division before the subject is treated by the urgent care division.
  • the method also includes processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in additional subjects acquired at a location of each of the subjects.
  • the method also includes processing data that represents time-varying information about motion of the additional subjects acquired at the location of each of the subjects. The data is acquired while the additional subjects are in the situation associated with the risk.
  • providing the processed data to a medical facility includes providing the processed data to an urgent care division of the medical facility.
  • the information is provided to the urgent care division before one or more of the subjects are treated by the urgent care division.
  • the subjects are treated in an order that is based on a severity of an injury.
  • relatively more severely injured subjects are treated before relatively less severely injured subjects.
  • the processed data is used to determine the subject's compliance with a particular standard of care throughout a progression of steps of the standard of care.
  • the processed data is used to determine whether the subject is receiving care that is appropriate according to a particular standard of care.
  • the data is processed after the subject is in the situation associated with risk.
  • the processing of the data occurs after the data has been acquired and with a short enough delay to enable an effect of the risk to be resolved.
  • the situation includes firefighting.
  • the situation includes a natural disaster or a sudden act of violence.
  • the risk includes one or more of heart failure, emotional stress, abnormal skin temperature, abnormal body temperature, hypertension, heart attack, stroke, arrhythmia, exhaustion, and anxiety.
  • the method also includes determining one or more of a blood pressure, a skin temperature, a body temperature, a heart rate, and a heart rate variability of the subject based on the datasets.
  • the method also includes detecting emotional stress in the subject by determining whether one or more of the determined blood pressure, heart rate, and heart rate variability of the subject is a predetermined amount above a threshold.
  • the data indicates that the subject is about to experience an effect of one of the risks.
  • the risk includes overexposure of the subject to ultraviolet light.
  • the method also includes processing data that represents information about an amount of ultraviolet light that the subject has been exposed to.
  • the method also includes comparing the amount of ultraviolet light that the subject has been exposed to a threshold to determine whether the subject has been overexposed to ultraviolet light.
  • the method also includes alerting the subject if the subject has been overexposed to ultraviolet light.
  • the risk includes trauma to the subject and the data is indicative of the existence of the trauma.
  • the operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the processor is also configured to receive and process the data from the motion sensor.
  • the processor is also configured to cause the biofeedback device to provide the processed data to a party that is responding to the trauma.
  • the processor is also configured to cause the biofeedback device to provide the processed data to a remote device.
  • the remote device is a server associated with an emergency service provider.
  • the processor is also configured to cause the biofeedback device to provide the processed data to a medical facility to which the subject is taken for medical care.
  • the device also includes a transceiver configured to provide the processed data.
  • the processed data is used to determine the subject's compliance with a particular standard of care throughout a progression of steps of the standard of care.
  • the processed data is used to determine whether the subject is receiving care that is appropriate according to a particular standard of care.
  • the risk includes overexposure of the subject to ultraviolet light.
  • the device also includes an ultraviolet light sensor configured to measure an amount of ultraviolet light that the subject is exposed to.
  • the processor is also configured to process data that represents information about the amount of ultraviolet light that the subject is exposed to.
  • the processor is also configured to compare the amount of ultraviolet light that the subject is exposed to a threshold to determine whether the subject has been overexposed to ultraviolet light.
  • the device is also configured to alert the subject if the subject has been overexposed to ultraviolet light.
  • a method in another aspect, includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The method also includes providing information related to the data to a remote device.
  • a system in another aspect, includes a remote device and a biofeedback device configured to be worn by a subject.
  • the biofeedback device includes a light source configured to emit light toward the skin of the subject.
  • the biofeedback device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the biofeedback device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor.
  • the processor is also configured to provide information related to the data to a remote device.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject.
  • the operations also include providing information related to the data to a remote device.
  • the biofeedback device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the biofeedback device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor.
  • the processor is also configured to provide information related to the data to a remote device.
  • Implementations can include one or more of the following features.
  • the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at a single location of the subject.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the single location is an arm of the subject.
  • the single location is a wrist of the subject.
  • the remote device is a server.
  • the method also includes determining, based on the data in the first and second datasets, that the subject is experiencing or has experienced a health-related problem.
  • the method also includes causing the remote device to alert one or both of a caregiver and the subject that the subject is experiencing or has experienced a health-related problem.
  • the method also includes causing the remote device to alert the subject that the subject is experiencing a health-related problem.
  • the remote device sends an alert to a device worn by the subject that acquires the data.
  • the remote device sends an alert to a mobile phone of the subject.
  • determining that the subject is experiencing or has experienced a health-related problem includes determining whether a blood pressure of the subject satisfies a threshold.
  • wherein the health-related problem is hypertension.
  • determining that the subject is experiencing or has experienced a health-related problem includes determining a rate of change of a blood pressure of the subject.
  • the medical event is a stroke
  • the subject is determined to be having a stroke if the rate of change of the blood pressure of the subject is positive and above a threshold.
  • the medical event is abnormal heart function
  • the subject is determined to be experiencing abnormal heart function if the rate of change of the blood pressure of the subject is negative and below a threshold.
  • the method also includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject.
  • the method also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject.
  • the method also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • PTT pulse transit time
  • the blood pressure of the subject is determined based on the PTT.
  • the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • determining that the subject is experiencing a health-related problem includes determining whether a heart rate of the subject satisfies a threshold.
  • the health-related problem is tachycardia.
  • determining the heart rate of the subject includes calculating a distance between two consecutive reference points in the first dataset, the distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • the reference points are local maxima or local minima.
  • the reference points are peaks or valleys in the first dataset.
  • determining that the subject is experiencing a health-related problem includes determining whether a heart rate variability of the subject satisfies a threshold.
  • the threshold is based on whether the subject experiences arrhythmia.
  • determining the heart rate variability of the subject includes calculating distances between multiple pairs of consecutive reference points in the first dataset, each distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • the reference points are local maxima or local minima.
  • the reference points are peaks or valleys.
  • determining that the subject has experienced a health-related problem includes determining whether the subject has sustained an impact of a magnitude that satisfies a threshold.
  • determining the magnitude of the impact includes analyzing gross motion data of the subject at the time of the impact.
  • the health-related problem is a concussion.
  • the method also includes determining, based on the data in the first and second datasets, that the subject is about to experience a health-related problem.
  • the method also includes causing the remote device to alert a caregiver that the subject is about to experience a health-related problem.
  • the method also includes causing the remote device to alert the subject that the subject is about to experience a health-related problem.
  • the remote device sends an alert to a device worn by the subject that acquires the data.
  • the remote device sends an alert to a mobile phone of the subject.
  • determining that the subject is about to experience a health-related problem includes determining whether a blood pressure of the subject satisfies a threshold.
  • the method also includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject.
  • the method also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject.
  • the method also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • PTT pulse transit time
  • the blood pressure of the subject is determined based on the PTT.
  • the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • determining that the subject is about to experience a health-related problem includes determining whether a heart rate of the subject satisfies a threshold.
  • determining the heart rate of the subject includes calculating a distance between two consecutive reference points in the first dataset, the distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • the reference points are local maxima or local minima.
  • the reference points are peaks or valleys in the first dataset.
  • determining that the subject is about to experience a health-related problem includes determining whether a heart rate variability of the subject satisfies a threshold.
  • determining the heart rate variability of the subject includes calculating distances between multiple pairs of consecutive reference points in the first dataset, each distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • the reference points are local maxima or local minima.
  • the reference points are peaks or valleys.
  • the method also includes providing location information related to the subject to the remote device.
  • the location information is provided by a location module of a device worn by the subject that acquires the data.
  • the location module is a GPS transponder.
  • the method also includes providing temperature information related to the subject to the remote device.
  • the remote device is a thermostat.
  • the subject is remote from a location that is temperature-controlled by the thermostat.
  • the thermostat is configured to adjust its temperature settings based on the temperature information related to the subject.
  • a time when the thermostat adjusts its temperature settings is based on the location information related to the subject.
  • the thermostat adjusts its temperature settings when the location information indicates that the subject is within a predefined distance from a location that is temperature-controlled by the thermostat.
  • the remote device is a light.
  • the subject is remote from a location that can be illuminated by the light.
  • the light is configured to adjust its lighting settings at a time that is based on the location information related to the subject.
  • the light adjusts its lighting settings when the location information indicates that the subject is within a predefined distance from a location that is lighting-controlled by the light.
  • the method also includes determining that the subject is interacting with a particular object based on a location of the subject.
  • the remote device is a server.
  • the particular object is an advertisement.
  • the particular object is a product display.
  • the particular object is a retail product.
  • the location of the subject is determined by a GPS module of a device worn by the subject that acquires the data.
  • the location of the subject is determined based on a strength of a wireless connection between a device worn by the subject that acquires the data and one or more proximity sensors.
  • a relatively higher strength of the wireless connection between the device and the proximity sensor indicates that the device is relatively closer to the proximity sensor.
  • the wireless connection is a Bluetooth connection.
  • the method also includes determining, based on the processed data, that the subject is experiencing one or more of an increase in heart rate, blood pressure, and respiratory rate while the subject is interacting with the particular object.
  • the method also includes inferring that the subject is interested in the particular object based on one or more of the heart rate, the blood pressure, and the respiratory rate of the subject while the subject is interacting with the particular object.
  • the remote device is an entertainment device.
  • the entertainment device is a television.
  • the entertainment device is an audio output device.
  • the entertainment device is a gaming device.
  • the processed data indicates whether the subject has exercised for a predetermined length of time
  • the entertainment device can be turned on only if the subject has exercised for the predetermined length of time.
  • the entertainment device is configured to provide content personalized for the subject based on a state of the subject as determined from the processed data.
  • the state of the subject includes a level of interest in the content provided by the entertainment device.
  • a rise in one or more of a heart rate, a heart rate variability, an electrical skin impedance, a respiratory rate, and a blood pressure of the subject while the subject is experiencing the content indicates an increased level of interest in the content.
  • the heart rate, the heart rate variability, the electrical skin impedance, and the respiratory rate of the subject are determined from the processed data.
  • the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the blood pressure of the subject is determined from the processed data.
  • the entertainment device provides content designed to excite the subject if the heart rate variability of the subject is within a predefined range.
  • the entertainment device provides content designed to excite the subject if one or more of the heart rate, the electrical skin impedance, the respiratory rate, and the blood pressure of the subject is below a respective threshold.
  • the state of the subject includes a level of stress of the subject while the subject is experiencing the content.
  • a rise in one or more of a heart rate, a heart rate variability, an electrical skin impedance, a respiratory rate, and a blood pressure of the subject while the subject is experiencing the content indicates an increased level of interest in the content.
  • the heart rate, the heart rate variability, the electrical skin impedance, and the respiratory rate of the subject are determined from the processed data.
  • the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject, wherein the blood pressure of the subject is determined from the processed data.
  • the entertainment device provides content designed to calm the subject if the heart rate variability of the subject is within a predefined range.
  • the entertainment device provides content designed to calm the subject if one or more of the heart rate, the electrical skin impedance, the respiratory rate, and the blood pressure of the subject is above a respective threshold.
  • the entertainment device is a television and the content includes one or more of television shows, movies, and games.
  • the entertainment device is a gaming device that is configured to adjust game settings based on a state of the subject as determined from the processed data.
  • game settings include one or more of difficulty settings, sound settings, and situational settings.
  • the entertainment device is configured to turn off based on a state of the subject as determined from the processed data.
  • the method also includes causing the remote device to adjust a dating preference in a dating profile of the subject based on a state of the subject as determined from the processed data.
  • the method also includes processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in one or more other subjects acquired at locations on the other subjects.
  • the method also includes processing data that represents time-varying information about motion of the one or more other subjects acquired at the locations on the other subjects.
  • the method also includes determining a compatibility between the subject and each of the other subjects based on states of the subjects as determined from the data.
  • the method also includes ranking the compatibilities between the subject and each of the other subjects.
  • the remote device is a device operated by the subject.
  • the method also includes determining, based on the data in the first and second datasets, that the subject is not adequately alert.
  • determining that the subject is not adequately alert is based on one or more of a heart rate, a respiratory rate, a blood pressure, and an activity level of the subject.
  • determining that the subject is not adequately alert includes determining, based on the processed data, whether one or more of the heart rate, the respiratory rate, the blood pressure, and the activity level of the subject is below a threshold.
  • the method also includes causing the device to activate an alarm if the subject is not adequately alert.
  • the method also includes causing the device to slow down if the subject is not adequately alert.
  • the device is a vehicle.
  • the data is acquired by the device and the device is wearable by the subject.
  • the method also includes causing an alarm of the wearable device to be activated if the subject is not adequately alert.
  • the biofeedback device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the processor is also configured to receive data from the motion sensor.
  • the operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the biofeedback device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the processor is also configured to receive data from the motion sensor.
  • the processor is also configured to determine, based on the received data, that the subject is experiencing or has experienced a health-related problem.
  • the processor is also configured to determine, based on the received data, that the subject is about to experience a health-related problem.
  • the processor is also configured to cause the remote device to alert a caregiver that the subject is experiencing, has experienced, or is about to experience a health-related problem.
  • the processor is also configured to cause the remote device to alert the subject that the subject is experiencing, has experienced, or is about to experience a health-related problem.
  • the remote device sends an alert to the biofeedback device.
  • the remote device sends an alert to a mobile phone of the subject.
  • the processor is also configured to provide location information related to the subject to the remote device.
  • the biofeedback device also includes a location module configured to provide the location information related to the subject to the remote device.
  • the location module is a GPS transponder.
  • the processor is also configured to provide temperature information related to the subject to the remote device.
  • the processor is also configured to determine that the subject is interacting with a particular object based on a location of the subject.
  • the remote device is a server.
  • the particular object is an advertisement.
  • the particular object is a product display.
  • the particular object is a retail product.
  • the location of the subject is determined by the GPS module of the biofeedback device.
  • the location of the subject is determined based on a strength of a wireless connection between the biofeedback device and one or more proximity sensors.
  • a relatively higher strength of the wireless connection between the biofeedback device and the proximity sensor indicates that the biofeedback device is relatively closer to the proximity sensor.
  • the wireless connection is a Bluetooth connection.
  • the remote device is a device operated by the subject.
  • the processor is also configured to determine, based on the received data, that the subject is not adequately alert.
  • the processor is also configured to cause the biofeedback device to activate an alarm if the subject is not adequately alert.
  • the processor is also configured to cause the device operated by the subject to slow down if the subject is not adequately alert.
  • the device is a vehicle.
  • a method in another aspect, includes deriving a score associated with a state of a subject, the state of the subject being one or more members selected from the group consisting of health, sleep, fitness, and stress. Deriving the score is based on data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired at a location of the subject.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including deriving a score associated with a state of a subject.
  • the state of the subject is one or more members selected from the group consisting of health, sleep, fitness, and stress. Deriving the score is based on data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired at a location of the subject.
  • the device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the device also includes a processor configured to receive data from one or more of the light-emitting element, the optical sensor, and the motion sensor.
  • the processor is also configured to derive a score associated with a state of the subject, the state of the subject being one or more members selected from the group consisting of health, sleep, fitness, and stress.
  • Implementations can include one or more of the following features.
  • deriving the score is also based on data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at a single location of the subject.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the single location is an arm of the subject.
  • the single location is a wrist of the subject.
  • the score is a numerical value.
  • the numerical value is between 1 and 100.
  • the numerical value is between 1 and 10.
  • the data is acquired by a device that is worn by the subject and that displays the score.
  • the device worn by the subject derives the score.
  • the device worn by the subject provides the data to a remote device that derives the score.
  • the remote device is a server.
  • the remote device provides the score to the device worn by the subject.
  • the remote device provides the score to a mobile phone of the subject.
  • the score is provided to one or both of the subject and another party.
  • the state of the subject includes a sleep state
  • the score includes a sleep score
  • the sleep score is associated with a level of quality of the subject's sleep.
  • deriving the score includes identifying one or more potential sleep rest periods of the subject based on gross motion data of the subject.
  • deriving the score also includes calculating one or more of an average heart rate, a standard deviation of the average heart rate, and an average heart rate variability of the subject during each of the one or more potential sleep rest periods based on the information about at least one pulse pressure wave propagating through blood in the subject.
  • one or more of the potential sleep rest periods are identified as sleep rest periods by comparing one or more of the average heart rate, the standard deviation of the average heart rate, and the average heart rate variability of the subject during the respective potential sleep rest period to a threshold.
  • the sleep state of the subject is associated with one or more of sleep duration, sleep latency, and sleep staging.
  • deriving the score includes determining one or more of the sleep duration, the sleep latency, and the sleep staging of the subject.
  • the method also includes determining the sleep duration of the subject.
  • determining the sleep duration of the subject includes determining a total length of time during which the subject was asleep based on information related to one or more sleep rest periods of the subject.
  • the information related to the one or more sleep rest periods includes a time associated with a beginning of each sleep rest period, a time associated with an end of each sleep rest period, gross motion data of the subject during each sleep rest period, and heart rate data of the subject during each sleep rest period.
  • determining the sleep duration of the subject includes determining a percentage of time that the subject was asleep between a time when the subject started to try to fall asleep and a time when the subject awoke based on information related to one or more sleep rest periods of the subject and gross motion data of the subject before the subject fell asleep.
  • the method also includes determining the sleep latency of the subject.
  • determining the sleep latency of the subject includes determining a length of time that it takes for the subject to transition from a state of wakefulness to the sleep state based on information related to one or more sleep rest periods of the subject and gross motion data of the subject before the subject fell asleep.
  • the method also includes determining the sleep staging of the subject.
  • determining the sleep staging of the subject includes determining a deepness of the subject's sleep during a portion of each of one or more sleep rest periods of the subject based on information related to the one or more sleep rest periods.
  • the sleep staging of the subject is determined based on at least a heart rate and gross motion data of the subject during one or more of the portions of the sleep rest periods.
  • the data is acquired by a device that is worn by the subject.
  • the method also includes causing the device to calculate and display the sleep score when the subject is determined to have awoken.
  • the method also includes providing information to the subject that assists the subject in improving the sleep score.
  • the information includes a recommended sleep schedule.
  • the information is provided to a device that is worn by the subject that acquires the data.
  • the information is provided to a mobile phone of the subject.
  • the state of the subject includes a fitness state
  • the score includes a fitness score
  • the fitness score is associated with one or more of a degree of physical fitness, cardiac condition, coaching, dehydration, social interaction, adherence to a regimen, and coaching effectiveness of the subject.
  • deriving the score includes calculating a resting heart rate of the subject while the subject is inactive based on the information about at least one pulse pressure wave propagating through blood in the subject and gross motion data of the subject.
  • deriving the score also includes calculating a heart rate of the subject based on the information about at least one pulse pressure wave propagating through blood in the subject. Deriving the score also includes determining that the subject is in the fitness state based on the heart rate and the gross motion data of the subject.
  • deriving the score includes determining a length of time that it takes for the subject's heart rate to transition from the heart rate in the fitness state to the resting heart rate.
  • deriving the score includes determining a length of time that it takes for the subject's heart rate to transition from the resting heart rate to the heart rate in the fitness state.
  • the data is acquired by a device that is worn by the subject.
  • the method also includes causing the device to calculate and display the fitness score when the subject is determined to be in the fitness state.
  • the method also includes causing the device to calculate and display the fitness score when the subject is determined to have transitioned from the fitness state to a non-fitness state.
  • the method also includes providing information to the subject that assists the subject in improving the fitness score.
  • the information includes a recommended fitness routine.
  • the information is provided to a device that is worn by the subject that acquires the data.
  • the information is provided to a mobile phone of the subject.
  • the method also includes embedding a visual indication of one or more of the fitness score, a heart rate, a respiratory rate, and a blood pressure of the subject into a video showing the subject performing a fitness routine.
  • the visual indications are updated throughout the video according to the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject during the fitness routine.
  • the method also includes predicting an outcome of an athletic event that the subject is participating in based on one or more of the fitness score, a heart rate, a respiratory rate, and a blood pressure of the subject during the athletic event.
  • the method also includes comparing one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject to fitness scores, heart rates, respiratory rates, and blood pressures of other individuals who are participating in the athletic event.
  • the method also includes, while the subject is performing physical activity, comparing one or more of the fitness score, a heart rate, a respiratory rate, and a blood pressure of the subject to fitness scores, heart rates, respiratory rates, and blood pressures of one or more individuals who have previously performed the physical activity.
  • performing the physical activity includes performing an athletic event, and the one or more individuals are professional athletes who compete in the athletic event.
  • the state of the subject includes a stress state
  • the score includes a stress score
  • deriving the score includes calculating one or more of a heart rate, a heart rate variability, a blood pressure, an electrical skin impedance, and a respiratory rate of the subject based on the information about at least one pulse pressure wave propagating through blood in the subject and information about motion of the subject.
  • the stress state of the subject is associated with hypertension
  • deriving the score includes determining whether the subject is experiencing hypertension by comparing a blood pressure of the subject to a threshold.
  • the stress state of the subject is associated with emotional stress
  • deriving the score includes determining a level of emotional stress experienced by the subject by comparing one or more of a heart rate, a heart rate variability, a blood pressure, an electrical skin impedance, and a respiratory rate of the subject to a threshold.
  • determining the level of emotional stress experienced by the subject is based at least in part on audio data.
  • the audio data is captured by a microphone of a device that acquires the data in the first dataset.
  • the audio data includes one or both of environmental noise and a tonality of the subject's voice.
  • determining the level of emotional stress experienced by the subject includes analyzing the environmental noise to determine whether the subject is in an environment attributed to an increased emotional stress level.
  • determining the level of emotional stress experienced by the subject includes analyzing the tonality of the subject's voice to determine whether the subject is in a confrontational situation attributed to an increased emotional stress level.
  • the data is acquired by a device that is worn by the subject.
  • the method also includes causing the device to calculate and display the stress score when the subject is determined to be in the stress state.
  • the method also includes providing information to the subject that assists the subject in improving the stress score.
  • the information includes a recommended stress-reducing routine.
  • the information is provided to a device that is worn by the subject that acquires the data.
  • the information is provided to a mobile phone of the subject.
  • the state of the subject includes a sleep state
  • the score includes a sleep score
  • the sleep state of the subject is associated with one or more of sleep duration, sleep latency, and sleep staging
  • deriving the score includes determining one or more of the sleep duration, the sleep latency, and the sleep staging of the subject.
  • the processor is also configured to determine the sleep duration of the subject.
  • determining the sleep duration of the subject includes determining a total length of time during which the subject was asleep based on information related to one or more sleep rest periods of the subject.
  • determining the sleep duration of the subject includes determining a percentage of time that the subject was asleep between a time when the subject started to try to fall asleep and a time when the subject awoke based on information related to one or more sleep rest periods of the subject and gross motion data of the subject before the subject fell asleep.
  • the processor is also configured to determine the sleep latency of the subject.
  • determining the sleep latency of the subject includes determining a length of time that it takes for the subject to transition from a state of wakefulness to the sleep state based on information related to one or more sleep rest periods of the subject and gross motion data of the subject before the subject fell asleep.
  • the processor is also configured to determine the sleep staging of the subject.
  • determining the sleep staging of the subject includes determining a deepness of the subject's sleep during a portion of each of one or more sleep rest periods of the subject based on information related to the one or more sleep rest periods.
  • the sleep staging of the subject is determined based on at least a heart rate and gross motion data of the subject during one or more of the portions of the sleep rest periods.
  • the biofeedback device also includes a display, and the processor is also configured to cause the display to display the sleep score.
  • the processor causes the display to display the sleep score when the subject is determined to have awoken.
  • the state of the subject includes a fitness state
  • the score includes a fitness score
  • the fitness score is associated with one or more of a degree of physical fitness, cardiac condition, coaching, dehydration, social interaction, adherence to a regimen, and coaching effectiveness of the subject.
  • deriving the score includes calculating a resting heart rate of the subject while the subject is inactive based on the information about at least one pulse pressure wave propagating through blood in the subject and gross motion data of the subject.
  • deriving the score also includes calculating a heart rate of the subject based on the information about at least one pulse pressure wave propagating through blood in the subject. Deriving the score also includes determining that the subject is in the fitness state based on the heart rate and the gross motion data of the subject.
  • deriving the score also includes determining a length of time that it takes for the subject's heart rate to transition from the heart rate in the fitness state to the resting heart rate.
  • deriving the score also includes determining a length of time that it takes for the subject's heart rate to transition from the resting heart rate to the heart rate in the fitness state.
  • the processor is also configured to cause the display to display the fitness score.
  • the processor causes the display to display the fitness score when the subject is determined to be in the fitness state.
  • the processor causes the display to display the fitness score when the subject is determined to have transitioned from the fitness state to a non-fitness state.
  • the processor is also configured to determine one or more of a heart rate, a respiratory rate, and a blood pressure of the subject based on data received from one or more of the light-emitting element, the optical sensor, and the motion sensor.
  • the device also includes a transceiver, and the processor is configured to cause the transceiver to provide one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject to a remote device.
  • the processor causes the transceiver to provide one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject to a video that shows the subject performing a fitness routine.
  • a visual indication of one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject is embedded into the video.
  • the visual indications are updated throughout the video according to the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject during the fitness routine.
  • the processor is also configured to predict an outcome of an athletic event that the subject is participating in based on one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject during the athletic event.
  • the transceiver is configured to communicate with transceivers of other biofeedback devices.
  • the processor is also configured to compare one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject to fitness scores, heart rates, respiratory rates, and blood pressures of other individuals who are participating in the athletic event.
  • the processor is also configured to, while the subject is performing physical activity, compare one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject to fitness scores, heart rates, respiratory rates, and blood pressures of one or more individuals who have previously performed the physical activity.
  • performing the physical activity includes performing an athletic event, and the one or more individuals are professional athletes who compete in the athletic event.
  • the state of the subject includes a stress state
  • the score includes a stress score
  • the stress state of the subject is associated with emotional stress
  • deriving the score includes determining a level of emotional stress experienced by the subject by comparing one or more of a heart rate, a heart rate variability, a blood pressure, an electrical skin impedance, and a respiratory rate of the subject to a threshold.
  • the biofeedback device also includes an audio input device.
  • determining the level of emotional stress experienced by the subject is based at least in part on audio data provided to the processor by the audio input device.
  • the audio data includes one or both of environmental noise and a tonality of the subject's voice.
  • determining the level of emotional stress experienced by the subject includes analyzing the environmental noise to determine whether the subject is in an environment attributed to an increased emotional stress level
  • determining the level of emotional stress experienced by the subject includes analyzing the tonality of the subject's voice to determine whether the subject is in a confrontational situation attributed to an increased emotional stress level.
  • the processor is also configured to cause the display to display the stress score.
  • the processor causes the display to display the stress score when the subject is determined to be in the stress state.
  • a method in another aspect, includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject. The method also includes deriving information about a psychological state of the subject from the processed data.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject.
  • the operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the operations also include deriving information about a psychological state of the subject from the processed data.
  • the device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the device also includes a processor configured to receive data from one or more of the light-emitting element, the optical sensor, and the motion sensor. The processor is also configured to derive information about a psychological state of the subject from the processed data.
  • Implementations can include one or more of the following features.
  • the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at a single location of the subject.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the single location is an arm of the subject.
  • the single location is a wrist of the subject.
  • the psychological state of the subject includes a state of stress.
  • the method also includes determining one or more of a blood pressure, a heart rate, and a heart rate variability of the subject based on the datasets.
  • the method also includes deriving information about the state of stress of the subject based on one or more of the determined blood pressure, heart rate, and heart rate variability of the subject.
  • the method also includes correlating a level of stress of the subject to an amount of ultraviolet light that the subject has been exposed to.
  • deriving the information includes inferring a relationship between at least some of the processed data and one psychological state of the subject.
  • the method also includes inferring an existence of a second psychological state of the subject by comparing other processed data with the processed data related to the one psychological state.
  • the one psychological state includes a state of relatively lower stress.
  • the one psychological state includes a baseline state of the subject, and the relationship between at least some of the processed data and the one psychological state is inferred prior to the subject performing a polygraph test.
  • the psychological state includes a malicious intent.
  • the psychological state includes lying.
  • a device worn by the subject acquires the data.
  • deriving information about the psychological state of the subject includes determining a baseline state of the subject based on one or more of a blood pressure, a heart rate, a heart rate variability, a respiratory rate, and an electrical skin impedance.
  • the device is worn by the subject for an extended period of time to determine the baseline state of the subject.
  • the device is continuously worn by the subject for more than one day.
  • the processor is also configured to determine one or more of a blood pressure, a heart rate, and a heart rate variability of the subject based on the received data.
  • the processor is also configured to derive information about a state of stress of the subject based on one or more of the determined blood pressure, heart rate, and heart rate variability of the subject.
  • the device also includes an ultraviolet light sensor configured to measure an amount of ultraviolet light that the subject is exposed to.
  • the processor is also configured to correlate a level of stress of the subject to an amount of ultraviolet light that the subject has been exposed to.
  • a method in another aspect, includes processing data in a dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The method also includes determining whether one or more segments of the dataset were captured from a subject other than an expected subject by analyzing morphological features of the segments.
  • a method in another aspect, includes processing data in a dataset that represents time-varying information about motion of a subject acquired at a location of the subject. The method also includes determining whether one or more segments of the dataset were captured from a subject other than an expected subject by analyzing morphological features of the segments.
  • a method in another aspect, includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject.
  • the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the method also includes, based on the first and second datasets, determining at least two parameters of the subject, the parameters selected from the group consisting of blood pressure, respiratory rate, blood oxygen levels, heart rate, heart rate variability, stroke volume, cardiac output, MoCG morphology, and PPG morphology.
  • the method also includes determining a biometric signature of the subject, the biometric signature represented by a multi-dimensional space that is defined by at least two axes, each axis corresponding to at least one of the determined parameters.
  • the method also includes determining whether the biometric signature was captured from a subject who is an expected subject by analyzing features of the biometric signature.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject.
  • the operations also include determining whether one or more segments of the dataset were captured from a subject other than an expected subject by analyzing morphological features of the segments.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a dataset that represents time-varying information about motion of a subject acquired at a location of the subject.
  • the operations also include determining whether one or more segments of the dataset were captured from a subject other than an expected subject by analyzing morphological features of the segments.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject.
  • the operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the operations also include determining at least two parameters of the subject based on the first and second datasets. The parameters are selected from the group consisting of blood pressure, respiratory rate, blood oxygen levels, heart rate, heart rate variability, stroke volume, cardiac output, MoCG morphology, and PPG morphology.
  • the operations also include determining a biometric signature of the subject.
  • the biometric signature is represented by a multi-dimensional space that is defined by at least two axes. Each axis corresponds to at least one of the determined parameters.
  • the operations also include determining whether the biometric signature was captured from a subject who is an expected subject by analyzing features of the biometric signature.
  • the biofeedback device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the biofeedback device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor.
  • the processor is also configured to determine whether one or more segments of the data were captured from a subject other than an expected subject by analyzing morphological features of the segments.
  • Implementations can include one or more of the following features.
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at a single location of the subject.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the single location is an arm of the subject.
  • the single location is a wrist of the subject.
  • the determining includes analyzing other biometric data.
  • the other biometric data includes one or more of electrical skin impedance, respiratory rate, heart rate, heart rate variability, PPG morphology, and vocal sound frequency of the subject.
  • analyzing the other biometric data includes determining whether the subject is under distress.
  • the determining includes analyzing confidential information provided by the subject.
  • the confidential information includes one or more of a password, a personal identification number, and a predefined gesture.
  • the analyzing includes comparing morphological features of different segments of biometric data.
  • the method also includes taking an action when it is determined that one or more of the segments were captured from a subject other than the expected subject.
  • taking an action includes prompting the subject to provide confidential information to authenticate the subject as the expected subject.
  • the expected subject is a subject associated with a particular device that captures the data segments at a location on the expected subject.
  • the determining includes taking account of one or both of a changing level of activity and a changing heart rate of the subject.
  • the method also includes sending information to a device upon determining that the subject is the expected subject.
  • the device is a payment gateway, and the information includes a payment authorization.
  • the device is a lock
  • the information causes a lock to unlock.
  • causing the lock to unlock is also based on a location of the subject.
  • the method also includes sending information to a device upon determining that the subject is under distress.
  • the subject is determined to be under distress if one or more of a heart rate, a blood pressure, and a respiratory rate of the subject surpasses a threshold.
  • the device is a payment gateway
  • the information includes instructions for the payment gateway to prevent the subject from accessing the payment gateway.
  • the device is a lock
  • the information includes instructions for the lock to remain locked.
  • the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the method also includes determining whether one or more segments of the datasets were captured from a subject other than an expected subject by analyzing morphological features of the segments.
  • the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the method also includes determining a pulse transit time (PTT) based on the datasets, the PTT representing a transit time of a pulse pressure wave within the subject.
  • PTT pulse transit time
  • the method also includes determining a blood pressure of the subject based on the datasets.
  • the determining includes analyzing other biometric data.
  • the other biometric data includes one or more of electrical skin impedance, respiratory rate, heart rate, heart rate variability, stroke volume, cardiac output, MoCG morphology, PPG morphology, and vocal sound frequency of the subject.
  • analyzing the other biometric data includes determining whether the subject is under distress.
  • the morphological features include differences in blood pressure at specific times during each of the data segments.
  • the specific times include times of peaks or valleys in blood pressure during the data segments.
  • the morphological features include differences in blood pressure at successive peaks of blood pressure, successive valleys of blood pressure, or successive peaks and valleys of blood pressure.
  • determining whether one or more segments of the data were captured from a subject other than an expected subject includes analyzing confidential information provided by the subject.
  • the confidential information includes one or more of a password, a personal identification number, and a predefined gesture.
  • the biofeedback device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the processor is also configured to receive data from the motion sensor
  • the processor is also configured to take an action when it is determined that one or more of the segments were captured from a subject other than the expected subject.
  • taking an action includes prompting the subject to provide confidential information to authenticate the subject as the expected subject.
  • the motion sensor is also configured to determine when a subject performs the predefined gesture.
  • the biofeedback device also includes a transceiver configured to send information to a device upon determining that the subject is the expected subject.
  • the device is a payment gateway, and the information includes a payment authorization.
  • the device is a lock
  • the information causes a lock to unlock.
  • the biofeedback device also includes a location module, and causing the lock to unlock is also based on a location of the subject as determined by the location module.
  • the transceiver is also configured to send information to a device upon determining that the subject is under distress.
  • the device is a payment gateway
  • the information includes instructions for the payment gateway to prevent the subject from accessing the payment gateway.
  • the device is a lock
  • the information includes instructions for the lock to remain locked.
  • a method in another aspect, includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The method also includes providing, based on the data, information about a medication regimen of the subject.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject.
  • the operations also include providing, based on the data, information about a medication regimen of the subject.
  • the device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor.
  • the processor is also configured to provide, based on the data, information about a medication regimen of the subject.
  • Implementations can include one or more of the following features.
  • the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at a single location of the subject.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the single location is an arm of the subject.
  • the single location is a wrist of the subject.
  • the method also includes determining, based on the data, that the subject has potentially missed a dose of a medication.
  • the method also includes providing a notification indicating that the subject has potentially missed the dose of the medication.
  • determining that the subject has potentially missed a dose of a medication includes determining that a blood pressure of the subject has crossed a threshold.
  • the method also includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject.
  • the method also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject.
  • the method also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • PTT pulse transit time
  • the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • determining that the subject has potentially missed a dose of a medication includes determining that a heart rate of the subject has crossed a threshold.
  • determining that the subject has potentially missed a dose of a medication includes determining that a respiratory rate of the subject has crossed a threshold.
  • the method also includes determining, based on the data, a reaction of the subject to a medication.
  • the method also includes providing a recommended medication regimen of the medication based on the reaction of the subject to the medication.
  • the recommended medication regimen includes one or more recommended dosage timings.
  • the recommended medication regimen also includes one or more recommended dosage amounts. Each of the recommended dosage amounts corresponds to one of the dosage timings.
  • determining a reaction of the subject to a medication includes determining a blood pressure of the subject.
  • the blood pressure of the subject is determined periodically.
  • the recommended dosage timings and amounts are determined so as to maintain a blood pressure of the subject within a defined range.
  • determining a reaction of the subject to a medication includes determining a heart rate of the subject.
  • the heart rate of the subject is determined periodically.
  • determining a reaction of the subject to a medication includes determining a regularity of a heart rate of the subject.
  • the recommended dosage timings and amounts are determined so as to maintain a heart rate of the subject within a defined range.
  • determining a reaction of the subject to a medication includes determining a cardiac output of the subject.
  • the recommended dosage timings and amounts are determined so as to maintain a cardiac output of the subject within a defined range.
  • determining a reaction of the subject to a medication includes determining a temperature of the subject.
  • the recommended dosage timings and amounts are determined so as to maintain the temperature of the subject within a defined range.
  • the recommended dosage timings and amounts are determined so as to maintain a heart rate of the subject within a defined range.
  • determining a reaction of the subject to a medication includes determining a respiratory rate of the subject.
  • the respiratory rate of the subject is determined periodically.
  • the recommended dosage timings and amounts are determined so as to maintain a respiratory rate of the subject within a defined range.
  • the biofeedback device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the processor is also configured to receive data from the motion sensor.
  • the processor is also configured to determine, based on the data, that the subject has potentially missed a dose of a medication and provide a notification indicating that the subject has potentially missed the dose of the medication.
  • the processor is also configured to determine, based on the data, a reaction of the subject to a medication and provide a recommended medication regimen of the medication based on the reaction of the subject to the medication.
  • the recommended medication regimen includes one or more recommended dosage timings.
  • the recommended medication regimen also includes one or more recommended dosage amounts, each of which corresponds to one of the dosage timings.
  • the operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject
  • a method in another aspect, includes processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in each of two or more subjects acquired at a location of each of the subjects. The method also includes processing data that represents time-varying information about motion of the two or more subjects acquired at the location on each of the subject. The method also includes providing information to a user that reports relative states of the subjects.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in each of two or more subjects acquired at a location of each of the subjects.
  • the operations also include processing data that represents time-varying information about motion of the two or more subjects acquired at the location on each of the subject.
  • the operations also include providing information to a user that reports relative states of the subjects.
  • a biofeedback device configured to be worn by two or more subjects includes a light source configured to emit light toward the skin of the subject.
  • the device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the device also includes a processor configured to receive data from one or more of the light-emitting element, the optical sensor, and the motion sensor. The processor is also configured to provide information to a user that reports relative states of the subjects.
  • Implementations can include one or more of the following features.
  • the information about at least one pulse pressure wave propagating through blood in the subjects includes photoplethysmographic (PPG) data and the information about motion of the subjects includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at single locations of each of the subjects.
  • the data is acquired by devices worn by the subjects.
  • the devices are mobile and do not reduce mobility of the subjects.
  • the devices process the data.
  • the single location of each of the subjects is an arm of the subject.
  • the single location is a wrist of the subject.
  • the relative states of the subjects are determined based on one or more of respiratory rates, heart rates, and blood pressures of the subjects.
  • the relative states of the subjects are determined by comparing one or more of the respiratory rates, the heart rates, and the blood pressures of the subjects to respective threshold values.
  • devices worn by the subjects acquire the data, and the respiratory rates, the heart rates, and the blood pressures of the subjects are determined according to the data.
  • the method also includes managing the subjects based on the relative states.
  • the method also includes assigning tasks to the subjects based on the relative states of the subjects.
  • one or more of the subjects are put into an athletic contest according to the relative states of the subjects.
  • a subject is put into the athletic contest if one or more of the respiratory rate, the heart rate, and the blood pressure of the subject is above a respective threshold.
  • one or more of the subjects are assigned particular combat tasks according to the relative states of the subjects.
  • a subject is assigned a particular combat task if one or more of the respiratory rate, the heart rate, and the blood pressure of the subject is above a respective threshold.
  • the relative states include one or more of relative psychological states, relative physical states, and relative states of readiness.
  • the two or more subjects are managed based on the relative states.
  • the processor is also configured to assign tasks to the subjects based on the relative states of the subjects.
  • one or more of the subjects are put into an athletic contest according to the relative states of the subjects.
  • one or more of the subjects are assigned particular combat tasks according to the relative states of the subjects.
  • a method in another aspect, includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject while the subject is sleeping. The method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject while the subject is sleeping. The method also includes determining, based on the data, information about a characteristic of the subject's sleep.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject while the subject is sleeping.
  • the operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject while the subject is sleeping.
  • the operations also include determining, based on the data, information about a characteristic of the subject's sleep.
  • the device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the device also includes a processor configured to receive data from one or more of the light-emitting element, the optical sensor, and the motion sensor. The processor is also configured to determine, based on the data, information about a characteristic of the subject's sleep.
  • Implementations can include one or more of the following features.
  • the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at a single location of the subject.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the single location is an arm of the subject.
  • the single location is a wrist of the subject.
  • the method also includes generating a reduced set of data by excluding data associated with non-sleep periods of the subject.
  • a period of time is identified as a non-sleep period based on gross motion data of the subject.
  • identifying the period of time as a non-sleep period includes determining that the gross motion data during the period of time is above a threshold.
  • identifying the period of time as a non-sleep period includes determining that the gross motion data during the period of time is substantially irregular.
  • a period of time is identified as a sleep period based on gross motion data of the subject.
  • identifying the period of time as a sleep period includes determining that the gross motion data during the period of time is below a threshold.
  • identifying the period of time as a sleep period includes determining that the gross motion data during the period of time is substantially flat.
  • the method also includes determining a start and an end of the sleep period.
  • determining the start of the sleep period includes identifying a time when the gross motion data falls below a threshold, and determining the end of the sleep period includes identifying a time when the gross motion data rises above a threshold.
  • the method also includes calculating a property of the sleep of the subject based on the data.
  • the property is associated with one or more of heart rate, heart rate variability, activity level, respiratory rate, and blood pressure of the subject.
  • one or more of the heart rate, the heart rate variability, the activity level, the respiratory rate, and the blood pressure of the subject are determined based on the processed data.
  • determining the heart rate of the subject includes calculating a distance between two consecutive reference points in the first dataset, the distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • the reference points are local maxima or local minima.
  • the reference points are peaks or valleys.
  • determining the heart rate variability of the subject includes calculating distances between multiple pairs of consecutive reference points in the first dataset, each distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • determining the blood pressure of the subject includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject. Determining the blood pressure of the subject also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject. Determining the blood pressure of the subject also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject, wherein the PTT is related to an internal pressure of one or more blood vessels of the subject. Determining the blood pressure of the subject also includes determining the blood pressure of the subject based on the internal pressure of the one or more blood vessels.
  • PTT pulse transit time
  • the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • the characteristic of the subject's sleep is determined based on the property.
  • the characteristic includes sleep apnea.
  • determining that the subject is experiencing sleep apnea includes identifying a simple signal in a heart rate signal of the subject that is acquired during a sleep period of the subject.
  • determining that the subject is experiencing sleep apnea includes identifying recurring simple signals in the heart rate signal of the subject.
  • the simple signals recur at least every two minutes during the sleep period of the subject.
  • the characteristic includes a quality of the sleep, including one or more of a sleep duration, a sleep latency, a sleep staging, a number of disturbances, and a number of tosses and turns.
  • determining information about a characteristic of the subject's sleep includes determining the sleep duration of the subject.
  • determining the sleep duration of the subject includes determining a total length of time during which the subject was asleep based on information related to one or more sleep rest periods of the subject.
  • the information related to the one or more sleep rest periods includes a time associated with a beginning of each sleep rest period, a time associated with an end of each sleep rest period, gross motion data of the subject during each sleep rest period, and heart rate data of the subject during each sleep rest period.
  • determining information about a characteristic of the subject's sleep includes determining the sleep latency of the subject.
  • determining the sleep latency of the subject includes determining a length of time that it takes for the subject to transition from a state of wakefulness to the sleep state based on information related to one or more sleep rest periods of the subject and gross motion data of the subject before the subject fell asleep.
  • determining information about a characteristic of the subject's sleep includes determining the sleep staging of the subject.
  • determining the sleep staging of the subject includes determining a deepness of the subject's sleep during a portion of each of one or more sleep rest periods of the subject based on information related to the one or more sleep rest periods.
  • the sleep staging of the subject is determined based on at least a heart rate and gross motion data of the subject during one or more of the portions of the sleep rest periods.
  • the method also includes alerting the subject when the sleep duration exceeds a threshold while the subject is in a light sleep stage.
  • the characteristic includes a sleep disorder.
  • the characteristic includes a level of nocturnal dip of blood pressure.
  • the characteristic includes a sleep period.
  • the method also includes deriving a value representing an evaluation of a state of the subject based on the data.
  • the state of the subject includes a health-related state.
  • the state of the subject is associated with one or more of sleep quality, sleep duration, sleep latency, and sleep staging.
  • the value is provided to the subject or to another party.
  • the value is derived based on data related to motion of the subject.
  • the data is acquired by a device that is worn by the subject and that displays the value.
  • the device derives the value.
  • the device provides the data to a remote device that derives the value.
  • the method also includes processing data that represents information about an amount of ultraviolet light that the subject has been exposed to.
  • the method also includes correlating a characteristic of the subject's sleep to the amount of ultraviolet light that the subject has been exposed to.
  • the method also includes correlating a quality of the subject's sleep to the amount of ultraviolet light that the subject has been exposed to.
  • the method also includes correlating a duration of the subject's sleep to the amount of ultraviolet light that the subject has been exposed to.
  • the processor is also configured to identify a period of time as a non-sleep period based on gross motion data of the subject measured by the motion sensor.
  • identifying the period of time as a non-sleep period includes determining that the gross motion data during the period of time is above a threshold.
  • identifying the period of time as a non-sleep period includes determining that the gross motion data during the period of time is substantially irregular.
  • the processor is also configured to identify a period of time as a sleep period based on gross motion data of the subject measured by the motion sensor.
  • identifying the period of time as a sleep period includes determining that the gross motion data during the period of time is below a threshold.
  • identifying the period of time as a sleep period includes determining that the gross motion data during the period of time is substantially flat.
  • the processor is also configured to determine a start and an end of the sleep period.
  • determining the start of the sleep period includes identifying a time when the gross motion data falls below a threshold, and determining the end of the sleep period includes identifying a time when the gross motion data rises above a threshold.
  • the processor is also configured to calculate a property of the sleep of the subject based on the data.
  • the characteristic of the subject's sleep is determined based on the property, and the characteristic of the subject's sleep includes sleep apnea.
  • the processor is also configured to determine that the subject is experiencing sleep apnea. Determining that the subject is experiencing sleep apnea includes identifying a simple signal in a heart rate signal of the subject that is acquired during a sleep period of the subject.
  • determining that the subject is experiencing sleep apnea includes identifying recurring simple signals in the heart rate signal of the subject.
  • the simple signals recur at least every two minutes during the sleep period of the subject.
  • the characteristic includes a quality of the sleep, including one or more of latency to sleep, number of disturbances, and number of tosses and turns.
  • a method in another aspect, includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject.
  • the data in the first and second datasets is acquired while the subject is in a situation that requires at least a predetermined amount of alertness of the subject.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The data is acquired while the subject is in a situation that requires at least a predetermined amount of alertness of the subject.
  • the device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor.
  • the processor is also configured to process the data to derive a measure of alertness of the subject.
  • Implementations can include one or more of the following features.
  • the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject
  • the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at a single location of the subject.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the single location is an arm of the subject.
  • the single location is a wrist of the subject.
  • the situation includes one in which a likelihood of harm to one or more human lives is increased if the alertness of the subject is below the predetermined amount.
  • the situation is one in which a likelihood of damage to one or more properties is increased if the alertness of the subject is below the predetermined amount.
  • the situation is one in which a likelihood of economic damage is increased if the alertness of the subject is below the predetermined amount.
  • the situation is one or more of air traffic control, intelligence analysis, vehicle driving, machinery driving, security guarding, baggage screening, and aircraft piloting.
  • the method also includes using the processed data to derive a measure of alertness of the subject.
  • the measure of alertness of the subject is based on one or more of a heart rate, a respiratory rate, a blood pressure, and an activity level of the subject.
  • the method also includes activating an alarm on a device worn by the subject if the measure of alertness of the subject falls below a threshold.
  • the device worn by the subject acquires the data.
  • the device worn by the subject processes the data.
  • the method also includes causing a speed of a vehicle being operated by the subject to be decreased if the measure of alertness of the subject falls below a threshold.
  • the method also includes causing an alarm in a vehicle being operated by the subject to be activated if the measure of alertness of the subject falls below a threshold.
  • the method also includes causing a device being operated by the subject to be turned off if the measure of alertness of the subject falls below a threshold.
  • the method also includes causing an operation switch of a vehicle being operated by the subject to be turned off if the measure of alertness of the subject falls below a threshold.
  • the method also includes assigning a task to the subject based on the measure of alertness.
  • the subject is put into an athletic contest if the measure of alertness of the subject is above a threshold.
  • the subject is assigned a particular combat task if the measure of alertness of the subject is above a threshold.
  • the biofeedback device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the processor is also configured to receive and process the data from the motion sensor.
  • the biofeedback device also includes a transceiver configured to provide one or both of the processed data and the measure of alertness.
  • the transceiver is also configured to cause a speed of a vehicle being operated by the subject to be decreased if the measure of alertness of the subject falls below a threshold.
  • the transceiver is also configured to cause an alarm in a vehicle being operated by the subject to be activated if the measure of alertness of the subject falls below a threshold.
  • the transceiver is also configured to cause a device being operated by the subject to be turned off if the measure of alertness of the subject falls below a threshold.
  • the transceiver is also configured to cause an operation switch of a vehicle being operated by the subject to be turned off if the measure of alertness of the subject falls below a threshold.
  • the processor is also configured to assign a task to the subject based on the measure of alertness.
  • the subject is put into an athletic contest if the measure of alertness of the subject is above a threshold.
  • the subject is assigned a particular combat task if the measure of alertness of the subject is above a threshold.
  • operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • a method in another aspect, includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The method also includes predicting a medical event of the subject based on the processed data.
  • one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject.
  • the operations also include predicting a medical event of the subject based on the processed data.
  • the device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject.
  • the optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject.
  • the device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor.
  • the processor is also configured to predict a medical event of the subject based on the data.
  • Implementations can include one or more of the following features.
  • the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • the data is acquired continuously.
  • the data is acquired at a frequency of at least 16 Hz.
  • the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • the data is acquired at a single location of the subject.
  • the data is acquired by a device worn by the subject.
  • the device is mobile and does not reduce a mobility of the subject.
  • the device processes the data.
  • the single location is an arm of the subject.
  • the single location is a wrist of the subject.
  • the method also includes alerting a caregiver when a medical event of the subject is predicted.
  • processing the data includes determining one or more of heart rate, heart rate variability, blood pressure, blood pressure variability, body temperature, skin temperature, vocal tonality, electrical skin impedance, respiratory rate, blood oxygen level, stroke volume, cardiac output, MoCG morphology, and PPG morphology of the subject.
  • predicting the medical event of the subject includes determining whether a heart rate of the subject satisfies a threshold.
  • the medical event is tachycardia.
  • determining the heart rate of the subject includes calculating a distance between two consecutive reference points in the first dataset, the distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • the reference points are local maxima or local minima.
  • the reference points are peaks or valleys.
  • predicting the medical event of the subject includes determining whether a heart rate variability of the subject satisfies a threshold.
  • the threshold is based on whether the subject experiences arrhythmia.
  • determining the heart rate variability of the subject includes calculating distances between multiple pairs of consecutive reference points in the first dataset, each distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • the reference points are local maxima or local minima.
  • the reference points are peaks or valleys.
  • predicting the medical event of the subject includes determining whether a blood pressure of the subject satisfies a threshold.
  • the medical event is hypertension.
  • predicting the medical event of the subject includes determining a rate of change of a blood pressure of the subject.
  • the medical event is a stroke
  • a stroke is predicted if the rate of change of the blood pressure of the subject is positive and above a threshold.
  • the medical event is abnormal heart function
  • abnormal heart function is predicted if the rate of change of the blood pressure of the subject is negative and below a threshold.
  • the method also includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject.
  • the method also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject.
  • the method also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • PTT pulse transit time
  • the blood pressure of the subject is determined based on the PTT.
  • the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • the device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject.
  • the processor is also configured to receive data from the motion sensor.
  • the device also includes a transceiver configured to alert a caregiver when a medical event of the subject is predicted.
  • the operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • Blood pressure and/or other biometric parameters may be measured based on continuously acquired data, without the need for cuffs, pressure points or electrodes.
  • Continuous acquiring data means acquiring data at a sufficient frequency (e.g., a sufficient number of times per second) to allow for the derivation of the parameters described herein from that data.
  • the data can, for example, be collected at a frequency ranging from 16 Hz to 256 Hz. In certain implementations, the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • Vital signs can be measured at one location, using a comfortable and unobtrusive device.
  • the disclosed technology may be integrated with third party devices (for example, mobile devices) thereby allowing for using external sensors such as motion detectors and light sensors disposed in the third party devices.
  • FIG. 1A illustrates pulse transit time (PTT) calculation using an example BCGB plot, and a photoplethysmogram (PPG) plot.
  • PTT pulse transit time
  • PPG photoplethysmogram
  • FIGS. 1B and 1C are example block diagrams of a device that performs biometric measurements based on MoCG and PPG data.
  • FIGS. 1D-1F are plots generated based on data collected using sensors of the device of FIGS. 1B and 1C .
  • FIG. 1G illustrates side and top views of an example configuration of optical sensors that can be used in the device of FIGS. 1B and 1C .
  • FIGS. 2A-2C , 3 , and 4 illustrate plots generated based on data collected by the sensors of the device of FIGS. 1B and 1C .
  • FIGS. 5A-5E illustrate examples of cardiac signals.
  • FIGS. 6A-6C are flowcharts depicting example processes for biometric authentication.
  • FIG. 7A is a flowchart depicting an example of a process for calculating motion pulse transit time (MPTT).
  • FIG. 7B is a flowchart depicting an example of another process for calculating MPTT.
  • FIG. 8 shows examples of heat maps that relate to data collected from the motion sensors of the device of FIGS. 1B and 1C , and are used in determining weights for data corresponding to accelerometers oriented along different axes.
  • FIGS. 9 , 10 A- 10 C, 11 A, and 11 B illustrate plots used in calculating MPTT.
  • FIG. 12 is a flowchart depicting an example of a process for calibration of the device of FIGS. 1B and 1C .
  • FIGS. 13 and 14 illustrate examples related to calibration of the device of FIGS. 1B and 1C .
  • FIGS. 15A-15D and 16 A- 16 C show examples of plots used in detecting various heart conditions.
  • FIG. 17 is a flowchart of an example of a process for detecting arrhythmia.
  • FIG. 18 is an example of a plot of arterial stiffness vs. exercise frequency.
  • FIGS. 19A and 19B are examples of plots used in determining sleep quality and/or sleep disorders.
  • FIG. 20 is an example of a screenshot for showing sleep quality.
  • FIG. 21 is a flowchart depicting an example of a process for determining sleep quality.
  • FIG. 22 is an example of a screenshot for showing a fitness-related metric.
  • FIG. 23 is an example of a screenshot for showing a stress-related metric.
  • FIG. 24 is a flowchart depicting an example of a process for deriving information about a psychological state of a subject.
  • FIG. 25 is a flowchart depicting an example of a process for determining a metric for quality of care provided at a care facility.
  • FIG. 26 shows an example where the technology described is used by emergency responders.
  • FIG. 27 is a flowchart depicting an example of a process for determining relative states of multiple subjects.
  • FIG. 28 is a flowchart depicting an example of a process for predicting a medical event.
  • FIG. 29 is a flowchart depicting an example of a process for determining information about a medication regimen.
  • FIG. 30 shows an example where the technology is used at a medical or caregiving facility.
  • FIG. 31 shows an example of the technology being used with a proximity system.
  • FIGS. 32A and 32B show an example implementation of the device of FIGS. 1B and 1C in the form of a wearable watch.
  • FIG. 33 shows an example of an environment where the technology is used for access control.
  • FIG. 34 shows an example where the technology is used for allowing a user to access/operate a vehicle of other machinery.
  • FIG. 35 shows an example where the technology is used for controlling gaming and/or entertainment systems.
  • FIG. 36 shows an example where the technology is used for controlling various devices connected to a network.
  • FIG. 37 is an example of a screenshot that displays and allows sharing of blood pressure results.
  • FIG. 38 is a flowchart depicting an example of a process for controlling remote devices using the technology described in this document.
  • FIGS. 39A-39C show examples of user interfaces of an application that makes data collected by the device of FIGS. 1B and 1C available to a user.
  • FIG. 40 is an example of a block diagram of a computer system.
  • This document describes technology for determining pulse transit time (PTT) of blood based on motion data such as motioncardiogram (MoCG) data (which is related to, and also referred to in this document as ballistocardiogram (BCG) data) and optical data such as photoplethysmographic (PPG) data.
  • PTT pulse transit time
  • MoCG motioncardiogram
  • BCG ballistocardiogram
  • PPG photoplethysmographic
  • MPTT motion pulse transit time
  • the terms PTT and MPTT may be used interchangeably.
  • This document also describes technology for performing various biometric measurements (e.g., blood pressure, respiratory rate, blood oxygen level, stroke volume, cardiac output, arterial stiffness, and temperature) based on the MoCG data and the PPG data.
  • the MoCG is an example of a motion of the subject.
  • MoCG is a pulsatile motion signal of the body measurable, for example, by a motion sensor such as an accelerometer or a gyroscope.
  • the pulsatile motion signal results from a mechanical motion of portions of the body that occurs in response to mechanical motion of the heart.
  • the pulsatile motion signal can result from mechanical motion of portions of the body that occurs in response to blood being pumped during a heartbeat. This motion is a mechanical reaction of the body to the internal pumping of blood and is externally measurable.
  • the MoCG signal therefore corresponds to, but is delayed from, the heartbeat.
  • the MoCG signal recorded at a given portion of the body therefore represents the motion of the blood due to a heartbeat, but is delayed from, the heart's electrical activation (e.g. when the ventricles are electrically depolarized).
  • PPG data is data optically obtained via a plethysmogram, a volumetric measurement of the vasculature.
  • PPG data can be obtained using an optical device which illuminates the skin and measures changes in light absorption. With each cardiac cycle the heart pumps blood resulting in a pressure pulse wave within the vasculature. This causes time-varying changes in the volume of the vasculature. The changes can be detected, for example, by illuminating the skin with light from a light-emitting diode (LED) and then measuring the amount of light either transmitted or reflected to a detector such as a photodiode. Each cardiac cycle is therefore represented as a pattern of crests and troughs. The shape of the PPG waveform differs from subject to subject, and varies with the location and manner in which the waveform is recorded.
  • LED light-emitting diode
  • FIG. 1A illustrates pulse transit time (PTT) calculation using an example BCG plot 102 , and a photoplethysmogram (PPG) plot 103 .
  • BCG plot 102 can be analyzed to determine points at which a pulse (or pressure wave) originates at a first location on the body. The BCG however, may be measured at a second location on the body.
  • the points e.g., local maxima
  • 108 a, 108 b and 108 c in the BCG plot 102 may represent time points at which corresponding pulses originate at or near the chest. These points are often referred to in this document as pulse origination points.
  • the time of arrival of the pulse at a second location can be determined from PPG data obtained at the second location.
  • the PPG data can be measured at the wrist using one or more optical sensors.
  • Light from the optical sensors i.e., the light sources such as LEDs of the optical sensors
  • the reflected light which is modulated by blood volume changes underneath the skin
  • the output of the photo-detector may be amplified by an amplifier before being converted to a digital signal (for example, by an analog to digital converter (ADC)) that represents the PPG.
  • ADC analog to digital converter
  • the plot 103 of FIG. 1A represents PPG data that can be used to determine the arrival time of the pulses at the wrist.
  • the maximum slope points 109 a, 109 b, and 109 c represent the arrival times of the pulses that originated at the chest at time points represented by 108 a, 108 b, and 108 c, respectively. These points may in general be referred to in this document as pulse arrival points 109 .
  • the plot 103 is synchronized with the BCG plot 102 such that the PTT (or MPTT) 113 between the chest and the wrist can be determined as a time difference between the originating point at the chest and the corresponding arrival point at the wrist.
  • PTT or MPTT
  • the time difference between 108 b and 109 b represents the PTT 113 .
  • the time difference between 108 a and 109 a, or the time difference between 108 c and 109 c can be used in determining the PTT 113 .
  • the technology described in this document allows for determination of PTT from MoCG (or BCG) and PPG data measured at substantially the same location on a human body (e.g., the wrist). This includes identifying, from the PPG data, a time point (e.g., the time points 109 ) at which a pulse wave arrives at the location, identifying, from the BCG data, a time point (e.g., the time points 108 ) at which the pulse originated at a different location on the body (e.g., the heart) from the MoCG data, and determining the PTT 113 as a difference between the two identified time points.
  • a time point e.g., the time points 109
  • a time point e.g., the time points 108
  • FIG. 1B is a block diagram of an example of a device 100 that performs biometric measurements based on MoCG and PPG data.
  • the biometric measurements can be used for monitoring health related parameters, as well as in diagnosing conditions and predicting an onset of such conditions.
  • the device 100 can be a wearable device that a subject can wear on the body.
  • the device 100 can be disposed in a wearable watch, bracelet, anklet, armband, chest-patch, or belt.
  • An example implementation of the device in the form of a wearable watch 3200 is shown in FIGS. 32A and 32B .
  • the watch 3200 includes a case 3202 that is configured to hold the internal components of the watch, including light sources 3204 a, 3204 b, an optical sensor 3206 , a motion sensor 3208 , a processor 3210 , and an ultraviolet light sensor 3212 .
  • the device may also be disposed as a part of a garment worn by the subject.
  • the device 100 may also be disposed in a rug or mat (e.g., a bathroom mat or a shower mat).
  • the device 100 may also be disposed in a separate device carried or worn by the subject.
  • the device 100 can be disposed internally or externally in a watch or mobile device used by the subject.
  • the device 100 can include a transceiver that is configured to communicate wirelessly with another device to perform a biometric monitoring process. For example, data collected and/or computed by the device 100 may be transmitted to an application executing on a mobile device for additional analysis or storage.
  • alerts and messages may be transmitted from a server or mobile device for display on the device 100 .
  • Devices similar to the device 100 are described in U.S. patent application Ser. Nos. 13/166,388 and 13/803,165, and 61/660,987, the contents of which are incorporated by reference herein.
  • Various combinations of the operations described in this document may also be performed by a general purpose computing device that executes appropriate instructions encoded on a non-transitory computer readable storage device such as an optical disk, a hard disk, or a memory device.
  • the device 100 can be configured to make MoCG and PPG measurements either directly (such as when implemented as a part of an armband, wristband, chest patch, undergarment) or indirectly (such as when implemented as part of a mobile device) from a portion of the body proximate to the location of the device.
  • the MoCG data can be measured using one or more motion sensors 105 such as an accelerometer or a gyroscope.
  • the motion sensors 105 include multiple accelerometers (e.g., one for each of the x, y, and z axes) and/or multiple gyroscopes (e.g., one each for measuring tilt, rotation, and yaw). Even though FIG.
  • the device 100 can include one or more sensors to measure or detect ambient conditions.
  • sensors can include, for example, a microphone (e.g., to measure environmental noise), an altimeter, a humidity sensor, a GPS device (for determining geographical location), and an ultraviolet light sensor (e.g. to detect level of sun exposure).
  • the device 100 can be configured to warn the user (for example, by displaying a message) if a measured, derived, or inferred health parameter is outside an acceptable range for the parameter.
  • health parameters can include (without being limited to the following) measured parameters such as heart rate, respiratory rate, or arrhythmia, derived parameters such as blood pressure, stroke volume, or arterial stiffness, and inferred parameters such as mood, stress level, or sleep deprivation.
  • the level of sun exposure (as measured by the ultraviolet light sensor) can be correlated to the mood or stress level of the user, and related suggestions and recommendations can be provided accordingly. For example, if sun exposure above a certain threshold level is known to decrease stress for a particular user, the user may be asked to increase sun exposure during a period when a stress level detected by the device 100 is high.
  • environmental sounds captured by the microphone can be used to contextualize or interpret vital signs data captured using the device 100 .
  • a tonality e.g., amplitude and/or frequency
  • a tonality e.g., amplitude and/or frequency
  • environmental noise can be detected during a user's commute to determine, for example, if, and to what extent driving (or rush hour subway) affects the user's health parameters.
  • the data captured by the microphone can be used to determine and/or confirm if that is attributable to environmental noise (e.g., snoring, or an alarm clock going off).
  • environmental noise e.g., snoring, or an alarm clock going off.
  • an unacceptable condition e.g., a user's increased stress level
  • construction activity determined, for example, via pile driver sounds captured by the microphone
  • the data captured using the motion sensors 105 includes both MoCG data and motion data associated with an activity of the subject.
  • the MoCG data can be filtered out from the combination using, for example, one or more band pass filters (BPF) 125 shown in FIG. 1C .
  • BPF band pass filters
  • a pass band of the BPF 125 can be designed to filter out constant components (e.g., acceleration due to gravity) and high frequency noise components.
  • a pass band of 3-12 Hz may be used for the band pass filter 125 .
  • multiple band pass filters may be used concurrently.
  • a filter with a 3-12 Hz passband and another filter with a 10-30 Hz passband can be used simultaneously to measure different parameters measurable in the two different bands.
  • the band pass filtered accelerometers can be combined to obtain an activity index 127 , which in turn is used in calculating appropriate weights 130 for obtaining updated biometric measurements 132 .
  • the activity index 127 can be less than a threshold value (e.g., 5) indicating, for example, that the band pass filtered accelerometer outputs can be used directly in determining the biometric measurements.
  • the activity index 127 can be higher (e.g., between 5 and 15), indicating that the band pass filtered accelerometer data may need to be adjusted (e.g., by applying a threshold) before being used in determining the biometric measurements.
  • the band pass filtered accelerometer data may be discarded as being unreliable.
  • weights 130 may be adjusted to reflect if and how the band pass filtered data from the accelerometer 105 is used. Examples of band pass filtered accelerometer data are illustrated in FIG. 1F , where plots 170 , 172 , and 174 represent outputs of accelerometers in the x, y, and z axes, respectively.
  • the PPG data can be measured using one or more optical sensors 110 .
  • the optical sensors 110 can include one or more light emitting diodes (LEDs) whose output can be controlled, for example, by a microcontroller.
  • Example configurations of the optical sensors 110 are depicted in FIG. 1G .
  • the optical sensors include a 7.5 mm 2 photodiode with two green LEDs placed within 1.5 mm of either side. The photodiode has an opaque optical shield surrounding the sides. The LEDs can have a peak wavelength of 525 nm and a viewing angle of 60 degrees.
  • the optical sensors 110 In operation, light from the optical sensors 110 (i.e., from the light sources such as LEDs of the optical sensors) is directed toward the skin of the subject, and the reflected light is modulated by blood flow underneath the skin.
  • the optical sensors 110 also include one or more photo-detectors (e.g., photodiodes) that receive the reflected light and provide a resulting signal to the microcontroller.
  • the resulting signal may be amplified by an amplifier before being converted to a digital signal (for example, by an analog to digital converter (ADC)) that is provided to the microcontroller.
  • the PPG signal is synchronized with the heartbeat and can therefore be used to determine the heart rate (HR) 112 of a wearer of the device. This is shown in additional detail in FIG. 1C .
  • the heart rate signal can be within a particular range of the spectrum (e.g., 0 to one half of the sampling frequency) of the PPG signal 150 , and can be isolated using, for example, a band pass filter (BPF) 154 .
  • BPF band pass filter
  • FIG. 1D An example of this is shown in FIG. 1D , where the plot 160 represents raw PPG data, and the plot 162 represents the output of the BPF 154 .
  • the pass band of the filter used for the example depicted in FIG. 1C is 0.4-4 Hz. As seen from FIG. 1C , the low frequency portion of the raw data, as well as the high frequency variations are filtered out in the output plot 162 .
  • a frequency between 75-85 Hz is chosen such that reasonable power saving is achieved, and the optical interferers are aliased into non-biological optical signal frequency range (>10 Hz). For example, if 80 Hz is chosen, then the aliased interferers would be at frequencies such as 20 Hz, 30 Hz, and/or 40 Hz.
  • An appropriate low pass filter e.g., a filter with cut-off frequency of 10 Hz
  • the filtered PPG signal can be interpolated accordingly in time domain without signal loss.
  • the output of the BPF 154 can be used to determine a heart rate 144 of the subject, and can also be combined with the output of the BPF 125 to determine other biometric parameters such as pulse transit time (MPTT) and stroke volume (SV) 145 , as well as other parameters 146 , including, for example, systolic and diastolic blood pressure, stroke volume (SV), and cardiac output (CO).
  • MPTT pulse transit time
  • SV stroke volume
  • CO cardiac output
  • calibration data 155 is used in computing one or more of the parameters 146 .
  • the calibration data 155 can include user-specific calibration information (e.g., constants used in equations) that may be used in computing one or more of the parameters 146 .
  • the calibration data 155 can be computed based on user-provided data. For example, a user may be asked to provide biographical data such as age, height, and weight for use in computing the calibration data. In some implementations, the user can be asked to provide his/her last-known blood-pressure data to determine one or more constants or parameters included in the calibration data 155 . In some cases, a medical professional may measure a user's blood pressure during set up of the device 100 .
  • calibration data 155 can be calculated based on a user action. For example, the user may be asked to hold the device 100 at or near chest level to equalize hydrostatic pressure effects and sense chest vibrations that are used in computing a calibration point. This way, a delay between a chest vibration and the time of arrival of a pulse wave at the wrist (if the device 100 is worn on the wrist) can be used to calibrate for blood pressure for a scenario where there is no height difference between the heart and the measuring point.
  • the calibration data 155 can include information related to skin tone calibration where LED intensity and amplifier gain are adjusted until an optimal DC level is reached.
  • calibration data 155 may be included in the calibration data 155 .
  • the calibration factors may be adjusted retroactively once the user enters valid calibration data.
  • Calibration data may also be imported from the user's medical records if, for example, the device is dispensed to the user by their medical professional.
  • a signal representing respiratory rate is typically within the 0-1 Hz range of PPG, and can be obtained using low pass filtering.
  • FIG. 1C where the PPG data 150 is passed through the low pass filter (LPF) 152 and optionally combined with the output of another LPF 135 (used for low pass filtering the MoCG data) to obtain biometric parameters such as sleep data 142 and respiratory rate 143 .
  • LPF low pass filter
  • FIG. 1E An example of determining the respiratory rate 143 from the PPG data 150 is illustrated in FIG. 1E .
  • the plot 166 represents the raw PPG data
  • the plot 168 shows the output of the LPF 152 representing the low frequency variations due to respiration.
  • biometric parameters may also be computed.
  • blood oxygenation SpO 2
  • the device 100 can also include a computing device 115 that can be configured to compute the biometric parameters, including, for example, blood pressure, respiratory rate, blood oxygen, stroke volume, cardiac output, and temperature.
  • an activity index 148 (which may be the activity index 127 , also shown in FIG. 1C ) can be used in determining a set of weights 147 used in calculating one or more of the biometric parameters 146 .
  • the heart rate information 144 is used in calculating one or more of the biometric parameters 146 .
  • the heart rate information 144 can be obtained from the PPG by detecting peaks and/or valleys in a graphical representation (e.g., the plot 162 shown in FIG. 1C ) of the PPG data 150 . This can include, for example, cross-correlating a portion of the PPG data (e.g., samples or data corresponding to a two second segment of the plot 162 of FIG. 1C ) with similar segments to produce a plot 180 (shown in FIG. 2A ) representing a series of cross-correlation products.
  • two-second segments from the plot 162 are cross-correlated with adjacent (possibly with some partial overlap) two-second segments to produce the plot 180 of FIG. 2A .
  • a particular cross correlation result (for example, one that produces the highest cross-correlation amplitude) can then be selected as a template.
  • the plot 178 shown in FIG. 2B is an example of a template.
  • the template can be adjusted to conform to a desired morphology, allowing for a beat to beat natural variation but discounting noise and non-heartbeat signals.
  • the selected template can then be correlated with segments from the plot 162 (shown in FIG. 1C ) to identify locations of correlation peaks. This is illustrated in FIG. 2A , where the plot 180 represents a series of such peaks.
  • the location of the correlation signal peaks can be used to direct a search for valleys, inflection points, and/or peaks within the band pass filtered PPG signal.
  • the inflection point in this case is defined as the point of maximum slope.
  • FIG. 2C illustrates an example of a PPG signal with identified peaks 181 , inflection points 183 and valleys 185 . For brevity, only a few of the peaks, inflection points, and valleys are marked using the reference numbers 181 , 183 , and 185 , respectively.
  • the instantaneous heart rate for each of the heartbeats can be plotted as shown in FIG. 3 , and can be used for other purposes such as computing other parameters and diagnosing conditions such as arrhythmia.
  • confidence levels associated with a calculated instantaneous heart rate can be determined before being used in any subsequent analysis. For example, if a person suddenly stands up from a sitting position, the instantaneous heart rate during the transition may shoot up. In some implementations, the rate of such rapid increase can include meaningful information. However, in some implementations, the information obtained during this transition may not be reliable as an indicator of the person's health status. Determining confidence levels associated with the computed heart rates can allow for discarding such outliers in subsequent analyses.
  • a given computed instantaneous heart rate can be compared, for example, to the average (or median) instantaneous heart rate over a predetermined time range (e.g., ⁇ 10 seconds) to determine whether the given instantaneous heart rate is reliable. If the given instantaneous heart rate differs (e.g., differs by more than a predetermined amount) from the average heart rate over the predetermined time range, the given instantaneous heart rate may be determined to be unreliable and therefore deweighed in subsequent computations. This allows for selecting reliable data points at the expense of a short latency (10 seconds in the above example).
  • the instantaneous heart-rate data as shown in FIG. 3 can be used for computing instantaneous heart-rate variability (HRV).
  • HRV heart-rate variability
  • FIG. 4 An example of HRV plotted against the corresponding heartbeats is shown in FIG. 4 .
  • the HRV data can be used to calculate a mean HRV for a set of heartbeats.
  • HRV data can be used in detecting conditions such as stress. For example, if the mean HRV is above a certain threshold, the subject may be determined to be under higher than usual stress.
  • HRV can be calculated by computing a variance of individual RR intervals (distance between the ‘R’ points of two consecutive QRS complex curves representing heartbeats, or alternatively the distance between valleys as shown in FIG. 2C ) from the average RR interval, over a period of time (e.g., 5 minutes).
  • the HRV can also be calculated in the frequency domain by comparing the power spectrum at very low frequencies (e.g., 0.04-0.15 Hz) with the power spectrum at slightly higher frequencies (e.g., 0.18 to 0.4 Hz).
  • Cardiac waveform morphology (also referred to as cardiac morphology) can be defined as the shape of a plot representing cardiac activity.
  • FIG. 5A represents a Wiggers diagram, which is a standard diagram used in cardiac physiology.
  • the shape of an electro-cardiogram (ECG) QRS complex 505 represents a morphology associated with a heartbeat.
  • Cardiac morphology depends on where and how cardiac activity is measured.
  • the morphology 510 of a phonocardiogram signal is different from that of the ECG morphology 505 .
  • the morphology associated with ventricular volume 515 is different from the morphology associated with ventricular pressure 520 .
  • FIG. 5B shows an example of a cardiac signal illustrating the morphology 525 associated with a PPG signal.
  • the morphology of a measured PPG signal can be checked to determine whether the measured PPG signal reliably represents heartbeats.
  • the relative separations of the peaks and valleys of the PPG signal are analyzed to determine whether the PPG signal reliably represents heartbeats. For example, a segment of the PPG signal can be determined to represent heartbeats if the following threshold condition is satisfied:
  • the condition above uses the range [0.25, 0.4] as an example, and other values can also be used.
  • the range (or threshold) could be determined for an individual user by using, for example, a range considered to be normal for the particular user.
  • the ratio from the above condition can vary within the range for various conditions of the subject. For example, the ratio can be at a low portion of the range during relaxation or sleep conditions, and at a high portion of the range during stressful events such as anger or fear.
  • other morphology checks can also be performed. For example, one morphology check can involve verifying that at a resting position, the user's systolic amplitude is approximately half of the diastolic amplitude. In some implementations, segments that do not satisfy the morphology check conditions are discarded from being used in biometric parameter computations.
  • Cardiac morphology also typically varies from one person to another due to, for example, unique heart beat signatures, breathing patterns and the unique ‘transmission line’ reflection signatures that are caused by the lengths and stiffness of an individual's arteries.
  • the main peak represents the first systolic peak which is followed by the secondary peak (or bump) representing the early diastolic peak (or reflection).
  • the time between the two peaks is also inversely proportional to arterial stiffness. This is easier to visualize from the first and/or the second derivatives of the PPG signal.
  • FIGS. 5C and 5D show examples of cardiac signals illustrating morphology based on PPG signals. In the example of FIG.
  • cardiac morphology can be used as a biometric identifier.
  • the device 100 described with reference to FIG. 1B can be configured to verify, based on a determined cardiac morphology, that the person wearing the device is the person for whom the device was assigned.
  • the determined cardiac morphology may also be used to uniquely identify a wearer of the device 100 .
  • biometric identification can be used, for example, in security and accessibility applications.
  • the device 100 can be configured to transmit a cardiac morphology based signature to a receiver (e.g., on a mobile phone, or at secured access point) to gain access to a secure resource.
  • a receiver e.g., on a mobile phone, or at secured access point
  • the wearer of the device may be identified based on the identified cardiac morphology of the wearer.
  • FIG. 5E shows examples of cardiac signals illustrating morphology for four different individuals, and illustrates how the cardiac morphology varies from one person to another.
  • multiple measured or derived parameters can be used as a biometric signature to uniquely identify a wearer.
  • a wearer can be identified based on a multi-dimensional space defined based on the measured or derived parameters. Because the parameters vary from one person to another, each person would be mapped to a different region within the multi-dimensional space.
  • a simple two-dimensional example of such a space can be defined, for example, by using heart rate as one axis and PPG shape as the second axis. Because the PPG shape and heart rate varies from one person to another, each person can typically be mapped to a separate region on the two-dimensional plane, and can be identified based on a location of the region. Higher dimensional spaces can be used for robustly identifying individuals among a large population.
  • parameters that can be used as axes for such spaces include cardiac morphology, heart rate, cardiac volume, PPG, or other parameters derived as a function of one or more of these parameters.
  • cardiac morphology can be combined with another parameter such as the MoCG morphology to achieve increased accuracy and/or resolution for bio-authentication applications. Examples of such applications include access control, digital wallet authorization, digital passwords/signature and environmental control.
  • MoCG data can be used to provide a MPTT signature and/or a MoCG signature waveform that may be unique to a particular user.
  • the biometric signature based user identification can be used in electronic payment applications.
  • the device 100 can be configured to communicate with a payment gateway using, for example, near field communication (NFC) or Bluetooth Low Energy (BLE) protocols.
  • the payment gateway can be configured to identify the user based on a corresponding biometric signature to initiate the payment process.
  • the payment gateway can communicate the identification information to a server that stores credit card or bank information of the corresponding user, for example, within a corresponding user account.
  • the server may initiate communications with the payment gateway that result in the credit card being charged or the bank account being debited.
  • the biometric signature based user identification is disabled if the device determines that the wearer is under distress.
  • the device can determine whether the wearer is under distress based on the wearer's vital signs (e.g., such as heart rate (HR), heart rate variability (HRV), blood pressure (BP), and respiratory rate). For example, if a wearer of the device is being forced to access a payment gateway, the device can detect the wearer's distress, as indicated by a sudden increase in HR, BP, and/or respiratory rate, and prevent him or her from accessing the payment gateway.
  • HR heart rate
  • HRV heart rate variability
  • BP blood pressure
  • the device can detect the wearer's distress, as indicated by a sudden increase in HR, BP, and/or respiratory rate, and prevent him or her from unlocking the lock.
  • the wearer's vital signs do not produce a match of the wearer's biometric signature when the wearer is under distress.
  • the wearer when the wearer is under distress, the multi-dimensional space defined based on the measured or derived parameters takes on a modified for that does not match the wearer's biometric signature. As such, a wearer under distress is unable to be identified by the biometric signature.
  • a wearer may exhibit signs that are synonymous with distress when the wearer is not in fact in distress. For example, if the wearer is involved in a non-dangerous and exciting event, such as buying an extremely expensive item, the wearer may experience an increase in HR, BP, and/or respiratory rate that may mistakenly be interpreted by the device as signs of distress. Thus, in some implementations, the wearer is provided with an opportunity to authenticate himself or herself in the event that the device detects false signs of distress or fails to identify the biometric signature of the wearer. The wearer can authenticate himself or herself using confidential information such as a password or a personal identification number that is communicated to the device or a server in communication with the device. In some implementations, the wearer can authenticate himself or herself by performing a private, predefined gesture. The one or more motion sensors of the device can be configured to determine whether the authenticating gesture matches the predefined gesture.
  • FIG. 6A An example process 600 of bio-authenticating a subject is shown in FIG. 6A .
  • a machine such as a processor, that receives information from the optical sensors 110 of the device 100 can perform one or more steps of the process 600 .
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • data in a dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed ( 602 ).
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • a determination can then be made of whether one or more segments of the dataset were captured from a subject other than an expected subject ( 604 ). The determination can be made by analyzing morphological features of the segments.
  • FIG. 6B Another example process 610610 of bio-authenticating a subject using information about motion of the subject is shown in FIG. 6B .
  • a machine such as a processor, that receives information from the motion sensor 105 of the device 100 can perform one or more steps of the process 610610 .
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • data in a dataset that represents time-varying information about motion of a subject can be processed 612 ).
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • a determination can then be made of whether one or more segments of the dataset were captured from a subject other than an expected subject 614 ). The determination can be made by analyzing morphological features of the segments.
  • FIG. 6C Another example process 620 of bio-authenticating a subject is shown in FIG. 6C .
  • a machine such as a processor, that receives information from the motion sensor 105 and the optical sensors 110 of the device 100 can perform one or more steps of the process 620 .
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • the machine may also use the calculated MPTT to further generate additional biometric measurements, the processes for which are discussed below.
  • data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed ( 622 ).
  • Data in a second dataset that represents time-varying information about motion of the subject can also be processed ( 624 ).
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • at least two parameters of the subject can be determined ( 626 ).
  • the parameters can include one or more of blood pressure, respiratory rate, blood oxygen levels, heart rate, heart rate variability, stroke volume, cardiac output, MoCG morphology, and PPG morphology.
  • a biometric signature of the subject can then be determined ( 628 ).
  • the biometric signature can be represented in a multi-dimensional space. Each axis can correspond to at least one of the determined parameters.
  • a determination can then be made of whether the biometric signature was captured from a subject who is an expected subject ( 630 ). The determination can be made by analyzing features of the biometric signature.
  • the biometric signature based user identification can be used in providing rewards and/or discounts to a user. For example, if the identified user is determined to be adhering to a particular exercise regimen, reward points or incentives such as discounts on particular products can be credited to the corresponding user account. Therefore, a user can be motivated to keep adhering to particular good practices to keep getting such rewards or discounts.
  • the information collected from the motion sensors 105 and the optical sensors 110 of FIG. 1B is used to calculate the MPTT, which can be used to further calculate the biometric parameters, such as blood pressure, stroke volume, etc.
  • An example process 700 for the MPTT calculation is shown in FIG. 7A .
  • a machine such as a processor, that receives the information from the motion sensors 105 and the optical sensors 110 can perform one or more steps of the process 700700 .
  • the machine may further provide the calculated results to, for example, the wearer, another person who is interested and authorized to receive the information, or another machine for further data processing or data storage.
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • the machine may also use the calculated MPTT to further generate additional biometric measurements, the processes for which are discussed below.
  • the MoCG data for use in the MPTT calculation can be preprocessed ( 702 ).
  • the motion sensor or sensors e.g., the accelerometers
  • the motion sensor or sensors collect three sets of MoCG data along three orthogonal axes, x, y, and z, or along polar coordinates.
  • the three sets may be combined by selecting a weight, w x , w y , w z for each set and summing the weighted sets.
  • An example of the weight selection is shown in FIG. 8 , which illustrates two dimensional heat-map diagrams 800 , 802 , and 804 produced from power spectra of MoCG ensembles collected over time.
  • each row in the diagrams represents the power spectrum of a corresponding frame of MoCG data.
  • the colors represent the values of the energy level.
  • the weights w x , w y , w z can be assigned, using respective diagrams, based on the ratio of energy inside the heart rate range to the energy outside the heart rate range. If the power spectra is consistent across the different frames and/or is a harmonic of the already calculated heart rate (as illustrated in the diagram 804 ), the corresponding axis (the z axis in this example) is assigned a higher weight than the other axes.
  • the lines 806 , 808 , and 810 in FIG. 8 represent the first, second, and third harmonic, respectively of the measured heart rate in this time segment.
  • the MoCG data for the MPTT calculation is then calculated as the weighted sum of the three sets of MoCG data for the three axes.
  • a single axis can be selected (e.g., the axis with the highest weight) while ignoring the others.
  • only the z axis can be selected for the example shown in FIG. 8 .
  • axis selection can be performed by independently analyzing each axis and then combining the axes based on agreement of the candidate MPTT values. This may be done, for example, to avoid the calculation of a power spectrum signal without sacrificing on the accuracy.
  • a representative segment of the PPG data is generated ( 704704 ) for calculating the PPT.
  • the representative PPG segment is generated by averaging across multiple PPG segments of the same length.
  • FIG. 9 shows an example of the representative segment 904 of the PPG data used in determining the MPTT.
  • the representative segment 904 in this example is calculated by averaging across multiple segments 906 of equal duration.
  • the MoCG data is then analyzed using the representative segment ( 706 ) to calculate candidate MPTT values.
  • the representative segment can be calculated, for example, by averaging across multiple segments of equal duration arranged on the same time grid as a representative PPG signal.
  • a short segment of the MoCG data 902 (of equal duration to the representative segment 904 ) and the representative segment 904 are aligned in time, for example, by aligning inflection points (or valleys or peaks).
  • the length of the segment 904 and the corresponding MoCG data can be in the order of several seconds. In the example shown in FIG. 9 , the length of the segment 904 is 2 seconds. However segments of other lengths (e.g. 1.5 seconds-5 second) can also be used.
  • the representative segment is generated from data collected when a user is stationary, so that the data does not include a significant amount of unwanted noise.
  • the MPTT is measured as the difference between a time point to when a mid-systole portion 908 of the representative PPG segment 904 is measured, and a second time point representing the portion of MoCG data corresponding to the mid-systole.
  • the MoCG data represents the motion due to an actual heartbeat
  • the PPG data represents a pulse wave arrival recorded at a distance from the heart
  • the second time point generally occurs before to. Since a human body is not a rigid body, as defined by the laws of mechanics, the MoCG pulse arrives at the location where the device is located in a somewhat delayed (but constant per individual) fashion.
  • the portion of MoCG data corresponding to the mid-systole is typically manifested as a peak or valley in the MoCG data, and the MPTT can be determined by identifying the correct peak or valley corresponding to the mid-systole. While mid-systole is used as a reference point in this example, other portions of the cardiac morphology can also be used as the reference point. Based on a priori knowledge of typical MPTT, a predetermined time range relative to to is searched and the peaks and valleys detected within the predetermined time range are flagged as potential candidates for being the correct peak or valley corresponding to the mid-systole. Therefore, the difference between the time point corresponding to each such valley or peak and the time to represents a hypothetical MPTT. The correct MPTT value is determined based on the hypothetical MPTTs, as described using the example below.
  • the predetermined time range can be chosen to be, for example, between 10 to 400 ms, or another duration longer than an actual expected range.
  • seven peaks and valleys 910 , 912 , 913 , 914 , 916 , 918 , 920 , corresponding to time points t 1 , t 2 , t 3 , t 4 , t 5 , t 6 , t 7 , respectively, are identified on the MoCG plot 902 .
  • h 1 t 0 ⁇ t 1
  • h 2 t 0 ⁇ t 2
  • h 3 t 0 ⁇ t 3
  • h 4 t 0 ⁇ t 4
  • h 5 t 0 ⁇ t 5
  • h 6 t 0 ⁇ t 6
  • h 7 t 0 ⁇ t 7 .
  • a longer segment 1000 of the MoCG data (e.g., of 20 second duration, as shown in FIG. 10A ) is aligned with the corresponding PPG data, and the time points corresponding to mid-systoles in the PPG pulses are identified as reference points.
  • the MoCG data is checked at each time point preceding the reference points by h 1 (and possibly within a small time range around such time points) for the presence of a peak or valley. If a peak or valley is detected, it is flagged, and the total number of flagged peaks and valleys for the entire segment of MoCG data are recorded.
  • FIG. 10A illustrates a 20 second segment of MoCG data, along with flagged peaks and valleys corresponding to one particular hypothetical MPTT.
  • the flagged peaks and valleys are identified by markers (e.g., circles) 1008 , 1010 .
  • one of the hypothetical MPTTs is chosen as the true MPTT value, based on the recorded number of peaks or valleys.
  • the hypothetical MPTT that yields the maximum number of peaks or valleys can be chosen as the true MPTT value.
  • the hypothetical MPTTs can be combined together as a weighted sum to obtain the true MPTT value.
  • the weights can be assigned based on, for example, a ratio of the number of flagged peaks (or valleys) to the total number of reference points, and a consistency of the flagged peaks (or valleys) defined as a signal-to-noise ratio:
  • a weight for a given hypothetical MPTT can then be determined as:
  • Weight ((Number of flagged peaks)/(total reference points)) 2 *log(SNR)
  • the predetermined time range can be the duration for which a user wears the device 100 .
  • An example of such a histogram is shown in FIG. 11A , where the y axis represents a calculated MPTT value (averaged over 60 seconds), the y axis represents time, and the darkness of each point represents calculated confidence measure associated with the calculated MPTT.
  • the different horizontal sets represent candidate MPTT values for different time ranges.
  • a representative set can be selected from the candidate sets based on, for example, a priori knowledge about the expected MPTT, and/or confidence measures associated with the points in the set. For example, from FIG.
  • the sets 1111 or 1112 can be selected as the best representative sets for the MPTT, based on the confidence levels associated with the points (as represented by the darkness of the points), as well as a priori knowledge that the MPTT is expected to be within a 250-350 ms range. Therefore, more consistent (and hence reliable) estimates of MPTT values can be identified from the histograms, and the average MPTT value over the predetermined time range can be calculated ( 710 ), for example, as an average of the consistent MPTT values. Inconsistent MPTT values can be discarded from being included in computing the average MPTT. Other parameters such as average SV can also be calculated using similar plots. Before generating such plots, individual estimates of SV (in ml/heartbeat) can be calculated from the amplitude of the MoCG signal based on the fact that SV varies directly with the average amplitude of the MoCG.
  • only one candidate MPTT value can be selected.
  • the candidate MPTT value having the highest weights and/or an appropriate or expected morphology can be selected.
  • a confidence measure can be determined for each measurement of MPTT (or other biometric parameters) to indicate the confidence one has in the reading.
  • An example is shown in FIG. 11B , which illustrates computation of confidence measures 1120 corresponding to the calculated values of MPTT 1115 .
  • the confidence measures can be used, for example, to determine whether a calculated value can be used for subsequent computations.
  • FIG. 7B An example process for calculating MPTT is shown in FIG. 7B .
  • the process can be executed, for example by the device 100 described above with reference to FIG. 1B .
  • Operations of the process can include obtaining a first data set representing time-varying information on at least one pulse pressure wave within vasculature at a first body part of a subject ( 722 ).
  • the first data set can be obtained from a first sensor such as a PPG sensor.
  • the operations also include obtaining a second data set representing time-varying information about motion of the subject at the first body part of a subject ( 724 ).
  • the second data set can be obtained from a second sensor such as a motion sensor.
  • the operations further include identifying a first point in the first data set, the first point representing an arrival time of the pulse pressure wave at the first body part ( 726 ) and identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject ( 728 ).
  • Identifying the first point can include, for example, computing a cross-correlation of a template segment with each of multiple segments of the first dataset, identifying, based on the computed cross-correlations, at least one candidate segment of the first dataset as including the first point, and identifying a first feature within the identified candidate segment as the first point.
  • Identifying the second point can include, for example, determining a reference point in the second data set, wherein the reference point corresponds to substantially the same point in time as the first point in the first data set.
  • One or more target features can then be identified within a predetermined time range relative to the reference point, and a time point corresponding to one of the target features can be selected as the second point.
  • the operations also include computing MPTT as a difference between the first and second time points ( 730 ).
  • the MPTT represents a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject can then be used in computing various parameters such as blood pressure or arterial stiffness.
  • the calculated MPTT value is related to elasticity of the blood vessels as shown in the following equation:
  • L is the vessel length
  • PWV is the pulse wave velocity
  • E is the Young's modulus
  • h is the vessel wall thickness
  • is the blood density
  • r is the vessel radius.
  • the elasticity is in turn related to the vessel pressure P as:
  • the vessel pressure P can be derived as:
  • the pressure value calculated using (3) represents diastolic pressure (Dia).
  • the systolic pressure (Sys) can then be computed as:
  • A is a universal constant that applies to all users and is unitless
  • B is an individual constant in units of mmHg
  • C is an individual constant in units of mmHg/mg
  • SV is the stroke volume
  • the parameters B and C for calculating the diastolic and systolic pressures may vary from one person to another. Accordingly, a process or device may need to be calibrated for an individual before use. Generally, the calibration is performed the first time the accelerometer and the optical sensor are used for measuring and the algorithms are used for calculating the MPTT, SV, and the other parameters.
  • FIG. 12 An example process 1200 of calibration performed by a machine, such as a processor, is shown in FIG. 12 .
  • the machine receives ( 1202 ) known reference systolic and diastolic pressures (SysO and DiaO), e.g., as input from a wearer. If the pressures are unknown to the wearer, generic values of 120/80 mmHg are used. In such cases, the wearer may be allowed to alter the calibration at a later time when the actual pressures becomes known.
  • the machine also calculates ( 1204 ) the MPTT and the SV using methods described above. The machine then calculates the constants B and C ( 1206 ) for this particular wearer based on the following equations:
  • the values of the parameters are saved or stored ( 1208 ) for the individual.
  • a device e.g., the device 100 including the accelerometer and the optical sensor can be used by multiple people.
  • a calibration is performed for each individual following the process 1200 and a set of calculated parameters are stored in association with the corresponding person.
  • the device may automatically choose a set of stored parameters for use with an individual based on biometric identifications of the individual, or may ask the individual to self-identify and choose the correct set of parameters for use, in case the device is shared among multiple users.
  • blood pressure measurements based on continuously acquired data can be made available for each individual by converting the MPTT and SV into systolic and diastolic pressures as described above.
  • the systolic and diastolic pressures can also be calculated by adding time-varying parameter estimations based on second order parameters.
  • the diastolic pressure can be calculated as:
  • f(.) and g(.) are predetermined functions, and the parameters D and E are time dependent and individual dependent.
  • the parameters can be calibrated when at least two calibration points (e.g., two known sets of systolic and diastolic pressures) at different times are available.
  • the calibrated parameters do not change frequently. These parameters may be affected by arterial diameters, arterial wall thicknesses, arterial lengths, arterial elasticity, and other physical parameters related to the cardiovascular system of a human body.
  • the majority of the volume of blood related to MPTT travels through large arteries, and is less susceptible to hydrostatic changes, temperature, or peripheral tone.
  • Curves representing relationships between MPTT and blood pressure are illustrated in FIG. 13 . As seen from this example, while the curves may differ from one person to another, the general shapes of the curves are similar.
  • FIG. 14 illustrates systolic pressure measured over 90 days after a single calibration, and in the absence of any additional recalibration.
  • a processor e.g., a processor of the computing device 115 (shown in FIG. 1B ), or of an external computing device to which the PPG data and the MoCG data is transmitted
  • BP blood pressure
  • HR HR
  • HRV HR
  • HRV respiratory rate
  • CO cardiac output
  • the processor can be programmed to use the PPG data and accelerometer data to detect arrhythmia or irregular heart rhythms, such as arterial fibrillation (AFIB) or atrial flutter.
  • FIGS. 15A-15D shows graphs in which heart rate data of the wearer of the device 100 is plotted.
  • the graphs show heart rate data plotted over a 24 hour period ( FIG. 15A ), during the day ( FIG. 15B ), and during the night ( FIG. 15C ).
  • each of these graphs includes R wave to R wave interval (RR i ) along the x-axis and RR i+1 along the y-axis.
  • the plotted data can be used to determine whether the subject has a normal heart rhythm or an irregular heart rhythm, as described below.
  • the plots can be updated after predetermined intervals (e.g., every 5-10 minutes) in order to capture any transient anomaly.
  • the PPG and accelerometer signals are used in the manner described above to determine the instantaneous heart rate of the wearer for each heartbeat of the wearer over a period of time (e.g., 20 seconds).
  • the RR values are then determined by examining the instantaneous heart rate curve to determine the time between each of the successive heartbeats. Each RR value is equal to the time between two consecutive heartbeats. Each RR value (RR i ) is then plotted versus the subsequent RR value (RR i+1 ).
  • the graphs shown in FIGS. 15A-15D represent plots of a subject with a normal heart rhythm.
  • a normal heart rhythm the time between beats tends to be fairly consistent.
  • the heart rate tends to increase gradually over time.
  • the individual's heart rate may be significantly higher during such activities (as compared to his or her heart rate at rest), the difference in time between consecutive heartbeats should be fairly consistent over the course of a small number of consecutive heart beats.
  • the RR i vs. RR i+1 plot will typically be fairly linear along a diagonal, as shown in FIG. 15D .
  • FIGS. 16A-16C show heart rate data for individuals with different heart conditions.
  • FIG. 16A shows heart rate data taken over a 24 hour period from an individual having atrial fibrillation (AFIB).
  • FIG. 16B shows heart rate data taken over a 24 hour period from an individual having atrial flutter
  • FIG. 16C shows heart rate data taken over a 24 hour period from an individual having a normal heart rhythm.
  • AFIB is apparent since the spread of the various RR data points from the expected diagonal is greater than a predetermined spread value.
  • AFIB causes erratic beating of the heart resulting in the time between consecutive heartbeats varying significantly from one pair of heartbeats to the next. It is this characteristic that causes the plot of RR i vs. RR i+1 to spread significantly from the expected diagonal (i.e., the diagonal plot of an individual who has a regular heart rhythm (as shown in FIG. 16C )).
  • Atrial flutter can be seen by the multiple clusters of data that are offset from the diagonal. Atrial flutter results in changes in heart rate in multiples, which produces the multiple clusters of data that are offset from the diagonal.
  • the processor can be programmed to alert the wearer in response to detecting such irregular heart rhythms.
  • the processor can activate an audio or visual alarm of the device, which can, for example, instruct the wearer to seek medical attention.
  • FIG. 17 An example process 1700 of detecting arrhythmia of a subject is shown in FIG. 17 .
  • a machine such as a processor, that receives information from the motion sensor 105 and the optical sensors 110 of the device 100 can perform one or more steps of the process 1700 .
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed ( 1702 ).
  • Data in a second dataset that represents time-varying information about motion of the subject can also be processed ( 1704 ).
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • Arrhythmia of the subject can be detected based on the processed data ( 1706 ).
  • Arrhythmia can include atrial fibrillation or atrial flutter.
  • Processing the data can include determining whether a spread of plotted R wave to R wave intervals versus next consecutive R wave to R wave intervals exceeds a predetermined spread value.
  • Processing the data can also include determining whether multiple clusters of plotted data points are offset from a diagonal
  • arterial stiffness is an indicator for vascular health (e.g. arteriosclerosis), risk for hypertension, stroke, and heart attack.
  • the processor can therefore be programmed to calculate arterial stiffness as a function of the pulse transit time (MPTT).
  • Certain conventional devices that are used to assess arterial stiffness require devices to be placed at two different locations of the subject (e.g., at the carotid and leg of the subject).
  • the device described herein which is able to collect from a single location of the subject all necessary data for determining arterial stiffness, tends to be more convenient than those conventional devices.
  • the processor can be programmed to inform the wearer of the device of his or her arterial stiffness value by, for example, causing that value to be displayed on the display of the device.
  • the arterial stiffness value can be used as one of multiple factors for assessing the overall health of the wearer.
  • the processor is programmed to use arterial stiffness of the wearer to determine a health metric (e.g., a health score) for the wearer.
  • the health score may be a numerical value. In some cases, the numerical value is between 1 and 10 or between 1 and 100.
  • arterial stiffness As shown in FIG. 18 , the arterial stiffness of a subject tends to decrease as the activity level of the subject (e.g., the number of times per week that the subject exercises) increases.
  • activity level of the subject e.g., the number of times per week that the subject exercises
  • arterial stiffness is one parameter that can be monitored by the device and shared with the user to track the progress of a subject involved in an exercise regimen. This can serve as positive feedback for the user in addition to conventional feedback, such as weight loss.
  • the processor can also be programmed to use the PPG data and accelerometer data to detect sleep disorders, such as sleep apnea, and to deduce sleep quality and sleep stages.
  • sleep disorders such as sleep apnea
  • the processor first analyzes the low frequency components of the accelerometer data to identify sleep rest periods (SRPs), which are periods in which the accelerometer data is substantially flat for a minimum period of time (e.g., 90 seconds).
  • SRPs sleep rest periods
  • the flatness of the accelerometer data indicates that the wearer of the device is not moving during the SRPs.
  • SRPs are periods during which the wearer of the device is likely to be asleep.
  • FIG. 19 illustrates three separate SRPs (SRP 1 , SRP 2 , and SRP 3 ).
  • SRP 1 and SRP 2 and SRP 2 and SRP 3 are respectively separated from one another by a brief period of motion by the wearer of the device.
  • the three SRPs are treated as a single sleep cycle.
  • the processor can, for example, be programmed to treat periods of motion that last less than five minutes as not interrupting a sleep cycle during which that motion occurs.
  • the processor uses the PPG data and the accelerometer data collected during the SRPs to calculate the average heart rate, the standard deviation of the heart rate, the average heart rate variability (HRV), and the average activity level for each of the SRPs.
  • the processor analyzes the complexity of the heart rate signal and the deviation from diagonal of values plotted on an RR i vs. RR i+1 plot. These parameters can be used to confirm that the wearer of the device was sleeping during the SRP being analyzed and to identify certain sleep conditions and sleep disorders, as discussed below.
  • jetlag can also be detected by analyzing heart rate during sleep. For example, an upward heart rate during sleep can indicate a presence of jetlag, and a flat heart rate during sleep can indicate that the subject is not jetlagged.
  • the processor can be programmed to consider the average heart rate, the standard deviation of the heart rate, and the average heart rate variability (HRV) to confirm that the wearer was sleeping during the SRP being considered. For example, the average heart rate, the standard deviation of the heart rate, and the average heart rate variability (HRV) of the subject over the SRP being analyzed is compared to the baselines of these values in the subject. If they fall below the baseline by a predetermined amount, this confirms that the subject was asleep during the period being analyzed.
  • HRV average heart rate variability
  • the processor can determine the number of hours slept by the wearer, the sleep latency of the wearer (e.g., the length of time that it took for the subject to transition from wakefulness to sleep), the number of times that the wearer tossed and turned, and the percent of time that the wearer was asleep between the time that he or she went to bed and got up. In some cases, the processor can further determine the deepness of the sleep of the wearer during each of the SRPs.
  • the deepness of the sleep is sometimes referred to as the sleep stage. For example, if the accelerometer detected minimal movement and the patient's heart rate variability was a predetermined amount below the wearer's baseline heart rate during a portion of the SRP, it can be concluded that the wearer was in a deep sleep during that portion of the SRP. If the accelerometer detected some movement and the patient's heart rate was higher than can be expected of a deep sleep during a portion of the SRP, it can be concluded that the wearer was in REM sleep during that portion of the SRP. Otherwise, it can be concluded that the wearer was in a light sleep during that portion of the SRP.
  • the processor is programmed to use the above-noted parameters (e.g., the number of hours slept by the wearer, the number of times that the wearer tossed and turned, the percent of time that the wearer was asleep between the time that he or she went to bed and got up, and the deepness of sleep) to derive a quality of sleep metric or sleep score.
  • the wearer can monitor his or her sleep score over time in an effort to modify his or her sleep habits and maximize the quality of his or her sleep. It has been found that the use of such scores, as opposed to the various different related parameters, are more easily understood by users.
  • the processor can cause the device to automatically display the sleep score when the wearer is determined to have awoken.
  • the device can determine when the wearer has awoken based on information related to the SRPs. Based on characteristics related to the wearer's sleep, information can be provided to the wearer to assist the wearer in improving his or her sleep score.
  • the wearer can be provided with a recommended sleep schedule. For example, if the wearer is determined to have been getting too little sleep, the recommended sleep schedule may suggest that the wearer go to bed earlier in the evening or sleep in later into the morning.
  • the information can be provided on the display of the device or on a separate device, such as a mobile phone of the wearer.
  • FIG. 19B illustrates the heart rate signal of the wearer during a period of time in which the wearer experienced an episode of sleep apnea.
  • the heart rate signal of the wearer is complex from 2:54 AM until about 3:16 AM at which time the heart rate of the wearer spikes suddenly. From 3:16 AM until about 3:30 AM, the heart rate signal is simple (i.e., includes periodicity or a repeating pattern). The presence of a simple heart signal at least every two minutes during an SRP can be indicative of sleep apnea.
  • the processor can be programmed to carry out a multi-step test to detect sleep apnea.
  • the processor analyzes the heart rate throughout the SRP being analyzed. If the difference between the minimum heart rate and the maximum heart rate during the SRP is less than a threshold heart rate differential, then the processor determines that there was no sleep apnea and the test is concluded. If, however, the minimum-maximum heart rate differential exceeds the threshold heart rate differential, then the processor determines that sleep apnea could be the cause and a carries out a further analysis of the SRP. Specifically, the processor analyzes the heart rate variability, the plotted RR points, the complexity of the signal, and the activity level of the subject during the SRP
  • the heart rate variability is lower during the SRP than in neighboring periods, then this weighs against a finding of sleep apnea. If, however, the heart rate variability during the SRP exceeds the heart rate variability during neighboring periods, then this weighs in favor of a finding of sleep apnea.
  • Another factor used to determine whether the wearer has sleep apnea is the complexity of the heart rate signal. If the heart rate signal is complex during the SRP, then this weighs against a finding of sleep apnea. If, however, at least every two minutes, the heart rate signal becomes simple (i.e., has periodicity or a repeating pattern), then this weighs in favor of sleep apnea.
  • Activity level is another factor used to identify sleep apnea. If the activity level of the wearer during the SRP being analyzed (as determined using the accelerometer data) is greater than the activity level of the wearer during neighboring periods, this weighs against a finding of sleep apnea. If, however, the activity level of the wearer during the SRP being analyzed is less than the activity level of the wearer during neighboring periods, this weighs in favor of a finding of sleep apnea.
  • the processor can be programmed to determine the presence or absence of sleep apnea as a function of heart rate, heart rate variability, the location of data points on the RR i vs. RR i+1 plot, the complexity of the heart rate signal, and the activity level of the subject.
  • FIG. 20 shows an example screenshot 2000 on a mobile phone 2002 of a wearer that displays qualities of the wearer's sleep in conjunction with light levels during various times.
  • the wearer slept for 7 hours and 52 minutes total, awoke, 4 times, and has a sleep score of 74.
  • the screenshot also includes two bars: one bar shows times when the wearer had low-quality sleep, and another bar shows the measured light levels during those times. In this way, a correlation is made between the wearer's sleep quality and light levels experienced by the wearer.
  • the screenshot 2000 also includes a link 2004 for the wearer to receive sleeping environment tips that can improve his or her sleep quality.
  • the processor can alert the wearer that he or she may have experienced an irregular sleep pattern.
  • FIG. 21 An example process 2100 of determining information about a characteristic of a subject's sleep is shown in FIG. 21 .
  • a machine such as a processor, that receives information from the motion sensor 105 and the optical sensors 110 of the device 100 can perform one or more steps of the process 2100 .
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed ( 2102 ).
  • Data in a second dataset that represents time-varying information about motion of the subject can also be processed ( 2104 ).
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • the information about at least one pulse pressure wave propagating through blood in the subject can include photoplethysmographic (PPG) data, and the information about motion of the subject can include one or both of motioncardiogram (MoCG) data and gross motion data.
  • PPG photoplethysmographic
  • MoCG motioncardiogram
  • information about a characteristic of the subject's sleep can be determined ( 2106 ).
  • the characteristic can include a quality of the sleep of the subject.
  • the quality of the sleep of the subject can include one or more of a sleep duration, a sleep latency, a sleep staging, latency to sleep, a number of disturbances, and a number of tosses and turns.
  • the characteristic of the subject's sleep can also include sleep apnea.
  • the processor can also be programmed to perform various fitness applications that allow the wearer to monitor his or her fitness level.
  • the processor can be programmed to analyze the accelerometer data over a given period of time (e.g., 15 minutes) to determine the total number of steps taken by the wearer during that time.
  • the processor is programmed to look for rhythm/cadence to detect walking as opposed to other ordinary motion, such as hand motions and vibrations.
  • the absolute value of the accelerometer data will typically be higher during periods of walking that during periods of most other daily activities.
  • the processor can calculate calories burned over a given period of time by analyzing the activity level of the wearer and/or the heart rate of the user. Using both the activity level and the heart rate to determine calories burned can lead to a more accurate estimation of caloric output.
  • the processor is programmed to provide a fitness score based on certain fitness-related parameters, such as resting heart rate. The more fit an individual is, the lower his or her baseline HR will be. Thus, in some cases, the processor is programmed to determine a fitness score based on the average heart rate of the wearer during sleep periods or periods of inactivity. Additionally, the speed of heart rate recovery can be a strong indicator of a person's fitness level. For example, the more fit an individual is, the faster his or her heart rate returns to the baseline after exercising. Similarly the more fit an individual is, the longer it takes for his or her heart rate to increase during exercise.
  • the processor is programmed to determine an individual's fitness score based on the amount of time that it takes for the individual's heart rate to reach a maximum during exercise and the amount of time that it takes for his or her heart rate to return to the baseline after exercise.
  • the processor can cause the device to automatically display the fitness score when the wearer is determined to be in the fitness state.
  • the fitness score may be displayed when the wearer starts to go for a run, and may be displayed throughout the run.
  • the fitness score may be displayed when the wearer transitions from a fitness state to a non-fitness state.
  • the fitness score may be displayed when the wearer finishes a run.
  • the device can determine when the wearer is in the fitness state based on the gross motion data and the vitals of the wearer, such as the wearer's heart rate. Based on characteristics related to the wearer's fitness, information can be provided to the wearer to assist the wearer in improving his or her fitness score.
  • FIG. 22 shows an example screenshot 2200 displaying a fitness score on a mobile phone 2202 of a wearer.
  • the information on the screenshot indicates that the wearer has improved his or her fitness score by two points.
  • the screenshot also provides the wearer with updated personalized training zones.
  • the personalized training zones represent the heart rate that the wearer should strive to achieve under various exercise conditions. For example, if the wearer is performing extreme exercise, he or she should strive to have a heart rate of more than 151 beats per minute.
  • the wearer can be provided with a recommended fitness routine. For example, it may be determined that the wearer has trouble completing a three-mile run, as indicated by an abnormally high heart rate during the run.
  • the recommended fitness schedule may suggest that the wearer run one mile twice a week for a week in order to improve his or her fitness, thereby allowing the wearer to work up to a fitness level appropriate for safely completing a three-mile run.
  • the information for assisting the wearer can be provided on the display of the device or on a separate device, such as a mobile phone of the wearer.
  • the device may have access to other users' vital information and fitness scores, such that a wearer of the device can compare his or her fitness score to those of other people.
  • a wearer of the device may want to follow the same training regimen as the one that the professional athlete follows. However, following the same training regimen does not necessarily produce the same results.
  • a wearer of the device may follow the same training regimen as a professional athlete, but he may not exhibit the same level of effort as the professional athlete.
  • the device can determine the degree of similarity between the wearer's training level and the professional athlete's training level.
  • vital information of a professional athlete from when the athlete performed or is performing a particular training routine is presented to the wearer while the wearer performs the same training routine.
  • a video showing the athlete performing the training routine can include a visual indication of the athlete's BP, HR, and respiratory rate over the course of the training routine.
  • the wearer can determine whether he or she is experiencing a similar BP, HR, and respiratory rate as the athlete, thereby indicating whether the wearer is training with the same intensity as the athlete.
  • the video may be configured to interact with the device such that the video encourages the wearer to try harder if the wearer's intensity is below that of the athlete.
  • the device can continue to monitor the BP, HR, and respiratory rate of the wearer to determine whether the wearer is physically recovering as well as the athlete.
  • the vital information of the professional athlete can be used to determine the athlete's physical state at particular times during competition.
  • the athlete's vital information can represent how the athlete physically feels while completing the last 20 meters of a 100 meter dash, or while catching a game-winning touchdown as time expires. A wearer may desire to recreate this feeling for himself or herself.
  • the device is configured to assist the wearer in recreating similar competition situations.
  • the athlete's vital information may indicate that a wide receiver had a particular BP, HR, and respiratory rate while catching a game-winning touchdown in a championship game.
  • the particular BP, HR, and respiratory rate may be significantly higher than they typically would be due to the intensity and importance of the game situation.
  • a wearer In order to recreate the situation, a wearer cannot simply go to a local football field and catch a pass from a friend because the wearer would not be in the same physical state that the wide receiver was in at the time of the catch. Rather, the user needs to match the wide receiver's BP, HR, and respiratory rate before recreating the catch.
  • the wearer may perform various actions or activities to artificially match the wide receiver's vitals (e.g., running, listening to loud or exciting music, etc.).
  • the device can alert the wearer. At that point, the wearer can recreate the game situation with improved accuracy.
  • the wearer can recreate the game situation with the aid of a virtual reality device, such as a stereoscopic device that creates a computer-simulated environment.
  • a virtual reality device such as a stereoscopic device that creates a computer-simulated environment.
  • the stereoscopic device can be used to aid the wearer in artificially matching his or her vitals with the athlete's by presenting to the wearer the same visuals and sounds that the athlete experienced before the game situation.
  • the stereoscopic device can also be used to recreate the particular game situation or play. That is, rather than catching a real football from a real person, the stereoscopic device can display visuals that simulate the action of catching the game-winning touchdown.
  • a person in a real combat situation typically exhibits increases in BP, HR, and respiratory rate due to the danger of the situation. Training for these situations does not involve the same risk of danger. Thus, such training is typically not performed under the same physical conditions. That is, a trainee does not have the same BP, HR, and respiratory rate that he would otherwise have in a real combat situation.
  • a person's vital information can be used to determine the person's physical state at particular times during a real combat situation. For example, a Navy SEAL may exhibit a particular BP, HR, and respiratory rate while performing a raid of a terrorist hideout. A trainee who is wearing the device may perform various actions or activities to artificially match the Navy SEAL's vitals. When the trainee has achieved a physical state that matches the Navy SEAL's, the device can alert the trainee, who can then recreate a training scenario with improved accuracy.
  • the processor can also be programmed to analyze the PPG data and the accelerometer data in a way to determine the stress level of the wearer of the device.
  • Heart rate (HR), heart rate variability (HRV), blood pressure (BP), and respiratory rate are all indicators of stress. Specifically, the values of these parameters increase as stress levels increase. Thus, by comparing these values to baseline values of the wearer for associated parameters, the level of stress of the wearer can be estimated.
  • the stress level can, for example, be provided to the wearer as a stress score.
  • the processor can cause the device to automatically display the stress score when the wearer is determined to be in a stress state.
  • the device can determine when the wearer is in a stress state based on the vitals of the wearer, such as the wearer's heart rate, heart rate variability, blood pressure, and respiratory rate. Based on characteristics related to the wearer's stress, information can be provided to the wearer to assist the wearer in improving his or her stress score. In some implementations, the wearer can be provided with a recommended stress-reducing routine.
  • the recommended stress-reducing routine may suggest that the wearer meditate at particular times (e.g., once a day) or adjust his or her daily schedule to minimize circumstances that are generally attributed to stress (e.g., sitting in traffic, working too much, etc.).
  • the information can be provided on the display of the device or on a separate device, such as a mobile phone of the wearer.
  • FIG. 23 shows an example screenshot 2300 on a mobile phone 2302 of a wearer that includes a number of stress moments experienced by the wearer.
  • the wearer has experienced four stress moments on the current day.
  • a graph indicates the number of stress moments that the wearer has experienced throughout the week.
  • the screenshot includes recommendations for the wearer to reduce his or her stress.
  • the screenshot recommends that the wearer plan some rest, relaxation, and/or a meditation session to reduce stress.
  • the screenshot also includes a link 2304 to a 1-minute relax sessions, during which the mobile phone guides the wearer on a relaxation session.
  • FIG. 24 An example process 2400 of deriving information about a psychological state of a subject is shown in FIG. 24 .
  • a machine such as a processor, that receives information from the motion sensor 105 and the optical sensors 110 of the device 100 can perform one or more steps of the process 2400 .
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed ( 2402 ).
  • Data in a second dataset that represents time-varying information about motion of the subject can also be processed ( 2404 ).
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • Information about a psychological state of the subject can be derived from the processed data ( 2406 ).
  • the psychological state of the subject can be a state of stress, a malicious intent, or a state of lying. Relationships between at least some of the processed data and a psychological state of the subject can be inferred.
  • one or more scores can be derived based on data collected by the device 100 .
  • a machine such as a processor, that receives information from the optical sensors 110 of the device 100 can perform one or more steps of the process.
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • Operations of the process can include deriving a score that is associated with a state of a subject.
  • the state of the subject can be one or more of a health state, a sleep metric, a fitness state, and a stress state.
  • Deriving the score can be based on data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in the subject.
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject). Deriving the score can also be based on data in a second dataset that represents time-varying information about motion of the subject.
  • the machine can receive information from the motion sensor 105 of the device 100 .
  • the data produced by the device can be used to assist triage medical personnel in various settings.
  • the device could be worn by military personnel in battle to provide medical personnel with valuable information regarding the vital signs of the military personnel.
  • the devices worn by the military personnel can, for example, be configured to transmit data regarding their vital signs to a central computer manned by medical personnel.
  • the medical personnel can view the vital signs of the various military personnel to prioritize medical care. As a result, the people that most need urgent treatment will receive it first, while those who have less threatening injuries will be attended to later.
  • the devices described herein could be used to assist medical personnel in various other triage settings, such as sites of natural disasters or terrorist attacks.
  • the medical personnel could be provided with a number of devices that could be put on patients in the triage setting as those patients are being assessed.
  • the medical personnel can leave that victim and focus their efforts on victims in more urgent need of medical care. While doing so, the vital signs of those victims who were initially assessed and determined not to require urgent medical care will be monitored and transmitted to a central monitoring station.
  • medical personnel in the area can be directed to that victim to provide the necessary medical care.
  • a machine such as a processor, that receives information from the optical sensors 110 of the device 100 can perform a process for risk assessment.
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • the process can include processing data from a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in the subject.
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • Data in a second dataset that represents time-varying information about motion of the subject can also be processed.
  • the machine can receive information from the motion sensor 105 of the device 100 .
  • the data can be acquired while the subject is in a situation associated with risk. Whether the subject is in a situation associated with risk can be indicated by the data.
  • the risk can be trauma to the subject, and the data can be indicative of the existence of the trauma.
  • the devices described herein could be used to assist medical personnel in a hospital setting. Once a patient is stabilized following triage, he or she is typically monitored based on a provider's standard of care or mandate (e.g., according to an accountable care organization (ACO)). In some implementations, the device can continue to monitor the vital signs of the patient outside of the triage context to ensure that the care that the patient is receiving is appropriate in view of the patient's vitals. A provider's standard of care may require a patient to go through a progression of steps before the patient is deemed to be ready for discharge. The device can monitor the vital signs of the patient during each step of the progression.
  • a provider's standard of care may require a patient to go through a progression of steps before the patient is deemed to be ready for discharge.
  • the device can monitor the vital signs of the patient during each step of the progression.
  • the first step of the progression may involve monitoring the patient's vitals while the patient is resting (e.g., lying down and/or sleeping)
  • the second step of the progression may involve monitoring the patient's vitals while the patient is sitting up in bed
  • the third step of the progression may involve monitoring the patient's vitals while the patient is standing up while being supported
  • the fourth step of the progression may involve monitoring the patient's vitals while the patient is standing up unassisted
  • the fifth step of the progression may involve monitoring the patient's vitals while the patient is walking
  • the device continuously monitors the patient's vitals throughout each of these stages and can present a notification if the vitals indicate that the patient is in a dangerous state (e.g., if the patient is progressing through each step too quickly without giving his or her body a chance to recover). In this way, the device monitors the patient's compliance with the provider's standard of care.
  • the patient's vitals can also serve as an indicator of the quality of care that the patient is receiving at a care facility.
  • the device can monitor the vitals of residents at a nursing home to determine the level of activity that the residents are experiencing. Data from the motion sensor of the device may indicate that the residents typically walk or perform other exercises one hour per day, and data from the ultraviolet light sensor of the device may indicate that the residents typically spend two hours per day outdoors.
  • the monitored vitals can be compared to metrics defined by a health organization (e.g., the American Heart Association) to determine whether the residents are adhering to the organization's recommendations regarding physical activity and other health-related actions.
  • the residents' level of compliance with the organization's recommendations can be used to assess the quality of care at the nursing home.
  • the nursing home may be assigned a quality score based on the monitored vitals and the level of compliance with the organization's recommendations, and multiple nursing homes may be compared and/or ranked according to their quality scores. Similar concepts can also apply in the context of child care.
  • FIG. 25 An example process 2500 of determining a quality of care provided to the one or more subjects by a care facility is shown in FIG. 25 .
  • a machine such as a processor, that receives information from the motion sensor 105 and the optical sensors 110 of the device 100 can perform one or more steps of the process 2500 .
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • data that represents time-varying information about at least one pulse pressure wave propagating through blood in each of one or more subjects can be processed ( 2502 ).
  • Data that represents time-varying information about motion of the one or more subjects can also be processed ( 2504 ).
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • a quality of care provided to the one or more subjects by a care facility that cares for the one or more subjects can be determined ( 2506 ). Determining a quality of care can include determining a level of physical activity experienced by each of the one or more subjects. The level of physical activity can be determined by comparing gross motion data gathered by the motion sensor 105 to a threshold. Data that represents information about an amount of ultraviolet light that each of the one or more subjects has been exposed to over a particular time period can also be processed, and an amount of time that each of the one or more subjects has spent outside can be determined.
  • the devices described herein can also be beneficial to first responders, such as firefighters and police offers. By wearing the devices, the first responders will ensure that their vital signs are monitored before, during, and after any stressful events that they experience to ensure that they receive the help they need. This is illustrated in the example of FIG. 26 , where health parameters of one or more firefighters 2605 on a potentially hazardous mission are obtained via devices 100 worn or carried by the firefighters 2605 .
  • the firefighters' vital signs could be obtained by the devices 100 and transmitted to a central monitoring station (e.g., within a fire truck 2610 , or at a fire station) where the vital signs can be monitored to determine whether the firefighters 2605 are well enough to continue fighting a fire or otherwise responding to an emergency.
  • the devices 100 worn or carried by the firefighters 2605 further include GPS transponders.
  • Such devices are particularly beneficial for situations in which one or more first responders 2605 become incapacitated in a dangerous setting.
  • the device could not only send the firefighter's vital sign data to the central monitoring station to alert someone that the firefighter is in need of medical care, the device could also identify the location of the firefighter 2605 to a rescuer 2620 (possibly via a device 100 ) sent to assist the incapacitated firefighter 2605 , such that the rescuer 2620 knows exactly where to go.
  • the communications about the health parameters of the one or more firefighters 2605 can be sent directly to the central monitoring station, or via a server 2630 .
  • the server 2630 determines that a firefighter's mental/physical state is not suitable for continuing the mission, the server 2630 can send a signal to the firefighter (e.g., via the device 100 , or via another communication device) to alert the firefighter 2605 about the situation. For example, if the health condition of the firefighter deteriorates during the mission (e.g., because of excessive smoke inhalation), a signal can be sent to the device 100 to alert the firefighter to take corrective measures.
  • the device 100 can be configured to communicate with the central monitoring station on the fire truck 2610 .
  • the data from the devices 100 can be transmitted to the server 2630 (possibly via the central monitoring station) for determining whether a firefighter 2605 is safe. The determination can also be made at the central monitoring station.
  • the data from the device 100 may also indicate whether the wearer of the device 100 requires assistance from a rescuer 2620 .
  • the server 2630 and/or the central monitoring station can then alert the firefighter 2605 and/or a rescuer 2620 accordingly.
  • his/her location may also be tracked using information transmitted from the corresponding device.
  • the processor can also be programmed to monitor the alertness of the wearer. This can be particularly advantageous for personnel who perform tasks that require attention and concentration, and could result in serious harm or danger if carried out incorrectly. Examples of such personnel include air traffic controllers, pilots, military truck drivers, tanker drivers, security guards, TSA agents, intelligence analysts, etc.
  • the processor can analyze the respiratory rate, heart rate, blood pressure, and activity level of the wearer. Each of these parameters tends to decrease as a subject falls asleep. Thus, the processor can be programmed to conclude that the wearer's alertness level has dropped to an unacceptable level when one or more of those parameters falls a predetermined amount from the baseline of those parameters.
  • the processor can be programmed so that, upon determining that the wearer's alert level has dropped to an unacceptable level, an alarm (e.g., an audible, visual, or tactile alarm) on the device is activated.
  • an alarm e.g., an audible, visual, or tactile alarm
  • the alarm can raise the alertness level of the wearer and thus reduce risk of harm to the wearer and others.
  • the processor can be configured to communicate with the vehicle or machinery for which the wearer is responsible.
  • the device worn by a truck driver can transmit data regarding his or her alertness level to a controller of the truck.
  • the controller can be configured to disable operation of the truck if the alertness level is below an acceptable threshold. For example, the controller can warn the driver that he or she has a certain period of time to pull the truck over before it is disabled. This will encourage the driver to pull off the road and either get some sleep or otherwise increase his or her alertness level before driving the truck again.
  • the alertness data can be stored in a database for later analysis. Studying the alertness data from a large sampling of personnel in a given industry can help regulatory bodies for those industries to draft safety standards that increase or maximize safety while maintaining productivity.
  • alertness data over a period of time for a particular wearer of the device can be analyzed to determine the overall physical and/or mental state of a given wearer (e.g., as opposed to the instantaneous state of the given user). Such information can be used to detect a trend of regressing physical and/or mental state of the given wearer. For example, although a wearer of the device may exhibit vitals that indicate that he is alert enough to perform a particular task (e.g., fly a plane) at a particular time, the wearer's alertness data over a period of time may indicate that the wearer's general alertness is on the decline. This may be due to the wearer's old age. The device can detect such a trend and alert the wearer and/or an external entity that the wearer should be closely monitored.
  • a process can be configured to acquire data while a subject is in a situation that requires a predetermined amount of alertness of the subject.
  • a machine such as a processor, that receives information from the optical sensors 110 of the device 100 can perform one or more steps of such a process.
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • Operations of the process can include processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in the subject.
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • the operations can also include processing data in a second dataset that represents time-varying information about motion of the subject.
  • the machine can receive information from the motion sensor 105 of the device 100 .
  • the data can be acquired while the subject is in a situation that requires at least a predetermined amount of alertness of the subject.
  • the situation can include one or more of air traffic control, intelligence analysis, vehicle driving, machinery driving, security guarding, baggage screening, and aircraft piloting.
  • the devices described herein can also be used as polygraph devices. Like conventional polygraph devices, the devices described herein gather a baseline for the wearer's vital signs (e.g., respiratory rate, electrical skin impedance, heart rate, heart rate variability, and blood pressure) and those baselines can later be compared to associated vital signs recorded during questioning. Because the devices described herein are wearable, untethered, and non-cumbersome, and thus do not reduce the mobility of the wearer, the individual being tested can be required to wear the device for a specified period of time (e.g., 24 hours) before and after questioning without hindering the normal, everyday activities of the individual.
  • a specified period of time e.g., 24 hours
  • the baselines for the subject's vital signs can be more accurately determined. For example, it is less likely that the subject could artificially adjust his or her vital baselines due to the large amounts of data collected to form those baselines. Therefore, the accuracy of the polygraph test can be increased relative to certain conventional polygraph devices.
  • the accelerometer data can be analyzed to identify movements or lack of movements that may indicate that the subject is lying. It is believed, for example, that individuals freeze for a moment when they are caught doing something wrong. In the case of polygraph examinations, it is believed that a subject will freeze when asked a question about the subject's wrongdoing. Thus, by analyzing the accelerometer data of the device, it is possible to identify those times during questioning that the subject freezes. This information can be used to further assess the truthfulness of the subject's response during that time.
  • the processor can be programmed to analyze the PPG data and the accelerometer data to determine the physical and mental readiness of a subject to perform a certain task.
  • General fatigue and stress which can result in a drop in physical and mental readiness, is generally evidenced by an increase in respiratory rate, heart rate, and blood pressure.
  • the processor can be programmed to analyze the wearer's respiratory rate, heart rate, and blood pressure and to indicate a state of unreadiness if those parameters fall a certain amount below the baseline for those parameters.
  • the processor is programmed to also consider other factors in this readiness assessment, including the quality of the wearer's sleep (e.g., the wearer's sleep score) over a period of time (e.g., 24 hours or 48 hours) leading up to the assessment.
  • the quality of the wearer's sleep e.g., the wearer's sleep score
  • a period of time e.g., 24 hours or 48 hours
  • the determination of readiness of wearers of the device can assist leaders of those wearers with maximizing his or her human resources during taxing situations. For example, military leaders can analyze the data of soldiers in their units to determine which of those soldiers is most physically and mentally able to successfully carry out a mission and can staff the mission accordingly. Similarly, coaches may analyze the data of their team members to determine which of those athletes are best physically and mentally fit to play at their top level at any given time during a competition and can use those players that are able to perform at their top level.
  • the physical and mental readiness of a subject can be used by the device to predict a winner of a competition. For example, by analyzing vital signs (e.g., BP, HR, respiratory rate) of the contestant before and during a track race, a change in physical and mental readiness can be inferred.
  • the device can also consider information such as the force exerted against the ground by the contestant and the velocity of the contestant at various points during the race to determine a likelihood that the contestant will win the race.
  • the contestant's device can also consider similar information related to other contestants in determining the likelihood that the contestant will win the race.
  • the device may determine that a first contestant got off to a quicker start than a second contestant in a 100 meter dash based on collected motion data. Historical data may indicate that the contestant who is “first out of the blocks” has a 65% chance of winning the race. Thus, the device can predict the winner of the race within milliseconds of the start of the race.
  • the device can monitor a contestant's performance at an infinite number of intervals while correlating the contestant's performance to the measured vitals.
  • a contestant typically keeps track of his lap times for each of the four laps.
  • the contestant does not typically have access to more detailed data, such as his or her performance over the first 100 meters, the last 100 meters, at various points in the middle of the race, etc.
  • the device can be configured to keep track of the contestant's performance at any time or range of times during the race, and can also correlate the contestant's performance to the vitals measured by the device. For example, the contestant may complete the first lap of the mile in 50 seconds, putting him or her on pace to easily break the world record.
  • the device may determine that the contestant has a BP, HR, and respiratory rate significantly higher than what would typically be seen in someone who has only completed 25% of the race, and thus determine that the contestant likely will not win the race. By exhibiting so much effort early in the race, the contestant burns out and finishes the race with a mediocre time.
  • the contestant can use the performance data and the measured vitals to improve his or her training in the future. For example, the next time the contestant runs a mile, the device may detect that the contestant is exhibiting too much effort early in the race by measuring a high BP, HR, and respiratory rate. The device can be configured to notify the contestant to reserve energy in order to optimize his or her performance.
  • the device can be used to monitor the performance of an entire team of individuals wearing the device. For example, the collective physical and mental readiness of a football team, as well as motion sensor data and information related to other factors, can be used to determine whether the football team is performing to its potential.
  • Information related to the vitals of a first team such as the team's collective BP, HR, and respiratory rate, may indicate that the first team is exhibiting a large amount of effort.
  • Information related to the vitals of a second team may indicate that the second team is exhibiting minimal effort.
  • the second team is winning the football game against the first team, indicating that the first team may have inferior technique or coaching.
  • Information related to a team's vitals can also be used to ensure that the team does not exhibit too much effort early in the season, thereby making it susceptible to “burning out” towards the end of the season.
  • FIG. 27 An example process 2700 of providing information to a user that reports relative states of subjects is shown in FIG. 27 .
  • a machine such as a processor, that receives information from the motion sensor 105 and the optical sensors 110 of the device 100 can perform one or more steps of the process 2700 .
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • data that represents time-varying information about at least one pulse pressure wave propagating through blood in each of two or more subjects can be processed ( 2702 ).
  • Data that represents time-varying information about motion of the two or more subjects can also be processed ( 2704 ).
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • the Information can be provided to a user that reports relative states of the subjects ( 2706 ).
  • the information can be based on the processed data.
  • the relative states of the subjects can include one or more of relative psychological states, relative physical states, and relative states of readiness.
  • the subjects can be put into an athletic contest or assigned a particular combat task according to the relative states of the subjects.
  • the processor can be programmed to analyze the vital signs of multiple users in the moments leading up to a collision. For example, when two players collide during a sporting competition, a large amount of force is absorbed by each player. Force data can be measured by the motion sensor of the device, and the device can determine the magnitude of force absorbed by each player. The device can determine the effect of the force on each player by analyzing the players' vitals (e.g., BP, HR, respiratory rate, body temperature) before, during, and after the collision. The vitals and the force information can be used to determine whether a player has sustained bodily damage due to the impact force. For example, if a player experiences a sudden increase in HR, respiratory rate, and body temperature following a collision, it may be an indication that the player has sustained a concussion.
  • BP BP
  • HR respiratory rate
  • body temperature body temperature
  • a player's bodily reaction to sustaining a concussion is delayed.
  • a player may experience a sudden increase in HR, respiratory rate, and body temperature at some time following a collision, or the player may experience a gradual increase in HR, respiratory rate, and body temperature beginning at the time of the collision.
  • the device can monitor the player's vitals for an extended time following the collision and compare the monitored vital information to vital information of a player who was previously diagnosed with a concussion. In this way, the device can determine vital patterns that are indicative of a person who sustains a concussion. If the device determines that a player has sustained a concussion, the device may be configured to alert the player or a third party.
  • the player may be required to pass a protocol before reentering the game. If the device determines that there is a possibility that the player has sustained a concussion, the device may enter a mode where the player is monitored more closely in order to make a more definitive determination.
  • the devices described herein can also be used as human flight recorders. While accident investigators (e.g., National Transportation Safety Board (NTSB) investigators) have traditionally been limited to analyzing voice recorders and, in some cases, black boxes, after airplane and train crashes, the devices described herein, when worn by the operators of those vehicles, will provide insight into the state of the operator at the time of the crash. For example, by analyzing vital signs of the operator (e.g., the respiratory rate, heart rate, heart rate variability, and blood pressure of the operator) in the moments leading up to the crash, the investigators can learn whether the operator fell asleep, experienced some form of medical emergency, etc. This information is valuable for the investigators to determine whether the crash was the result of the operator's actions as opposed to some other reason, such as mechanical failure.
  • NTSB National Transportation Safety Board
  • information related to the vital signs of the operator as well as information related to the operating characteristics of the car can be used to determine the cause of the accident, the mechanism of injury to the operator, and the impact of the injury to the operator.
  • the mental and/or physiological state of the operator before, during, and/or after the accident can be ascertained.
  • the 60 minutes following a traumatic injury is generally referred to as the “golden hour,” during which there is the highest likelihood that prompt medical treatment will prevent death. It is especially important to quickly gather vital information during this time to assist first responders and doctors in diagnosing and treating the operator.
  • the human flight recorder information can be used by third parties to determine who was at fault in creating the accident. For example, a law enforcement body may analyze the human flight recorder information to determine whether a tort or a crime was committed by an operator. In some implementations, the human flight recorder information can be used to determine an exact time when an event occurred. For example, the information can be used to determine an exact time of death, an exact time when a person went missing (e.g., by being abducted), or an exact time when a person fell down.
  • the data could be analyzed by his or her physician to help diagnose the condition. For example, if a wearer has a heart attack, the data could be analyzed to investigate the variation in the vital signs leading up to the attack. Other data can also be considered, such as the wearer's genetics, epigenetics, diet, exercise practice, and environmental circumstances surrounding the event or condition. This information may be correlated and used to prevent onset of similar conditions in the future, for example, by alerting the user of such a possibility upon detecting similar variations in vital signs.
  • the device is able to determine a “baseline biorhythm” of a wearer based on the wearer's vital signs in various circumstances and environmental environments.
  • the baseline biorhythm is typically unique to each individual.
  • the device is able to detect when the wearer's vital signs are shifting away from the baseline biorhythm. For example, the device may detect that a wearer's biorhythm has gradually shifted over a particular time period, as indicated by variations in the wearer's vital signs.
  • the device may also detect that the wearer has spent minimal time outside over the same time period, as indicated by measurements from the device's ultraviolet light sensor. The device can identify a correlation between the wearer's changed biorhythm and the change in ultraviolet light exposure.
  • the device can identify a correlation between the wearer's changed biorhythm and changes in the weather. For example, the device can consider the wearer's location information in conjunction with weather information from the National Oceanic and Atmospheric Administration to determine the type of weather experienced by the wearer over a particular period of time. The device may identify that the wearer experiences higher BP and HR when the weather is cold and/or rainy and determine that such weather causes increased stress in the wearer.
  • the device can include a temperature sensor for determining the skin temperature of the wearer and an ambient temperature sensor for detecting the ambient temperature.
  • the processor can be programmed to estimate the wearer's core temperature as a function of the measured skin temperature and ambient temperature (e.g., based on the difference between the skin temperature and the ambient temperature).
  • the processor can be programmed to use this data to predict medical conditions before they happen. For example, the heart rate, heart rate variability, and blood pressure of the wearer can be monitored and processed by the processor to make such predictions.
  • a medical event that can be predicted in a subject is tachycardia. Tachycardia is when a subject's heart rate is over 100 beats per minute. If a subject's heart rate is trending upwards, a prediction can be made as to when the subject will experience tachycardia.
  • Other examples of medical events that can be predicted are hypertension and stroke.
  • Hypertension is diagnosed when a subject's blood pressure exceeds 140/90 mmHg. If the increase is rapid, a prediction can be made as to when the subject will have a high likelihood of experiencing a stroke.
  • a subject's blood pressure is decreasing rapidly (e.g., if the rate of change of the blood pressure is negative and below a threshold)
  • a prediction can be made as to whether the subject will have a heart condition.
  • the heart rate variability of the subject is used to predict a medical event, whether the subject experiences arrhythmia (e.g., atrial fibrillation) can determine what an appropriate heart rate variability of the subject is. For example, a subject who experiences arrhythmia may have a high heart rate variability, but this may be normal given the subject's condition.
  • arrhythmia e.g., atrial fibrillation
  • FIG. 28 An example process 2800 of predicting a medical event of a subject is shown in FIG. 28 .
  • a machine such as a processor, that receives information from the optical sensors 110 of the device 100 can perform one or more steps of the process ## 00 .
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed ( 2802 ).
  • Data in a second dataset that represents time-varying information about motion of the subject can also be processed.
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • a medical event of the subject can then be predicted ( 2804 ).
  • the medical event can be predicted based on the processed data.
  • Medical events that can be predicted include tachycardia, hypertension, stroke, and heart condition.
  • the processor can also be programmed to ensure that the wearer of the device is adhering to a prescribed medication regimen. For example, for wearers who are prescribed blood pressure medication, the processor can be programmed to monitor the blood pressure of the wearer and to alert the wearer if, based on the blood pressure data, it appears that the wearer forgot to take his or her medication.
  • the device can be used in this manner to monitor a wearer's adherence to a prescribed medication schedule for any of various other medications that impact the various different vital signs monitored by the device.
  • the processor can also be programmed to determine the effectiveness of a medication. For example, in the context of inhalation medications, it is unknown if generic inhalation medications have the same effectiveness as brand name inhalation medications. One reason for this is that environmental and genetic makeups are generally different between users.
  • the processors can be programmed to monitor the heart rate and the blood oxygenation (SpO 2 ) of wearers of devices who are prescribed generic inhalation medication and wearers of devices who are prescribed name brand inhalation medication.
  • the processors can also consider data related to environment and genetic makeups of the wearers. Data related to the effects of the inhalation medication on the wearers can be used to determine the effectiveness of the generic inhalation medication compared to the effectiveness of the name brand inhalation medication.
  • the device can be used in this manner to monitor the effectiveness of any of various other medications that impact the various different vital signs monitored by the device.
  • the processor can determine a correlation between a particular medication's effectiveness and environmental factors. For example, two wearers of the device who reside in two different extreme environments (e.g., Alaska and Florida) may experience different effects from the particular medication. Differences in the medication's effectiveness may be attributed to the different extreme environments experienced by the wearers. For example, the processor can determine a correlation between the particular medication's effectiveness and the environmental temperature experienced the wearer.
  • two wearers of the device who reside in two different extreme environments (e.g., Alaska and Florida) may experience different effects from the particular medication. Differences in the medication's effectiveness may be attributed to the different extreme environments experienced by the wearers.
  • the processor can determine a correlation between the particular medication's effectiveness and the environmental temperature experienced the wearer.
  • the device may identify a correlation between a particular medication's effectiveness and other environmental factors. For example, differences in a medication's effectiveness between two users may be attributed to the food that people generally eat in a particular region, thereby allowing the device to identify food-drug interaction information related to the medication.
  • the device can be configured to determine an optimal timing and dosage regimen for a particular wearer by monitoring the wearer's vitals while the wearer is under the influence of the medication. For example, a wearer may take a medication to maintain his or her blood pressure below a particular level.
  • the device may determine that the wearer's blood pressure was reduced too much, and recommend that the wearer take a smaller dose the next day. The following day, the wearer may take the dosage amount recommended by the device. The device may determine that the wearer's blood pressure was reduced to the ideal level, but that the wearer may need to take a second small dose of the medication to maintain his or her blood pressure at the ideal level over the course of the day. In this way, the device can continuously refine the wearer's dosage regimen to be custom tailored to the wearer. The device can be used in this manner to determine an optimal dosage regimen for any of various other medications that impact the various different vital signs monitored by the device as described herein.
  • the processor can determine an optimal time for a wearer of the device to take a medication. For example, a doctor typically tell a patient to take particular medications at particular times of the day or under particular circumstances (e.g., in the morning, in the evening, with food, etc.). Such blanket directions do not typically apply to all patients under all circumstances.
  • the processor can monitor the vital signs of the wearer of the device to determine the optimal time for the wearer to take the medication under the current circumstances.
  • the processor can consider characteristics of the particular medication when making the determination.
  • the wearer of the device may take a medication that has a tendency to cause the wearer to be energetic.
  • a doctor may suggest that the medication be taken no later than 3:00 pm to prevent disruption of the wearer's sleep.
  • the processor may determine that the wearer is more energized than usual.
  • the processor may recommend that the wearer take the medication earlier than usual to prevent the wearer from becoming too energized and having his sleep disrupted later.
  • FIG. 29 An example process 2900 of providing information about a medication regimen of a subject is shown in FIG. 29 .
  • a machine such as a processor, that receives information from the optical sensors 110 of the device 100 can perform one or more steps of the process 2900 .
  • the machine can include the computing device 115 described above with reference to FIG. 1B .
  • data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed ( 2902 ).
  • Data in a second dataset that represents time-varying information about motion of the subject can also be processed.
  • the data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject).
  • Information about a medication regimen of the subject can then be provided ( 2904 ). Based on the data, a determination can be made that the subject has potentially missed a dose of a medication, and a notification can be provided to the subject indicating such. Based on the data, a reaction of the subject to a medication can be determined, and a recommended medication regimen of the medication can be provided to the subject based on the reaction to the medication.
  • the recommended medication regimen can include one or more recommended dosage timings and one or more recommended dosage amounts, each of which corresponds to one of the dosage timings.
  • the device 100 can be configured to communicate with other computing devices.
  • the device 100 can include a transceiver module that can send data to, and receive data from, a server computer.
  • the device 100 can be configured to act as a client within a client-server architecture.
  • the server computer can be configured to receive and store data provided by the device 100 and share the data with other computing devices. This is illustrated in FIG. 30 , which shows an example where, a hospital, nursing home, or elder-care center uses a server computer (or another central computer acting as a hub) 3030 that is configured to receive communications from devices 100 worn by patients or residents 3005 .

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Cardiology (AREA)
  • Physiology (AREA)
  • Pulmonology (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Psychiatry (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Hematology (AREA)
  • Computer Security & Cryptography (AREA)
  • General Engineering & Computer Science (AREA)
  • Vascular Medicine (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Signal Processing (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Developmental Disabilities (AREA)
  • Social Psychology (AREA)
  • Psychology (AREA)
  • Hospice & Palliative Care (AREA)
  • Educational Technology (AREA)

Abstract

The technology described in this document is embodied in a method that includes obtaining a first data set representing time-varying information on at least one pulse pressure wave at a first body part of a subject. The method also includes obtaining a second data set representing time-varying information about motion of the subject at the first body part of a subject. The method also includes identifying a first point in the first data set, the first point representing an arrival time of the pulse pressure wave at the first body part, and identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject. The method also includes computing a pulse transit time (PTT) as a difference between the first and second points.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application claims priority to U.S. Provisional Application 61/894,884, filed on Oct. 23, 2013, and U.S. Provisional Application No. 62/002,531, filed on May 23, 2014, the entire content of which is incorporated herein by reference.
  • TECHNICAL FIELD
  • This document describes technology related to consumer biometric devices.
  • BACKGROUND
  • Various types of sensors can be used for sensing biometric parameters.
  • SUMMARY
  • In one aspect, a method includes obtaining, using a first sensor, a first data set representing time-varying information on at least one pulse pressure wave within vasculature at a first body part of a subject. The method also includes obtaining, using a second sensor, a second data set representing time-varying information about motion of the subject at the first body part of a subject. The method also includes identifying, using one or more processors, a first point in the first data set, the first point representing an arrival time of the pulse pressure wave at the first body part. The method also includes identifying, using the one or more processors, a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject. The method also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • In another aspect, one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including obtaining a first data set representing time-varying information on at least one pulse pressure wave within vasculature at a first body part of a subject. The operations also include obtaining a second data set representing time-varying information about motion of the subject at the first body part of a subject. The operations also include identifying a first point in the first data set. The first point represents an arrival time of the pulse pressure wave at the first body part. The operations also include identifying a second point in the second dataset. The second point represents an earlier time at which the pulse pressure wave traverses a second body part of the subject. The operations also include computing a pulse transit time (PTT) as a difference between the first and second points. The PTT represents a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • In another aspect, a biofeedback device configured to be worn by a subject includes a first sensor configured to obtain a first data set representing time-varying information on at least one pulse pressure wave within vasculature at a first body part of a subject. The device also includes a second sensor configured to obtain a second data set representing time-varying information about motion of the subject at the first body part of a subject. The device also includes memory. The device also includes one or more processors. The one or more processors are configured to receive the first and second data sets. The one or more processors are also configured to identify a first point in the first data set, the first point representing an arrival time of the pulse pressure wave at the first body part. The one or more processors are also configured to identify a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject. The one or more processors are also configured to compute a pulse transit time (PTT) as a difference between the first and second points. The PTT represents a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • Implementations can include one or more of the following features.
  • In some implementations, the information about the at least one pulse pressure wave includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • In some implementations, data including at least one of the first data set and the second data set is acquired continuously.
  • In some implementations, the data is acquired at a frequency of at least 16 Hz.
  • In some implementations, the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • In some implementations, the data is acquired by a device worn by the subject.
  • In some implementations, the device is mobile and does not reduce a mobility of the subject.
  • In some implementations, the device processes the data.
  • In some implementations, the first body part is an arm of the subject.
  • In some implementations, the first body part is a wrist of the subject.
  • In some implementations, the first sensor includes an optical sensor and the second sensor includes an accelerometer or a gyroscope.
  • In some implementations, identifying the first point includes computing, by the one or more processors, a cross-correlation of a template segment with each of multiple segments of the first dataset. Identifying the first point also includes identifying, based on the computed cross-correlations, at least one candidate segment of the first dataset as including the first point. Identifying the first point also includes identifying, by the one or more processors, a first feature within the identified candidate segment as the first point.
  • In some implementations, identifying the second point includes determining a reference point in the second data set, the reference point corresponding to substantially the same point in time as the first point in the first data set. Identifying the second point also includes identifying one or more target features within a predetermined time range relative to the reference point. Identifying the second point also includes selecting a time point corresponding to one of the target features as the second point.
  • In some implementations, the target features includes at least one of a peak and a valley.
  • In some implementations, the method also includes computing a blood pressure of the subject as a function of the PTT.
  • In some implementations, the blood pressure includes a systolic pressure and a diastolic pressure.
  • In some implementations, a diastolic pressure is calculated as a linear function of the logarithm of the PTT.
  • In some implementations, a systolic pressure is calculated as a linear function of the diastolic pressure.
  • In some implementations, the pre-determined time range is associated with the systole portion of the subject's heartbeat.
  • In some implementations, the method also includes accepting user-input for initiating computation of the PTT.
  • In some implementations, the method also includes computing arterial stiffness as a function of the PTT.
  • In some implementations, the device also includes a mechanism that allows the device to be worn by the subject.
  • In some implementations, the mechanism does not reduce a mobility of the subject.
  • In some implementations, the one or more processors are also configured to compute a blood pressure of the subject as a function of the PTT.
  • In some implementations, the device also includes an input mechanism configured to accept user-input for initiating computation of the PTT.
  • In some implementations, the one or more processors are also configured to compute arterial stiffness as a function of the PTT.
  • In another aspect, a method includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject. The method also includes detecting arrhythmia of the subject based on the data.
  • In another aspect, one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject. The operations also include detecting arrhythmia of the subject based on the data.
  • In another aspect, a biofeedback device configured to be worn by a subject includes a light source configured to emit light toward the skin of the subject. The device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject. The optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject. The device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject. The device also includes a processor configured to receive data from one or more of the light-emitting element, the optical sensor, and the motion sensor. The processor is also configured to detect arrhythmia of the subject based on the data.
  • Implementations can include one or more of the following features.
  • In some implementations, the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • In some implementations, the data is acquired continuously.
  • In some implementations, the data is acquired at a frequency of at least 16 Hz.
  • In some implementations, the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • In some implementations, the data is acquired at a single location of the subject.
  • In some implementations, the data is acquired by a device worn by the subject.
  • In some implementations, the device is mobile and does not reduce a mobility of the subject.
  • In some implementations, the device processes the data.
  • In some implementations, the single location is an arm of the subject.
  • In some implementations, the single location is a wrist of the subject.
  • In some implementations, the arrhythmia includes atrial fibrillation (AFIB).
  • In some implementations, the arrhythmia includes atrial flutter.
  • In some implementations, the method also includes identifying, based on gross motion data of the subject, one or more period of high activity of the subject.
  • In some implementations, the data that the arrhythmia detection is based on does not include data collected during the one or more periods of high activity.
  • In some implementations, the data that the arrhythmia detection is based on includes data collected during the one or more periods of high activity.
  • In some implementations, processing the data includes plotting R wave to R wave intervals (RRi) versus next consecutive R wave to R wave intervals (RRi+1).
  • In some implementations, processing the data includes determining whether a spread of plotted data points exceeds a predetermined spread value.
  • In some implementations, the method also includes determining that the subject experienced atrial fibrillation (AFIB) if the spread of the plotted data points exceeds the predetermined spread value.
  • In some implementations, processing the data includes determining whether multiple clusters of plotted data points are offset from a diagonal.
  • In some implementations, the method also includes determining that the subject experienced atrial flutter if there are multiple clusters of plotted data points offset from the diagonal.
  • In some implementations, processing the data includes determining one or more of heart rate, heart rate variability, and blood pressure of the subject.
  • In some implementations, determining the heart rate of the subject includes calculating a distance between two consecutive reference points in the first dataset, the distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • In some implementations, the reference points are local maxima or local minima.
  • In some implementations, the reference points are peaks or valleys.
  • In some implementations, determining the heart rate variability of the subject includes calculating distances between multiple pairs of consecutive reference points in the first dataset, each distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • In some implementations, atrial fibrillation is detected if the heart rate variability of the subject crosses a threshold.
  • In some implementations, determining the blood pressure of the subject includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject. Determining the blood pressure of the subject also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject. Determining the blood pressure of the subject also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject, wherein the PTT is related to an elasticity of one or more blood vessels of the subject. Determining the blood pressure of the subject also includes determining the blood pressure of the subject based on the elasticity of the one or more blood vessels.
  • In some implementations, the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • In some implementations, processing the data includes plotting R wave to R wave intervals (RRi) versus next consecutive R wave to R wave intervals (RRi+1).
  • In some implementations, processing the data includes determining whether a spread of plotted data points exceeds a predetermined spread value.
  • In some implementations, the processor is also configured to determine that the subject experienced atrial fibrillation (AFIB) if the spread of the plotted data points exceeds the predetermined spread value.
  • In some implementations, processing the data includes determining whether multiple clusters of plotted data points are offset from a diagonal.
  • In some implementations, the processor is also configured to determine that the subject experienced atrial flutter if there are multiple clusters of plotted data points offset from the diagonal.
  • In another aspect, a method includes processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in each of one or more subjects acquired at a location of each of the subjects. The method also includes processing data that represents time-varying information about motion of the one or more subjects acquired at the location of each of the subjects. The method also includes determining, based on the data, a quality of care provided to the one or more subjects by a care facility that cares for the one or more subjects.
  • In another aspect, one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in each of one or more subjects acquired at a location of each of the subjects. The operations also include processing data that represents time-varying information about motion of the one or more subjects acquired at the location of each of the subjects. The operations also include determining, based on the data, a quality of care provided to the one or more subjects by a care facility that cares for the one or more subjects.
  • In another aspect, a biofeedback device configured to be worn by one or more subjects includes a light source configured to emit light toward the skin of the subject. The device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject. The optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject. The device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject. The device also includes a processor configured to receive data from one or more of the light-emitting element, the optical sensor, and the motion sensor. The processor is also configured to determine, based on the data, a quality of care provided to one or more subjects by a care facility that cares for the one or more subjects.
  • Implementations can include one or more of the following features.
  • In some implementations, the information about at least one pulse pressure wave propagating through blood in the subjects includes photoplethysmographic (PPG) data and the information about motion of the subjects includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • In some implementations, the data is acquired continuously.
  • In some implementations, the data is acquired at a frequency of at least 16 Hz.
  • In some implementations, the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • In some implementations, the data is acquired at single locations of each of the subjects.
  • In some implementations, the data is acquired by devices worn by the subjects.
  • In some implementations, the devices are mobile and do not reduce mobility of the subjects.
  • In some implementations, the devices process the data.
  • In some implementations, the single location of each of the subjects is an arm of the subject.
  • In some implementations, the single location is a wrist of the subject.
  • In some implementations, determining a quality of care provided to the one or more subjects includes determining a level of physical activity experienced by each of the one or more subjects by comparing gross motion data of each subject to a threshold value.
  • In some implementations, the threshold is based on a metric defined by a health organization.
  • In some implementations, the level of physical activity includes an amount of time that each subject has exercised over a particular time period.
  • In some implementations, the level of physical activity includes an amount of time or a distance that each subject has walked over a particular time period.
  • In some implementations, the method also includes processing data that represents information about an amount of ultraviolet light that each of the one or more subjects has been exposed to over a particular time period.
  • In some implementations, the method also includes determining an amount of time that each of the one or more subjects has spent outside over the particular time period based on the information about the ultraviolet light.
  • In some implementations, the method also includes comparing the quality of care provided by the care facility to a quality of care provided by another care facility that cares for one or more other subjects.
  • In some implementations, the device also includes an ultraviolet light sensor configured to measure levels of ultraviolet light that each of the one or more subjects is exposed to over a particular time period.
  • In some implementations, the processor is also configured to process data that represents information about the levels of ultraviolet light that each of the one or more subjects is exposed to over the particular time period.
  • In some implementations, the processor is also configured to determine an amount of time that each of the one or more subjects has spent outside over the particular time period based on the information about the levels of ultraviolet light.
  • In some implementations, determining the quality of care provided to the one or more subjects includes determining a level of physical activity experienced by each of the one or more subjects by comparing gross motion data of each subject to a threshold value.
  • In another aspect, a method includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The data is acquired while the subject is in a situation associated with risk indicated by the data.
  • In another aspect, one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The data is acquired while the subject is in a situation associated with risk indicated by the data.
  • In another aspect, a biofeedback device configured to be worn by a subject includes a light source configured to emit light toward the skin of the subject. The device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject. The optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject. The device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor. The processor is also configured to process the data to determine whether the subject is in a situation associated with risk and to derive a measure of a level of risk associated with the subject.
  • Implementations can include one or more of the following features.
  • In some implementations, the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • In some implementations, the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • In some implementations, the data is acquired continuously.
  • In some implementations, the data is acquired at a frequency of at least 16 Hz.
  • In some implementations, the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • In some implementations, the data is acquired at a single location of the subject.
  • In some implementations, the data is acquired by a device worn by the subject.
  • In some implementations, the device is mobile and does not reduce a mobility of the subject.
  • In some implementations, the device processes the data.
  • In some implementations, the single location is an arm of the subject.
  • In some implementations, the single location is a wrist of the subject.
  • In some implementations, the method also includes using the processed data to derive a measure of a level of risk associated with the subject.
  • In some implementations, the method also includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject. The method also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject. The method also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • In some implementations, the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • In some implementations, the method also includes determining a blood pressure of the subject based on the PTT.
  • In some implementations, the risk includes trauma to the subject and the data is indicative of the existence of the trauma.
  • In some implementations, the method also includes providing the processed data to a party that is responding to the trauma.
  • In some implementations, the processed data is transmitted from a device worn by the subject to a remote device.
  • In some implementations, the remote device is a server associated with an emergency service provider.
  • In some implementations, the processed data is provided to the party before the party has reached the subject.
  • In some implementations, the method also includes processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in additional subjects acquired at a location of each of the subjects. The method also includes processing data that represents time-varying information about motion of the additional subjects acquired at the location of each of the subjects. The data is acquired while the additional subjects are in the situation associated with the risk, and the risk includes trauma.
  • In some implementations, the method also includes providing the processed data for the subject and the additional subjects to a party that is responding to the trauma, before the party has reached the subjects.
  • In some implementations, the processed data is transmitted from devices worn by the subjects to a remote device.
  • In some implementations, the remote device is a server associated with an emergency service provider.
  • In some implementations, the method also includes providing information to the party that enables the party to assess a level of risk associated with each of the subjects before the party has reached the subjects.
  • In some implementations, the method also includes providing the processed data to a medical facility to which the subject is taken for medical care.
  • In some implementations, the risk includes trauma.
  • In some implementations, providing the processed data to a medical facility includes providing the processed data to an urgent care division of the medical facility.
  • In some implementations, the information is provided to the urgent care division before the subject is treated by the urgent care division.
  • In some implementations, the method also includes processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in additional subjects acquired at a location of each of the subjects. The method also includes processing data that represents time-varying information about motion of the additional subjects acquired at the location of each of the subjects. The data is acquired while the additional subjects are in the situation associated with the risk.
  • In some implementations, providing the processed data to a medical facility includes providing the processed data to an urgent care division of the medical facility.
  • In some implementations, the information is provided to the urgent care division before one or more of the subjects are treated by the urgent care division.
  • In some implementations, the subjects are treated in an order that is based on a severity of an injury.
  • In some implementations, relatively more severely injured subjects are treated before relatively less severely injured subjects.
  • In some implementations, the processed data is used to determine the subject's compliance with a particular standard of care throughout a progression of steps of the standard of care.
  • In some implementations, the processed data is used to determine whether the subject is receiving care that is appropriate according to a particular standard of care.
  • In some implementations, the data is processed after the subject is in the situation associated with risk.
  • In some implementations, the processing of the data occurs after the data has been acquired and with a short enough delay to enable an effect of the risk to be resolved.
  • In some implementations, the situation includes firefighting.
  • In some implementations, the situation includes a natural disaster or a sudden act of violence.
  • In some implementations, the risk includes one or more of heart failure, emotional stress, abnormal skin temperature, abnormal body temperature, hypertension, heart attack, stroke, arrhythmia, exhaustion, and anxiety.
  • In some implementations, the method also includes determining one or more of a blood pressure, a skin temperature, a body temperature, a heart rate, and a heart rate variability of the subject based on the datasets. The method also includes detecting emotional stress in the subject by determining whether one or more of the determined blood pressure, heart rate, and heart rate variability of the subject is a predetermined amount above a threshold.
  • In some implementations, the data indicates that the subject is about to experience an effect of one of the risks.
  • In some implementations, the risk includes overexposure of the subject to ultraviolet light.
  • In some implementations, the method also includes processing data that represents information about an amount of ultraviolet light that the subject has been exposed to.
  • In some implementations, the method also includes comparing the amount of ultraviolet light that the subject has been exposed to a threshold to determine whether the subject has been overexposed to ultraviolet light.
  • In some implementations, the method also includes alerting the subject if the subject has been overexposed to ultraviolet light.
  • In some implementations, the risk includes trauma to the subject and the data is indicative of the existence of the trauma.
  • In some implementations, the operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • In some implementations, the device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject. The processor is also configured to receive and process the data from the motion sensor.
  • In some implementations, the processor is also configured to cause the biofeedback device to provide the processed data to a party that is responding to the trauma.
  • In some implementations, the processor is also configured to cause the biofeedback device to provide the processed data to a remote device.
  • In some implementations, the remote device is a server associated with an emergency service provider.
  • In some implementations, the processor is also configured to cause the biofeedback device to provide the processed data to a medical facility to which the subject is taken for medical care.
  • In some implementation, the device also includes a transceiver configured to provide the processed data.
  • In some implementations, the processed data is used to determine the subject's compliance with a particular standard of care throughout a progression of steps of the standard of care.
  • In some implementations, the processed data is used to determine whether the subject is receiving care that is appropriate according to a particular standard of care.
  • In some implementations, the risk includes overexposure of the subject to ultraviolet light.
  • In some implementations, the device also includes an ultraviolet light sensor configured to measure an amount of ultraviolet light that the subject is exposed to.
  • In some implementations, the processor is also configured to process data that represents information about the amount of ultraviolet light that the subject is exposed to.
  • In some implementations, the processor is also configured to compare the amount of ultraviolet light that the subject is exposed to a threshold to determine whether the subject has been overexposed to ultraviolet light.
  • In some implementations, the device is also configured to alert the subject if the subject has been overexposed to ultraviolet light.
  • In another aspect, a method includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The method also includes providing information related to the data to a remote device.
  • In another aspect, a system includes a remote device and a biofeedback device configured to be worn by a subject. The biofeedback device includes a light source configured to emit light toward the skin of the subject. The biofeedback device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject. The optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject. The biofeedback device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor. The processor is also configured to provide information related to the data to a remote device.
  • In another aspect, one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The operations also include providing information related to the data to a remote device.
  • In another aspect, a biofeedback device configured to be worn by a subject includes a light source configured to emit light toward the skin of the subject. The biofeedback device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject. The optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject. The biofeedback device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor. The processor is also configured to provide information related to the data to a remote device.
  • Implementations can include one or more of the following features.
  • In some implementations, the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • In some implementations, the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • In some implementations, the data is acquired continuously.
  • In some implementations, the data is acquired at a frequency of at least 16 Hz.
  • In some implementations, the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • In some implementations, the data is acquired at a single location of the subject.
  • In some implementations, the data is acquired by a device worn by the subject.
  • In some implementations, the device is mobile and does not reduce a mobility of the subject.
  • In some implementations, the device processes the data.
  • In some implementations, the single location is an arm of the subject.
  • In some implementations, the single location is a wrist of the subject.
  • In some implementations, the remote device is a server.
  • In some implementations, the method also includes determining, based on the data in the first and second datasets, that the subject is experiencing or has experienced a health-related problem.
  • In some implementations, the method also includes causing the remote device to alert one or both of a caregiver and the subject that the subject is experiencing or has experienced a health-related problem.
  • In some implementations, the method also includes causing the remote device to alert the subject that the subject is experiencing a health-related problem.
  • In some implementations, the remote device sends an alert to a device worn by the subject that acquires the data.
  • In some implementations, the remote device sends an alert to a mobile phone of the subject.
  • In some implementations, determining that the subject is experiencing or has experienced a health-related problem includes determining whether a blood pressure of the subject satisfies a threshold.
  • In some implementations, wherein the health-related problem is hypertension.
  • In some implementations, determining that the subject is experiencing or has experienced a health-related problem includes determining a rate of change of a blood pressure of the subject.
  • In some implementations, the medical event is a stroke, and the subject is determined to be having a stroke if the rate of change of the blood pressure of the subject is positive and above a threshold.
  • In some implementations, the medical event is abnormal heart function, and the subject is determined to be experiencing abnormal heart function if the rate of change of the blood pressure of the subject is negative and below a threshold.
  • In some implementations, the method also includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject. The method also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject. The method also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • In some implementations, the blood pressure of the subject is determined based on the PTT.
  • In some implementations, the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • In some implementations, determining that the subject is experiencing a health-related problem includes determining whether a heart rate of the subject satisfies a threshold.
  • In some implementations, the health-related problem is tachycardia.
  • In some implementations, determining the heart rate of the subject includes calculating a distance between two consecutive reference points in the first dataset, the distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • In some implementations, the reference points are local maxima or local minima.
  • In some implementations, the reference points are peaks or valleys in the first dataset.
  • In some implementations, determining that the subject is experiencing a health-related problem includes determining whether a heart rate variability of the subject satisfies a threshold.
  • In some implementations, the threshold is based on whether the subject experiences arrhythmia.
  • In some implementations, determining the heart rate variability of the subject includes calculating distances between multiple pairs of consecutive reference points in the first dataset, each distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • In some implementations, the reference points are local maxima or local minima.
  • In some implementations, the reference points are peaks or valleys.
  • In some implementations, determining that the subject has experienced a health-related problem includes determining whether the subject has sustained an impact of a magnitude that satisfies a threshold.
  • In some implementations, determining the magnitude of the impact includes analyzing gross motion data of the subject at the time of the impact.
  • In some implementations, the health-related problem is a concussion.
  • In some implementations, the method also includes determining, based on the data in the first and second datasets, that the subject is about to experience a health-related problem.
  • In some implementations, the method also includes causing the remote device to alert a caregiver that the subject is about to experience a health-related problem.
  • In some implementations, the method also includes causing the remote device to alert the subject that the subject is about to experience a health-related problem.
  • In some implementations, the remote device sends an alert to a device worn by the subject that acquires the data.
  • In some implementations, the remote device sends an alert to a mobile phone of the subject.
  • In some implementations, determining that the subject is about to experience a health-related problem includes determining whether a blood pressure of the subject satisfies a threshold.
  • In some implementations, the method also includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject. The method also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject. The method also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • In some implementations, the blood pressure of the subject is determined based on the PTT.
  • In some implementations, the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • In some implementations, determining that the subject is about to experience a health-related problem includes determining whether a heart rate of the subject satisfies a threshold.
  • In some implementations, determining the heart rate of the subject includes calculating a distance between two consecutive reference points in the first dataset, the distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • In some implementations, the reference points are local maxima or local minima.
  • In some implementations, the reference points are peaks or valleys in the first dataset.
  • In some implementations, determining that the subject is about to experience a health-related problem includes determining whether a heart rate variability of the subject satisfies a threshold.
  • In some implementations, determining the heart rate variability of the subject includes calculating distances between multiple pairs of consecutive reference points in the first dataset, each distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • In some implementations, the reference points are local maxima or local minima.
  • In some implementations, the reference points are peaks or valleys.
  • In some implementations, the method also includes providing location information related to the subject to the remote device.
  • In some implementations, the location information is provided by a location module of a device worn by the subject that acquires the data.
  • In some implementations, the location module is a GPS transponder.
  • In some implementations, the method also includes providing temperature information related to the subject to the remote device.
  • In some implementations, the remote device is a thermostat.
  • In some implementations, the subject is remote from a location that is temperature-controlled by the thermostat.
  • In some implementations, the thermostat is configured to adjust its temperature settings based on the temperature information related to the subject.
  • In some implementations, a time when the thermostat adjusts its temperature settings is based on the location information related to the subject.
  • In some implementations, the thermostat adjusts its temperature settings when the location information indicates that the subject is within a predefined distance from a location that is temperature-controlled by the thermostat.
  • In some implementations, the remote device is a light.
  • In some implementations, the subject is remote from a location that can be illuminated by the light.
  • In some implementations, the light is configured to adjust its lighting settings at a time that is based on the location information related to the subject.
  • In some implementations, the light adjusts its lighting settings when the location information indicates that the subject is within a predefined distance from a location that is lighting-controlled by the light.
  • In some implementations, the method also includes determining that the subject is interacting with a particular object based on a location of the subject.
  • In some implementations, the remote device is a server.
  • In some implementations, the particular object is an advertisement.
  • In some implementations, the particular object is a product display.
  • In some implementations, the particular object is a retail product.
  • In some implementations, the location of the subject is determined by a GPS module of a device worn by the subject that acquires the data.
  • In some implementations, the location of the subject is determined based on a strength of a wireless connection between a device worn by the subject that acquires the data and one or more proximity sensors.
  • In some implementations, a relatively higher strength of the wireless connection between the device and the proximity sensor indicates that the device is relatively closer to the proximity sensor.
  • In some implementations, the wireless connection is a Bluetooth connection.
  • In some implementations, the method also includes determining, based on the processed data, that the subject is experiencing one or more of an increase in heart rate, blood pressure, and respiratory rate while the subject is interacting with the particular object.
  • In some implementations, the method also includes inferring that the subject is interested in the particular object based on one or more of the heart rate, the blood pressure, and the respiratory rate of the subject while the subject is interacting with the particular object.
  • In some implementations, the remote device is an entertainment device.
  • In some implementations, the entertainment device is a television.
  • In some implementations, the entertainment device is an audio output device.
  • In some implementations, the entertainment device is a gaming device.
  • In some implementations, the processed data indicates whether the subject has exercised for a predetermined length of time, and the entertainment device can be turned on only if the subject has exercised for the predetermined length of time.
  • In some implementations, the entertainment device is configured to provide content personalized for the subject based on a state of the subject as determined from the processed data.
  • In some implementations, the state of the subject includes a level of interest in the content provided by the entertainment device.
  • In some implementations, a rise in one or more of a heart rate, a heart rate variability, an electrical skin impedance, a respiratory rate, and a blood pressure of the subject while the subject is experiencing the content indicates an increased level of interest in the content.
  • In some implementations, the heart rate, the heart rate variability, the electrical skin impedance, and the respiratory rate of the subject are determined from the processed data.
  • In some implementations, the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject. The blood pressure of the subject is determined from the processed data.
  • In some implementations, the entertainment device provides content designed to excite the subject if the heart rate variability of the subject is within a predefined range.
  • In some implementations, the entertainment device provides content designed to excite the subject if one or more of the heart rate, the electrical skin impedance, the respiratory rate, and the blood pressure of the subject is below a respective threshold.
  • In some implementations, the state of the subject includes a level of stress of the subject while the subject is experiencing the content.
  • In some implementations, a rise in one or more of a heart rate, a heart rate variability, an electrical skin impedance, a respiratory rate, and a blood pressure of the subject while the subject is experiencing the content indicates an increased level of interest in the content.
  • In some implementations, the heart rate, the heart rate variability, the electrical skin impedance, and the respiratory rate of the subject are determined from the processed data.
  • In some implementations, the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject, wherein the blood pressure of the subject is determined from the processed data.
  • In some implementations, the entertainment device provides content designed to calm the subject if the heart rate variability of the subject is within a predefined range.
  • In some implementations, the entertainment device provides content designed to calm the subject if one or more of the heart rate, the electrical skin impedance, the respiratory rate, and the blood pressure of the subject is above a respective threshold.
  • In some implementations, the entertainment device is a television and the content includes one or more of television shows, movies, and games.
  • In some implementations, the entertainment device is a gaming device that is configured to adjust game settings based on a state of the subject as determined from the processed data.
  • In some implementations, game settings include one or more of difficulty settings, sound settings, and situational settings.
  • In some implementations, the entertainment device is configured to turn off based on a state of the subject as determined from the processed data.
  • In some implementations, the method also includes causing the remote device to adjust a dating preference in a dating profile of the subject based on a state of the subject as determined from the processed data.
  • In some implementations, the method also includes processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in one or more other subjects acquired at locations on the other subjects. The method also includes processing data that represents time-varying information about motion of the one or more other subjects acquired at the locations on the other subjects. The method also includes determining a compatibility between the subject and each of the other subjects based on states of the subjects as determined from the data.
  • In some implementations, the method also includes ranking the compatibilities between the subject and each of the other subjects.
  • In some implementations, the remote device is a device operated by the subject.
  • In some implementations, the method also includes determining, based on the data in the first and second datasets, that the subject is not adequately alert.
  • In some implementations, determining that the subject is not adequately alert is based on one or more of a heart rate, a respiratory rate, a blood pressure, and an activity level of the subject.
  • In some implementations, determining that the subject is not adequately alert includes determining, based on the processed data, whether one or more of the heart rate, the respiratory rate, the blood pressure, and the activity level of the subject is below a threshold.
  • In some implementations, the method also includes causing the device to activate an alarm if the subject is not adequately alert.
  • In some implementations, the method also includes causing the device to slow down if the subject is not adequately alert.
  • In some implementations, the device is a vehicle.
  • In some implementations, the data is acquired by the device and the device is wearable by the subject.
  • In some implementations, the method also includes causing an alarm of the wearable device to be activated if the subject is not adequately alert.
  • In some implementations, the biofeedback device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject. The processor is also configured to receive data from the motion sensor.
  • In some implementations, the operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • In some implementations, the biofeedback device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject. The processor is also configured to receive data from the motion sensor.
  • In some implementations, the processor is also configured to determine, based on the received data, that the subject is experiencing or has experienced a health-related problem.
  • In some implementations, the processor is also configured to determine, based on the received data, that the subject is about to experience a health-related problem.
  • In some implementations, the processor is also configured to cause the remote device to alert a caregiver that the subject is experiencing, has experienced, or is about to experience a health-related problem.
  • In some implementations, the processor is also configured to cause the remote device to alert the subject that the subject is experiencing, has experienced, or is about to experience a health-related problem.
  • In some implementations, the remote device sends an alert to the biofeedback device.
  • In some implementations, the remote device sends an alert to a mobile phone of the subject.
  • In some implementations, the processor is also configured to provide location information related to the subject to the remote device.
  • In some implementation, the biofeedback device also includes a location module configured to provide the location information related to the subject to the remote device.
  • In some implementations, the location module is a GPS transponder.
  • In some implementations, the processor is also configured to provide temperature information related to the subject to the remote device.
  • In some implementations, the processor is also configured to determine that the subject is interacting with a particular object based on a location of the subject.
  • In some implementations, the remote device is a server.
  • In some implementations, the particular object is an advertisement.
  • In some implementations, the particular object is a product display.
  • In some implementations, the particular object is a retail product.
  • In some implementations, the location of the subject is determined by the GPS module of the biofeedback device.
  • In some implementations, the location of the subject is determined based on a strength of a wireless connection between the biofeedback device and one or more proximity sensors.
  • In some implementations, a relatively higher strength of the wireless connection between the biofeedback device and the proximity sensor indicates that the biofeedback device is relatively closer to the proximity sensor.
  • In some implementations, the wireless connection is a Bluetooth connection.
  • In some implementations, the remote device is a device operated by the subject.
  • In some implementations, the processor is also configured to determine, based on the received data, that the subject is not adequately alert.
  • In some implementations, the processor is also configured to cause the biofeedback device to activate an alarm if the subject is not adequately alert.
  • In some implementations, the processor is also configured to cause the device operated by the subject to slow down if the subject is not adequately alert.
  • In some implementations, the device is a vehicle.
  • In another aspect, a method includes deriving a score associated with a state of a subject, the state of the subject being one or more members selected from the group consisting of health, sleep, fitness, and stress. Deriving the score is based on data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired at a location of the subject.
  • In another aspect, one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including deriving a score associated with a state of a subject. The state of the subject is one or more members selected from the group consisting of health, sleep, fitness, and stress. Deriving the score is based on data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired at a location of the subject.
  • In another aspect, a biofeedback device configured to be worn by a subject includes a light source configured to emit light toward the skin of the subject. The device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject. The optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject. The device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject. The device also includes a processor configured to receive data from one or more of the light-emitting element, the optical sensor, and the motion sensor. The processor is also configured to derive a score associated with a state of the subject, the state of the subject being one or more members selected from the group consisting of health, sleep, fitness, and stress.
  • Implementations can include one or more of the following features.
  • In some implementations, deriving the score is also based on data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • In some implementations, the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • In some implementations, the data is acquired continuously.
  • In some implementations, the data is acquired at a frequency of at least 16 Hz.
  • In some implementations, the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • In some implementations, the data is acquired at a single location of the subject.
  • In some implementations, the data is acquired by a device worn by the subject.
  • In some implementations, the device is mobile and does not reduce a mobility of the subject.
  • In some implementations, the device processes the data.
  • In some implementations, the single location is an arm of the subject.
  • In some implementations, the single location is a wrist of the subject.
  • In some implementations, the score is a numerical value.
  • In some implementations, the numerical value is between 1 and 100.
  • In some implementations, the numerical value is between 1 and 10.
  • In some implementations, the data is acquired by a device that is worn by the subject and that displays the score.
  • In some implementations, the device worn by the subject derives the score.
  • In some implementations, the device worn by the subject provides the data to a remote device that derives the score.
  • In some implementations, the remote device is a server.
  • In some implementations, the remote device provides the score to the device worn by the subject.
  • In some implementations, the remote device provides the score to a mobile phone of the subject.
  • In some implementations, the score is provided to one or both of the subject and another party.
  • In some implementations, the state of the subject includes a sleep state, and the score includes a sleep score.
  • In some implementations, the sleep score is associated with a level of quality of the subject's sleep.
  • In some implementations, deriving the score includes identifying one or more potential sleep rest periods of the subject based on gross motion data of the subject.
  • In some implementations, deriving the score also includes calculating one or more of an average heart rate, a standard deviation of the average heart rate, and an average heart rate variability of the subject during each of the one or more potential sleep rest periods based on the information about at least one pulse pressure wave propagating through blood in the subject.
  • In some implementations, one or more of the potential sleep rest periods are identified as sleep rest periods by comparing one or more of the average heart rate, the standard deviation of the average heart rate, and the average heart rate variability of the subject during the respective potential sleep rest period to a threshold.
  • In some implementations, the sleep state of the subject is associated with one or more of sleep duration, sleep latency, and sleep staging.
  • In some implementations, deriving the score includes determining one or more of the sleep duration, the sleep latency, and the sleep staging of the subject.
  • In some implementations, the method also includes determining the sleep duration of the subject.
  • In some implementations, determining the sleep duration of the subject includes determining a total length of time during which the subject was asleep based on information related to one or more sleep rest periods of the subject.
  • In some implementations, the information related to the one or more sleep rest periods includes a time associated with a beginning of each sleep rest period, a time associated with an end of each sleep rest period, gross motion data of the subject during each sleep rest period, and heart rate data of the subject during each sleep rest period.
  • In some implementations, determining the sleep duration of the subject includes determining a percentage of time that the subject was asleep between a time when the subject started to try to fall asleep and a time when the subject awoke based on information related to one or more sleep rest periods of the subject and gross motion data of the subject before the subject fell asleep.
  • In some implementations, the method also includes determining the sleep latency of the subject.
  • In some implementations, determining the sleep latency of the subject includes determining a length of time that it takes for the subject to transition from a state of wakefulness to the sleep state based on information related to one or more sleep rest periods of the subject and gross motion data of the subject before the subject fell asleep.
  • In some implementations, the method also includes determining the sleep staging of the subject.
  • In some implementations, determining the sleep staging of the subject includes determining a deepness of the subject's sleep during a portion of each of one or more sleep rest periods of the subject based on information related to the one or more sleep rest periods.
  • In some implementations, the sleep staging of the subject is determined based on at least a heart rate and gross motion data of the subject during one or more of the portions of the sleep rest periods.
  • In some implementations, the data is acquired by a device that is worn by the subject.
  • In some implementations, the method also includes causing the device to calculate and display the sleep score when the subject is determined to have awoken.
  • In some implementations, the method also includes providing information to the subject that assists the subject in improving the sleep score.
  • In some implementations, the information includes a recommended sleep schedule.
  • In some implementations, the information is provided to a device that is worn by the subject that acquires the data.
  • In some implementations, the information is provided to a mobile phone of the subject.
  • In some implementations, the state of the subject includes a fitness state, and the score includes a fitness score.
  • In some implementations, the fitness score is associated with one or more of a degree of physical fitness, cardiac condition, coaching, dehydration, social interaction, adherence to a regimen, and coaching effectiveness of the subject.
  • In some implementations, deriving the score includes calculating a resting heart rate of the subject while the subject is inactive based on the information about at least one pulse pressure wave propagating through blood in the subject and gross motion data of the subject.
  • In some implementations, deriving the score also includes calculating a heart rate of the subject based on the information about at least one pulse pressure wave propagating through blood in the subject. Deriving the score also includes determining that the subject is in the fitness state based on the heart rate and the gross motion data of the subject.
  • In some implementations, deriving the score includes determining a length of time that it takes for the subject's heart rate to transition from the heart rate in the fitness state to the resting heart rate.
  • In some implementations, deriving the score includes determining a length of time that it takes for the subject's heart rate to transition from the resting heart rate to the heart rate in the fitness state.
  • In some implementations, the data is acquired by a device that is worn by the subject.
  • In some implementations, the method also includes causing the device to calculate and display the fitness score when the subject is determined to be in the fitness state.
  • In some implementations, the method also includes causing the device to calculate and display the fitness score when the subject is determined to have transitioned from the fitness state to a non-fitness state.
  • In some implementations, the method also includes providing information to the subject that assists the subject in improving the fitness score.
  • In some implementations, the information includes a recommended fitness routine.
  • In some implementations, the information is provided to a device that is worn by the subject that acquires the data.
  • In some implementations, the information is provided to a mobile phone of the subject.
  • In some implementations, the method also includes embedding a visual indication of one or more of the fitness score, a heart rate, a respiratory rate, and a blood pressure of the subject into a video showing the subject performing a fitness routine.
  • In some implementations, the visual indications are updated throughout the video according to the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject during the fitness routine.
  • In some implementations, the method also includes predicting an outcome of an athletic event that the subject is participating in based on one or more of the fitness score, a heart rate, a respiratory rate, and a blood pressure of the subject during the athletic event.
  • In some implementations, the method also includes comparing one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject to fitness scores, heart rates, respiratory rates, and blood pressures of other individuals who are participating in the athletic event.
  • In some implementations, the method also includes, while the subject is performing physical activity, comparing one or more of the fitness score, a heart rate, a respiratory rate, and a blood pressure of the subject to fitness scores, heart rates, respiratory rates, and blood pressures of one or more individuals who have previously performed the physical activity.
  • In some implementations, performing the physical activity includes performing an athletic event, and the one or more individuals are professional athletes who compete in the athletic event.
  • In some implementations, the state of the subject includes a stress state, and the score includes a stress score.
  • In some implementations, deriving the score includes calculating one or more of a heart rate, a heart rate variability, a blood pressure, an electrical skin impedance, and a respiratory rate of the subject based on the information about at least one pulse pressure wave propagating through blood in the subject and information about motion of the subject.
  • In some implementations, the stress state of the subject is associated with hypertension, and deriving the score includes determining whether the subject is experiencing hypertension by comparing a blood pressure of the subject to a threshold.
  • In some implementations, the stress state of the subject is associated with emotional stress, and deriving the score includes determining a level of emotional stress experienced by the subject by comparing one or more of a heart rate, a heart rate variability, a blood pressure, an electrical skin impedance, and a respiratory rate of the subject to a threshold.
  • In some implementations, determining the level of emotional stress experienced by the subject is based at least in part on audio data.
  • In some implementations, the audio data is captured by a microphone of a device that acquires the data in the first dataset.
  • In some implementations, the audio data includes one or both of environmental noise and a tonality of the subject's voice.
  • In some implementations, determining the level of emotional stress experienced by the subject includes analyzing the environmental noise to determine whether the subject is in an environment attributed to an increased emotional stress level.
  • In some implementations, determining the level of emotional stress experienced by the subject includes analyzing the tonality of the subject's voice to determine whether the subject is in a confrontational situation attributed to an increased emotional stress level.
  • In some implementations, the data is acquired by a device that is worn by the subject.
  • In some implementations, the method also includes causing the device to calculate and display the stress score when the subject is determined to be in the stress state.
  • In some implementations, the method also includes providing information to the subject that assists the subject in improving the stress score.
  • In some implementations, the information includes a recommended stress-reducing routine.
  • In some implementations, the information is provided to a device that is worn by the subject that acquires the data.
  • In some implementations, the information is provided to a mobile phone of the subject.
  • In some implementations, the state of the subject includes a sleep state, and the score includes a sleep score.
  • In some implementations, the sleep state of the subject is associated with one or more of sleep duration, sleep latency, and sleep staging, and deriving the score includes determining one or more of the sleep duration, the sleep latency, and the sleep staging of the subject.
  • In some implementations, the processor is also configured to determine the sleep duration of the subject.
  • In some implementations, determining the sleep duration of the subject includes determining a total length of time during which the subject was asleep based on information related to one or more sleep rest periods of the subject.
  • In some implementations, determining the sleep duration of the subject includes determining a percentage of time that the subject was asleep between a time when the subject started to try to fall asleep and a time when the subject awoke based on information related to one or more sleep rest periods of the subject and gross motion data of the subject before the subject fell asleep.
  • In some implementations, the processor is also configured to determine the sleep latency of the subject.
  • In some implementations, determining the sleep latency of the subject includes determining a length of time that it takes for the subject to transition from a state of wakefulness to the sleep state based on information related to one or more sleep rest periods of the subject and gross motion data of the subject before the subject fell asleep.
  • In some implementations, the processor is also configured to determine the sleep staging of the subject.
  • In some implementations, determining the sleep staging of the subject includes determining a deepness of the subject's sleep during a portion of each of one or more sleep rest periods of the subject based on information related to the one or more sleep rest periods.
  • In some implementations, the sleep staging of the subject is determined based on at least a heart rate and gross motion data of the subject during one or more of the portions of the sleep rest periods.
  • In some implementations, the biofeedback device also includes a display, and the processor is also configured to cause the display to display the sleep score.
  • In some implementations, the processor causes the display to display the sleep score when the subject is determined to have awoken.
  • In some implementations, the state of the subject includes a fitness state, and the score includes a fitness score.
  • In some implementations, the fitness score is associated with one or more of a degree of physical fitness, cardiac condition, coaching, dehydration, social interaction, adherence to a regimen, and coaching effectiveness of the subject.
  • In some implementations, deriving the score includes calculating a resting heart rate of the subject while the subject is inactive based on the information about at least one pulse pressure wave propagating through blood in the subject and gross motion data of the subject.
  • In some implementations, deriving the score also includes calculating a heart rate of the subject based on the information about at least one pulse pressure wave propagating through blood in the subject. Deriving the score also includes determining that the subject is in the fitness state based on the heart rate and the gross motion data of the subject.
  • In some implementations, deriving the score also includes determining a length of time that it takes for the subject's heart rate to transition from the heart rate in the fitness state to the resting heart rate.
  • In some implementations, deriving the score also includes determining a length of time that it takes for the subject's heart rate to transition from the resting heart rate to the heart rate in the fitness state.
  • In some implementations, the processor is also configured to cause the display to display the fitness score.
  • In some implementations, the processor causes the display to display the fitness score when the subject is determined to be in the fitness state.
  • In some implementations, the processor causes the display to display the fitness score when the subject is determined to have transitioned from the fitness state to a non-fitness state.
  • In some implementations, the processor is also configured to determine one or more of a heart rate, a respiratory rate, and a blood pressure of the subject based on data received from one or more of the light-emitting element, the optical sensor, and the motion sensor.
  • In some implementations, the device also includes a transceiver, and the processor is configured to cause the transceiver to provide one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject to a remote device.
  • In some implementations, the processor causes the transceiver to provide one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject to a video that shows the subject performing a fitness routine. A visual indication of one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject is embedded into the video.
  • In some implementations, the visual indications are updated throughout the video according to the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject during the fitness routine.
  • In some implementations, the processor is also configured to predict an outcome of an athletic event that the subject is participating in based on one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject during the athletic event.
  • In some implementations, the transceiver is configured to communicate with transceivers of other biofeedback devices.
  • In some implementations, the processor is also configured to compare one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject to fitness scores, heart rates, respiratory rates, and blood pressures of other individuals who are participating in the athletic event.
  • In some implementations, the processor is also configured to, while the subject is performing physical activity, compare one or more of the fitness score, the heart rate, the respiratory rate, and the blood pressure of the subject to fitness scores, heart rates, respiratory rates, and blood pressures of one or more individuals who have previously performed the physical activity.
  • In some implementations, performing the physical activity includes performing an athletic event, and the one or more individuals are professional athletes who compete in the athletic event.
  • In some implementations, the state of the subject includes a stress state, and the score includes a stress score.
  • In some implementations, the stress state of the subject is associated with emotional stress, and deriving the score includes determining a level of emotional stress experienced by the subject by comparing one or more of a heart rate, a heart rate variability, a blood pressure, an electrical skin impedance, and a respiratory rate of the subject to a threshold.
  • In some implementations, the biofeedback device also includes an audio input device.
  • In some implementations, determining the level of emotional stress experienced by the subject is based at least in part on audio data provided to the processor by the audio input device.
  • In some implementations, the audio data includes one or both of environmental noise and a tonality of the subject's voice.
  • In some implementations, determining the level of emotional stress experienced by the subject includes analyzing the environmental noise to determine whether the subject is in an environment attributed to an increased emotional stress level
  • In some implementations, determining the level of emotional stress experienced by the subject includes analyzing the tonality of the subject's voice to determine whether the subject is in a confrontational situation attributed to an increased emotional stress level.
  • In some implementations, the processor is also configured to cause the display to display the stress score.
  • In some implementations, the processor causes the display to display the stress score when the subject is determined to be in the stress state.
  • In another aspect, a method includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject. The method also includes deriving information about a psychological state of the subject from the processed data.
  • In another aspect, one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject. The operations also include deriving information about a psychological state of the subject from the processed data.
  • In another aspect, a biofeedback device configured to be worn by a subject includes a light source configured to emit light toward the skin of the subject. The device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject. The optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject. The device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject. The device also includes a processor configured to receive data from one or more of the light-emitting element, the optical sensor, and the motion sensor. The processor is also configured to derive information about a psychological state of the subject from the processed data.
  • Implementations can include one or more of the following features.
  • In some implementations, the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • In some implementations, the data is acquired continuously.
  • In some implementations, the data is acquired at a frequency of at least 16 Hz.
  • In some implementations, the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • In some implementations, the data is acquired at a single location of the subject.
  • In some implementations, the data is acquired by a device worn by the subject.
  • In some implementations, the device is mobile and does not reduce a mobility of the subject.
  • In some implementations, the device processes the data.
  • In some implementations, the single location is an arm of the subject.
  • In some implementations, the single location is a wrist of the subject.
  • In some implementations, the psychological state of the subject includes a state of stress.
  • In some implementations, the method also includes determining one or more of a blood pressure, a heart rate, and a heart rate variability of the subject based on the datasets. The method also includes deriving information about the state of stress of the subject based on one or more of the determined blood pressure, heart rate, and heart rate variability of the subject.
  • In some implementations, the method also includes correlating a level of stress of the subject to an amount of ultraviolet light that the subject has been exposed to.
  • In some implementations, deriving the information includes inferring a relationship between at least some of the processed data and one psychological state of the subject.
  • In some implementations, the method also includes inferring an existence of a second psychological state of the subject by comparing other processed data with the processed data related to the one psychological state.
  • In some implementations, the one psychological state includes a state of relatively lower stress.
  • In some implementations, the one psychological state includes a baseline state of the subject, and the relationship between at least some of the processed data and the one psychological state is inferred prior to the subject performing a polygraph test.
  • In some implementations, the psychological state includes a malicious intent.
  • In some implementations, the psychological state includes lying.
  • In some implementations, a device worn by the subject acquires the data.
  • In some implementations, deriving information about the psychological state of the subject includes determining a baseline state of the subject based on one or more of a blood pressure, a heart rate, a heart rate variability, a respiratory rate, and an electrical skin impedance.
  • In some implementations, the device is worn by the subject for an extended period of time to determine the baseline state of the subject.
  • In some implementations, the device is continuously worn by the subject for more than one day.
  • In some implementations, the processor is also configured to determine one or more of a blood pressure, a heart rate, and a heart rate variability of the subject based on the received data. The processor is also configured to derive information about a state of stress of the subject based on one or more of the determined blood pressure, heart rate, and heart rate variability of the subject.
  • In some implementations, the device also includes an ultraviolet light sensor configured to measure an amount of ultraviolet light that the subject is exposed to.
  • In some implementations, the processor is also configured to correlate a level of stress of the subject to an amount of ultraviolet light that the subject has been exposed to.
  • In another aspect, a method includes processing data in a dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The method also includes determining whether one or more segments of the dataset were captured from a subject other than an expected subject by analyzing morphological features of the segments.
  • In another aspect, a method includes processing data in a dataset that represents time-varying information about motion of a subject acquired at a location of the subject. The method also includes determining whether one or more segments of the dataset were captured from a subject other than an expected subject by analyzing morphological features of the segments.
  • In another aspect, a method includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject. The method also includes, based on the first and second datasets, determining at least two parameters of the subject, the parameters selected from the group consisting of blood pressure, respiratory rate, blood oxygen levels, heart rate, heart rate variability, stroke volume, cardiac output, MoCG morphology, and PPG morphology. The method also includes determining a biometric signature of the subject, the biometric signature represented by a multi-dimensional space that is defined by at least two axes, each axis corresponding to at least one of the determined parameters. The method also includes determining whether the biometric signature was captured from a subject who is an expected subject by analyzing features of the biometric signature.
  • In another aspect, one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The operations also include determining whether one or more segments of the dataset were captured from a subject other than an expected subject by analyzing morphological features of the segments.
  • In another aspect, one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a dataset that represents time-varying information about motion of a subject acquired at a location of the subject. The operations also include determining whether one or more segments of the dataset were captured from a subject other than an expected subject by analyzing morphological features of the segments.
  • In another aspect, one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject. The operations also include determining at least two parameters of the subject based on the first and second datasets. The parameters are selected from the group consisting of blood pressure, respiratory rate, blood oxygen levels, heart rate, heart rate variability, stroke volume, cardiac output, MoCG morphology, and PPG morphology. The operations also include determining a biometric signature of the subject. The biometric signature is represented by a multi-dimensional space that is defined by at least two axes. Each axis corresponds to at least one of the determined parameters. The operations also include determining whether the biometric signature was captured from a subject who is an expected subject by analyzing features of the biometric signature.
  • In another aspect, a biofeedback device configured to be worn by a subject includes a light source configured to emit light toward the skin of the subject. The biofeedback device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject. The optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject. The biofeedback device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor. The processor is also configured to determine whether one or more segments of the data were captured from a subject other than an expected subject by analyzing morphological features of the segments.
  • Implementations can include one or more of the following features.
  • In some implementations, the data is acquired continuously.
  • In some implementations, the data is acquired at a frequency of at least 16 Hz.
  • In some implementations, the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • In some implementations, the data is acquired at a single location of the subject.
  • In some implementations, the data is acquired by a device worn by the subject.
  • In some implementations, the device is mobile and does not reduce a mobility of the subject.
  • In some implementations, the device processes the data.
  • In some implementations, the single location is an arm of the subject.
  • In some implementations, the single location is a wrist of the subject.
  • In some implementations, the determining includes analyzing other biometric data.
  • In some implementations, the other biometric data includes one or more of electrical skin impedance, respiratory rate, heart rate, heart rate variability, PPG morphology, and vocal sound frequency of the subject.
  • In some implementations, analyzing the other biometric data includes determining whether the subject is under distress.
  • In some implementations, the determining includes analyzing confidential information provided by the subject.
  • In some implementations, the confidential information includes one or more of a password, a personal identification number, and a predefined gesture.
  • In some implementations, the analyzing includes comparing morphological features of different segments of biometric data.
  • In some implementations, the method also includes taking an action when it is determined that one or more of the segments were captured from a subject other than the expected subject.
  • In some implementations, taking an action includes prompting the subject to provide confidential information to authenticate the subject as the expected subject.
  • In some implementations, the expected subject is a subject associated with a particular device that captures the data segments at a location on the expected subject.
  • In some implementations, the determining includes taking account of one or both of a changing level of activity and a changing heart rate of the subject.
  • In some implementations, the method also includes sending information to a device upon determining that the subject is the expected subject.
  • In some implementations, the device is a payment gateway, and the information includes a payment authorization.
  • In some implementations, the device is a lock, and the information causes a lock to unlock.
  • In some implementations, causing the lock to unlock is also based on a location of the subject.
  • In some implementations, the method also includes sending information to a device upon determining that the subject is under distress.
  • In some implementations, the subject is determined to be under distress if one or more of a heart rate, a blood pressure, and a respiratory rate of the subject surpasses a threshold.
  • In some implementations, the device is a payment gateway, and the information includes instructions for the payment gateway to prevent the subject from accessing the payment gateway.
  • In some implementations, the device is a lock, and the information includes instructions for the lock to remain locked.
  • In some implementations, the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject. The method also includes determining whether one or more segments of the datasets were captured from a subject other than an expected subject by analyzing morphological features of the segments.
  • In some implementations, the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • In some implementations, the method also includes determining a pulse transit time (PTT) based on the datasets, the PTT representing a transit time of a pulse pressure wave within the subject.
  • In some implementations, the method also includes determining a blood pressure of the subject based on the datasets.
  • In some implementations, the determining includes analyzing other biometric data.
  • In some implementations, the other biometric data includes one or more of electrical skin impedance, respiratory rate, heart rate, heart rate variability, stroke volume, cardiac output, MoCG morphology, PPG morphology, and vocal sound frequency of the subject.
  • In some implementations, analyzing the other biometric data includes determining whether the subject is under distress.
  • In some implementations, the morphological features include differences in blood pressure at specific times during each of the data segments.
  • In some implementations, the specific times include times of peaks or valleys in blood pressure during the data segments.
  • In some implementations, the morphological features include differences in blood pressure at successive peaks of blood pressure, successive valleys of blood pressure, or successive peaks and valleys of blood pressure.
  • In some implementations, determining whether one or more segments of the data were captured from a subject other than an expected subject includes analyzing confidential information provided by the subject.
  • In some implementations, the confidential information includes one or more of a password, a personal identification number, and a predefined gesture.
  • In some implementations, the biofeedback device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject. The processor is also configured to receive data from the motion sensor
  • In some implementations, the processor is also configured to take an action when it is determined that one or more of the segments were captured from a subject other than the expected subject.
  • In some implementations, taking an action includes prompting the subject to provide confidential information to authenticate the subject as the expected subject.
  • In some implementations, the motion sensor is also configured to determine when a subject performs the predefined gesture.
  • In some implementations, the biofeedback device also includes a transceiver configured to send information to a device upon determining that the subject is the expected subject.
  • In some implementations, the device is a payment gateway, and the information includes a payment authorization.
  • In some implementations, the device is a lock, and the information causes a lock to unlock.
  • In some implementations, the biofeedback device also includes a location module, and causing the lock to unlock is also based on a location of the subject as determined by the location module.
  • In some implementations, the transceiver is also configured to send information to a device upon determining that the subject is under distress.
  • In some implementations, the device is a payment gateway, and the information includes instructions for the payment gateway to prevent the subject from accessing the payment gateway.
  • In some implementations, the device is a lock, and the information includes instructions for the lock to remain locked.
  • In another aspect, a method includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The method also includes providing, based on the data, information about a medication regimen of the subject.
  • In another aspect, one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The operations also include providing, based on the data, information about a medication regimen of the subject.
  • In another aspect, a biofeedback device configured to be worn by a subject includes a light source configured to emit light toward the skin of the subject. The device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject. The optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject. The device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor. The processor is also configured to provide, based on the data, information about a medication regimen of the subject.
  • Implementations can include one or more of the following features.
  • In some implementations, the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • In some implementations, the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • In some implementations, the data is acquired continuously.
  • In some implementations, the data is acquired at a frequency of at least 16 Hz.
  • In some implementations, the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • In some implementations, the data is acquired at a single location of the subject.
  • In some implementations, the data is acquired by a device worn by the subject.
  • In some implementations, the device is mobile and does not reduce a mobility of the subject.
  • In some implementations, the device processes the data.
  • In some implementations, the single location is an arm of the subject.
  • In some implementations, the single location is a wrist of the subject.
  • In some implementations, the method also includes determining, based on the data, that the subject has potentially missed a dose of a medication. The method also includes providing a notification indicating that the subject has potentially missed the dose of the medication.
  • In some implementations, determining that the subject has potentially missed a dose of a medication includes determining that a blood pressure of the subject has crossed a threshold.
  • In some implementations, the method also includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject. The method also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject. The method also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject. The blood pressure of the subject is determined based on the PTT.
  • In some implementations, the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • In some implementations, determining that the subject has potentially missed a dose of a medication includes determining that a heart rate of the subject has crossed a threshold.
  • In some implementations, determining that the subject has potentially missed a dose of a medication includes determining that a respiratory rate of the subject has crossed a threshold.
  • In some implementations, the method also includes determining, based on the data, a reaction of the subject to a medication. The method also includes providing a recommended medication regimen of the medication based on the reaction of the subject to the medication.
  • In some implementations, the recommended medication regimen includes one or more recommended dosage timings. The recommended medication regimen also includes one or more recommended dosage amounts. Each of the recommended dosage amounts corresponds to one of the dosage timings.
  • In some implementations, determining a reaction of the subject to a medication includes determining a blood pressure of the subject.
  • In some implementations, the blood pressure of the subject is determined periodically.
  • In some implementations, the recommended dosage timings and amounts are determined so as to maintain a blood pressure of the subject within a defined range.
  • In some implementations, determining a reaction of the subject to a medication includes determining a heart rate of the subject.
  • In some implementations, the heart rate of the subject is determined periodically.
  • In some implementations, determining a reaction of the subject to a medication includes determining a regularity of a heart rate of the subject.
  • In some implementations, the recommended dosage timings and amounts are determined so as to maintain a heart rate of the subject within a defined range.
  • In some implementations, determining a reaction of the subject to a medication includes determining a cardiac output of the subject.
  • In some implementations, the recommended dosage timings and amounts are determined so as to maintain a cardiac output of the subject within a defined range.
  • In some implementations, determining a reaction of the subject to a medication includes determining a temperature of the subject.
  • In some implementations, the recommended dosage timings and amounts are determined so as to maintain the temperature of the subject within a defined range.
  • In some implementations, the recommended dosage timings and amounts are determined so as to maintain a heart rate of the subject within a defined range.
  • In some implementations, determining a reaction of the subject to a medication includes determining a respiratory rate of the subject.
  • In some implementations, the respiratory rate of the subject is determined periodically.
  • In some implementations, the recommended dosage timings and amounts are determined so as to maintain a respiratory rate of the subject within a defined range.
  • In some implementations, the biofeedback device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject. The processor is also configured to receive data from the motion sensor.
  • In some implementations, the processor is also configured to determine, based on the data, that the subject has potentially missed a dose of a medication and provide a notification indicating that the subject has potentially missed the dose of the medication.
  • In some implementations, the processor is also configured to determine, based on the data, a reaction of the subject to a medication and provide a recommended medication regimen of the medication based on the reaction of the subject to the medication.
  • In some implementations, the recommended medication regimen includes one or more recommended dosage timings. The recommended medication regimen also includes one or more recommended dosage amounts, each of which corresponds to one of the dosage timings.
  • In some implementations, the operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject
  • In another aspect, a method includes processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in each of two or more subjects acquired at a location of each of the subjects. The method also includes processing data that represents time-varying information about motion of the two or more subjects acquired at the location on each of the subject. The method also includes providing information to a user that reports relative states of the subjects.
  • In another aspect, one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data that represents time-varying information about at least one pulse pressure wave propagating through blood in each of two or more subjects acquired at a location of each of the subjects. The operations also include processing data that represents time-varying information about motion of the two or more subjects acquired at the location on each of the subject. The operations also include providing information to a user that reports relative states of the subjects.
  • In another aspect, a biofeedback device configured to be worn by two or more subjects includes a light source configured to emit light toward the skin of the subject. The device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject. The optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject. The device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject. The device also includes a processor configured to receive data from one or more of the light-emitting element, the optical sensor, and the motion sensor. The processor is also configured to provide information to a user that reports relative states of the subjects.
  • Implementations can include one or more of the following features.
  • In some implementations, the information about at least one pulse pressure wave propagating through blood in the subjects includes photoplethysmographic (PPG) data and the information about motion of the subjects includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • In some implementations, the data is acquired continuously.
  • In some implementations, the data is acquired at a frequency of at least 16 Hz.
  • In some implementations, the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • In some implementations, the data is acquired at single locations of each of the subjects.
  • In some implementations, the data is acquired by devices worn by the subjects.
  • In some implementations, the devices are mobile and do not reduce mobility of the subjects.
  • In some implementations, the devices process the data.
  • In some implementations, the single location of each of the subjects is an arm of the subject.
  • In some implementations, the single location is a wrist of the subject.
  • In some implementations, the relative states of the subjects are determined based on one or more of respiratory rates, heart rates, and blood pressures of the subjects.
  • In some implementations, the relative states of the subjects are determined by comparing one or more of the respiratory rates, the heart rates, and the blood pressures of the subjects to respective threshold values.
  • In some implementations, devices worn by the subjects acquire the data, and the respiratory rates, the heart rates, and the blood pressures of the subjects are determined according to the data.
  • In some implementations, the method also includes managing the subjects based on the relative states.
  • In some implementations, the method also includes assigning tasks to the subjects based on the relative states of the subjects.
  • In some implementations, one or more of the subjects are put into an athletic contest according to the relative states of the subjects.
  • In some implementations, a subject is put into the athletic contest if one or more of the respiratory rate, the heart rate, and the blood pressure of the subject is above a respective threshold.
  • In some implementations, one or more of the subjects are assigned particular combat tasks according to the relative states of the subjects.
  • In some implementations, a subject is assigned a particular combat task if one or more of the respiratory rate, the heart rate, and the blood pressure of the subject is above a respective threshold.
  • In some implementations, the relative states include one or more of relative psychological states, relative physical states, and relative states of readiness.
  • In some implementations, the two or more subjects are managed based on the relative states.
  • In some implementations, the processor is also configured to assign tasks to the subjects based on the relative states of the subjects.
  • In some implementations, one or more of the subjects are put into an athletic contest according to the relative states of the subjects.
  • In some implementations, one or more of the subjects are assigned particular combat tasks according to the relative states of the subjects.
  • In another aspect, a method includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject while the subject is sleeping. The method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject while the subject is sleeping. The method also includes determining, based on the data, information about a characteristic of the subject's sleep.
  • In another aspect, one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject while the subject is sleeping. The operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject while the subject is sleeping. The operations also include determining, based on the data, information about a characteristic of the subject's sleep.
  • In another aspect, a biofeedback device configured to be worn by a subject includes a light source configured to emit light toward the skin of the subject. The device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject. The optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject. The device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject. The device also includes a processor configured to receive data from one or more of the light-emitting element, the optical sensor, and the motion sensor. The processor is also configured to determine, based on the data, information about a characteristic of the subject's sleep.
  • Implementations can include one or more of the following features.
  • In some implementations, the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • In some implementations, the data is acquired continuously.
  • In some implementations, the data is acquired at a frequency of at least 16 Hz.
  • In some implementations, the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • In some implementations, the data is acquired at a single location of the subject.
  • In some implementations, the data is acquired by a device worn by the subject.
  • In some implementations, the device is mobile and does not reduce a mobility of the subject.
  • In some implementations, the device processes the data.
  • In some implementations, the single location is an arm of the subject.
  • In some implementations, the single location is a wrist of the subject.
  • In some implementations, the method also includes generating a reduced set of data by excluding data associated with non-sleep periods of the subject.
  • In some implementations, a period of time is identified as a non-sleep period based on gross motion data of the subject.
  • In some implementations, identifying the period of time as a non-sleep period includes determining that the gross motion data during the period of time is above a threshold.
  • In some implementations, identifying the period of time as a non-sleep period includes determining that the gross motion data during the period of time is substantially irregular.
  • In some implementations, a period of time is identified as a sleep period based on gross motion data of the subject.
  • In some implementations, identifying the period of time as a sleep period includes determining that the gross motion data during the period of time is below a threshold.
  • In some implementations, identifying the period of time as a sleep period includes determining that the gross motion data during the period of time is substantially flat.
  • In some implementations, the method also includes determining a start and an end of the sleep period.
  • In some implementations, determining the start of the sleep period includes identifying a time when the gross motion data falls below a threshold, and determining the end of the sleep period includes identifying a time when the gross motion data rises above a threshold.
  • In some implementations, the method also includes calculating a property of the sleep of the subject based on the data.
  • In some implementations, the property is associated with one or more of heart rate, heart rate variability, activity level, respiratory rate, and blood pressure of the subject.
  • In some implementations, one or more of the heart rate, the heart rate variability, the activity level, the respiratory rate, and the blood pressure of the subject are determined based on the processed data.
  • In some implementations, determining the heart rate of the subject includes calculating a distance between two consecutive reference points in the first dataset, the distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • In some implementations, the reference points are local maxima or local minima.
  • In some implementations, the reference points are peaks or valleys.
  • In some implementations, determining the heart rate variability of the subject includes calculating distances between multiple pairs of consecutive reference points in the first dataset, each distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • In some implementations, determining the blood pressure of the subject includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject. Determining the blood pressure of the subject also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject. Determining the blood pressure of the subject also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject, wherein the PTT is related to an internal pressure of one or more blood vessels of the subject. Determining the blood pressure of the subject also includes determining the blood pressure of the subject based on the internal pressure of the one or more blood vessels.
  • In some implementations, the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • In some implementations, the characteristic of the subject's sleep is determined based on the property.
  • In some implementations, the characteristic includes sleep apnea.
  • In some implementations, determining that the subject is experiencing sleep apnea includes identifying a simple signal in a heart rate signal of the subject that is acquired during a sleep period of the subject.
  • In some implementations, determining that the subject is experiencing sleep apnea includes identifying recurring simple signals in the heart rate signal of the subject.
  • In some implementations, the simple signals recur at least every two minutes during the sleep period of the subject.
  • In some implementations, the characteristic includes a quality of the sleep, including one or more of a sleep duration, a sleep latency, a sleep staging, a number of disturbances, and a number of tosses and turns.
  • In some implementations, determining information about a characteristic of the subject's sleep includes determining the sleep duration of the subject.
  • In some implementations, determining the sleep duration of the subject includes determining a total length of time during which the subject was asleep based on information related to one or more sleep rest periods of the subject.
  • In some implementations, the information related to the one or more sleep rest periods includes a time associated with a beginning of each sleep rest period, a time associated with an end of each sleep rest period, gross motion data of the subject during each sleep rest period, and heart rate data of the subject during each sleep rest period.
  • In some implementations, determining information about a characteristic of the subject's sleep includes determining the sleep latency of the subject.
  • In some implementations, determining the sleep latency of the subject includes determining a length of time that it takes for the subject to transition from a state of wakefulness to the sleep state based on information related to one or more sleep rest periods of the subject and gross motion data of the subject before the subject fell asleep.
  • In some implementations, determining information about a characteristic of the subject's sleep includes determining the sleep staging of the subject.
  • In some implementations, determining the sleep staging of the subject includes determining a deepness of the subject's sleep during a portion of each of one or more sleep rest periods of the subject based on information related to the one or more sleep rest periods.
  • In some implementations, the sleep staging of the subject is determined based on at least a heart rate and gross motion data of the subject during one or more of the portions of the sleep rest periods.
  • In some implementations, the method also includes alerting the subject when the sleep duration exceeds a threshold while the subject is in a light sleep stage.
  • In some implementations, the characteristic includes a sleep disorder.
  • In some implementations, the characteristic includes a level of nocturnal dip of blood pressure.
  • In some implementations, the characteristic includes a sleep period.
  • In some implementations, the method also includes deriving a value representing an evaluation of a state of the subject based on the data.
  • In some implementations, the state of the subject includes a health-related state.
  • In some implementations, the state of the subject is associated with one or more of sleep quality, sleep duration, sleep latency, and sleep staging.
  • In some implementations, the value is provided to the subject or to another party.
  • In some implementations, the value is derived based on data related to motion of the subject.
  • In some implementations, the data is acquired by a device that is worn by the subject and that displays the value.
  • In some implementations, the device derives the value.
  • In some implementations, the device provides the data to a remote device that derives the value.
  • In some implementations, the method also includes processing data that represents information about an amount of ultraviolet light that the subject has been exposed to.
  • In some implementations, the method also includes correlating a characteristic of the subject's sleep to the amount of ultraviolet light that the subject has been exposed to.
  • In some implementations, the method also includes correlating a quality of the subject's sleep to the amount of ultraviolet light that the subject has been exposed to.
  • In some implementations, the method also includes correlating a duration of the subject's sleep to the amount of ultraviolet light that the subject has been exposed to.
  • In some implementations, the processor is also configured to identify a period of time as a non-sleep period based on gross motion data of the subject measured by the motion sensor.
  • In some implementations, identifying the period of time as a non-sleep period includes determining that the gross motion data during the period of time is above a threshold.
  • In some implementations, identifying the period of time as a non-sleep period includes determining that the gross motion data during the period of time is substantially irregular.
  • In some implementations, the processor is also configured to identify a period of time as a sleep period based on gross motion data of the subject measured by the motion sensor.
  • In some implementations, identifying the period of time as a sleep period includes determining that the gross motion data during the period of time is below a threshold.
  • In some implementations, identifying the period of time as a sleep period includes determining that the gross motion data during the period of time is substantially flat.
  • In some implementations, the processor is also configured to determine a start and an end of the sleep period.
  • In some implementations, determining the start of the sleep period includes identifying a time when the gross motion data falls below a threshold, and determining the end of the sleep period includes identifying a time when the gross motion data rises above a threshold.
  • In some implementations, the processor is also configured to calculate a property of the sleep of the subject based on the data.
  • In some implementations, the characteristic of the subject's sleep is determined based on the property, and the characteristic of the subject's sleep includes sleep apnea.
  • In some implementations, the processor is also configured to determine that the subject is experiencing sleep apnea. Determining that the subject is experiencing sleep apnea includes identifying a simple signal in a heart rate signal of the subject that is acquired during a sleep period of the subject.
  • In some implementations, determining that the subject is experiencing sleep apnea includes identifying recurring simple signals in the heart rate signal of the subject.
  • In some implementations, the simple signals recur at least every two minutes during the sleep period of the subject.
  • In some implementations, the characteristic includes a quality of the sleep, including one or more of latency to sleep, number of disturbances, and number of tosses and turns.
  • In another aspect, a method includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The data in the first and second datasets is acquired while the subject is in a situation that requires at least a predetermined amount of alertness of the subject.
  • In another aspect, one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The data is acquired while the subject is in a situation that requires at least a predetermined amount of alertness of the subject.
  • In another aspect, a biofeedback device configured to be worn by a subject includes a light source configured to emit light toward the skin of the subject. The device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject. The optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject. The device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor. The processor is also configured to process the data to derive a measure of alertness of the subject.
  • Implementations can include one or more of the following features.
  • In some implementations, the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject
  • In some implementations, the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • In some implementations, the data is acquired continuously.
  • In some implementations, the data is acquired at a frequency of at least 16 Hz.
  • In some implementations, the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • In some implementations, the data is acquired at a single location of the subject.
  • In some implementations, the data is acquired by a device worn by the subject.
  • In some implementations, the device is mobile and does not reduce a mobility of the subject.
  • In some implementations, the device processes the data.
  • In some implementations, the single location is an arm of the subject.
  • In some implementations, the single location is a wrist of the subject.
  • In some implementations, the situation includes one in which a likelihood of harm to one or more human lives is increased if the alertness of the subject is below the predetermined amount.
  • In some implementations, the situation is one in which a likelihood of damage to one or more properties is increased if the alertness of the subject is below the predetermined amount.
  • In some implementations, the situation is one in which a likelihood of economic damage is increased if the alertness of the subject is below the predetermined amount.
  • In some implementations, the situation is one or more of air traffic control, intelligence analysis, vehicle driving, machinery driving, security guarding, baggage screening, and aircraft piloting.
  • In some implementations, the method also includes using the processed data to derive a measure of alertness of the subject.
  • In some implementations, the measure of alertness of the subject is based on one or more of a heart rate, a respiratory rate, a blood pressure, and an activity level of the subject.
  • In some implementations, the method also includes activating an alarm on a device worn by the subject if the measure of alertness of the subject falls below a threshold.
  • In some implementations, the device worn by the subject acquires the data.
  • In some implementations, the device worn by the subject processes the data.
  • In some implementations, the method also includes causing a speed of a vehicle being operated by the subject to be decreased if the measure of alertness of the subject falls below a threshold.
  • In some implementations, the method also includes causing an alarm in a vehicle being operated by the subject to be activated if the measure of alertness of the subject falls below a threshold.
  • In some implementations, the method also includes causing a device being operated by the subject to be turned off if the measure of alertness of the subject falls below a threshold.
  • In some implementations, the method also includes causing an operation switch of a vehicle being operated by the subject to be turned off if the measure of alertness of the subject falls below a threshold.
  • In some implementations, the method also includes assigning a task to the subject based on the measure of alertness.
  • In some implementations, the subject is put into an athletic contest if the measure of alertness of the subject is above a threshold.
  • In some implementations, the subject is assigned a particular combat task if the measure of alertness of the subject is above a threshold.
  • In some implementations, the biofeedback device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject. The processor is also configured to receive and process the data from the motion sensor.
  • In some implementations, the biofeedback device also includes a transceiver configured to provide one or both of the processed data and the measure of alertness.
  • In some implementations, the transceiver is also configured to cause a speed of a vehicle being operated by the subject to be decreased if the measure of alertness of the subject falls below a threshold.
  • In some implementations, the transceiver is also configured to cause an alarm in a vehicle being operated by the subject to be activated if the measure of alertness of the subject falls below a threshold.
  • In some implementations, the transceiver is also configured to cause a device being operated by the subject to be turned off if the measure of alertness of the subject falls below a threshold.
  • In some implementations, the transceiver is also configured to cause an operation switch of a vehicle being operated by the subject to be turned off if the measure of alertness of the subject falls below a threshold.
  • In some implementations, the processor is also configured to assign a task to the subject based on the measure of alertness.
  • In some implementations, the subject is put into an athletic contest if the measure of alertness of the subject is above a threshold.
  • In some implementations, the subject is assigned a particular combat task if the measure of alertness of the subject is above a threshold.
  • In some implementations, operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • In another aspect, a method includes processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The method also includes predicting a medical event of the subject based on the processed data.
  • In another aspect, one or more machine-readable storage devices stores instructions that are executable by one or more processing devices to perform operations including processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject acquired at a location of the subject. The operations also include predicting a medical event of the subject based on the processed data.
  • In another aspect, a biofeedback device configured to be worn by a subject includes a light source configured to emit light toward the skin of the subject. The device also includes an optical sensor configured to receive the emitted light after the emitted light reflects off of the skin of the subject. The optical sensor is also configured to provide data that corresponds to a characteristic of the received light, the data representing time-varying information about at least one pulse pressure wave propagating through blood in the subject acquired by the optical sensor at a location of the subject. The device also includes a processor configured to receive data from one or both of the light-emitting element and the optical sensor. The processor is also configured to predict a medical event of the subject based on the data.
  • Implementations can include one or more of the following features.
  • In some implementations, the method also includes processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • In some implementations, the information about at least one pulse pressure wave propagating through blood in the subject includes photoplethysmographic (PPG) data and the information about motion of the subject includes one or both of motioncardiogram (MoCG) data and gross motion data.
  • In some implementations, the data is acquired continuously.
  • In some implementations, the data is acquired at a frequency of at least 16 Hz.
  • In some implementations, the data is acquired at a frequency of between 75 Hz and 85 Hz.
  • In some implementations, the data is acquired at a single location of the subject.
  • In some implementations, the data is acquired by a device worn by the subject.
  • In some implementations, the device is mobile and does not reduce a mobility of the subject.
  • In some implementations, the device processes the data.
  • In some implementations, the single location is an arm of the subject.
  • In some implementations, the single location is a wrist of the subject.
  • In some implementations, the method also includes alerting a caregiver when a medical event of the subject is predicted.
  • In some implementations, processing the data includes determining one or more of heart rate, heart rate variability, blood pressure, blood pressure variability, body temperature, skin temperature, vocal tonality, electrical skin impedance, respiratory rate, blood oxygen level, stroke volume, cardiac output, MoCG morphology, and PPG morphology of the subject.
  • In some implementations, predicting the medical event of the subject includes determining whether a heart rate of the subject satisfies a threshold.
  • In some implementations, the medical event is tachycardia.
  • In some implementations, determining the heart rate of the subject includes calculating a distance between two consecutive reference points in the first dataset, the distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • In some implementations, the reference points are local maxima or local minima.
  • In some implementations, the reference points are peaks or valleys.
  • In some implementations, predicting the medical event of the subject includes determining whether a heart rate variability of the subject satisfies a threshold.
  • In some implementations, the threshold is based on whether the subject experiences arrhythmia.
  • In some implementations, determining the heart rate variability of the subject includes calculating distances between multiple pairs of consecutive reference points in the first dataset, each distance representing a time that has elapsed between two consecutive heartbeats of the subject.
  • In some implementations, the reference points are local maxima or local minima.
  • In some implementations, the reference points are peaks or valleys.
  • In some implementations, predicting the medical event of the subject includes determining whether a blood pressure of the subject satisfies a threshold.
  • In some implementations, the medical event is hypertension.
  • In some implementations, predicting the medical event of the subject includes determining a rate of change of a blood pressure of the subject.
  • In some implementations, the medical event is a stroke, and a stroke is predicted if the rate of change of the blood pressure of the subject is positive and above a threshold.
  • In some implementations, the medical event is abnormal heart function, and abnormal heart function is predicted if the rate of change of the blood pressure of the subject is negative and below a threshold.
  • In some implementations, the method also includes identifying a first point in the first dataset, the first point representing an arrival time of the pulse pressure wave at a first body part of the subject. The method also includes identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject. The method also includes computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
  • In some implementations, the blood pressure of the subject is determined based on the PTT.
  • In some implementations, the first body part is the location of the subject at which the data in the first data set is acquired, and the second body part is the heart of the subject.
  • In some implementations, the device also includes a motion sensor configured to provide data that represents time-varying information about motion of the subject acquired by the motion sensor at the location of the subject. The processor is also configured to receive data from the motion sensor.
  • In some implementations, the device also includes a transceiver configured to alert a caregiver when a medical event of the subject is predicted.
  • In some implementations, the operations also include processing data in a second dataset that represents time-varying information about motion of the subject acquired at the location of the subject.
  • Aspects can include one or more of the following advantages.
  • Particular implementations may realize one, or more of the following advantages. Blood pressure and/or other biometric parameters may be measured based on continuously acquired data, without the need for cuffs, pressure points or electrodes. “Continuously” acquiring data, as used herein, means acquiring data at a sufficient frequency (e.g., a sufficient number of times per second) to allow for the derivation of the parameters described herein from that data. The data can, for example, be collected at a frequency ranging from 16 Hz to 256 Hz. In certain implementations, the data is acquired at a frequency of between 75 Hz and 85 Hz. Vital signs can be measured at one location, using a comfortable and unobtrusive device. By providing an ability to capture continuous measurements 24 hour a day, a new paradigm in monitoring health can be enabled, thereby allowing for recording transient medical events that may otherwise go undetected. The disclosed technology may be integrated with third party devices (for example, mobile devices) thereby allowing for using external sensors such as motion detectors and light sensors disposed in the third party devices.
  • The details of one or more implementations of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages of the subject matter will become apparent from the description, the drawings, and the claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1A illustrates pulse transit time (PTT) calculation using an example BCGB plot, and a photoplethysmogram (PPG) plot.
  • FIGS. 1B and 1C are example block diagrams of a device that performs biometric measurements based on MoCG and PPG data.
  • FIGS. 1D-1F are plots generated based on data collected using sensors of the device of FIGS. 1B and 1C.
  • FIG. 1G illustrates side and top views of an example configuration of optical sensors that can be used in the device of FIGS. 1B and 1C.
  • FIGS. 2A-2C, 3, and 4 illustrate plots generated based on data collected by the sensors of the device of FIGS. 1B and 1C.
  • FIGS. 5A-5E illustrate examples of cardiac signals.
  • FIGS. 6A-6C are flowcharts depicting example processes for biometric authentication.
  • FIG. 7A is a flowchart depicting an example of a process for calculating motion pulse transit time (MPTT).
  • FIG. 7B is a flowchart depicting an example of another process for calculating MPTT.
  • FIG. 8 shows examples of heat maps that relate to data collected from the motion sensors of the device of FIGS. 1B and 1C, and are used in determining weights for data corresponding to accelerometers oriented along different axes.
  • FIGS. 9, 10A-10C, 11A, and 11B illustrate plots used in calculating MPTT.
  • FIG. 12 is a flowchart depicting an example of a process for calibration of the device of FIGS. 1B and 1C.
  • FIGS. 13 and 14 illustrate examples related to calibration of the device of FIGS. 1B and 1C.
  • FIGS. 15A-15D and 16A-16C show examples of plots used in detecting various heart conditions.
  • FIG. 17 is a flowchart of an example of a process for detecting arrhythmia.
  • FIG. 18 is an example of a plot of arterial stiffness vs. exercise frequency.
  • FIGS. 19A and 19B are examples of plots used in determining sleep quality and/or sleep disorders.
  • FIG. 20 is an example of a screenshot for showing sleep quality.
  • FIG. 21 is a flowchart depicting an example of a process for determining sleep quality.
  • FIG. 22 is an example of a screenshot for showing a fitness-related metric.
  • FIG. 23 is an example of a screenshot for showing a stress-related metric.
  • FIG. 24 is a flowchart depicting an example of a process for deriving information about a psychological state of a subject.
  • FIG. 25 is a flowchart depicting an example of a process for determining a metric for quality of care provided at a care facility.
  • FIG. 26 shows an example where the technology described is used by emergency responders.
  • FIG. 27 is a flowchart depicting an example of a process for determining relative states of multiple subjects.
  • FIG. 28 is a flowchart depicting an example of a process for predicting a medical event.
  • FIG. 29 is a flowchart depicting an example of a process for determining information about a medication regimen.
  • FIG. 30 shows an example where the technology is used at a medical or caregiving facility.
  • FIG. 31 shows an example of the technology being used with a proximity system.
  • FIGS. 32A and 32B show an example implementation of the device of FIGS. 1B and 1C in the form of a wearable watch.
  • FIG. 33 shows an example of an environment where the technology is used for access control.
  • FIG. 34 shows an example where the technology is used for allowing a user to access/operate a vehicle of other machinery.
  • FIG. 35 shows an example where the technology is used for controlling gaming and/or entertainment systems.
  • FIG. 36 shows an example where the technology is used for controlling various devices connected to a network.
  • FIG. 37 is an example of a screenshot that displays and allows sharing of blood pressure results.
  • FIG. 38 is a flowchart depicting an example of a process for controlling remote devices using the technology described in this document.
  • FIGS. 39A-39C show examples of user interfaces of an application that makes data collected by the device of FIGS. 1B and 1C available to a user.
  • FIG. 40 is an example of a block diagram of a computer system.
  • DETAILED DESCRIPTION
  • This document describes technology for determining pulse transit time (PTT) of blood based on motion data such as motioncardiogram (MoCG) data (which is related to, and also referred to in this document as ballistocardiogram (BCG) data) and optical data such as photoplethysmographic (PPG) data. When determined using motion data of the body, PTT can also be referred to as motion pulse transit time (MPTT). In this document, the terms PTT and MPTT may be used interchangeably. This document also describes technology for performing various biometric measurements (e.g., blood pressure, respiratory rate, blood oxygen level, stroke volume, cardiac output, arterial stiffness, and temperature) based on the MoCG data and the PPG data. The MoCG is an example of a motion of the subject. For example, MoCG is a pulsatile motion signal of the body measurable, for example, by a motion sensor such as an accelerometer or a gyroscope. The pulsatile motion signal results from a mechanical motion of portions of the body that occurs in response to mechanical motion of the heart. For example, the pulsatile motion signal can result from mechanical motion of portions of the body that occurs in response to blood being pumped during a heartbeat. This motion is a mechanical reaction of the body to the internal pumping of blood and is externally measurable. The MoCG signal therefore corresponds to, but is delayed from, the heartbeat. The MoCG signal recorded at a given portion of the body therefore represents the motion of the blood due to a heartbeat, but is delayed from, the heart's electrical activation (e.g. when the ventricles are electrically depolarized).
  • PPG data is data optically obtained via a plethysmogram, a volumetric measurement of the vasculature. PPG data can be obtained using an optical device which illuminates the skin and measures changes in light absorption. With each cardiac cycle the heart pumps blood resulting in a pressure pulse wave within the vasculature. This causes time-varying changes in the volume of the vasculature. The changes can be detected, for example, by illuminating the skin with light from a light-emitting diode (LED) and then measuring the amount of light either transmitted or reflected to a detector such as a photodiode. Each cardiac cycle is therefore represented as a pattern of crests and troughs. The shape of the PPG waveform differs from subject to subject, and varies with the location and manner in which the waveform is recorded.
  • FIG. 1A illustrates pulse transit time (PTT) calculation using an example BCG plot 102, and a photoplethysmogram (PPG) plot 103. BCG plot 102 can be analyzed to determine points at which a pulse (or pressure wave) originates at a first location on the body. The BCG however, may be measured at a second location on the body. For example, the points (e.g., local maxima) 108 a, 108 b and 108 c in the BCG plot 102 may represent time points at which corresponding pulses originate at or near the chest. These points are often referred to in this document as pulse origination points.
  • The time of arrival of the pulse at a second location (e.g., the wrist) can be determined from PPG data obtained at the second location. For example, the PPG data can be measured at the wrist using one or more optical sensors. Light from the optical sensors (i.e., the light sources such as LEDs of the optical sensors) is directed toward the skin of the subject, and the reflected light (which is modulated by blood volume changes underneath the skin) is measured using one or more photo-detectors (e.g., photodiodes). The output of the photo-detector may be amplified by an amplifier before being converted to a digital signal (for example, by an analog to digital converter (ADC)) that represents the PPG.
  • The plot 103 of FIG. 1A represents PPG data that can be used to determine the arrival time of the pulses at the wrist. For example, the maximum slope points 109 a, 109 b, and 109 c (109 in general) represent the arrival times of the pulses that originated at the chest at time points represented by 108 a, 108 b, and 108 c, respectively. These points may in general be referred to in this document as pulse arrival points 109. The plot 103 is synchronized with the BCG plot 102 such that the PTT (or MPTT) 113 between the chest and the wrist can be determined as a time difference between the originating point at the chest and the corresponding arrival point at the wrist. In the example shown in FIG. 1A, the time difference between 108 b and 109 b represents the PTT 113. Similarly, the time difference between 108 a and 109 a, or the time difference between 108 c and 109 c can be used in determining the PTT 113.
  • The technology described in this document allows for determination of PTT from MoCG (or BCG) and PPG data measured at substantially the same location on a human body (e.g., the wrist). This includes identifying, from the PPG data, a time point (e.g., the time points 109) at which a pulse wave arrives at the location, identifying, from the BCG data, a time point (e.g., the time points 108) at which the pulse originated at a different location on the body (e.g., the heart) from the MoCG data, and determining the PTT 113 as a difference between the two identified time points.
  • FIG. 1B is a block diagram of an example of a device 100 that performs biometric measurements based on MoCG and PPG data. The biometric measurements can be used for monitoring health related parameters, as well as in diagnosing conditions and predicting an onset of such conditions. In some implementations, the device 100 can be a wearable device that a subject can wear on the body. For example, the device 100 can be disposed in a wearable watch, bracelet, anklet, armband, chest-patch, or belt. An example implementation of the device in the form of a wearable watch 3200 is shown in FIGS. 32A and 32B. The watch 3200 includes a case 3202 that is configured to hold the internal components of the watch, including light sources 3204a, 3204b, an optical sensor 3206, a motion sensor 3208, a processor 3210, and an ultraviolet light sensor 3212.
  • In some implementations, the device may also be disposed as a part of a garment worn by the subject. The device 100 may also be disposed in a rug or mat (e.g., a bathroom mat or a shower mat). The device 100 may also be disposed in a separate device carried or worn by the subject. For example, the device 100 can be disposed internally or externally in a watch or mobile device used by the subject. In some implementations, the device 100 can include a transceiver that is configured to communicate wirelessly with another device to perform a biometric monitoring process. For example, data collected and/or computed by the device 100 may be transmitted to an application executing on a mobile device for additional analysis or storage. On the other hand, alerts and messages may be transmitted from a server or mobile device for display on the device 100. Devices similar to the device 100 are described in U.S. patent application Ser. Nos. 13/166,388 and 13/803,165, and 61/660,987, the contents of which are incorporated by reference herein. Various combinations of the operations described in this document may also be performed by a general purpose computing device that executes appropriate instructions encoded on a non-transitory computer readable storage device such as an optical disk, a hard disk, or a memory device.
  • The device 100 can be configured to make MoCG and PPG measurements either directly (such as when implemented as a part of an armband, wristband, chest patch, undergarment) or indirectly (such as when implemented as part of a mobile device) from a portion of the body proximate to the location of the device. The MoCG data can be measured using one or more motion sensors 105 such as an accelerometer or a gyroscope. In some implementations, the motion sensors 105 include multiple accelerometers (e.g., one for each of the x, y, and z axes) and/or multiple gyroscopes (e.g., one each for measuring tilt, rotation, and yaw). Even though FIG. 1B shows only motion sensors and optical sensors, other types of sensors such as electric impedance sensors (including electrical skin impedance sensors, such as Galvanic skin resistance sensors), hydration level sensors, skin reflection index sensors, and strain sensors can also be used in performing one or more of the measurements described in this document. In some implementations, one or more of the sensors may be located in an external device such as a mobile device. For example, motion sensors and a camera disposed in a mobile device may be used in place of the motion sensors 105 and optical sensor 110, respectively. In some implementations, the device 100 can include one or more sensors to measure or detect ambient conditions. Such sensors can include, for example, a microphone (e.g., to measure environmental noise), an altimeter, a humidity sensor, a GPS device (for determining geographical location), and an ultraviolet light sensor (e.g. to detect level of sun exposure).
  • In some implementations, the device 100 can be configured to warn the user (for example, by displaying a message) if a measured, derived, or inferred health parameter is outside an acceptable range for the parameter. Examples of such health parameters can include (without being limited to the following) measured parameters such as heart rate, respiratory rate, or arrhythmia, derived parameters such as blood pressure, stroke volume, or arterial stiffness, and inferred parameters such as mood, stress level, or sleep deprivation. In one example, the level of sun exposure (as measured by the ultraviolet light sensor) can be correlated to the mood or stress level of the user, and related suggestions and recommendations can be provided accordingly. For example, if sun exposure above a certain threshold level is known to decrease stress for a particular user, the user may be asked to increase sun exposure during a period when a stress level detected by the device 100 is high.
  • In some implementations, environmental sounds captured by the microphone can be used to contextualize or interpret vital signs data captured using the device 100. For example, a tonality (e.g., amplitude and/or frequency) of a user's voice can be analyzed to determine if the user is in a confrontational situation (e.g., at work or at home) that can be attributed to an unacceptable level of a particular health parameter (e.g., stress). In another example, environmental noise can be detected during a user's commute to determine, for example, if, and to what extent driving (or rush hour subway) affects the user's health parameters. In yet another example, if a user is detected to be having a disturbed sleep pattern, the data captured by the microphone can be used to determine and/or confirm if that is attributable to environmental noise (e.g., snoring, or an alarm clock going off). In another example, if an unacceptable condition (e.g., a user's increased stress level) coincides with construction activity (determined, for example, via pile driver sounds captured by the microphone), a determination may be made that the unacceptable condition is likely due to the sounds coming from the construction site.
  • The data captured using the motion sensors 105 includes both MoCG data and motion data associated with an activity of the subject. The MoCG data can be filtered out from the combination using, for example, one or more band pass filters (BPF) 125 shown in FIG. 1C. In some implementations, a pass band of the BPF 125 can be designed to filter out constant components (e.g., acceleration due to gravity) and high frequency noise components. For example, in some cases, a pass band of 3-12 Hz may be used for the band pass filter 125. In other cases multiple band pass filters may be used concurrently. For example, a filter with a 3-12 Hz passband and another filter with a 10-30 Hz passband can be used simultaneously to measure different parameters measurable in the two different bands. In some implementations, the band pass filtered accelerometers can be combined to obtain an activity index 127, which in turn is used in calculating appropriate weights 130 for obtaining updated biometric measurements 132. For example, if the subject is sitting still, the activity index 127 can be less than a threshold value (e.g., 5) indicating, for example, that the band pass filtered accelerometer outputs can be used directly in determining the biometric measurements. In another example, if the subject is running, the activity index 127 can be higher (e.g., between 5 and 15), indicating that the band pass filtered accelerometer data may need to be adjusted (e.g., by applying a threshold) before being used in determining the biometric measurements. In some implementations, if the activity index is higher than an upper threshold value (e.g., 15), the band pass filtered accelerometer data may be discarded as being unreliable. In some implementations, weights 130 may be adjusted to reflect if and how the band pass filtered data from the accelerometer 105 is used. Examples of band pass filtered accelerometer data are illustrated in FIG. 1F, where plots 170, 172, and 174 represent outputs of accelerometers in the x, y, and z axes, respectively.
  • In some implementations, the PPG data can be measured using one or more optical sensors 110. In some implementations, the optical sensors 110 can include one or more light emitting diodes (LEDs) whose output can be controlled, for example, by a microcontroller. Example configurations of the optical sensors 110 are depicted in FIG. 1G. In some implementations, the optical sensors include a 7.5 mm2 photodiode with two green LEDs placed within 1.5 mm of either side. The photodiode has an opaque optical shield surrounding the sides. The LEDs can have a peak wavelength of 525 nm and a viewing angle of 60 degrees.
  • In operation, light from the optical sensors 110 (i.e., from the light sources such as LEDs of the optical sensors) is directed toward the skin of the subject, and the reflected light is modulated by blood flow underneath the skin. The optical sensors 110 also include one or more photo-detectors (e.g., photodiodes) that receive the reflected light and provide a resulting signal to the microcontroller. The resulting signal may be amplified by an amplifier before being converted to a digital signal (for example, by an analog to digital converter (ADC)) that is provided to the microcontroller. The PPG signal is synchronized with the heartbeat and can therefore be used to determine the heart rate (HR) 112 of a wearer of the device. This is shown in additional detail in FIG. 1C. In some implementations, the heart rate signal can be within a particular range of the spectrum (e.g., 0 to one half of the sampling frequency) of the PPG signal 150, and can be isolated using, for example, a band pass filter (BPF) 154. An example of this is shown in FIG. 1D, where the plot 160 represents raw PPG data, and the plot 162 represents the output of the BPF 154. The pass band of the filter used for the example depicted in FIG. 1C is 0.4-4 Hz. As seen from FIG. 1C, the low frequency portion of the raw data, as well as the high frequency variations are filtered out in the output plot 162.
  • In some implementations, it could be desirable to sample the optical PPG sensor at a low frequency to achieve power savings. However, a low sampling frequency can cause interference between the optical sensors and artificial light sources, which usually oscillate at the frequency of 60 Hz and 120 Hz in North America, and 50 Hz and 100 Hz worldwide. If the sampling rate is lower than the Nyquist rate corresponding to the maximum frequency (e.g., 120 Hz*2=240 Hz) then aliasing would occur. For example, if the PPG sensor is sampled at 121 Hz, then a 120 Hz interfering source will alias as 1 Hz, which is within the frequency of heart rate and could cause confusion. In some implementations, a frequency between 75-85 Hz is chosen such that reasonable power saving is achieved, and the optical interferers are aliased into non-biological optical signal frequency range (>10 Hz). For example, if 80 Hz is chosen, then the aliased interferers would be at frequencies such as 20 Hz, 30 Hz, and/or 40 Hz. An appropriate low pass filter (e.g., a filter with cut-off frequency of 10 Hz) could then be used to eliminate the interferers while preserving the PPG signal. If a finer time resolution is desired (e.g., corresponding to 256 Hz), the filtered PPG signal can be interpolated accordingly in time domain without signal loss.
  • In some implementations, the output of the BPF 154 can be used to determine a heart rate 144 of the subject, and can also be combined with the output of the BPF 125 to determine other biometric parameters such as pulse transit time (MPTT) and stroke volume (SV) 145, as well as other parameters 146, including, for example, systolic and diastolic blood pressure, stroke volume (SV), and cardiac output (CO).
  • In some implementations, calibration data 155 is used in computing one or more of the parameters 146. For example, the calibration data 155 can include user-specific calibration information (e.g., constants used in equations) that may be used in computing one or more of the parameters 146. In some implementations, the calibration data 155 can be computed based on user-provided data. For example, a user may be asked to provide biographical data such as age, height, and weight for use in computing the calibration data. In some implementations, the user can be asked to provide his/her last-known blood-pressure data to determine one or more constants or parameters included in the calibration data 155. In some cases, a medical professional may measure a user's blood pressure during set up of the device 100. In some implementations, calibration data 155 can be calculated based on a user action. For example, the user may be asked to hold the device 100 at or near chest level to equalize hydrostatic pressure effects and sense chest vibrations that are used in computing a calibration point. This way, a delay between a chest vibration and the time of arrival of a pulse wave at the wrist (if the device 100 is worn on the wrist) can be used to calibrate for blood pressure for a scenario where there is no height difference between the heart and the measuring point. In some implementations, the calibration data 155 can include information related to skin tone calibration where LED intensity and amplifier gain are adjusted until an optimal DC level is reached. If no user-specific calibration data is available, standard calibration values (for example parameters to get a standard 120/80 mmHg sys/dia measurements) may be included in the calibration data 155. In some implementations, the calibration factors may be adjusted retroactively once the user enters valid calibration data. Calibration data may also be imported from the user's medical records if, for example, the device is dispensed to the user by their medical professional.
  • Because the baseline of PPG is modulated by respiration, a signal representing respiratory rate is typically within the 0-1 Hz range of PPG, and can be obtained using low pass filtering. This is illustrated in FIG. 1C, where the PPG data 150 is passed through the low pass filter (LPF) 152 and optionally combined with the output of another LPF 135 (used for low pass filtering the MoCG data) to obtain biometric parameters such as sleep data 142 and respiratory rate 143. An example of determining the respiratory rate 143 from the PPG data 150 is illustrated in FIG. 1E. In this example, the plot 166 represents the raw PPG data, and the plot 168 shows the output of the LPF 152 representing the low frequency variations due to respiration.
  • In some implementations, other biometric parameters may also be computed. For example, by using multiple LEDs of different colors in the optical sensor 110, blood oxygenation (SpO2) can be obtained using pulse oximetry theory. Computation of other biometric parameters is described below. Referring again to FIG. 1B, the device 100 can also include a computing device 115 that can be configured to compute the biometric parameters, including, for example, blood pressure, respiratory rate, blood oxygen, stroke volume, cardiac output, and temperature. In some implementations, an activity index 148 (which may be the activity index 127, also shown in FIG. 1C) can be used in determining a set of weights 147 used in calculating one or more of the biometric parameters 146.
  • As seen from FIG. 1C, the heart rate information 144 is used in calculating one or more of the biometric parameters 146. In some implementations, the heart rate information 144 can be obtained from the PPG by detecting peaks and/or valleys in a graphical representation (e.g., the plot 162 shown in FIG. 1C) of the PPG data 150. This can include, for example, cross-correlating a portion of the PPG data (e.g., samples or data corresponding to a two second segment of the plot 162 of FIG. 1C) with similar segments to produce a plot 180 (shown in FIG. 2A) representing a series of cross-correlation products. In one example, two-second segments from the plot 162 are cross-correlated with adjacent (possibly with some partial overlap) two-second segments to produce the plot 180 of FIG. 2A. A particular cross correlation result (for example, one that produces the highest cross-correlation amplitude) can then be selected as a template. The plot 178 shown in FIG. 2B is an example of a template. In some implementations, the template can be adjusted to conform to a desired morphology, allowing for a beat to beat natural variation but discounting noise and non-heartbeat signals.
  • The selected template can then be correlated with segments from the plot 162 (shown in FIG. 1C) to identify locations of correlation peaks. This is illustrated in FIG. 2A, where the plot 180 represents a series of such peaks. The location of the correlation signal peaks can be used to direct a search for valleys, inflection points, and/or peaks within the band pass filtered PPG signal. The inflection point in this case is defined as the point of maximum slope. FIG. 2C illustrates an example of a PPG signal with identified peaks 181, inflection points 183 and valleys 185. For brevity, only a few of the peaks, inflection points, and valleys are marked using the reference numbers 181, 183, and 185, respectively. The distance between two consecutive valleys (or inflection points or peaks) represents a time difference between two consecutive heartbeats, and can be used to compute instantaneous heart rate. For example, if two valleys (or inflection points or peaks) are separated by 141 samples, and if the sampling rate is 128 Hz, the instantaneous heart-rate can be computed as 60*128/141=54.47 beats-per-minute (BPM). The instantaneous heart rate for each of the heartbeats can be plotted as shown in FIG. 3, and can be used for other purposes such as computing other parameters and diagnosing conditions such as arrhythmia.
  • In some implementations, confidence levels associated with a calculated instantaneous heart rate can be determined before being used in any subsequent analysis. For example, if a person suddenly stands up from a sitting position, the instantaneous heart rate during the transition may shoot up. In some implementations, the rate of such rapid increase can include meaningful information. However, in some implementations, the information obtained during this transition may not be reliable as an indicator of the person's health status. Determining confidence levels associated with the computed heart rates can allow for discarding such outliers in subsequent analyses. In some implementations, a given computed instantaneous heart rate can be compared, for example, to the average (or median) instantaneous heart rate over a predetermined time range (e.g., ±10 seconds) to determine whether the given instantaneous heart rate is reliable. If the given instantaneous heart rate differs (e.g., differs by more than a predetermined amount) from the average heart rate over the predetermined time range, the given instantaneous heart rate may be determined to be unreliable and therefore deweighed in subsequent computations. This allows for selecting reliable data points at the expense of a short latency (10 seconds in the above example).
  • In some implementations, the instantaneous heart-rate data as shown in FIG. 3 can be used for computing instantaneous heart-rate variability (HRV). An example of HRV plotted against the corresponding heartbeats is shown in FIG. 4. As shown in FIG. 4, the HRV data can be used to calculate a mean HRV for a set of heartbeats. In some implementations, HRV data can be used in detecting conditions such as stress. For example, if the mean HRV is above a certain threshold, the subject may be determined to be under higher than usual stress. In the time domain, HRV can be calculated by computing a variance of individual RR intervals (distance between the ‘R’ points of two consecutive QRS complex curves representing heartbeats, or alternatively the distance between valleys as shown in FIG. 2C) from the average RR interval, over a period of time (e.g., 5 minutes). Alternatively, the HRV can also be calculated in the frequency domain by comparing the power spectrum at very low frequencies (e.g., 0.04-0.15 Hz) with the power spectrum at slightly higher frequencies (e.g., 0.18 to 0.4 Hz).
  • Cardiac waveform morphology (also referred to as cardiac morphology) can be defined as the shape of a plot representing cardiac activity. FIG. 5A represents a Wiggers diagram, which is a standard diagram used in cardiac physiology. Referring to FIG. 5A, the shape of an electro-cardiogram (ECG) QRS complex 505 represents a morphology associated with a heartbeat. Cardiac morphology depends on where and how cardiac activity is measured. For example, the morphology 510 of a phonocardiogram signal is different from that of the ECG morphology 505. In another example, the morphology associated with ventricular volume 515 is different from the morphology associated with ventricular pressure 520.
  • FIG. 5B shows an example of a cardiac signal illustrating the morphology 525 associated with a PPG signal. The morphology of a measured PPG signal can be checked to determine whether the measured PPG signal reliably represents heartbeats. In some implementations, the relative separations of the peaks and valleys of the PPG signal are analyzed to determine whether the PPG signal reliably represents heartbeats. For example, a segment of the PPG signal can be determined to represent heartbeats if the following threshold condition is satisfied:

  • 0.25<Median (peak to valley distances)/Median (valley to valley distances)<0.4
  • The condition above uses the range [0.25, 0.4] as an example, and other values can also be used. For example, the range (or threshold) could be determined for an individual user by using, for example, a range considered to be normal for the particular user. The ratio from the above condition can vary within the range for various conditions of the subject. For example, the ratio can be at a low portion of the range during relaxation or sleep conditions, and at a high portion of the range during stressful events such as anger or fear. In some implementations, other morphology checks can also be performed. For example, one morphology check can involve verifying that at a resting position, the user's systolic amplitude is approximately half of the diastolic amplitude. In some implementations, segments that do not satisfy the morphology check conditions are discarded from being used in biometric parameter computations.
  • Cardiac morphology also typically varies from one person to another due to, for example, unique heart beat signatures, breathing patterns and the unique ‘transmission line’ reflection signatures that are caused by the lengths and stiffness of an individual's arteries. In A typical PPG signal the main peak represents the first systolic peak which is followed by the secondary peak (or bump) representing the early diastolic peak (or reflection). The time between the two peaks is also inversely proportional to arterial stiffness. This is easier to visualize from the first and/or the second derivatives of the PPG signal. FIGS. 5C and 5D show examples of cardiac signals illustrating morphology based on PPG signals. In the example of FIG. 5C, the derivative 538 of the PPG signal 539 shows a discernible second peak 540, whereas in the example of FIG. 5D, the corresponding second peak 545 is comparatively weaker. However, the example of FIG. 5D shows the presence of a third peak 550. Therefore, in some implementations, cardiac morphology can be used as a biometric identifier. For example, the device 100 described with reference to FIG. 1B can be configured to verify, based on a determined cardiac morphology, that the person wearing the device is the person for whom the device was assigned. In some implementations, the determined cardiac morphology may also be used to uniquely identify a wearer of the device 100. Such biometric identification can be used, for example, in security and accessibility applications. For example, the device 100 can be configured to transmit a cardiac morphology based signature to a receiver (e.g., on a mobile phone, or at secured access point) to gain access to a secure resource. In some implementations, when a same device is used by multiple individuals (e.g., different members of a family), the wearer of the device may be identified based on the identified cardiac morphology of the wearer. FIG. 5E shows examples of cardiac signals illustrating morphology for four different individuals, and illustrates how the cardiac morphology varies from one person to another.
  • Security Applications
  • In some implementations, multiple measured or derived parameters can be used as a biometric signature to uniquely identify a wearer. For example, a wearer can be identified based on a multi-dimensional space defined based on the measured or derived parameters. Because the parameters vary from one person to another, each person would be mapped to a different region within the multi-dimensional space. A simple two-dimensional example of such a space can be defined, for example, by using heart rate as one axis and PPG shape as the second axis. Because the PPG shape and heart rate varies from one person to another, each person can typically be mapped to a separate region on the two-dimensional plane, and can be identified based on a location of the region. Higher dimensional spaces can be used for robustly identifying individuals among a large population. Examples of parameters that can be used as axes for such spaces include cardiac morphology, heart rate, cardiac volume, PPG, or other parameters derived as a function of one or more of these parameters. In another example, cardiac morphology can be combined with another parameter such as the MoCG morphology to achieve increased accuracy and/or resolution for bio-authentication applications. Examples of such applications include access control, digital wallet authorization, digital passwords/signature and environmental control. In such cases, MoCG data can be used to provide a MPTT signature and/or a MoCG signature waveform that may be unique to a particular user.
  • In some implementations, the biometric signature based user identification can be used in electronic payment applications. In some implementations, the device 100 can be configured to communicate with a payment gateway using, for example, near field communication (NFC) or Bluetooth Low Energy (BLE) protocols. The payment gateway can be configured to identify the user based on a corresponding biometric signature to initiate the payment process. The payment gateway can communicate the identification information to a server that stores credit card or bank information of the corresponding user, for example, within a corresponding user account. Upon receiving identification of the user, the server may initiate communications with the payment gateway that result in the credit card being charged or the bank account being debited.
  • In some implementations, the biometric signature based user identification is disabled if the device determines that the wearer is under distress. The device can determine whether the wearer is under distress based on the wearer's vital signs (e.g., such as heart rate (HR), heart rate variability (HRV), blood pressure (BP), and respiratory rate). For example, if a wearer of the device is being forced to access a payment gateway, the device can detect the wearer's distress, as indicated by a sudden increase in HR, BP, and/or respiratory rate, and prevent him or her from accessing the payment gateway. Similarly, in some examples, if a wearer of the device is being forced to unlock a lock (e.g., a lock on a door of the wearer's home), the device can detect the wearer's distress, as indicated by a sudden increase in HR, BP, and/or respiratory rate, and prevent him or her from unlocking the lock.
  • In some implementations, the wearer's vital signs do not produce a match of the wearer's biometric signature when the wearer is under distress. For example, when the wearer is under distress, the multi-dimensional space defined based on the measured or derived parameters takes on a modified for that does not match the wearer's biometric signature. As such, a wearer under distress is unable to be identified by the biometric signature.
  • In some cases, a wearer may exhibit signs that are synonymous with distress when the wearer is not in fact in distress. For example, if the wearer is involved in a non-dangerous and exciting event, such as buying an extremely expensive item, the wearer may experience an increase in HR, BP, and/or respiratory rate that may mistakenly be interpreted by the device as signs of distress. Thus, in some implementations, the wearer is provided with an opportunity to authenticate himself or herself in the event that the device detects false signs of distress or fails to identify the biometric signature of the wearer. The wearer can authenticate himself or herself using confidential information such as a password or a personal identification number that is communicated to the device or a server in communication with the device. In some implementations, the wearer can authenticate himself or herself by performing a private, predefined gesture. The one or more motion sensors of the device can be configured to determine whether the authenticating gesture matches the predefined gesture.
  • An example process 600 of bio-authenticating a subject is shown in FIG. 6A. A machine, such as a processor, that receives information from the optical sensors 110 of the device 100 can perform one or more steps of the process 600. In some implementations, the machine can include the computing device 115 described above with reference to FIG. 1B. In the process 600, initially, data in a dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed (602). The data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject). A determination can then be made of whether one or more segments of the dataset were captured from a subject other than an expected subject (604). The determination can be made by analyzing morphological features of the segments.
  • Another example process 610610 of bio-authenticating a subject using information about motion of the subject is shown in FIG. 6B. A machine, such as a processor, that receives information from the motion sensor 105 of the device 100 can perform one or more steps of the process 610610. In some implementations, the machine can include the computing device 115 described above with reference to FIG. 1B. In the process 610, initially, data in a dataset that represents time-varying information about motion of a subject can be processed 612). The data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject). A determination can then be made of whether one or more segments of the dataset were captured from a subject other than an expected subject 614). The determination can be made by analyzing morphological features of the segments.
  • Another example process 620 of bio-authenticating a subject is shown in FIG. 6C. A machine, such as a processor, that receives information from the motion sensor 105 and the optical sensors 110 of the device 100 can perform one or more steps of the process 620. In some implementations, the machine can include the computing device 115 described above with reference to FIG. 1B. The machine may also use the calculated MPTT to further generate additional biometric measurements, the processes for which are discussed below. In the process 620, initially, data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed (622). Data in a second dataset that represents time-varying information about motion of the subject can also be processed (624). The data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject). Based on the first and second datasets, at least two parameters of the subject can be determined (626). The parameters can include one or more of blood pressure, respiratory rate, blood oxygen levels, heart rate, heart rate variability, stroke volume, cardiac output, MoCG morphology, and PPG morphology. A biometric signature of the subject can then be determined (628). In some implementations, the (628). The biometric signature can be represented in a multi-dimensional space. Each axis can correspond to at least one of the determined parameters. A determination can then be made of whether the biometric signature was captured from a subject who is an expected subject (630). The determination can be made by analyzing features of the biometric signature.
  • In some implementations, the biometric signature based user identification can be used in providing rewards and/or discounts to a user. For example, if the identified user is determined to be adhering to a particular exercise regimen, reward points or incentives such as discounts on particular products can be credited to the corresponding user account. Therefore, a user can be motivated to keep adhering to particular good practices to keep getting such rewards or discounts.
  • Motion Pulse Transit Time (MPTT) Calculation
  • The information collected from the motion sensors 105 and the optical sensors 110 of FIG. 1B is used to calculate the MPTT, which can be used to further calculate the biometric parameters, such as blood pressure, stroke volume, etc. An example process 700 for the MPTT calculation is shown in FIG. 7A. A machine, such as a processor, that receives the information from the motion sensors 105 and the optical sensors 110 can perform one or more steps of the process 700700. The machine may further provide the calculated results to, for example, the wearer, another person who is interested and authorized to receive the information, or another machine for further data processing or data storage. In some implementations, the machine can include the computing device 115 described above with reference to FIG. 1B. The machine may also use the calculated MPTT to further generate additional biometric measurements, the processes for which are discussed below.
  • In the process 700, initially, the MoCG data for use in the MPTT calculation can be preprocessed (702). During any time period, the motion sensor or sensors (e.g., the accelerometers) collect three sets of MoCG data along three orthogonal axes, x, y, and z, or along polar coordinates. The three sets may be combined by selecting a weight, wx, wy, wz for each set and summing the weighted sets. An example of the weight selection is shown in FIG. 8, which illustrates two dimensional heat-map diagrams 800, 802, and 804 produced from power spectra of MoCG ensembles collected over time. In each of the diagrams 800, 802, 804, the horizontal axis represents the frequency and the vertical axis represents frames of MoCG data collected over time. Therefore each row in the diagrams represents the power spectrum of a corresponding frame of MoCG data. The colors represent the values of the energy level. The weights wx, wy, wz can be assigned, using respective diagrams, based on the ratio of energy inside the heart rate range to the energy outside the heart rate range. If the power spectra is consistent across the different frames and/or is a harmonic of the already calculated heart rate (as illustrated in the diagram 804), the corresponding axis (the z axis in this example) is assigned a higher weight than the other axes. The lines 806, 808, and 810 in FIG. 8 represent the first, second, and third harmonic, respectively of the measured heart rate in this time segment. In the example shown in FIG. 8, the assigned weights are wx=0.03, wy=0.15, and wz=0.95. The MoCG data for the MPTT calculation is then calculated as the weighted sum of the three sets of MoCG data for the three axes. Alternatively, a single axis can be selected (e.g., the axis with the highest weight) while ignoring the others. For example, only the z axis can be selected for the example shown in FIG. 8. In some implementations, axis selection can be performed by independently analyzing each axis and then combining the axes based on agreement of the candidate MPTT values. This may be done, for example, to avoid the calculation of a power spectrum signal without sacrificing on the accuracy.
  • Referring again to FIG. 7A, a representative segment of the PPG data is generated (704704) for calculating the PPT. In some implementations, the representative PPG segment is generated by averaging across multiple PPG segments of the same length. FIG. 9 shows an example of the representative segment 904 of the PPG data used in determining the MPTT. The representative segment 904 in this example is calculated by averaging across multiple segments 906 of equal duration. The MoCG data is then analyzed using the representative segment (706) to calculate candidate MPTT values. The representative segment can be calculated, for example, by averaging across multiple segments of equal duration arranged on the same time grid as a representative PPG signal. A short segment of the MoCG data 902 (of equal duration to the representative segment 904) and the representative segment 904 are aligned in time, for example, by aligning inflection points (or valleys or peaks). The length of the segment 904 and the corresponding MoCG data can be in the order of several seconds. In the example shown in FIG. 9, the length of the segment 904 is 2 seconds. However segments of other lengths (e.g. 1.5 seconds-5 second) can also be used. In some implementations, the representative segment is generated from data collected when a user is stationary, so that the data does not include a significant amount of unwanted noise.
  • In some implementations, the MPTT is measured as the difference between a time point to when a mid-systole portion 908 of the representative PPG segment 904 is measured, and a second time point representing the portion of MoCG data corresponding to the mid-systole. Because the MoCG data represents the motion due to an actual heartbeat, and the PPG data represents a pulse wave arrival recorded at a distance from the heart, the second time point generally occurs before to. Since a human body is not a rigid body, as defined by the laws of mechanics, the MoCG pulse arrives at the location where the device is located in a somewhat delayed (but constant per individual) fashion. The portion of MoCG data corresponding to the mid-systole is typically manifested as a peak or valley in the MoCG data, and the MPTT can be determined by identifying the correct peak or valley corresponding to the mid-systole. While mid-systole is used as a reference point in this example, other portions of the cardiac morphology can also be used as the reference point. Based on a priori knowledge of typical MPTT, a predetermined time range relative to to is searched and the peaks and valleys detected within the predetermined time range are flagged as potential candidates for being the correct peak or valley corresponding to the mid-systole. Therefore, the difference between the time point corresponding to each such valley or peak and the time to represents a hypothetical MPTT. The correct MPTT value is determined based on the hypothetical MPTTs, as described using the example below.
  • The predetermined time range can be chosen to be, for example, between 10 to 400 ms, or another duration longer than an actual expected range. Within the predetermined time range, seven peaks and valleys 910, 912, 913, 914, 916, 918, 920, corresponding to time points t1, t2, t3, t4, t5, t6, t7, respectively, are identified on the MoCG plot 902. Accordingly, seven hypothetical MPTTs are determined as, h1=t0−t1, h2=t0−t2, h3=t0−t3, h4=t0−t4, h5=t0−t5, h6=t0−t6, and h7=t0−t7.
  • Next, for a given hypothetical MPTT (e.g., h1), a longer segment 1000 of the MoCG data (e.g., of 20 second duration, as shown in FIG. 10A) is aligned with the corresponding PPG data, and the time points corresponding to mid-systoles in the PPG pulses are identified as reference points. The MoCG data is checked at each time point preceding the reference points by h1 (and possibly within a small time range around such time points) for the presence of a peak or valley. If a peak or valley is detected, it is flagged, and the total number of flagged peaks and valleys for the entire segment of MoCG data are recorded. FIG. 10A illustrates a 20 second segment of MoCG data, along with flagged peaks and valleys corresponding to one particular hypothetical MPTT. In the example of FIG. 10A, the flagged peaks and valleys are identified by markers (e.g., circles) 1008, 1010.
  • The above process is repeated for each of the hypothetical MPTTs and the total number of peaks or valleys are recorded for each case. The plots corresponding to two other hypotheses are illustrated in FIGS. 10B and 10C. In some implementations, one of the hypothetical MPTTs is chosen as the true MPTT value, based on the recorded number of peaks or valleys. For example, the hypothetical MPTT that yields the maximum number of peaks or valleys can be chosen as the true MPTT value. In some implementations, the hypothetical MPTTs can be combined together as a weighted sum to obtain the true MPTT value. The weights can be assigned based on, for example, a ratio of the number of flagged peaks (or valleys) to the total number of reference points, and a consistency of the flagged peaks (or valleys) defined as a signal-to-noise ratio:

  • SNR=mean (amplitudes of flagged peaks)/standard deviation (amplitudes of flagged peaks)
  • A weight for a given hypothetical MPTT can then be determined as:

  • Weight=((Number of flagged peaks)/(total reference points))2*log(SNR)
  • Next, a 2D histogram or is generated (708) from the MPTT values calculated during a predetermined time range. For example, the predetermined time range can be the duration for which a user wears the device 100. An example of such a histogram is shown in FIG. 11A, where the y axis represents a calculated MPTT value (averaged over 60 seconds), the y axis represents time, and the darkness of each point represents calculated confidence measure associated with the calculated MPTT. The different horizontal sets represent candidate MPTT values for different time ranges. A representative set can be selected from the candidate sets based on, for example, a priori knowledge about the expected MPTT, and/or confidence measures associated with the points in the set. For example, from FIG. 11A, the sets 1111 or 1112 can be selected as the best representative sets for the MPTT, based on the confidence levels associated with the points (as represented by the darkness of the points), as well as a priori knowledge that the MPTT is expected to be within a 250-350 ms range. Therefore, more consistent (and hence reliable) estimates of MPTT values can be identified from the histograms, and the average MPTT value over the predetermined time range can be calculated (710), for example, as an average of the consistent MPTT values. Inconsistent MPTT values can be discarded from being included in computing the average MPTT. Other parameters such as average SV can also be calculated using similar plots. Before generating such plots, individual estimates of SV (in ml/heartbeat) can be calculated from the amplitude of the MoCG signal based on the fact that SV varies directly with the average amplitude of the MoCG.
  • In some implementations, only one candidate MPTT value can be selected. For example, the candidate MPTT value having the highest weights and/or an appropriate or expected morphology can be selected. In some implementations, a confidence measure can be determined for each measurement of MPTT (or other biometric parameters) to indicate the confidence one has in the reading. An example is shown in FIG. 11B, which illustrates computation of confidence measures 1120 corresponding to the calculated values of MPTT 1115. The confidence measures can be used, for example, to determine whether a calculated value can be used for subsequent computations.
  • An example process for calculating MPTT is shown in FIG. 7B. The process can be executed, for example by the device 100 described above with reference to FIG. 1B. Operations of the process can include obtaining a first data set representing time-varying information on at least one pulse pressure wave within vasculature at a first body part of a subject (722). The first data set can be obtained from a first sensor such as a PPG sensor. The operations also include obtaining a second data set representing time-varying information about motion of the subject at the first body part of a subject (724). The second data set can be obtained from a second sensor such as a motion sensor.
  • The operations further include identifying a first point in the first data set, the first point representing an arrival time of the pulse pressure wave at the first body part (726) and identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject (728). Identifying the first point can include, for example, computing a cross-correlation of a template segment with each of multiple segments of the first dataset, identifying, based on the computed cross-correlations, at least one candidate segment of the first dataset as including the first point, and identifying a first feature within the identified candidate segment as the first point. Identifying the second point can include, for example, determining a reference point in the second data set, wherein the reference point corresponds to substantially the same point in time as the first point in the first data set. One or more target features can then be identified within a predetermined time range relative to the reference point, and a time point corresponding to one of the target features can be selected as the second point.
  • The operations also include computing MPTT as a difference between the first and second time points (730). The MPTT represents a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject can then be used in computing various parameters such as blood pressure or arterial stiffness.
  • Use of the MPTT and SV Values
  • The calculated MPTT value is related to elasticity of the blood vessels as shown in the following equation:
  • PTT = L PWV = L Eh 2 ρ r , ( 1 )
  • where L is the vessel length, PWV is the pulse wave velocity, E is the Young's modulus, h is the vessel wall thickness, ρ is the blood density, and r is the vessel radius.
  • The elasticity is in turn related to the vessel pressure P as:

  • E=E 0 e αP,   (2)
  • where E0 is an elasticity parameter, and α is about 0.017 mmHg−1. Based on (1) and (2), the vessel pressure P can be derived as:

  • P=A ln(PTT)+B,   (3)
  • where A and B are parameters calculated as follows:
  • A = - 2 α B = 1 α ln ( 2 L 2 ρ r E o h )
  • The pressure value calculated using (3) represents diastolic pressure (Dia). The systolic pressure (Sys) can then be computed as:

  • Sys=Dia+C*SV,   (5)
  • where A is a universal constant that applies to all users and is unitless, B is an individual constant in units of mmHg, C is an individual constant in units of mmHg/mg, and SV is the stroke volume.
  • Calibration
  • The parameters B and C for calculating the diastolic and systolic pressures may vary from one person to another. Accordingly, a process or device may need to be calibrated for an individual before use. Generally, the calibration is performed the first time the accelerometer and the optical sensor are used for measuring and the algorithms are used for calculating the MPTT, SV, and the other parameters.
  • An example process 1200 of calibration performed by a machine, such as a processor, is shown in FIG. 12. The machine receives (1202) known reference systolic and diastolic pressures (SysO and DiaO), e.g., as input from a wearer. If the pressures are unknown to the wearer, generic values of 120/80 mmHg are used. In such cases, the wearer may be allowed to alter the calibration at a later time when the actual pressures becomes known. The machine also calculates (1204) the MPTT and the SV using methods described above. The machine then calculates the constants B and C (1206) for this particular wearer based on the following equations:

  • B=refDia−A ln(MPTT), and

  • C=(refSys−refDia0)/SV.
  • The values of the parameters are saved or stored (1208) for the individual. In some situations, a device (e.g., the device 100) including the accelerometer and the optical sensor can be used by multiple people. A calibration is performed for each individual following the process 1200 and a set of calculated parameters are stored in association with the corresponding person. The device may automatically choose a set of stored parameters for use with an individual based on biometric identifications of the individual, or may ask the individual to self-identify and choose the correct set of parameters for use, in case the device is shared among multiple users.
  • After the calibration, blood pressure measurements based on continuously acquired data can be made available for each individual by converting the MPTT and SV into systolic and diastolic pressures as described above.
  • In some implementations, the systolic and diastolic pressures can also be calculated by adding time-varying parameter estimations based on second order parameters. For example, the diastolic pressure can be calculated as:

  • Dia=B+A*ln(MPTT)+D*f(HR)+E*g(temperature)
  • where f(.) and g(.) are predetermined functions, and the parameters D and E are time dependent and individual dependent. The parameters can be calibrated when at least two calibration points (e.g., two known sets of systolic and diastolic pressures) at different times are available.
  • Generally, the calibrated parameters do not change frequently. These parameters may be affected by arterial diameters, arterial wall thicknesses, arterial lengths, arterial elasticity, and other physical parameters related to the cardiovascular system of a human body. The majority of the volume of blood related to MPTT travels through large arteries, and is less susceptible to hydrostatic changes, temperature, or peripheral tone. Curves representing relationships between MPTT and blood pressure are illustrated in FIG. 13. As seen from this example, while the curves may differ from one person to another, the general shapes of the curves are similar.
  • Because multiple calibration points for a given person appear to remain on the corresponding curve, consistent data may be obtained for a reasonably long time after one calibration. With the system being calibrated around the reference ‘normal’ blood pressure values, if the user's blood pressure deviates from the original calibration values over time, the device will correctly identify that the BP values are different but with reduced accuracy. At that point the device may alert the user that calibration is required. In some cases, the device may not require recalibration for several months. As an example, FIG. 14 illustrates systolic pressure measured over 90 days after a single calibration, and in the absence of any additional recalibration.
  • In addition to using the PPG data and accelerometer data (e.g., MoCG data) discussed above to determine certain vital signs (e.g., blood pressure (BP), HR, HRV, respiratory rate, blood oxygen levels, SV, and cardiac output (CO)) of the wearer of the device, a processor (e.g., a processor of the computing device 115 (shown in FIG. 1B), or of an external computing device to which the PPG data and the MoCG data is transmitted) can be programmed to use this data to detect or predict certain health-related conditions.
  • Detection of Irregular Heart Rhythms
  • The processor can be programmed to use the PPG data and accelerometer data to detect arrhythmia or irregular heart rhythms, such as arterial fibrillation (AFIB) or atrial flutter. FIGS. 15A-15D shows graphs in which heart rate data of the wearer of the device 100 is plotted. The graphs show heart rate data plotted over a 24 hour period (FIG. 15A), during the day (FIG. 15B), and during the night (FIG. 15C). Specifically, each of these graphs includes R wave to R wave interval (RRi) along the x-axis and RRi+1 along the y-axis. The plotted data can be used to determine whether the subject has a normal heart rhythm or an irregular heart rhythm, as described below. The plots can be updated after predetermined intervals (e.g., every 5-10 minutes) in order to capture any transient anomaly.
  • To populate the graphs shown in FIGS. 15A-15C, the PPG and accelerometer signals are used in the manner described above to determine the instantaneous heart rate of the wearer for each heartbeat of the wearer over a period of time (e.g., 20 seconds). The RR values are then determined by examining the instantaneous heart rate curve to determine the time between each of the successive heartbeats. Each RR value is equal to the time between two consecutive heartbeats. Each RR value (RRi) is then plotted versus the subsequent RR value (RRi+1).
  • The graphs shown in FIGS. 15A-15D represent plots of a subject with a normal heart rhythm. With a normal heart rhythm, the time between beats tends to be fairly consistent. For example, while a healthy individual's heart rate increases as a result of certain activities, such as exercise, the heart rate tends to increase gradually over time. Thus, while the individual's heart rate may be significantly higher during such activities (as compared to his or her heart rate at rest), the difference in time between consecutive heartbeats should be fairly consistent over the course of a small number of consecutive heart beats. Similarly, while a healthy individual's heart rate may decrease significantly as he or she recovers from such activities, the heart rate tends to decrease gradually over time meaning that the difference in time between consecutive heartbeats should be fairly consistent during such a recovery period. Thus, in a healthy individual, the RRi vs. RRi+1 plot will typically be fairly linear along a diagonal, as shown in FIG. 15D.
  • FIGS. 16A-16C show heart rate data for individuals with different heart conditions. For example, FIG. 16A shows heart rate data taken over a 24 hour period from an individual having atrial fibrillation (AFIB). FIG. 16B shows heart rate data taken over a 24 hour period from an individual having atrial flutter, and FIG. 16C shows heart rate data taken over a 24 hour period from an individual having a normal heart rhythm. Referring first to FIG. 16A, AFIB is apparent since the spread of the various RR data points from the expected diagonal is greater than a predetermined spread value. AFIB causes erratic beating of the heart resulting in the time between consecutive heartbeats varying significantly from one pair of heartbeats to the next. It is this characteristic that causes the plot of RRi vs. RRi+1 to spread significantly from the expected diagonal (i.e., the diagonal plot of an individual who has a regular heart rhythm (as shown in FIG. 16C)).
  • Referring now to FIG. 16B, atrial flutter can be seen by the multiple clusters of data that are offset from the diagonal. Atrial flutter results in changes in heart rate in multiples, which produces the multiple clusters of data that are offset from the diagonal.
  • In addition to being programmed to detect irregular heart rhythms, such as arterial fibrillation (AFIB) or atrial flutter, the processor can be programmed to alert the wearer in response to detecting such irregular heart rhythms. For example, the processor can activate an audio or visual alarm of the device, which can, for example, instruct the wearer to seek medical attention.
  • An example process 1700 of detecting arrhythmia of a subject is shown in FIG. 17. A machine, such as a processor, that receives information from the motion sensor 105 and the optical sensors 110 of the device 100 can perform one or more steps of the process 1700. In some implementations, the machine can include the computing device 115 described above with reference to FIG. 1B. In the process 1700, initially, data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed (1702). Data in a second dataset that represents time-varying information about motion of the subject can also be processed (1704). The data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject). Arrhythmia of the subject can be detected based on the processed data (1706). Arrhythmia can include atrial fibrillation or atrial flutter. Processing the data can include determining whether a spread of plotted R wave to R wave intervals versus next consecutive R wave to R wave intervals exceeds a predetermined spread value. Processing the data can also include determining whether multiple clusters of plotted data points are offset from a diagonal
  • Detection of Arterial Stiffness
  • Another health-related characteristic that can be detected by the device described herein is arterial stiffness, which is an indicator for vascular health (e.g. arteriosclerosis), risk for hypertension, stroke, and heart attack. The stiffer the arteries, the faster the blood wave travels (due to fluid dynamics) and thus the shorter the MPTT. The processor can therefore be programmed to calculate arterial stiffness as a function of the pulse transit time (MPTT).
  • Certain conventional devices that are used to assess arterial stiffness require devices to be placed at two different locations of the subject (e.g., at the carotid and leg of the subject). Thus, the device described herein, which is able to collect from a single location of the subject all necessary data for determining arterial stiffness, tends to be more convenient than those conventional devices.
  • The processor can be programmed to inform the wearer of the device of his or her arterial stiffness value by, for example, causing that value to be displayed on the display of the device. In addition, the arterial stiffness value can be used as one of multiple factors for assessing the overall health of the wearer. In some cases, for example, the processor is programmed to use arterial stiffness of the wearer to determine a health metric (e.g., a health score) for the wearer. The health score may be a numerical value. In some cases, the numerical value is between 1 and 10 or between 1 and 100.
  • As shown in FIG. 18, the arterial stiffness of a subject tends to decrease as the activity level of the subject (e.g., the number of times per week that the subject exercises) increases. Thus, arterial stiffness is one parameter that can be monitored by the device and shared with the user to track the progress of a subject involved in an exercise regimen. This can serve as positive feedback for the user in addition to conventional feedback, such as weight loss.
  • Detection of Sleep Conditions
  • The processor can also be programmed to use the PPG data and accelerometer data to detect sleep disorders, such as sleep apnea, and to deduce sleep quality and sleep stages. Referring to FIG. 19, to analyze the sleep of the wearer of the device, the processor first analyzes the low frequency components of the accelerometer data to identify sleep rest periods (SRPs), which are periods in which the accelerometer data is substantially flat for a minimum period of time (e.g., 90 seconds). The flatness of the accelerometer data indicates that the wearer of the device is not moving during the SRPs. Thus, SRPs are periods during which the wearer of the device is likely to be asleep.
  • FIG. 19 illustrates three separate SRPs (SRP1, SRP2, and SRP3). SRP1 and SRP2 and SRP2 and SRP3 are respectively separated from one another by a brief period of motion by the wearer of the device. However, for purposes of analyzing the heart rate signal for sleep conditions, the three SRPs are treated as a single sleep cycle. The processor can, for example, be programmed to treat periods of motion that last less than five minutes as not interrupting a sleep cycle during which that motion occurs.
  • After identifying the SRPs, the processor uses the PPG data and the accelerometer data collected during the SRPs to calculate the average heart rate, the standard deviation of the heart rate, the average heart rate variability (HRV), and the average activity level for each of the SRPs. In addition, the processor analyzes the complexity of the heart rate signal and the deviation from diagonal of values plotted on an RRi vs. RRi+1 plot. These parameters can be used to confirm that the wearer of the device was sleeping during the SRP being analyzed and to identify certain sleep conditions and sleep disorders, as discussed below. In some implementations, jetlag can also be detected by analyzing heart rate during sleep. For example, an upward heart rate during sleep can indicate a presence of jetlag, and a flat heart rate during sleep can indicate that the subject is not jetlagged.
  • Because lack of motion cannot alone be used to determine whether the wearer of the device was sleeping, the processor can be programmed to consider the average heart rate, the standard deviation of the heart rate, and the average heart rate variability (HRV) to confirm that the wearer was sleeping during the SRP being considered. For example, the average heart rate, the standard deviation of the heart rate, and the average heart rate variability (HRV) of the subject over the SRP being analyzed is compared to the baselines of these values in the subject. If they fall below the baseline by a predetermined amount, this confirms that the subject was asleep during the period being analyzed.
  • Once the processor has confirmed during which of the identified SRPs the wearer was sleeping, the data collected during those periods can be analyzed to provide detailed information about the wearer's sleep and to deduce the sleep quality. For example, by analyzing the PPG data and the accelerometer data during the relevant time periods, the processor can determine the number of hours slept by the wearer, the sleep latency of the wearer (e.g., the length of time that it took for the subject to transition from wakefulness to sleep), the number of times that the wearer tossed and turned, and the percent of time that the wearer was asleep between the time that he or she went to bed and got up. In some cases, the processor can further determine the deepness of the sleep of the wearer during each of the SRPs. The deepness of the sleep is sometimes referred to as the sleep stage. For example, if the accelerometer detected minimal movement and the patient's heart rate variability was a predetermined amount below the wearer's baseline heart rate during a portion of the SRP, it can be concluded that the wearer was in a deep sleep during that portion of the SRP. If the accelerometer detected some movement and the patient's heart rate was higher than can be expected of a deep sleep during a portion of the SRP, it can be concluded that the wearer was in REM sleep during that portion of the SRP. Otherwise, it can be concluded that the wearer was in a light sleep during that portion of the SRP.
  • In some cases, the processor is programmed to use the above-noted parameters (e.g., the number of hours slept by the wearer, the number of times that the wearer tossed and turned, the percent of time that the wearer was asleep between the time that he or she went to bed and got up, and the deepness of sleep) to derive a quality of sleep metric or sleep score. The wearer can monitor his or her sleep score over time in an effort to modify his or her sleep habits and maximize the quality of his or her sleep. It has been found that the use of such scores, as opposed to the various different related parameters, are more easily understood by users.
  • In some cases, the processor can cause the device to automatically display the sleep score when the wearer is determined to have awoken. The device can determine when the wearer has awoken based on information related to the SRPs. Based on characteristics related to the wearer's sleep, information can be provided to the wearer to assist the wearer in improving his or her sleep score. In some implementations, the wearer can be provided with a recommended sleep schedule. For example, if the wearer is determined to have been getting too little sleep, the recommended sleep schedule may suggest that the wearer go to bed earlier in the evening or sleep in later into the morning. The information can be provided on the display of the device or on a separate device, such as a mobile phone of the wearer.
  • As noted above, in addition to generally determining the quality of the wearer's sleep, the processor can detect certain sleep disorders, such as sleep apnea. FIG. 19B illustrates the heart rate signal of the wearer during a period of time in which the wearer experienced an episode of sleep apnea. Referring to FIG. 19B, the heart rate signal of the wearer is complex from 2:54 AM until about 3:16 AM at which time the heart rate of the wearer spikes suddenly. From 3:16 AM until about 3:30 AM, the heart rate signal is simple (i.e., includes periodicity or a repeating pattern). The presence of a simple heart signal at least every two minutes during an SRP can be indicative of sleep apnea.
  • The processor can be programmed to carry out a multi-step test to detect sleep apnea. First, the processor analyzes the heart rate throughout the SRP being analyzed. If the difference between the minimum heart rate and the maximum heart rate during the SRP is less than a threshold heart rate differential, then the processor determines that there was no sleep apnea and the test is concluded. If, however, the minimum-maximum heart rate differential exceeds the threshold heart rate differential, then the processor determines that sleep apnea could be the cause and a carries out a further analysis of the SRP. Specifically, the processor analyzes the heart rate variability, the plotted RR points, the complexity of the signal, and the activity level of the subject during the SRP
  • If the heart rate variability is lower during the SRP than in neighboring periods, then this weighs against a finding of sleep apnea. If, however, the heart rate variability during the SRP exceeds the heart rate variability during neighboring periods, then this weighs in favor of a finding of sleep apnea.
  • Similarly, if the spread of data points in an RRi vs. RRi+1 plot largely lie along the diagonal, this weighs against a finding of sleep apnea. If, however, the data points are spread from the diagonal, then this weighs in favor of a finding of sleep apnea. The data points would be expected to spread from the diagonal during a sleep apnea episode because the wearer's heart rate would drastically increase in a very short period of time due to lack of oxygen in the wearer's blood. This drastic increase in a short period of time would translate to a larger than normal discrepancy between the RRi and RRi+1 values during that time period.
  • Another factor used to determine whether the wearer has sleep apnea is the complexity of the heart rate signal. If the heart rate signal is complex during the SRP, then this weighs against a finding of sleep apnea. If, however, at least every two minutes, the heart rate signal becomes simple (i.e., has periodicity or a repeating pattern), then this weighs in favor of sleep apnea.
  • Activity level is another factor used to identify sleep apnea. If the activity level of the wearer during the SRP being analyzed (as determined using the accelerometer data) is greater than the activity level of the wearer during neighboring periods, this weighs against a finding of sleep apnea. If, however, the activity level of the wearer during the SRP being analyzed is less than the activity level of the wearer during neighboring periods, this weighs in favor of a finding of sleep apnea.
  • The processor can be programmed to determine the presence or absence of sleep apnea as a function of heart rate, heart rate variability, the location of data points on the RRi vs. RRi+1 plot, the complexity of the heart rate signal, and the activity level of the subject.
  • In some cases, the processor can be programmed to determine a correlation between the wearer's sleep quality and an amount of light that the wearer is exposed to. FIG. 20 shows an example screenshot 2000 on a mobile phone 2002 of a wearer that displays qualities of the wearer's sleep in conjunction with light levels during various times. In this example, the wearer slept for 7 hours and 52 minutes total, awoke, 4 times, and has a sleep score of 74. The screenshot also includes two bars: one bar shows times when the wearer had low-quality sleep, and another bar shows the measured light levels during those times. In this way, a correlation is made between the wearer's sleep quality and light levels experienced by the wearer. The screenshot 2000 also includes a link 2004 for the wearer to receive sleeping environment tips that can improve his or her sleep quality.
  • Upon detecting an episode of sleep apnea, the processor can alert the wearer that he or she may have experienced an irregular sleep pattern.
  • An example process 2100 of determining information about a characteristic of a subject's sleep is shown in FIG. 21. A machine, such as a processor, that receives information from the motion sensor 105 and the optical sensors 110 of the device 100 can perform one or more steps of the process 2100. In some implementations, the machine can include the computing device 115 described above with reference to FIG. 1B. In the process 2100, initially, data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed (2102). Data in a second dataset that represents time-varying information about motion of the subject can also be processed (2104). The data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject). The information about at least one pulse pressure wave propagating through blood in the subject can include photoplethysmographic (PPG) data, and the information about motion of the subject can include one or both of motioncardiogram (MoCG) data and gross motion data. Based on the data, information about a characteristic of the subject's sleep can be determined (2106). The characteristic can include a quality of the sleep of the subject. The quality of the sleep of the subject can include one or more of a sleep duration, a sleep latency, a sleep staging, latency to sleep, a number of disturbances, and a number of tosses and turns. The characteristic of the subject's sleep can also include sleep apnea.
  • Fitness-Related Applications
  • The processor can also be programmed to perform various fitness applications that allow the wearer to monitor his or her fitness level. As an example, the processor can be programmed to analyze the accelerometer data over a given period of time (e.g., 15 minutes) to determine the total number of steps taken by the wearer during that time. The processor is programmed to look for rhythm/cadence to detect walking as opposed to other ordinary motion, such as hand motions and vibrations. The absolute value of the accelerometer data will typically be higher during periods of walking that during periods of most other daily activities.
  • In addition, the processor can calculate calories burned over a given period of time by analyzing the activity level of the wearer and/or the heart rate of the user. Using both the activity level and the heart rate to determine calories burned can lead to a more accurate estimation of caloric output.
  • In some cases, the processor is programmed to provide a fitness score based on certain fitness-related parameters, such as resting heart rate. The more fit an individual is, the lower his or her baseline HR will be. Thus, in some cases, the processor is programmed to determine a fitness score based on the average heart rate of the wearer during sleep periods or periods of inactivity. Additionally, the speed of heart rate recovery can be a strong indicator of a person's fitness level. For example, the more fit an individual is, the faster his or her heart rate returns to the baseline after exercising. Similarly the more fit an individual is, the longer it takes for his or her heart rate to increase during exercise. Thus, in certain cases, the processor is programmed to determine an individual's fitness score based on the amount of time that it takes for the individual's heart rate to reach a maximum during exercise and the amount of time that it takes for his or her heart rate to return to the baseline after exercise.
  • In some cases, the processor can cause the device to automatically display the fitness score when the wearer is determined to be in the fitness state. For example, the fitness score may be displayed when the wearer starts to go for a run, and may be displayed throughout the run. In some implementations, the fitness score may be displayed when the wearer transitions from a fitness state to a non-fitness state. For example, the fitness score may be displayed when the wearer finishes a run. In some implementations, the device can determine when the wearer is in the fitness state based on the gross motion data and the vitals of the wearer, such as the wearer's heart rate. Based on characteristics related to the wearer's fitness, information can be provided to the wearer to assist the wearer in improving his or her fitness score.
  • FIG. 22 shows an example screenshot 2200 displaying a fitness score on a mobile phone 2202 of a wearer. The information on the screenshot indicates that the wearer has improved his or her fitness score by two points. The screenshot also provides the wearer with updated personalized training zones. The personalized training zones represent the heart rate that the wearer should strive to achieve under various exercise conditions. For example, if the wearer is performing extreme exercise, he or she should strive to have a heart rate of more than 151 beats per minute.
  • In some implementations, the wearer can be provided with a recommended fitness routine. For example, it may be determined that the wearer has trouble completing a three-mile run, as indicated by an abnormally high heart rate during the run. The recommended fitness schedule may suggest that the wearer run one mile twice a week for a week in order to improve his or her fitness, thereby allowing the wearer to work up to a fitness level appropriate for safely completing a three-mile run. The information for assisting the wearer can be provided on the display of the device or on a separate device, such as a mobile phone of the wearer.
  • In some implementations, the device may have access to other users' vital information and fitness scores, such that a wearer of the device can compare his or her fitness score to those of other people. For example, a professional athlete who uses the device while training exhibits particular vital information and fitness scores. A wearer of the device may want to follow the same training regimen as the one that the professional athlete follows. However, following the same training regimen does not necessarily produce the same results. For example, a wearer of the device may follow the same training regimen as a professional athlete, but he may not exhibit the same level of effort as the professional athlete. By gaining access to the professional athlete's vital information and comparing it to the wearer's vital information, the device can determine the degree of similarity between the wearer's training level and the professional athlete's training level.
  • In some implementations, vital information of a professional athlete from when the athlete performed or is performing a particular training routine is presented to the wearer while the wearer performs the same training routine. For example, a video showing the athlete performing the training routine can include a visual indication of the athlete's BP, HR, and respiratory rate over the course of the training routine. As the wearer performs the same training routine while watching the video, the wearer can determine whether he or she is experiencing a similar BP, HR, and respiratory rate as the athlete, thereby indicating whether the wearer is training with the same intensity as the athlete. The video may be configured to interact with the device such that the video encourages the wearer to try harder if the wearer's intensity is below that of the athlete. Similarly, after training, the device can continue to monitor the BP, HR, and respiratory rate of the wearer to determine whether the wearer is physically recovering as well as the athlete.
  • The vital information of the professional athlete can be used to determine the athlete's physical state at particular times during competition. For example, the athlete's vital information can represent how the athlete physically feels while completing the last 20 meters of a 100 meter dash, or while catching a game-winning touchdown as time expires. A wearer may desire to recreate this feeling for himself or herself. In some implementations, the device is configured to assist the wearer in recreating similar competition situations. For example, the athlete's vital information may indicate that a wide receiver had a particular BP, HR, and respiratory rate while catching a game-winning touchdown in a championship game. The particular BP, HR, and respiratory rate may be significantly higher than they typically would be due to the intensity and importance of the game situation. In order to recreate the situation, a wearer cannot simply go to a local football field and catch a pass from a friend because the wearer would not be in the same physical state that the wide receiver was in at the time of the catch. Rather, the user needs to match the wide receiver's BP, HR, and respiratory rate before recreating the catch. The wearer may perform various actions or activities to artificially match the wide receiver's vitals (e.g., running, listening to loud or exciting music, etc.). When the wearer has achieved a physical state that matches the athlete's, the device can alert the wearer. At that point, the wearer can recreate the game situation with improved accuracy.
  • In some implementations, the wearer can recreate the game situation with the aid of a virtual reality device, such as a stereoscopic device that creates a computer-simulated environment. For example, the stereoscopic device can be used to aid the wearer in artificially matching his or her vitals with the athlete's by presenting to the wearer the same visuals and sounds that the athlete experienced before the game situation. Once the wearer has achieved a matching physical state, the stereoscopic device can also be used to recreate the particular game situation or play. That is, rather than catching a real football from a real person, the stereoscopic device can display visuals that simulate the action of catching the game-winning touchdown.
  • Concepts similar to those described above can also apply in the context of combat training A person in a real combat situation typically exhibits increases in BP, HR, and respiratory rate due to the danger of the situation. Training for these situations does not involve the same risk of danger. Thus, such training is typically not performed under the same physical conditions. That is, a trainee does not have the same BP, HR, and respiratory rate that he would otherwise have in a real combat situation. In some implementations, a person's vital information can be used to determine the person's physical state at particular times during a real combat situation. For example, a Navy SEAL may exhibit a particular BP, HR, and respiratory rate while performing a raid of a terrorist hideout. A trainee who is wearing the device may perform various actions or activities to artificially match the Navy SEAL's vitals. When the trainee has achieved a physical state that matches the Navy SEAL's, the device can alert the trainee, who can then recreate a training scenario with improved accuracy.
  • Monitoring Stress Levels
  • The processor can also be programmed to analyze the PPG data and the accelerometer data in a way to determine the stress level of the wearer of the device. Heart rate (HR), heart rate variability (HRV), blood pressure (BP), and respiratory rate are all indicators of stress. Specifically, the values of these parameters increase as stress levels increase. Thus, by comparing these values to baseline values of the wearer for associated parameters, the level of stress of the wearer can be estimated. The stress level can, for example, be provided to the wearer as a stress score.
  • In some cases, the processor can cause the device to automatically display the stress score when the wearer is determined to be in a stress state. The device can determine when the wearer is in a stress state based on the vitals of the wearer, such as the wearer's heart rate, heart rate variability, blood pressure, and respiratory rate. Based on characteristics related to the wearer's stress, information can be provided to the wearer to assist the wearer in improving his or her stress score. In some implementations, the wearer can be provided with a recommended stress-reducing routine. For example, the recommended stress-reducing routine may suggest that the wearer meditate at particular times (e.g., once a day) or adjust his or her daily schedule to minimize circumstances that are generally attributed to stress (e.g., sitting in traffic, working too much, etc.). The information can be provided on the display of the device or on a separate device, such as a mobile phone of the wearer.
  • FIG. 23 shows an example screenshot 2300 on a mobile phone 2302 of a wearer that includes a number of stress moments experienced by the wearer. In this example, the wearer has experienced four stress moments on the current day. A graph indicates the number of stress moments that the wearer has experienced throughout the week. The screenshot includes recommendations for the wearer to reduce his or her stress. In this example, the screenshot recommends that the wearer plan some rest, relaxation, and/or a meditation session to reduce stress. The screenshot also includes a link 2304 to a 1-minute relax sessions, during which the mobile phone guides the wearer on a relaxation session.
  • An example process 2400 of deriving information about a psychological state of a subject is shown in FIG. 24. A machine, such as a processor, that receives information from the motion sensor 105 and the optical sensors 110 of the device 100 can perform one or more steps of the process 2400. In some implementations, the machine can include the computing device 115 described above with reference to FIG. 1B. In the process 2400, initially, data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed (2402). Data in a second dataset that represents time-varying information about motion of the subject can also be processed (2404). The data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject). Information about a psychological state of the subject can be derived from the processed data (2406). The psychological state of the subject can be a state of stress, a malicious intent, or a state of lying. Relationships between at least some of the processed data and a psychological state of the subject can be inferred.
  • Health Metrics
  • As described above, one or more scores, also referred to as health metrics, can be derived based on data collected by the device 100. A machine, such as a processor, that receives information from the optical sensors 110 of the device 100 can perform one or more steps of the process. In some implementations, the machine can include the computing device 115 described above with reference to FIG. 1B. Operations of the process can include deriving a score that is associated with a state of a subject. The state of the subject can be one or more of a health state, a sleep metric, a fitness state, and a stress state. Deriving the score can be based on data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in the subject. The data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject). Deriving the score can also be based on data in a second dataset that represents time-varying information about motion of the subject. The machine can receive information from the motion sensor 105 of the device 100.
  • Triage Applications
  • The data produced by the device can be used to assist triage medical personnel in various settings. As an example, the device could be worn by military personnel in battle to provide medical personnel with valuable information regarding the vital signs of the military personnel. The devices worn by the military personnel can, for example, be configured to transmit data regarding their vital signs to a central computer manned by medical personnel. In the event that that multiple casualties are suffered at the same time, the medical personnel can view the vital signs of the various military personnel to prioritize medical care. As a result, the people that most need urgent treatment will receive it first, while those who have less threatening injuries will be attended to later.
  • In addition to being used for military personnel, the devices described herein could be used to assist medical personnel in various other triage settings, such as sites of natural disasters or terrorist attacks. For example, the medical personnel could be provided with a number of devices that could be put on patients in the triage setting as those patients are being assessed. In this way, after the medical personnel have performed an initial assessment of a victim and determined that he or she does not require urgent medical care, the medical personnel can leave that victim and focus their efforts on victims in more urgent need of medical care. While doing so, the vital signs of those victims who were initially assessed and determined not to require urgent medical care will be monitored and transmitted to a central monitoring station. Thus, in the event that the condition of one of those victims being monitored deteriorates to the point of requiring urgent medical attention, medical personnel in the area can be directed to that victim to provide the necessary medical care.
  • A machine, such as a processor, that receives information from the optical sensors 110 of the device 100 can perform a process for risk assessment. In some implementations, the machine can include the computing device 115 described above with reference to FIG. 1B. The process can include processing data from a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in the subject. The data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject). Data in a second dataset that represents time-varying information about motion of the subject can also be processed. The machine can receive information from the motion sensor 105 of the device 100. The data can be acquired while the subject is in a situation associated with risk. Whether the subject is in a situation associated with risk can be indicated by the data. The risk can be trauma to the subject, and the data can be indicative of the existence of the trauma.
  • In additional to being used in the triage context, the devices described herein could be used to assist medical personnel in a hospital setting. Once a patient is stabilized following triage, he or she is typically monitored based on a provider's standard of care or mandate (e.g., according to an accountable care organization (ACO)). In some implementations, the device can continue to monitor the vital signs of the patient outside of the triage context to ensure that the care that the patient is receiving is appropriate in view of the patient's vitals. A provider's standard of care may require a patient to go through a progression of steps before the patient is deemed to be ready for discharge. The device can monitor the vital signs of the patient during each step of the progression. For example, the first step of the progression may involve monitoring the patient's vitals while the patient is resting (e.g., lying down and/or sleeping), the second step of the progression may involve monitoring the patient's vitals while the patient is sitting up in bed, the third step of the progression may involve monitoring the patient's vitals while the patient is standing up while being supported, the fourth step of the progression may involve monitoring the patient's vitals while the patient is standing up unassisted, and the fifth step of the progression may involve monitoring the patient's vitals while the patient is walking The device continuously monitors the patient's vitals throughout each of these stages and can present a notification if the vitals indicate that the patient is in a dangerous state (e.g., if the patient is progressing through each step too quickly without giving his or her body a chance to recover). In this way, the device monitors the patient's compliance with the provider's standard of care.
  • In some implementations, the patient's vitals can also serve as an indicator of the quality of care that the patient is receiving at a care facility. For example, the device can monitor the vitals of residents at a nursing home to determine the level of activity that the residents are experiencing. Data from the motion sensor of the device may indicate that the residents typically walk or perform other exercises one hour per day, and data from the ultraviolet light sensor of the device may indicate that the residents typically spend two hours per day outdoors. The monitored vitals can be compared to metrics defined by a health organization (e.g., the American Heart Association) to determine whether the residents are adhering to the organization's recommendations regarding physical activity and other health-related actions. The residents' level of compliance with the organization's recommendations can be used to assess the quality of care at the nursing home. In some implementations, the nursing home may be assigned a quality score based on the monitored vitals and the level of compliance with the organization's recommendations, and multiple nursing homes may be compared and/or ranked according to their quality scores. Similar concepts can also apply in the context of child care.
  • An example process 2500 of determining a quality of care provided to the one or more subjects by a care facility is shown in FIG. 25. A machine, such as a processor, that receives information from the motion sensor 105 and the optical sensors 110 of the device 100 can perform one or more steps of the process 2500. In some implementations, the machine can include the computing device 115 described above with reference to FIG. 1B. In the process 2500, initially, data that represents time-varying information about at least one pulse pressure wave propagating through blood in each of one or more subjects can be processed (2502). Data that represents time-varying information about motion of the one or more subjects can also be processed (2504). The data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject). A quality of care provided to the one or more subjects by a care facility that cares for the one or more subjects can be determined (2506). Determining a quality of care can include determining a level of physical activity experienced by each of the one or more subjects. The level of physical activity can be determined by comparing gross motion data gathered by the motion sensor 105 to a threshold. Data that represents information about an amount of ultraviolet light that each of the one or more subjects has been exposed to over a particular time period can also be processed, and an amount of time that each of the one or more subjects has spent outside can be determined.
  • First Responder Applications
  • The devices described herein can also be beneficial to first responders, such as firefighters and police offers. By wearing the devices, the first responders will ensure that their vital signs are monitored before, during, and after any stressful events that they experience to ensure that they receive the help they need. This is illustrated in the example of FIG. 26, where health parameters of one or more firefighters 2605 on a potentially hazardous mission are obtained via devices 100 worn or carried by the firefighters 2605. In this example, the firefighters' vital signs could be obtained by the devices 100 and transmitted to a central monitoring station (e.g., within a fire truck 2610, or at a fire station) where the vital signs can be monitored to determine whether the firefighters 2605 are well enough to continue fighting a fire or otherwise responding to an emergency. In the event that a firefighter's health is considered to be in jeopardy based on his monitored vital signs, that firefighter could be prevented from continuing to fight the fire or respond to the emergency, for example, by sending an alert to the firefighter 2605 to retreat to a safe location.
  • In certain implementations, the devices 100 worn or carried by the firefighters 2605 further include GPS transponders. Such devices are particularly beneficial for situations in which one or more first responders 2605 become incapacitated in a dangerous setting. For example, in the event that a firefighter 2605 has a heart attack while fighting a fire inside a building, the device could not only send the firefighter's vital sign data to the central monitoring station to alert someone that the firefighter is in need of medical care, the device could also identify the location of the firefighter 2605 to a rescuer 2620 (possibly via a device 100) sent to assist the incapacitated firefighter 2605, such that the rescuer 2620 knows exactly where to go.
  • The communications about the health parameters of the one or more firefighters 2605 can be sent directly to the central monitoring station, or via a server 2630. In some implementations, if the server 2630 determines that a firefighter's mental/physical state is not suitable for continuing the mission, the server 2630 can send a signal to the firefighter (e.g., via the device 100, or via another communication device) to alert the firefighter 2605 about the situation. For example, if the health condition of the firefighter deteriorates during the mission (e.g., because of excessive smoke inhalation), a signal can be sent to the device 100 to alert the firefighter to take corrective measures.
  • In some implementations, the device 100 can be configured to communicate with the central monitoring station on the fire truck 2610. The data from the devices 100 can be transmitted to the server 2630 (possibly via the central monitoring station) for determining whether a firefighter 2605 is safe. The determination can also be made at the central monitoring station. The data from the device 100 may also indicate whether the wearer of the device 100 requires assistance from a rescuer 2620. The server 2630 and/or the central monitoring station can then alert the firefighter 2605 and/or a rescuer 2620 accordingly. In some implementations, if another individual (i.e., someone not in the firefighting team) is wearing a device 100, his/her location may also be tracked using information transmitted from the corresponding device.
  • Alertness Monitoring
  • The processor can also be programmed to monitor the alertness of the wearer. This can be particularly advantageous for personnel who perform tasks that require attention and concentration, and could result in serious harm or danger if carried out incorrectly. Examples of such personnel include air traffic controllers, pilots, military truck drivers, tanker drivers, security guards, TSA agents, intelligence analysts, etc.
  • To monitor the alertness of the wearer, the processor can analyze the respiratory rate, heart rate, blood pressure, and activity level of the wearer. Each of these parameters tends to decrease as a subject falls asleep. Thus, the processor can be programmed to conclude that the wearer's alertness level has dropped to an unacceptable level when one or more of those parameters falls a predetermined amount from the baseline of those parameters.
  • The processor can be programmed so that, upon determining that the wearer's alert level has dropped to an unacceptable level, an alarm (e.g., an audible, visual, or tactile alarm) on the device is activated. The alarm can raise the alertness level of the wearer and thus reduce risk of harm to the wearer and others.
  • As noted above, some wearers that may benefit from this application of the device are those wearers that drive vehicles or operate machinery that could present a danger if driven or operated incorrectly. In those cases, the processor can be configured to communicate with the vehicle or machinery for which the wearer is responsible. As an example, the device worn by a truck driver can transmit data regarding his or her alertness level to a controller of the truck. The controller can be configured to disable operation of the truck if the alertness level is below an acceptable threshold. For example, the controller can warn the driver that he or she has a certain period of time to pull the truck over before it is disabled. This will encourage the driver to pull off the road and either get some sleep or otherwise increase his or her alertness level before driving the truck again.
  • As an alternative to or in addition to taking the actions discussed above in response to detecting a potentially unsafe alertness level, the alertness data can be stored in a database for later analysis. Studying the alertness data from a large sampling of personnel in a given industry can help regulatory bodies for those industries to draft safety standards that increase or maximize safety while maintaining productivity.
  • Similarly, alertness data over a period of time for a particular wearer of the device can be analyzed to determine the overall physical and/or mental state of a given wearer (e.g., as opposed to the instantaneous state of the given user). Such information can be used to detect a trend of regressing physical and/or mental state of the given wearer. For example, although a wearer of the device may exhibit vitals that indicate that he is alert enough to perform a particular task (e.g., fly a plane) at a particular time, the wearer's alertness data over a period of time may indicate that the wearer's general alertness is on the decline. This may be due to the wearer's old age. The device can detect such a trend and alert the wearer and/or an external entity that the wearer should be closely monitored.
  • In some implementations, a process can be configured to acquire data while a subject is in a situation that requires a predetermined amount of alertness of the subject. A machine, such as a processor, that receives information from the optical sensors 110 of the device 100 can perform one or more steps of such a process. In some implementations, the machine can include the computing device 115 described above with reference to FIG. 1B. Operations of the process can include processing data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in the subject. The data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject). The operations can also include processing data in a second dataset that represents time-varying information about motion of the subject. The machine can receive information from the motion sensor 105 of the device 100. The data can be acquired while the subject is in a situation that requires at least a predetermined amount of alertness of the subject. The situation can include one or more of air traffic control, intelligence analysis, vehicle driving, machinery driving, security guarding, baggage screening, and aircraft piloting.
  • Detection of Malicious Intent
  • The devices described herein can also be used as polygraph devices. Like conventional polygraph devices, the devices described herein gather a baseline for the wearer's vital signs (e.g., respiratory rate, electrical skin impedance, heart rate, heart rate variability, and blood pressure) and those baselines can later be compared to associated vital signs recorded during questioning. Because the devices described herein are wearable, untethered, and non-cumbersome, and thus do not reduce the mobility of the wearer, the individual being tested can be required to wear the device for a specified period of time (e.g., 24 hours) before and after questioning without hindering the normal, everyday activities of the individual. As a result of the long period of time for which the subject wears the device, the baselines for the subject's vital signs can be more accurately determined. For example, it is less likely that the subject could artificially adjust his or her vital baselines due to the large amounts of data collected to form those baselines. Therefore, the accuracy of the polygraph test can be increased relative to certain conventional polygraph devices.
  • In addition to monitoring the above-noted vital signs of the subject determine whether the subject is answering questions truthfully, the accelerometer data can be analyzed to identify movements or lack of movements that may indicate that the subject is lying. It is believed, for example, that individuals freeze for a moment when they are caught doing something wrong. In the case of polygraph examinations, it is believed that a subject will freeze when asked a question about the subject's wrongdoing. Thus, by analyzing the accelerometer data of the device, it is possible to identify those times during questioning that the subject freezes. This information can be used to further assess the truthfulness of the subject's response during that time.
  • Readiness Detection
  • In addition to those applications discussed above, the processor can be programmed to analyze the PPG data and the accelerometer data to determine the physical and mental readiness of a subject to perform a certain task. General fatigue and stress, which can result in a drop in physical and mental readiness, is generally evidenced by an increase in respiratory rate, heart rate, and blood pressure. Thus, in order to determine a wearer's physical and/or mental readiness, the processor can be programmed to analyze the wearer's respiratory rate, heart rate, and blood pressure and to indicate a state of unreadiness if those parameters fall a certain amount below the baseline for those parameters. In certain cases, the processor is programmed to also consider other factors in this readiness assessment, including the quality of the wearer's sleep (e.g., the wearer's sleep score) over a period of time (e.g., 24 hours or 48 hours) leading up to the assessment.
  • The determination of readiness of wearers of the device can assist leaders of those wearers with maximizing his or her human resources during taxing situations. For example, military leaders can analyze the data of soldiers in their units to determine which of those soldiers is most physically and mentally able to successfully carry out a mission and can staff the mission accordingly. Similarly, coaches may analyze the data of their team members to determine which of those athletes are best physically and mentally fit to play at their top level at any given time during a competition and can use those players that are able to perform at their top level.
  • In some implementations, the physical and mental readiness of a subject, as well as motion sensor data and information related to other factors, can be used by the device to predict a winner of a competition. For example, by analyzing vital signs (e.g., BP, HR, respiratory rate) of the contestant before and during a track race, a change in physical and mental readiness can be inferred. The device can also consider information such as the force exerted against the ground by the contestant and the velocity of the contestant at various points during the race to determine a likelihood that the contestant will win the race. The contestant's device can also consider similar information related to other contestants in determining the likelihood that the contestant will win the race. For example, the device may determine that a first contestant got off to a quicker start than a second contestant in a 100 meter dash based on collected motion data. Historical data may indicate that the contestant who is “first out of the blocks” has a 65% chance of winning the race. Thus, the device can predict the winner of the race within milliseconds of the start of the race.
  • In some implementations, the device can monitor a contestant's performance at an infinite number of intervals while correlating the contestant's performance to the measured vitals. During a one mile track race, a contestant typically keeps track of his lap times for each of the four laps. However, the contestant does not typically have access to more detailed data, such as his or her performance over the first 100 meters, the last 100 meters, at various points in the middle of the race, etc. The device can be configured to keep track of the contestant's performance at any time or range of times during the race, and can also correlate the contestant's performance to the vitals measured by the device. For example, the contestant may complete the first lap of the mile in 50 seconds, putting him or her on pace to easily break the world record. However, the device may determine that the contestant has a BP, HR, and respiratory rate significantly higher than what would typically be seen in someone who has only completed 25% of the race, and thus determine that the contestant likely will not win the race. By exhibiting so much effort early in the race, the contestant burns out and finishes the race with a mediocre time. In some implementations, the contestant can use the performance data and the measured vitals to improve his or her training in the future. For example, the next time the contestant runs a mile, the device may detect that the contestant is exhibiting too much effort early in the race by measuring a high BP, HR, and respiratory rate. The device can be configured to notify the contestant to reserve energy in order to optimize his or her performance.
  • Similarly, in some implementations, the device can be used to monitor the performance of an entire team of individuals wearing the device. For example, the collective physical and mental readiness of a football team, as well as motion sensor data and information related to other factors, can be used to determine whether the football team is performing to its potential. Information related to the vitals of a first team, such as the team's collective BP, HR, and respiratory rate, may indicate that the first team is exhibiting a large amount of effort. Information related to the vitals of a second team may indicate that the second team is exhibiting minimal effort. However, the second team is winning the football game against the first team, indicating that the first team may have inferior technique or coaching. Such information can be used during training to indicate areas where the team needs to improve their technique. Information related to a team's vitals can also be used to ensure that the team does not exhibit too much effort early in the season, thereby making it susceptible to “burning out” towards the end of the season.
  • An example process 2700 of providing information to a user that reports relative states of subjects is shown in FIG. 27. A machine, such as a processor, that receives information from the motion sensor 105 and the optical sensors 110 of the device 100 can perform one or more steps of the process 2700. In some implementations, the machine can include the computing device 115 described above with reference to FIG. 1B. In the process 2700, initially, data that represents time-varying information about at least one pulse pressure wave propagating through blood in each of two or more subjects can be processed (2702). Data that represents time-varying information about motion of the two or more subjects can also be processed (2704). The data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject). Information can be provided to a user that reports relative states of the subjects (2706). The information can be based on the processed data. The relative states of the subjects can include one or more of relative psychological states, relative physical states, and relative states of readiness. The subjects can be put into an athletic contest or assigned a particular combat task according to the relative states of the subjects.
  • Correlation between Impact Force and Vitals of Multiple Users
  • In some implementations, the processor can be programmed to analyze the vital signs of multiple users in the moments leading up to a collision. For example, when two players collide during a sporting competition, a large amount of force is absorbed by each player. Force data can be measured by the motion sensor of the device, and the device can determine the magnitude of force absorbed by each player. The device can determine the effect of the force on each player by analyzing the players' vitals (e.g., BP, HR, respiratory rate, body temperature) before, during, and after the collision. The vitals and the force information can be used to determine whether a player has sustained bodily damage due to the impact force. For example, if a player experiences a sudden increase in HR, respiratory rate, and body temperature following a collision, it may be an indication that the player has sustained a concussion.
  • In some cases, a player's bodily reaction to sustaining a concussion is delayed. For example, a player may experience a sudden increase in HR, respiratory rate, and body temperature at some time following a collision, or the player may experience a gradual increase in HR, respiratory rate, and body temperature beginning at the time of the collision. The device can monitor the player's vitals for an extended time following the collision and compare the monitored vital information to vital information of a player who was previously diagnosed with a concussion. In this way, the device can determine vital patterns that are indicative of a person who sustains a concussion. If the device determines that a player has sustained a concussion, the device may be configured to alert the player or a third party. The player may be required to pass a protocol before reentering the game. If the device determines that there is a possibility that the player has sustained a concussion, the device may enter a mode where the player is monitored more closely in order to make a more definitive determination.
  • Human Flight Recorder
  • The devices described herein can also be used as human flight recorders. While accident investigators (e.g., National Transportation Safety Board (NTSB) investigators) have traditionally been limited to analyzing voice recorders and, in some cases, black boxes, after airplane and train crashes, the devices described herein, when worn by the operators of those vehicles, will provide insight into the state of the operator at the time of the crash. For example, by analyzing vital signs of the operator (e.g., the respiratory rate, heart rate, heart rate variability, and blood pressure of the operator) in the moments leading up to the crash, the investigators can learn whether the operator fell asleep, experienced some form of medical emergency, etc. This information is valuable for the investigators to determine whether the crash was the result of the operator's actions as opposed to some other reason, such as mechanical failure.
  • For example, in the context of a car accident, information related to the vital signs of the operator as well as information related to the operating characteristics of the car (e.g., the speed, direction, and breaking, as measured by other sensors) can be used to determine the cause of the accident, the mechanism of injury to the operator, and the impact of the injury to the operator. In this way, the mental and/or physiological state of the operator before, during, and/or after the accident can be ascertained. The 60 minutes following a traumatic injury is generally referred to as the “golden hour,” during which there is the highest likelihood that prompt medical treatment will prevent death. It is especially important to quickly gather vital information during this time to assist first responders and doctors in diagnosing and treating the operator.
  • In some implementations, the human flight recorder information can be used by third parties to determine who was at fault in creating the accident. For example, a law enforcement body may analyze the human flight recorder information to determine whether a tort or a crime was committed by an operator. In some implementations, the human flight recorder information can be used to determine an exact time when an event occurred. For example, the information can be used to determine an exact time of death, an exact time when a person went missing (e.g., by being abducted), or an exact time when a person fell down.
  • Similarly, after a wearer of the device experiences a period of illness or discomfort, the data could be analyzed by his or her physician to help diagnose the condition. For example, if a wearer has a heart attack, the data could be analyzed to investigate the variation in the vital signs leading up to the attack. Other data can also be considered, such as the wearer's genetics, epigenetics, diet, exercise practice, and environmental circumstances surrounding the event or condition. This information may be correlated and used to prevent onset of similar conditions in the future, for example, by alerting the user of such a possibility upon detecting similar variations in vital signs.
  • In some implementations, the device is able to determine a “baseline biorhythm” of a wearer based on the wearer's vital signs in various circumstances and environmental environments. The baseline biorhythm is typically unique to each individual. Once the baseline biorhythm is established and substantially refined, the device is able to detect when the wearer's vital signs are shifting away from the baseline biorhythm. For example, the device may detect that a wearer's biorhythm has gradually shifted over a particular time period, as indicated by variations in the wearer's vital signs. The device may also detect that the wearer has spent minimal time outside over the same time period, as indicated by measurements from the device's ultraviolet light sensor. The device can identify a correlation between the wearer's changed biorhythm and the change in ultraviolet light exposure.
  • In some implementations, the device can identify a correlation between the wearer's changed biorhythm and changes in the weather. For example, the device can consider the wearer's location information in conjunction with weather information from the National Oceanic and Atmospheric Administration to determine the type of weather experienced by the wearer over a particular period of time. The device may identify that the wearer experiences higher BP and HR when the weather is cold and/or rainy and determine that such weather causes increased stress in the wearer.
  • Detection of Temperature
  • In addition to using the accelerometer and optical sensor to determine vital signs of the wearer, the device can include a temperature sensor for determining the skin temperature of the wearer and an ambient temperature sensor for detecting the ambient temperature. The processor can be programmed to estimate the wearer's core temperature as a function of the measured skin temperature and ambient temperature (e.g., based on the difference between the skin temperature and the ambient temperature).
  • Prediction of Medical Events
  • While certain examples discussed above relate to the use of PPG data and accelerometer data (e.g., MoCG data) to diagnose medical conditions or events that were already experienced by the user, in certain implementations, the processor can be programmed to use this data to predict medical conditions before they happen. For example, the heart rate, heart rate variability, and blood pressure of the wearer can be monitored and processed by the processor to make such predictions. One example of a medical event that can be predicted in a subject is tachycardia. Tachycardia is when a subject's heart rate is over 100 beats per minute. If a subject's heart rate is trending upwards, a prediction can be made as to when the subject will experience tachycardia. Other examples of medical events that can be predicted are hypertension and stroke. For example, if a subject's blood pressure is increasing over time (e.g., if the rate of change of the blood pressure is above a threshold), a prediction can be made as to when the subject will experience hypertension. Hypertension is diagnosed when a subject's blood pressure exceeds 140/90 mmHg. If the increase is rapid, a prediction can be made as to when the subject will have a high likelihood of experiencing a stroke. Similarly, if a subject's blood pressure is decreasing rapidly (e.g., if the rate of change of the blood pressure is negative and below a threshold), a prediction can be made as to whether the subject will have a heart condition.
  • In cases where the heart rate variability of the subject is used to predict a medical event, whether the subject experiences arrhythmia (e.g., atrial fibrillation) can determine what an appropriate heart rate variability of the subject is. For example, a subject who experiences arrhythmia may have a high heart rate variability, but this may be normal given the subject's condition.
  • An example process 2800 of predicting a medical event of a subject is shown in FIG. 28. A machine, such as a processor, that receives information from the optical sensors 110 of the device 100 can perform one or more steps of the process ##00. In some implementations, the machine can include the computing device 115 described above with reference to FIG. 1B. In the process 2800, initially, data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed (2802). Data in a second dataset that represents time-varying information about motion of the subject can also be processed. The data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject). A medical event of the subject can then be predicted (2804). The medical event can be predicted based on the processed data. Medical events that can be predicted include tachycardia, hypertension, stroke, and heart condition.
  • Medication Compliance
  • The processor can also be programmed to ensure that the wearer of the device is adhering to a prescribed medication regimen. For example, for wearers who are prescribed blood pressure medication, the processor can be programmed to monitor the blood pressure of the wearer and to alert the wearer if, based on the blood pressure data, it appears that the wearer forgot to take his or her medication. The device can be used in this manner to monitor a wearer's adherence to a prescribed medication schedule for any of various other medications that impact the various different vital signs monitored by the device.
  • Medication Effectiveness
  • The processor can also be programmed to determine the effectiveness of a medication. For example, in the context of inhalation medications, it is unknown if generic inhalation medications have the same effectiveness as brand name inhalation medications. One reason for this is that environmental and genetic makeups are generally different between users. The processors can be programmed to monitor the heart rate and the blood oxygenation (SpO2) of wearers of devices who are prescribed generic inhalation medication and wearers of devices who are prescribed name brand inhalation medication. The processors can also consider data related to environment and genetic makeups of the wearers. Data related to the effects of the inhalation medication on the wearers can be used to determine the effectiveness of the generic inhalation medication compared to the effectiveness of the name brand inhalation medication. The device can be used in this manner to monitor the effectiveness of any of various other medications that impact the various different vital signs monitored by the device.
  • In some implementations, the processor can determine a correlation between a particular medication's effectiveness and environmental factors. For example, two wearers of the device who reside in two different extreme environments (e.g., Alaska and Florida) may experience different effects from the particular medication. Differences in the medication's effectiveness may be attributed to the different extreme environments experienced by the wearers. For example, the processor can determine a correlation between the particular medication's effectiveness and the environmental temperature experienced the wearer.
  • In some implementations, the device may identify a correlation between a particular medication's effectiveness and other environmental factors. For example, differences in a medication's effectiveness between two users may be attributed to the food that people generally eat in a particular region, thereby allowing the device to identify food-drug interaction information related to the medication.
  • Because everyone has a different genetic makeup, different people may require different dosages and dosage timings of a particular medication. For example, a person with a relatively fast metabolism may be able to increase the effectiveness of a medication by taking multiple small doses of the medication over the course of the day. In contrast, a person with a relatively slow metabolism may benefit from taking fewer large doses. The device can be configured to determine an optimal timing and dosage regimen for a particular wearer by monitoring the wearer's vitals while the wearer is under the influence of the medication. For example, a wearer may take a medication to maintain his or her blood pressure below a particular level. After the wearer takes the general recommended dose of the medication, the device may determine that the wearer's blood pressure was reduced too much, and recommend that the wearer take a smaller dose the next day. The following day, the wearer may take the dosage amount recommended by the device. The device may determine that the wearer's blood pressure was reduced to the ideal level, but that the wearer may need to take a second small dose of the medication to maintain his or her blood pressure at the ideal level over the course of the day. In this way, the device can continuously refine the wearer's dosage regimen to be custom tailored to the wearer. The device can be used in this manner to determine an optimal dosage regimen for any of various other medications that impact the various different vital signs monitored by the device as described herein.
  • In some implementations, the processor can determine an optimal time for a wearer of the device to take a medication. For example, a doctor typically tell a patient to take particular medications at particular times of the day or under particular circumstances (e.g., in the morning, in the evening, with food, etc.). Such blanket directions do not typically apply to all patients under all circumstances. The processor can monitor the vital signs of the wearer of the device to determine the optimal time for the wearer to take the medication under the current circumstances. The processor can consider characteristics of the particular medication when making the determination.
  • For example, the wearer of the device may take a medication that has a tendency to cause the wearer to be energetic. A doctor may suggest that the medication be taken no later than 3:00 pm to prevent disruption of the wearer's sleep. By analyzing the wearer's vital signs, such as the wearer's heart rate and respiratory rate over the course of a particular day, the processor may determine that the wearer is more energized than usual. The processor may recommend that the wearer take the medication earlier than usual to prevent the wearer from becoming too energized and having his sleep disrupted later.
  • An example process 2900 of providing information about a medication regimen of a subject is shown in FIG. 29. A machine, such as a processor, that receives information from the optical sensors 110 of the device 100 can perform one or more steps of the process 2900. In some implementations, the machine can include the computing device 115 described above with reference to FIG. 1B. In the process 2900, initially, data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed (2902). Data in a second dataset that represents time-varying information about motion of the subject can also be processed. The data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject). Information about a medication regimen of the subject can then be provided (2904). Based on the data, a determination can be made that the subject has potentially missed a dose of a medication, and a notification can be provided to the subject indicating such. Based on the data, a reaction of the subject to a medication can be determined, and a recommended medication regimen of the medication can be provided to the subject based on the reaction to the medication. The recommended medication regimen can include one or more recommended dosage timings and one or more recommended dosage amounts, each of which corresponds to one of the dosage timings.
  • Connectivity with Other Devices
  • In some implementations, the device 100 can be configured to communicate with other computing devices. For example, the device 100 can include a transceiver module that can send data to, and receive data from, a server computer. In such cases, the device 100 can be configured to act as a client within a client-server architecture. The server computer can be configured to receive and store data provided by the device 100 and share the data with other computing devices. This is illustrated in FIG. 30, which shows an example where, a hospital, nursing home, or elder-care center uses a server computer (or another central computer acting as a hub) 3030 that is configured to receive communications from devices 100 worn by patients or residents 3005. In such cases, the server computer 3030 can be configured to determine, based on data received from a particular device 100, that the wearer of the device 100 is in need of assistance. The server computer can be configured to alert appropriate personnel (e.g., medical personnel 3007) accordingly. For example, based on data (e.g., heart rate or blood pressure) received from a particular device 100, the server computer 3030 may determine that the wearer of the particular device 100 is experiencing (or is likely to experience) a health-related emergency, and alert appropriate caregivers 3007 automatically (e.g., by sending a message to a computing device 3040 at a caregivers' station, sending a text message or paging message to the caregivers, triggering an alarm, or initiating an emergency call). In some implementations, in addition to health related information, the data received from the device 100 can include additional information (e.g. location data) that can be used in contextualizing the health information. For example, if the data received from the device 100 indicates that a patient is in a horizontal position at 2:00 AM, the situation may be determined as normal. However, if accompanying location data (provided, for example, by a GPS unit within the device 100) shows that the patient is in a corridor or bathroom, the server computer may determine that a potentially dangerous event (e.g., a fall or loss of consciousness) has occurred. In some implementations, the device 100 itself may make such a determination and forward the information to the server computer 3030 for taking an appropriate action.
  • In some implementations, the device 100 can be configured to communicate over a network (e.g., a Wi-Fi network) with other devices connected to the network. For example, the device 100 can be configured to communicate with a Wi-Fi enabled thermostat to facilitate control of ambient temperature based on vital signs data collected by the device 100. For example, temperature data collected using the device 100 can be used to determine that the wearer is cold, and the temperature can be increased accordingly. In another example, location data provided by the device 100 (possibly through a server computer) can be used to determine that the wearer is not at home, and the thermostat can be instructed to switch off the heating or cooling system accordingly. Location data can also be used, for example, to determine that the wearer is returning home, and the heating or cooling system can be switched on in advance.
  • Referring to FIG. 31, the device 100 (e.g., the wearable watch 3200 of FIGS. 32A and 32B) can be configured to wirelessly communicate (e.g., via a Bluetooth connection) with a proximity system 3100 that is configured to identify the location of the watch 3200. One or more proximity sensors 3102 positioned throughout a store can monitor the location of the watch 3200, thus determining the wearer's tendencies in the store. The location of the watch 3200 can be determined based on a strength of a wireless communication signal between the watch 3200 and one or more of the proximity sensors 3102. In some implementations, the proximity sensors 3102 are iBeacons™. The location information can be used to determine particular products and/or advertisements that the wearer expressed interest in. For example, the proximity system 3100 can determine that a wearer of the watch 3200 spent a particular amount of time at a location near a display for a newly-released smartphone 3104, thus making the inference that the wearer was examining and/or interacting with the display and the smartphone 3104. The information measured by the proximity system 3100 can be compared to vital information collected by the watch 32003 during the same time period to determine the wearer's reaction to the display and the smartphone 3104. For example, the wearer may have experienced an increase in heart rate, blood pressure, and respiratory rate while considering the display and the smartphone 3104, thereby indicating that the wearer is interested in and/or excited about the smartphone 3104. In some implementations, the wearer's vital signs may indicate that a particular display, product, and/or advertisement scares the wearer or causes the wearer to feel stress, as indicated by the measured vital signs.
  • In some implementations, the device can be configured to wirelessly communicate (e.g., via a Bluetooth connection) with other devices. Multiple devices can create a mesh network, with each device representing a node that relays data for the network. In this manner, a wearer who is in a location where other forms of communication are not available may still be able to communicate with the mesh network via the device. For example, a wearer who is in an underground tunnel may not have access to a cellular or Wi-Fi network, but may still be able to communicate with devices of other wearers. Such mesh network communication can be beneficial in certain emergency situations. For example, a wearer of the device who is performing an underground construction project may become lost and/or trapped, and the wearer may not have access to a cellular network to call for help. However, the wearer may be able to manually notify another wearer of the emergency condition via the mesh network of connected devices.
  • In some implementations, the device can detect and emergency condition based on the vitals of the wearer. For example, the device may detect a sudden increase in blood pressure, heart rate, and/or respiratory rate and infer that the wearer is under distress. Upon such a determination, the device can be configured to automatically establish a wireless Bluetooth connection with any other devices within range in order to notify wearers of the other devices of the emergency condition. The wireless Bluetooth connection may be capable of relaying information to other wearers that can be used to assist the other wearers in locating the distressed wearer. For example, the signal strength of the Bluetooth connection can be monitored to determine whether a potential rescuer is getting closer to the distressed wearer.
  • Multiple devices 100 can be used to measure environmental characteristics. In some implementations, multiple devices 100 can be configured to communicate with a Wi-Fi enabled thermostat to facilitate control of ambient temperature in public places based on users' vital signs data collected by the devices 100. Temperature data collected by the devices 100 can be used to determine that the wearers are cold, and the temperature in the public place can be increased accordingly. For example, temperature data collected by devices 100 worn by users who are together in a room can be used to determine that at least some of the wearers are cold, and the temperature of the room can be increased accordingly.
  • Further, location data provided by the GPS transponder of the devices 100 can be used to determine public places that are not occupied by users, and the thermostat can be instructed to switch off the heating or cooling system accordingly. Similarly, location data can also be used, for example, to determine that users are about to be at a particular public place, and the heating or cooling system can be switched on in advance.
  • In some implementations, location data and temperature data provided by the devices 100 can be used to determine that nobody is in a particular subway car, and the heating or cooling system in the particular subway car can be switched off accordingly. Similarly, location and temperature data provided by the devices 100 can be used to determine that one or more users of the devices 100 are about to occupy a particular subway car, and the heating or cooling system of the particular subway car can be switched on in advance (e.g., to allow the subway car to assume an appropriate temperature in advance of being occupied).
  • Because the data from the device 100 can be used to identify a wearer, as well as make various inferences about the state of the body (activity, tiredness, stress level, sleep pattern, etc.) and/or mind (mood, alertness, etc.) of the wearer, different types of personalization can be facilitated accordingly, via communications with appropriate devices and systems. Examples of such personalization can include providing mood-based lighting or music and activity-based temperature control. In some implementations, an entertainment device such as a smart TV can be configured to provide personalized suggestions for TV shows, movies, or games based on a state of a user's body and/or mind as determined from data received from the device 100.
  • In some implementations, data from the device 100 can be used to cause a particular TV show or movie to be dynamically changed. For example, a TV show or a movie can have multiple pre-made endings. The device 100 can consider the wearer's vitals, such as blood pressure, heart rate, and respiratory rate, to make inferences about the physical and/or mental state of the wearer. The device 100 can then cause the particular TV show or movie to be dynamically altered based on the state of the wearer. For example, if the wearer's vitals indicate that the wearer is bored (e.g., as indicated by a reduced heart rate and/or respiratory rate), the device 100 may cause the TV show or movie to dynamically adapt and play a more exciting alternate ending. On the other hand, if the wearer's vitals indicate that the wearer is scared or upset by the content of the TV show or movie (e.g., as indicated by an increase in blood pressure, heart rate, and/or respiratory rate), the device 100 may cause the TV show or movie to dynamically adapt and play a toned-down alternate ending. The device 100 can be used in a similar manner to dynamically alter audio output devices (e.g., stereos or entertainment systems), video games, and other entertainment mediums, as described in more detail below.
  • In some implementations, the device 100 can be used to facilitate access control. An example of such an environment 3300 is shown in FIG. 33. In the example of FIG. 33, a biometric signature (e.g., one based on cardiac morphology, or a combination of one or more parameters detected, derived using the device 100) of a wearer of the device 100 can be used in conjunction with location data to determine that a wearer is proximate to an access point 3310 such as a door or turnstile. A network-connected lock or another access control mechanism 3320 associated with the access point 3310 can be activated based on determining that the biometric signature corresponds to a wearer authorized to access the corresponding access-controlled premises.
  • In some implementations, information related to the biometric signature of the user can be provided to the access control mechanism 3320 via a remote server 3330 that communicates with the device 100. For example, the remote server 3330 can determine, based on data received from the device 100, whether a biometric signature of the user corresponds to a user authorized to access the controlled premises. If the server 3330 determines that the user is authorized to access the premises, the server 3330 can then send a signal to the access control mechanism 3320 to unlock the access point 3310. In some implementations, the communications between the server 3330 and the device 100 can be via a local hub 3340 (e.g., a proximity sensor) that communicates with the server 3330 to forward information received from the device 100. In some implementations, the local hub 3340 can be configured to process the information received from the device 100 and directly transmit a signal to the access control mechanism 3320 accordingly. The access control mechanism can also be configured to communicate directly with the device 100. In such cases, information from the device 100 is transmitted to the access control mechanism 3320, which unlocks itself upon determining, based on the received information, that the corresponding user is authorized to access the controlled premises.
  • In some implementations, the biometric signature can be used to allow the wearer to access/operate a vehicle or another access-controlled machine. This is illustrated in the example depicted in FIG. 34. In the example of FIG. 34, data from the device 100 can be used to identify whether an individual is authorized to operate a vehicle or machine 3410, and/or determine whether the physical and/or mental state of the individual is appropriate for handling or operating the vehicle or machine. In some implementations, information about the user can be provided from the device 100 to a remote server 3430 either directly or via a transceiver module 3440 deployed on the vehicle or machine. The remote server 3430 (or the transceiver module 3440) can determine, based on data received from the device 100, whether a biometric signature of the user corresponds to a user authorized to access the controlled premises. The server 3430 (or the transceiver module 3440) can also determine, for example, whether the user possesses sufficient mental/physical capability for operating the vehicle or machine. In one example, data from the device 100 can be used to prevent a pilot from operating an airplane if his/her vital signs indicate an alertness level less than a threshold. In another example, data from the device 100 can be used to prevent a driver from operating a vehicle if his/her stress level is determined to be higher than a threshold level. This can help, for example, reduce occurrences of stress-related traffic issues (e.g., road rage) and accidents. In some implementations, if the server 3430 determines that a user's mental/physical state is not suitable for operating the vehicle or machine, the server 3430 can then send a signal to the transceiver module 3440 to shut down the vehicle or machine, or otherwise alert the user about the situation. In some implementations, the server 3430 (or the transceiver module 3440) can send a signal to the device 100 to alert the user. For example, if the alertness of the user is waning during the operation of the vehicle (e.g., because of the user dozing off on the wheel), a signal can be sent to the device 100 to alert the user to take corrective measures.
  • In some implementations, the device 100 can be configured to communicate with the transceiver module 3440 of the vehicle. In such cases, the transceiver module 3440 can be configured to provide feedback to other modules in the vehicle based on data received from the device 100 (either directly, or via the server 3430). For example, the transceiver module 3440 of the car can be configured to provide feedback signals to a temperature control system of the vehicle to adjust the temperature based on vital signs data collected by the device 100. In another example, the transceiver module 3440 may use data from the device 100 to provide feedback to a collision avoidance system that, for example, triggers an alarm (and/or slows the vehicle down) upon determining that a driver wearing the device 100 is not adequately alert. In another example, the transceiver module 3440 may use data from the device 100 to turn off an operation switch (e.g., an ignition) of the vehicle. In some implementations, in case of accidents, the data from the device 100 can be transmitted (possibly via the transceiver module 3440) to appropriate authorities for determining a nature of resources to be sent to the accident scene. For example, the data from the device 100 may indicate that a driver wearing the device 100 requires the assistance of a standard paramedic, or the data from the device 100 may indicate that the driver requires the assistance of a trauma unit. The data from the device 100 may also indicate whether the wearer of the device 100 requires immediate attention from rescue workers, or alternatively whether the wearer of the device 100 can be treated at a later time (e.g., in order to first treat others involved in the accident).
  • In some implementations, the device 100 can be configured to communicate with a gaming device such as a video game console. This is illustrated in the example depicted in FIG. 35. In the example of FIG. 35, data from the device 100 can be used to control a gaming device 3510 based on an identity and/or state of the body of the user. For example, one or more of blood pressure data, respiratory rate, and heart rate obtained using the device 100 can be used to determine an interest level or engagement level of the user. If the user is determined to show more interest in certain game situations as opposed to others, the gaming device can be configured to adaptively provide game situations that the user is interested in. If the data from the device 100 indicates a low level of interest, steps can be taken (e.g. increasing the background sound level, playing a stimulating track, or introducing additional challenges) to increase the interest level of the user. This way, games being played on the gaming device 3510 can be made more appealing to the user. In some implementations, the gaming device 3510 can be configured to be turned off if the user's body state is determined to be in a potentially harmful condition. For example, if the blood pressure or heart rate data from the device 100 indicates that the stress level of the user is above a threshold, the gaming device can be instructed to shut down to prevent the user from continuing to play.
  • In some implementations, information from the device 100 can be provided to a remote server 3530 either directly, or via a local hub 3540 that communicates with the server 3530. The information from the server 3530 can also be transmitted, for example, either directly or via the local hub 3540 to the gaming device 3510. In some implementations, the gaming device 3510 can be configured to receive data directly from the device 100 (or via the local hub 3540) and change the game situations accordingly.
  • In some implementations, operations of the entertainment or gaming devices can be linked to data obtained from the device 100. For example, if a user opts to force himself to exercise, he can choose a configuration in which a gaming device 3510 or TV 3520 will be switched on only if he has exercised for a predetermined length of time during a given time period. In some implementations, if data from the device 100 indicates that the user has fallen asleep, the entertainment device (e.g., the TV 3520) may also be switched off based on such data.
  • Further, as shown in the example depicted in FIG. 36, the device 100 can alternatively or additionally be linked to other types of devices, such as lighting units 3610, thermostats 3620, etc., that can be adjusted based on data from the device 100. For example, biometric signature or health data obtained using the device 100 can be used in determining if a user is hot or cold, and the thermostat 3620 can be adjusted accordingly. In another example, data from the device 100 can be used in determining that a user is approaching a room, and the lights in the room can be turned on via communications with the lighting unit 3610. The data about the user can be provided to the lighting unit 3610 or thermostat 3620 via a remote server 3630 that communicates with the device 100. In one example, if the server 3630 determines, based on data received from the device 100, that the user is feeling too cold, the server 3630 can then send a signal to the thermostat 3620 to increase the temperature of the room. In some implementations, the communications between the server 3630 and the device 100 can be via a local hub 3640 (e.g., a proximity sensor) that communicates with the server 3630 to forward information received from the device 100. In some implementations, the local hub 3640 can be configured to process the information received from the device 100 and directly transmit a signal to, for example, the lighting unit 3610 or the thermostat 3620, accordingly. In some implementations, the network connected lighting unit 3610 or thermostat 3620 can be configured to communicate directly with the device 100. In such cases, information from the device 100 can be transmitted to the thermostat 3620, which adjusts the temperature upon determining, based on the received information, that the corresponding user uncomfortable at a current temperature setting. In some implementations, network connected devices such as the lighting unit 3610, thermostat 3620, gaming device 3510, or TV 3520, can be turned off or adjusted upon receiving data indicating that the user has fallen asleep.
  • The interest level or engagement level determination, as described above with respect to a gaming device, can also be used for other applications. For example, upon authorization from a user, such information may be used by a dating or matchmaking service. For example, by reviewing a user's vital signs while the user is on a date, a determination can be made whether the user is interested in the other person or not. If the interest level is not determined to satisfy a threshold level, the dating or match-making service may refrain from suggesting persons with similar profiles. On the other hand, if the interest level is determined to be high (i.e., the interest level satisfies a threshold condition), the dating or match-making service may suggest to the user other persons with similar profiles. The interest level based suggestions can be provided, for example, by a processing device that receives the user's data and retrieves potential matches from a database. In some implementations, the process can be made completely automated to avoid the user's personal data being exposed to human personnel. In some implementations, the user's data can be anonymized such that a particular user cannot be identified by human personnel. In some implementations, some of the data or feedback received from the device 100 can be stored within a profile of the user (based on authorization and permissions from the user) to suggest future matches that the user is more likely to be interested in.
  • In some implementations, information based on the data collected by the device 100 can be made available to the user, for example, via an application executing on a smartphone device. The application can include one or more user interfaces that allow the user to review the variations over the course of a particular time period (e.g., a day, overnight, a week, or a month) or during a particular event (e.g., a meeting, an exercise session, or a date). Examples of such user interfaces 3900, 3925, and 3950 are shown in FIGS. 39A-39C. A user interface such as the example user-interface 3900 can enable a user to see how various events in his/her life affect stress levels, and possibly take action accordingly. For example, the user interface 3900 can indicate that the user tends to become stressed when attending to work-related e-mails late at night. The user may then make a conscious effort to avoid looking at work-related emails late at night to alleviate stress. If a particular activity is determined to have a beneficial effect on the user, the user can make an effort to increase such activities in his/her daily life. The user interface 3900 can include suggestions for improving stress levels, and show a graphical representation of the stress level variations over a period of time (e.g., a week).
  • In some implementations, a user interface 3925 can show variations of the vital signs during a certain activity (e.g., listening to music or running) For example, the user interface 3925 can show variation in heart rate for a running session and graphically compare the variation with other baselines such as the user's own variation from a previous time, or a professional athlete's variations for a similar activity. The user can then determine if his/her fitness level is improving or deteriorating. In some implementations, a user interface such as the example user interface 3950 can be configured to display various vital signals (e.g., heart rate, cardiac power, heart rate volume, recovery rate, etc.) related to the cardiac health of the user.
  • FIG. 37 shows an example screenshot 3700 on a mobile phone 3702 of a wearer for the wearer to view and share his or her blood pressure results. In this example, the wearer's average blood pressure is 136/86 mmHg. A graph displays the wearer's blood pressure over a number of days. The wearer has the option to share the blood pressure data with other people via a secure link. The wearer can also choose to share other information with other people, such as the wearer's medication data, activity data, and sleep data.
  • An example process 3800 of providing information related to the processed data to a remote device is shown in FIG. 38. A machine, such as a processor, that receives information from the optical sensors 110 of the device 100 can perform one or more steps of the process 3800. In some implementations, the machine can include the computing device 115 described above with reference to FIG. 1B. In the process 3800, initially, data in a first dataset that represents time-varying information about at least one pulse pressure wave propagating through blood in a subject can be processed (3802). Data in a second dataset that represents time-varying information about motion of the subject can also be processed. The data can be acquired at a location of the subject (e.g., the arm or the wrist of the subject). Information related to the processed data can then be provided to a remote device (3804). The remote device can be a server, a thermostat, a light, an entertainment device, a television, an audio output device, or a gaming device. The remote device can operate based on the processed data.
  • Computing Device
  • FIG. 40 is block diagram of an example computer system 4000 that can be used for performing one or more operations related to the technology described above. In some implementations, the computer system 4000 can be used to implement any portion, module, unit or subunit of the device 100, or computing devices and processors referenced above. The system 4000 includes a processor 4010, a memory 4020, a storage device 4030, and an input/output device 4040. Each of the components 4010, 4020, 4030, and 4040 can be interconnected, for example, using a system bus 4050. The processor 4010 is capable of processing instructions for execution within the system 4000. In one implementation, the processor 4010 is a single-threaded processor. In another implementation, the processor 4010 is a multi-threaded processor. The processor 4010 is capable of processing instructions stored in the memory 4020 or on the storage device 4030.
  • The memory 4020 stores information within the system 4000. In one implementation, the memory 4020 is a computer-readable storage device that includes a non-transitory computer readable medium. In general, non-transitory computer readable medium is a tangible storage medium for storing computer readable instructions and/or data. In some cases, the storage medium can be configured such that stored instructions or data are erased or replaced by new instructions and/or data. Examples of such non-transitory computer readable medium include a hard disk, solid-state storage device, magnetic memory or an optical disk. In one implementation, the memory 4020 is a volatile memory unit. In another implementation, the memory 4020 is a non-volatile memory unit.
  • The storage device 4030 is capable of providing mass storage for the system 4000. In one implementation, the storage device 4030 is a computer-readable medium. In various different implementations, the storage device 4030 can include, for example, a hard disk device, an optical disk device, or some other large capacity storage device.
  • The input/output device 4040 provides input/output operations for the system 4000. In one implementation, the input/output device 4040 can include one or more of a network interface devices, e.g., an Ethernet card, a serial communication device, e.g., an RS-232 port, and/or a wireless interface device, e.g., and 802.11 card. In another implementation, the input/output device can include driver devices configured to receive input data and send output data to other input/output devices, e.g., keyboard, printer and display devices.
  • Although an example processing system has been described in FIG. 40, implementations of the subject matter and the functional operations described in this specification can be implemented in other types of digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Implementations of the subject matter described in this specification can be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a tangible program carrier, for example a computer-readable medium, for execution by, or to control the operation of, a processing system. The computer readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, or a combination of one or more of them.
  • The term “processing system” encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The processing system can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them.
  • A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, or declarative or procedural languages, and it can be deployed in any form, including as a stand-alone program, a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • Computer readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example, semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
  • Implementations of the subject matter described in this specification can be implemented in a computing system that includes a back end component, e.g., a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject matter described is this specification, or any combination of one or more such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.
  • The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client server relationship to each other.
  • While this specification contains many specific implementation details, these should not be construed as limitations on the scope of what may be claimed, but rather as descriptions of features that may be specific to particular implementations. Certain features that are described in this specification in the context of separate implementations can also be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation can also be implemented in multiple implementations separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can, in some cases, be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
  • Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the implementations described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
  • A number of implementations of the invention have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the technology described in this document. Accordingly, other implementations are within the scope of the following claims.

Claims (40)

1. A method comprising:
obtaining, using a first sensor, a first data set representing time-varying information on at least one pulse pressure wave within vasculature at a first body part of a subject;
obtaining, using a second sensor, a second data set representing time-varying information about motion of the subject at the first body part of a subject;
identifying, using one or more processors, a first point in the first data set, the first point representing an arrival time of the pulse pressure wave at the first body part;
identifying, using the one or more processors, a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject; and
computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
2. The method of claim 1, wherein the information about the at least one pulse pressure wave comprises photoplethysmographic (PPG) data and the information about motion of the subject comprises one or both of motioncardiogram (MoCG) data and gross motion data.
3. The method of claim 1, wherein data comprising at least one of the first data set and the second data set is acquired continuously.
4. The method of claim 3, wherein the data is acquired at a frequency of at least 16 Hz.
5. The method of claim 3, wherein the data is acquired at a frequency of between 75 Hz and 85 Hz.
6. The method of claim 3, wherein the data is acquired by a device worn by the subject.
7. The method of claim 6, wherein the device is mobile and does not reduce a mobility of the subject.
8. The method of claim 7, wherein the device processes the data.
9. The method of claim 1, wherein the first body part is an arm of the subject.
10. The method of claim 1, wherein the first body part is a wrist of the subject.
11. The method of claim 1, wherein the first sensor comprises an optical sensor and the second sensor comprises an accelerometer or a gyroscope.
12. The method of claim 1, wherein identifying the first point comprises:
computing, by the one or more processors, a cross-correlation of a template segment with each of multiple segments of the first dataset;
identifying, based on the computed cross-correlations, at least one candidate segment of the first dataset as including the first point; and
identifying, by the one or more processors, a first feature within the identified candidate segment as the first point.
13. The method of claim 1, wherein identifying the second point comprises:
determining a reference point in the second data set, the reference point corresponding to substantially the same point in time as the first point in the first data set;
identifying one or more target features within a predetermined time range relative to the reference point; and
selecting a time point corresponding to one of the target features as the second point.
14. The method of claim 13, wherein the target features comprises at least one of a peak and a valley.
15. The method of claim 1, further comprising computing a blood pressure of the subject as a function of the PTT.
16. The method of claim 15, wherein the blood pressure includes a systolic pressure and a diastolic pressure.
17. The method of claim 16, wherein a systolic pressure is calculated as a linear function of the diastolic pressure.
18. The method of claim 13, in which the pre-determined time range is associated with the systole portion of the subject's heartbeat.
19. The method of claim 1, further comprising accepting user-input for initiating computation of the PTT.
20. The method of claim 1, further comprising computing arterial stiffness as a function of the PTT.
21. A biofeedback device configured to be worn by a subject, the device comprising:
a first sensor configured to obtain a first data set representing time-varying information on at least one pulse pressure wave within vasculature at a first body part of a subject;
a second sensor configured to obtain a second data set representing time-varying information about motion of the subject at the first body part of a subject;
memory; and
one or more processors configured to:
receive the first and second data sets,
identify a first point in the first data set, the first point representing an arrival time of the pulse pressure wave at the first body part,
identify a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject, and
compute a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
22. The device of claim 21, wherein the information about the at least one pulse pressure wave comprises photoplethysmographic (PPG) data and the information about motion of the subject comprises one or both of motioncardiogram (MoCG) data and gross motion data.
23. The device of claim 21, wherein data comprising at least one of the first data set and the second data set is acquired continuously.
24. The device of claim 23, wherein the data is acquired at a frequency of at least 16 Hz.
25. The device of claim 23, wherein the data is acquired at a frequency of between 75 Hz and 85 Hz.
26. The device of claim 21, further comprising a mechanism that allows the device to be worn by the subject.
27. The device of claim 21, wherein the mechanism does not reduce a mobility of the subject.
28. The device of claim 21, wherein the first body part is an arm of the subject.
29. The device of claim 21, wherein the first body part is a wrist of the subject.
30. The device of claim 21, wherein the first sensor comprises an optical sensor and the second sensor comprises an accelerometer or a gyroscope.
31. The device of claim 21, wherein identifying the first point comprises:
computing, by the one or more processors, a cross-correlation of a template segment with each of multiple segments of the first dataset;
identifying, based on the computed cross-correlations, at least one candidate segment of the first dataset as including the first point; and
identifying, by the one or more processors, a first feature within the identified candidate segment as the first point.
32. The device of claim 21, wherein identifying the second point comprises:
determining a reference point in the second data set, the reference point corresponding to substantially the same point in time as the first point in the first data set;
identifying one or more target features within a predetermined time range relative to the reference point; and
selecting a time point corresponding to one of the target features as the second point.
33. The device of claim 32, wherein the target features comprises at least one of a peak and a valley.
34. The device of claim 21, wherein the one or more processors are further configured to compute a blood pressure of the subject as a function of the PTT.
35. The device of claim 34, wherein the blood pressure includes a systolic pressure and a diastolic pressure.
36. The device of claim 35, wherein a systolic pressure is calculated as a function of the diastolic pressure.
37. The device of claim 32, in which the pre-determined time range is associated with the systole portion of the subject's heartbeat.
38. The device of claim 21, further comprising an input mechanism configured to accept user-input for initiating computation of the PTT.
39. The device of claim 21, wherein the one or more processors are further configured to compute arterial stiffness as a function of the PTT.
40. One or more machine-readable storage devices storing instructions that are executable by one or more processing devices to perform operations comprising:
obtaining a first data set representing time-varying information on at least one pulse pressure wave within vasculature at a first body part of a subject;
obtaining a second data set representing time-varying information about motion of the subject at the first body part of a subject;
identifying a first point in the first data set, the first point representing an arrival time of the pulse pressure wave at the first body part;
identifying a second point in the second dataset, the second point representing an earlier time at which the pulse pressure wave traverses a second body part of the subject; and
computing a pulse transit time (PTT) as a difference between the first and second points, the PTT representing a time taken by the pulse pressure wave to travel from the second body part to the first body part of the subject.
US14/521,829 2013-10-23 2014-10-23 Calculating Pulse Transit Time Abandoned US20150112606A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/521,829 US20150112606A1 (en) 2013-10-23 2014-10-23 Calculating Pulse Transit Time
US14/630,288 US20150164351A1 (en) 2013-10-23 2015-02-24 Calculating pulse transit time from chest vibrations

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201361894884P 2013-10-23 2013-10-23
US201462002531P 2014-05-23 2014-05-23
US14/521,829 US20150112606A1 (en) 2013-10-23 2014-10-23 Calculating Pulse Transit Time

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/630,288 Continuation-In-Part US20150164351A1 (en) 2013-10-23 2015-02-24 Calculating pulse transit time from chest vibrations

Publications (1)

Publication Number Publication Date
US20150112606A1 true US20150112606A1 (en) 2015-04-23

Family

ID=52825691

Family Applications (10)

Application Number Title Priority Date Filing Date
US14/521,897 Abandoned US20150112208A1 (en) 2013-10-23 2014-10-23 Medication management
US14/521,767 Abandoned US20150112154A1 (en) 2013-10-23 2014-10-23 Biometrics in risk situations
US14/521,822 Abandoned US20150112155A1 (en) 2013-10-23 2014-10-23 Sleep parameters
US14/522,132 Abandoned US20150112158A1 (en) 2013-10-23 2014-10-23 Health Metrics
US14/522,398 Abandoned US20150112159A1 (en) 2013-10-23 2014-10-23 Alertness Detection
US14/521,907 Abandoned US20150112157A1 (en) 2013-10-23 2014-10-23 Arrhythmia detection
US14/521,829 Abandoned US20150112606A1 (en) 2013-10-23 2014-10-23 Calculating Pulse Transit Time
US14/522,230 Expired - Fee Related US9396643B2 (en) 2013-10-23 2014-10-23 Biometric authentication
US14/521,823 Abandoned US20150112156A1 (en) 2013-10-23 2014-10-23 Predicting medical events
US14/522,157 Expired - Fee Related US9396642B2 (en) 2013-10-23 2014-10-23 Control using connected biometric devices

Family Applications Before (6)

Application Number Title Priority Date Filing Date
US14/521,897 Abandoned US20150112208A1 (en) 2013-10-23 2014-10-23 Medication management
US14/521,767 Abandoned US20150112154A1 (en) 2013-10-23 2014-10-23 Biometrics in risk situations
US14/521,822 Abandoned US20150112155A1 (en) 2013-10-23 2014-10-23 Sleep parameters
US14/522,132 Abandoned US20150112158A1 (en) 2013-10-23 2014-10-23 Health Metrics
US14/522,398 Abandoned US20150112159A1 (en) 2013-10-23 2014-10-23 Alertness Detection
US14/521,907 Abandoned US20150112157A1 (en) 2013-10-23 2014-10-23 Arrhythmia detection

Family Applications After (3)

Application Number Title Priority Date Filing Date
US14/522,230 Expired - Fee Related US9396643B2 (en) 2013-10-23 2014-10-23 Biometric authentication
US14/521,823 Abandoned US20150112156A1 (en) 2013-10-23 2014-10-23 Predicting medical events
US14/522,157 Expired - Fee Related US9396642B2 (en) 2013-10-23 2014-10-23 Control using connected biometric devices

Country Status (6)

Country Link
US (10) US20150112208A1 (en)
EP (1) EP3060107A1 (en)
JP (1) JP2016538097A (en)
KR (1) KR20160075677A (en)
CA (1) CA2928197A1 (en)
WO (1) WO2015061579A1 (en)

Cited By (98)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160007931A1 (en) * 2013-04-05 2016-01-14 Healbe Corporation Method for determining a person's sleeping phase which is favourable for waking up
US20160015314A1 (en) * 2014-07-21 2016-01-21 Withings System and Method to Monitor and Assist Individual's Sleep
US20160045131A1 (en) * 2014-08-18 2016-02-18 Cameron Health, Inc. Cardiac rate tracking in an implantable medical device
WO2016065476A1 (en) * 2014-10-30 2016-05-06 2352409 Ontario Inc. A wearable device and method for non-invasive monitoring continuous blood pressure and other physiological parameters with reduced movement artifacts
US20160150985A1 (en) * 2014-12-02 2016-06-02 Samsung Electronics Co., Ltd. Apparatus and method for measuring pulse wave
US20160278644A1 (en) * 2015-03-25 2016-09-29 Quanttus, Inc. Contact-less blood pressure measurement
USD772260S1 (en) * 2014-11-14 2016-11-22 Volvo Car Corporation Display screen with graphical user interface
USD772904S1 (en) * 2014-11-14 2016-11-29 Volvo Car Corporation Display screen with transitional graphical user interface
USD772903S1 (en) * 2014-11-14 2016-11-29 Volvo Car Corporation Display screen with transitional graphical user interface
USD772905S1 (en) * 2014-11-14 2016-11-29 Volvo Car Corporation Display screen with graphical user interface
US20170020443A1 (en) * 2015-07-20 2017-01-26 iFeel Healthy Ltd. Methods and systems of controlling a subject's body feature having a periodic wave function
US20170042471A1 (en) * 2015-08-10 2017-02-16 Murata Manufacturing Co., Ltd. Monitoring of sleep phenomena
US20170046508A1 (en) * 2015-08-11 2017-02-16 Suprema Inc. Biometric authentication using gesture
US9575560B2 (en) 2014-06-03 2017-02-21 Google Inc. Radar-based gesture-recognition through a wearable device
US9600080B2 (en) 2014-10-02 2017-03-21 Google Inc. Non-line-of-sight radar-based gesture recognition
US20170126613A1 (en) * 2015-11-03 2017-05-04 Joiiup Technology Inc. Instant information exchange system and method for online sports teams
US9693592B2 (en) 2015-05-27 2017-07-04 Google Inc. Attaching electronic components to interactive textiles
CN107049293A (en) * 2016-02-02 2017-08-18 富士通株式会社 Sensor information processing equipment
US9778749B2 (en) 2014-08-22 2017-10-03 Google Inc. Occluded gesture recognition
US20170290526A1 (en) * 2016-04-07 2017-10-12 Oregon Health & Science University Telecentive spirometer
US9811164B2 (en) 2014-08-07 2017-11-07 Google Inc. Radar-based gesture sensing and data transmission
USD803240S1 (en) * 2015-12-09 2017-11-21 Facebook, Inc. Display screen with animated graphical user interface
US9837760B2 (en) 2015-11-04 2017-12-05 Google Inc. Connectors for connecting electronics embedded in garments to external devices
US9848780B1 (en) * 2015-04-08 2017-12-26 Google Inc. Assessing cardiovascular function using an optical sensor
US20180000426A1 (en) * 2016-06-29 2018-01-04 Samsung Electronics Co., Ltd. System and Method for Providing a Real-Time Signal Segmentation and Fiducial Points Alignment Framework
WO2018013569A1 (en) * 2016-07-11 2018-01-18 Mc10, Inc. Multi-sensor blood pressure measurement system
US9892247B2 (en) * 2015-12-30 2018-02-13 Motorola Mobility Llc Multimodal biometric authentication system and method with photoplethysmography (PPG) bulk absorption biometric
US9921660B2 (en) 2014-08-07 2018-03-20 Google Llc Radar-based gesture recognition
US9933908B2 (en) 2014-08-15 2018-04-03 Google Llc Interactive textiles
US9955303B2 (en) 2015-07-21 2018-04-24 IP Funding Group, LLC Determining relative position with a BLE beacon
US9983747B2 (en) 2015-03-26 2018-05-29 Google Llc Two-layer interactive textiles
US10016162B1 (en) 2015-03-23 2018-07-10 Google Llc In-ear health monitoring
US10064582B2 (en) 2015-01-19 2018-09-04 Google Llc Noninvasive determination of cardiac health and other functional states and trends for human physiological systems
US10080528B2 (en) 2015-05-19 2018-09-25 Google Llc Optical central venous pressure measurement
US10088908B1 (en) 2015-05-27 2018-10-02 Google Llc Gesture detection and interactions
US10139916B2 (en) 2015-04-30 2018-11-27 Google Llc Wide-field radar-based gesture recognition
US10175781B2 (en) 2016-05-16 2019-01-08 Google Llc Interactive object with multiple electronics modules
US20190065970A1 (en) * 2017-08-30 2019-02-28 P Tech, Llc Artificial intelligence and/or virtual reality for activity optimization/personalization
US10241581B2 (en) 2015-04-30 2019-03-26 Google Llc RF-based micro-motion tracking for gesture tracking and recognition
US10244985B1 (en) 2017-12-28 2019-04-02 Saleem Sayani Wearable diagnostic device
US10268321B2 (en) 2014-08-15 2019-04-23 Google Llc Interactive textiles within hard objects
US10300370B1 (en) 2015-10-06 2019-05-28 Google Llc Advanced gaming and virtual reality control using radar
US10310620B2 (en) 2015-04-30 2019-06-04 Google Llc Type-agnostic RF signal representations
US10349887B1 (en) 2015-06-14 2019-07-16 Facense Ltd. Blood pressure measuring smartglasses
US10376195B1 (en) 2015-06-04 2019-08-13 Google Llc Automated nursing assessment
US10376163B1 (en) 2015-06-14 2019-08-13 Facense Ltd. Blood pressure from inward-facing head-mounted cameras
US10463299B1 (en) * 2013-02-22 2019-11-05 Cloud Dx, Inc. Systems and methods for monitoring medication effectiveness
US10492302B2 (en) 2016-05-03 2019-11-26 Google Llc Connecting an electronic component to an interactive textile
US10517489B2 (en) 2014-09-08 2019-12-31 Apple Inc. Wrist worn accelerometer for pulse transit time (PTT) measurements of blood pressure
US10542961B2 (en) 2015-06-15 2020-01-28 The Research Foundation For The State University Of New York System and method for infrasonic cardiac monitoring
US10559145B1 (en) * 2019-07-17 2020-02-11 Abdulaziz Mohammed Almehmadi Systems and methods for providing behavioral based intention detection
US10579150B2 (en) 2016-12-05 2020-03-03 Google Llc Concurrent detection of absolute distance and relative movement for sensing action gestures
US20200069245A1 (en) * 2016-06-13 2020-03-05 Medtronic, Inc. Multi-parameter prediction of acute cardiac episodes and attacks
US10638938B1 (en) 2015-06-14 2020-05-05 Facense Ltd. Eyeglasses to detect abnormal medical events including stroke and migraine
US10646121B2 (en) 2015-09-29 2020-05-12 Apple Inc. Pressure measurement designs
US10667697B2 (en) 2015-06-14 2020-06-02 Facense Ltd. Identification of posture-related syncope using head-mounted sensors
US10702171B2 (en) 2014-09-08 2020-07-07 Apple Inc. Systems, devices, and methods for measuring blood pressure of a user
US10772512B2 (en) 2014-09-08 2020-09-15 Apple Inc. Blood pressure monitoring using a multi-function wrist-worn device
US10779738B2 (en) 2014-09-08 2020-09-22 Apple Inc. Electrical coupling of pulse transit time (PTT) measurement system to heart for blood pressure measurement
US10786161B1 (en) 2013-11-27 2020-09-29 Bodymatter, Inc. Method for collection of blood pressure measurement
US10791938B2 (en) 2015-06-14 2020-10-06 Facense Ltd. Smartglasses for detecting congestive heart failure
US10799122B2 (en) 2015-06-14 2020-10-13 Facense Ltd. Utilizing correlations between PPG signals and iPPG signals to improve detection of physiological responses
US10849555B1 (en) 2015-09-29 2020-12-01 Apple Inc. Devices and systems for correcting errors in blood pressure measurements
US10869627B2 (en) * 2017-07-05 2020-12-22 Osr Enterprises Ag System and method for fusing information related to a driver of a vehicle
CN112168139A (en) * 2019-07-05 2021-01-05 腾讯科技(深圳)有限公司 Health monitoring method and device and storage medium
US10896375B2 (en) * 2017-07-11 2021-01-19 International Business Machines Corporation Cognitive replication through augmented reality
US10905382B2 (en) 2016-10-25 2021-02-02 Samsung Electronics Co., Ltd. Bio-signal quality assessment apparatus and method and bio-signal measurement parameter optimization apparatus and method
US10939834B2 (en) * 2017-05-01 2021-03-09 Samsung Electronics Company, Ltd. Determining cardiovascular features using camera-based sensing
US10973422B2 (en) * 2016-01-22 2021-04-13 Fitbit, Inc. Photoplethysmography-based pulse wave analysis using a wearable device
US10980428B2 (en) 2016-12-15 2021-04-20 ViviPulse, LLC Wearable pulse waveform measurement system and method
WO2021125677A1 (en) * 2019-12-18 2021-06-24 Samsung Electronics Co., Ltd. Electronic device for recognizing gesture of user from sensor signal of user and method for recognizing gesture using the same
US20210216904A1 (en) * 2020-01-13 2021-07-15 International Business Machines Corporation Knowledge Aided Feature Engineering
US11064892B2 (en) 2015-06-14 2021-07-20 Facense Ltd. Detecting a transient ischemic attack using photoplethysmogram signals
US11076788B2 (en) * 2014-12-30 2021-08-03 Nitto Denko Corporation Method and apparatus for deriving a mental state of a subject
US11103139B2 (en) 2015-06-14 2021-08-31 Facense Ltd. Detecting fever from video images and a baseline
US11103140B2 (en) 2015-06-14 2021-08-31 Facense Ltd. Monitoring blood sugar level with a comfortable head-mounted device
US20210275034A1 (en) * 2015-06-14 2021-09-09 Facense Ltd. Wearable-based health state verification for physical access authorization
US11154203B2 (en) 2015-06-14 2021-10-26 Facense Ltd. Detecting fever from images and temperatures
US11169988B2 (en) 2014-08-22 2021-11-09 Google Llc Radar recognition-aided search
US11179049B2 (en) 2016-02-29 2021-11-23 Fitbit, Inc. Intelligent inflatable cuff for arm-based blood pressure measurement
US11311198B2 (en) * 2015-03-25 2022-04-26 Tata Consultancy Services Limited System and method for determining psychological stress of a person
US11361445B2 (en) 2020-07-08 2022-06-14 Nec Corporation Of America Image analysis for detecting mask compliance
US11419509B1 (en) * 2016-08-18 2022-08-23 Verily Life Sciences Llc Portable monitor for heart rate detection
US11432775B2 (en) 2018-12-21 2022-09-06 Samsung Electronics Co., Ltd. Apparatus and method for estimating blood pressure
US11445986B2 (en) * 2018-01-30 2022-09-20 Gaia Connect Inc. Health monitor wearable device
US20220304622A1 (en) * 2019-06-17 2022-09-29 Happy Health, Inc. Wearable device operable to detect and/or prepare a user for sleep
US11543785B2 (en) * 2017-12-06 2023-01-03 Sintokogio, Ltd. Industrial machine startup control system, startup control method, and program
US11583197B2 (en) * 2018-12-05 2023-02-21 Acer Incorporated Method and device for detecting cardiac arrhythmia based on photoplethysmographic signal
US11589758B2 (en) 2016-01-25 2023-02-28 Fitbit, Inc. Calibration of pulse-transit-time to blood pressure model using multiple physiological sensors and various methods for blood pressure variation
US11612352B1 (en) * 2013-02-22 2023-03-28 Cloud Dx, Inc. Systems and methods for monitoring medication effectiveness
US11622730B2 (en) 2014-11-17 2023-04-11 Rochester Institute Of Technology Pulse wave velocity, arterial compliance, and blood pressure
US11666271B2 (en) * 2020-12-09 2023-06-06 Medtronic, Inc. Detection and monitoring of sleep apnea conditions
WO2023152622A1 (en) * 2022-02-08 2023-08-17 Bioenergetics Systems Technology Group Limited A wearabale monitor with photoplethysmogram sensor for determining emotion level
US11800984B2 (en) 2020-07-08 2023-10-31 Nec Corporation Of America Sensor fusion for measurement of physiological parameters
US11872053B1 (en) * 2013-02-22 2024-01-16 Cloud Dx, Inc. Systems and methods for monitoring medication effectiveness
WO2024049973A1 (en) * 2022-09-02 2024-03-07 Board Of Regents Of The University Of Nebraska Systems and methods for determination of pulse arrival time with wearable electronic devices
US11957454B2 (en) * 2017-06-23 2024-04-16 Koninklijke Philips N.V. Device, system and method for detection of pulse and/or pulse-related information of a patient
US12059233B2 (en) 2016-03-07 2024-08-13 Fitbit, Inc. Blood pressure sensors

Families Citing this family (132)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9946844B2 (en) * 2013-02-22 2018-04-17 Cloud Dx, Inc. Systems and methods for monitoring patient medication adherence
US11693383B1 (en) * 2013-05-31 2023-07-04 Signify Holding B.V. Systems and methods for providing hub-based motion detection using distributed, light-based motion sensors
US9554465B1 (en) 2013-08-27 2017-01-24 Flextronics Ap, Llc Stretchable conductor design and methods of making
US9674949B1 (en) 2013-08-27 2017-06-06 Flextronics Ap, Llc Method of making stretchable interconnect using magnet wires
US10231333B1 (en) 2013-08-27 2019-03-12 Flextronics Ap, Llc. Copper interconnect for PTH components assembly
JP6075277B2 (en) * 2013-12-04 2017-02-08 オムロンヘルスケア株式会社 User authentication system
US9338915B1 (en) 2013-12-09 2016-05-10 Flextronics Ap, Llc Method of attaching electronic module on fabrics by stitching plated through holes
US10015880B1 (en) 2013-12-09 2018-07-03 Multek Technologies Ltd. Rip stop on flex and rigid flex circuits
US9659478B1 (en) * 2013-12-16 2017-05-23 Multek Technologies, Ltd. Wearable electronic stress and strain indicator
US9694156B2 (en) 2014-06-05 2017-07-04 Eight Sleep Inc. Bed device system and methods
US20150351556A1 (en) 2014-06-05 2015-12-10 Morphy Inc. Bed device system and methods
WO2015193551A1 (en) * 2014-06-18 2015-12-23 Medieta Oy Method, device and arrangement for determining pulse transit time
CA2960601A1 (en) 2014-09-09 2016-03-17 Torvec, Inc. Methods and apparatus for monitoring alertness of an individual utilizing a wearable device and providing notification
US20170277858A1 (en) * 2014-09-19 2017-09-28 Shinano Kenshi Co., Ltd. System for predicting risk of onset of cerebrovascular disease
AU2015331309B2 (en) * 2014-10-16 2019-08-22 Teijin Limited Protective equipment comprising alarm system
US10168430B2 (en) 2014-11-17 2019-01-01 Adam Sobol Wireless devices and systems for tracking patients and methods for using the like
GB2534678B (en) * 2014-12-18 2017-08-09 Wearable Tech Ltd Issuing alarm signal to operatives
CL2015001149A1 (en) * 2015-04-30 2015-08-28 Pontificia Universidad Católica De Chile Method and device for detecting and recording emotional events, in which the method comprises energizing sensors and other devices, recording and storing electro-dermal activity, ambient sound, temperature and movement of the user, setting the normal conditions of the variables, detecting at least an emotional event, send the data to a device for storage, processing and visualization of information, analyze the data and associate events with audio records
CA2985452A1 (en) * 2015-05-08 2016-11-17 Eight Sleep Inc. Vibrating alarm system and operating methods
US20160361032A1 (en) * 2015-05-14 2016-12-15 Abraham Carter Systems and Methods for Wearable Health Alerts
US11147505B1 (en) 2015-06-01 2021-10-19 Verily Life Sciences Llc Methods, systems and devices for identifying an abnormal sleep condition
KR102436726B1 (en) * 2015-06-15 2022-08-26 삼성전자주식회사 Method and apparatus of evaluating physiological aging level
US10194871B2 (en) 2015-09-25 2019-02-05 Sanmina Corporation Vehicular health monitoring system and method
US10678890B2 (en) 2015-08-06 2020-06-09 Microsoft Technology Licensing, Llc Client computing device health-related suggestions
WO2017040333A1 (en) * 2015-08-28 2017-03-09 Awarables, Inc. Visualizing, scoring, recording, and analyzing sleep data and hypnograms
WO2017040331A1 (en) * 2015-08-28 2017-03-09 Awarables, Inc. Determining sleep stages and sleep events using sensor data
US10709371B2 (en) 2015-09-09 2020-07-14 WellBrain, Inc. System and methods for serving a custom meditation program to a patient
ES2607721B2 (en) * 2015-10-02 2019-07-04 Univ Catalunya Politecnica Method and apparatus for estimating the transit time of the aortic pulse from measured time intervals between fiducial points of the balistocardiogram
CN105167759A (en) * 2015-10-09 2015-12-23 谢洪武 Human pulse wave velocity measuring method and system based on intelligent mobile phone
TWI610657B (en) * 2015-11-13 2018-01-11 慶旺科技股份有限公司 A wearable device which diagnoses personal cardiac health condition by monitoring and analyzing heartbeat and the method thereof
US10105092B2 (en) 2015-11-16 2018-10-23 Eight Sleep Inc. Detecting sleeping disorders
US10154932B2 (en) 2015-11-16 2018-12-18 Eight Sleep Inc. Adjustable bedframe and operating methods for health monitoring
US10758185B2 (en) 2015-11-25 2020-09-01 Texas Instruments Incorporated Heart rate estimation apparatus using digital automatic gain control
US10321831B2 (en) 2015-11-25 2019-06-18 Texas Instruments Incorporated Heart rate estimation apparatus with state sequence optimization
US9709417B1 (en) 2015-12-29 2017-07-18 Ebay Inc. Proactive re-routing of vehicles using passive monitoring of occupant frustration level
US9989369B2 (en) 2015-12-29 2018-06-05 Ebay Inc. Proactive re-routing of vehicles to control traffic flow
US9792814B2 (en) * 2015-12-29 2017-10-17 Ebay Inc. Traffic disruption detection using passive monitoring of vehicle occupant frustration level
US10803145B2 (en) * 2016-02-05 2020-10-13 The Intellectual Property Network, Inc. Triggered responses based on real-time electroencephalography
KR20170096323A (en) * 2016-02-16 2017-08-24 삼성전자주식회사 Method and apparatus of providing matching degree of real life rhythm of user and circadian rhythm of user
CN105748051B (en) * 2016-02-18 2018-10-09 京东方科技集团股份有限公司 A kind of blood pressure measuring device
CA3014812A1 (en) * 2016-02-18 2017-08-24 Curaegis Technologies, Inc. Alertness prediction system and method
US10747850B2 (en) 2016-03-29 2020-08-18 International Business Machines Corporation Medication scheduling and alerts
CN105877941A (en) * 2016-04-06 2016-08-24 吉林大学 Neurosurgery sickbed auxiliary device
EP3427648A4 (en) * 2016-04-15 2019-11-27 Omron Corporation Biological information analysis device, biological information analysis system, program, and biological information analysis method
US9762581B1 (en) 2016-04-15 2017-09-12 Striiv, Inc. Multifactor authentication through wearable electronic device
KR20190003532A (en) * 2016-04-19 2019-01-09 데이진 가부시키가이샤 Goods with alarm system
CA3027168C (en) 2016-04-27 2021-03-30 BRYX, Inc. Method, apparatus and computer-readable medium for aiding emergency response
KR20170124943A (en) * 2016-05-03 2017-11-13 삼성전자주식회사 Apparatus and method for extracting cardiovascular characteristic
TWI711429B (en) 2016-05-09 2020-12-01 香港商倍靈科技有限公司 Wearable device for healthcare and method thereof
DE102016211197B4 (en) * 2016-06-22 2018-11-08 Audi Ag Authenticate using vital signs
WO2018008666A1 (en) * 2016-07-07 2018-01-11 国立研究開発法人産業技術総合研究所 Physiological condition assessing device, physiological condition assessing method, program for physiological condition assessing device, and physiological condition assessing system
US11064893B2 (en) 2016-07-20 2021-07-20 Samsung Electronics Co., Ltd. Real time authentication based on blood flow parameters
KR101814382B1 (en) * 2016-08-05 2018-01-04 울산대학교 산학협력단 Apparatus and method for diagnosing blood circulatory disturbance
DE102016215250A1 (en) * 2016-08-16 2018-02-22 Audi Ag A method of operating a motor vehicle using a user's mobile terminal and physiological vital signs
US10602964B2 (en) * 2016-08-17 2020-03-31 Koninklijke Philips N.V. Location, activity, and health compliance monitoring using multidimensional context analysis
US11207021B2 (en) * 2016-09-06 2021-12-28 Fitbit, Inc Methods and systems for labeling sleep states
US10478590B2 (en) 2016-09-16 2019-11-19 Bose Corporation Sleep assistance device for multiple users
US11594111B2 (en) 2016-09-16 2023-02-28 Bose Corporation Intelligent wake-up system
US10434279B2 (en) 2016-09-16 2019-10-08 Bose Corporation Sleep assistance device
US10653856B2 (en) 2016-09-16 2020-05-19 Bose Corporation Sleep system
US10517527B2 (en) 2016-09-16 2019-12-31 Bose Corporation Sleep quality scoring and improvement
US10963146B2 (en) 2016-09-16 2021-03-30 Bose Corporation User interface for a sleep system
US10561362B2 (en) * 2016-09-16 2020-02-18 Bose Corporation Sleep assessment using a home sleep system
CN107865647B (en) * 2016-09-28 2020-01-14 京东方科技集团股份有限公司 Blood pressure detection device and method for calibrating blood pressure detection device
CN106473750B (en) * 2016-10-08 2019-03-26 西安电子科技大学 Personal identification method based on photoplethysmographic optimal period waveform
TWI594728B (en) * 2016-10-14 2017-08-11 麗寶大數據股份有限公司 Carpet type body fat meter
US10716518B2 (en) 2016-11-01 2020-07-21 Microsoft Technology Licensing, Llc Blood pressure estimation by wearable computing device
US20180174146A1 (en) * 2016-12-15 2018-06-21 Parveen Bansal Situational access override
US11670422B2 (en) 2017-01-13 2023-06-06 Microsoft Technology Licensing, Llc Machine-learning models for predicting decompensation risk
JP6702559B2 (en) * 2017-02-10 2020-06-03 株式会社東芝 Electronic device, method and program
US10749863B2 (en) * 2017-02-22 2020-08-18 Intel Corporation System, apparatus and method for providing contextual data in a biometric authentication system
KR102002638B1 (en) * 2017-03-06 2019-07-22 계명대학교 산학협력단 arrhythmia diagnostic method and apparatus using signal of electrocardiogram for driver
US11123014B2 (en) 2017-03-21 2021-09-21 Stryker Corporation Systems and methods for ambient energy powered physiological parameter monitoring
US11602656B2 (en) 2017-05-02 2023-03-14 Kyndryl, Inc. Cognitive solution to enhance firefighting capabilities
US10699247B2 (en) 2017-05-16 2020-06-30 Under Armour, Inc. Systems and methods for providing health task notifications
US11266346B2 (en) * 2017-06-07 2022-03-08 Electronics And Telecommunications Research Institute Method and apparatus for determining sleep state using biometric information and motion information
US11547351B2 (en) * 2017-06-22 2023-01-10 Sharp Kabushiki Kaisha Body condition managing device and body condition managing method
DE102017211631A1 (en) * 2017-07-07 2019-01-10 Bundesdruckerei Gmbh Electronic system and method for classifying a physiological condition
KR20200032703A (en) 2017-07-26 2020-03-26 닛토덴코 가부시키가이샤 Photoplethysmography (PPG) device and method for determining physiological changes
CN107516075B (en) * 2017-08-03 2020-10-09 安徽华米智能科技有限公司 Electrocardiosignal detection method and device and electronic equipment
CN110996796B (en) * 2017-08-08 2023-01-31 索尼公司 Information processing apparatus, method, and program
CN107707525B (en) * 2017-08-24 2020-06-19 大唐终端技术有限公司 Authentication method and device for talkback terminal
KR101962812B1 (en) * 2017-10-13 2019-03-28 아주대학교산학협력단 Method and apparatus for detecting REM sleep based on PPG
US20190021633A1 (en) * 2017-11-21 2019-01-24 Ling Wang Detecting respiratory rates in audio using an adaptive low-pass filter
EP3505051A1 (en) * 2017-12-29 2019-07-03 Sanmina Corporation Vehicular health monitoring system and method
WO2019139939A1 (en) 2018-01-09 2019-07-18 Eight Sleep, Inc. Systems and methods for detecting a biological signal of a user of an article of furniture
WO2019143953A1 (en) 2018-01-19 2019-07-25 Eight Sleep Inc. Sleep pod
US10284552B1 (en) 2018-06-06 2019-05-07 Capital One Services, Llc Systems and methods for using micro accelerations as a biometric identification factor
US11883142B2 (en) * 2018-03-15 2024-01-30 AtCor Medical Pty, Ltd. System and method for cardiovascular health monitoring
US11006875B2 (en) 2018-03-30 2021-05-18 Intel Corporation Technologies for emotion prediction based on breathing patterns
JP2019187678A (en) * 2018-04-23 2019-10-31 日本電信電話株式会社 Exercise performance estimation device, method, program
US11707225B2 (en) * 2018-04-27 2023-07-25 Samsung Electronics Co., Ltd. Bio-sensing based monitoring of health
CN108937860B (en) * 2018-06-06 2021-02-02 歌尔科技有限公司 Motion state monitoring method, system and equipment and storage medium
US10587615B2 (en) 2018-06-06 2020-03-10 Capital One Services, Llc Systems and methods for using micro accelerations as a biometric identification factor
KR102564269B1 (en) 2018-06-07 2023-08-07 삼성전자주식회사 Electronic apparatus for providing exercise information using biometric information and operating method thereof
US10380813B1 (en) 2018-07-19 2019-08-13 Capital One Services, Llc Systems and methods for using motion pattern of a user for authentication
US10621322B2 (en) 2018-08-02 2020-04-14 Capital One Services, Llc Platform for distinguishing human from machine input
GB2579656A (en) 2018-12-11 2020-07-01 Ge Aviat Systems Ltd Method of assessing a pilot emotional state
JP7028339B2 (en) * 2018-12-19 2022-03-02 日本電気株式会社 Information processing equipment, wearable equipment, information processing methods and storage media
US10957140B2 (en) * 2018-12-28 2021-03-23 Intel Corporation Multi-factor biometric authentication
KR102243012B1 (en) * 2019-02-13 2021-04-22 와이케이씨테크(주) Estimation method of blood vessel elasticity and arrhythmia using skin image
US11445927B2 (en) 2019-02-13 2022-09-20 Viavi Solutions Inc. Baseline correction and extraction of heartbeat profiles
EP3701863A1 (en) 2019-02-26 2020-09-02 Polar Electro Oy Cardiogram measurements
US10748656B1 (en) * 2019-03-12 2020-08-18 Harmonize Inc. Population health platform
CN110025321B (en) * 2019-03-20 2021-08-31 华为技术有限公司 Psychological stress assessment method and related equipment
WO2020198169A1 (en) * 2019-03-22 2020-10-01 Sibel Inc. Wireless communication system for wearable medical sensors
JP7246609B2 (en) * 2019-03-28 2023-03-28 京セラドキュメントソリューションズ株式会社 image forming device
WO2020202173A1 (en) * 2019-04-02 2020-10-08 Myelin Foundry Private Limited System and method for predicting wellness metrics
KR102259285B1 (en) * 2019-04-25 2021-06-01 서울대학교산학협력단 Apparatus and method for measuring blood pressure
KR102277105B1 (en) * 2019-06-03 2021-07-14 계명대학교 산학협력단 Non-contact system of measuring blood pressure and its way to working
CN112089423B (en) * 2019-06-18 2024-05-17 北京京东尚科信息技术有限公司 Sleep information determining method, device and equipment
WO2021016136A1 (en) * 2019-07-19 2021-01-28 Barnacka Anna System and method for heart rhythm detection and reporting
WO2021030737A1 (en) * 2019-08-14 2021-02-18 Cardio Ring Technologies, Inc. Blood pressure measurement systems and methods
CN113347916A (en) 2019-10-15 2021-09-03 因普瑞缇夫护理公司 System and method for multivariate stroke detection
JP7476514B2 (en) * 2019-10-29 2024-05-01 オムロンヘルスケア株式会社 Sphygmomanometer, method of operating the same, and program
WO2021211634A1 (en) * 2020-04-14 2021-10-21 Rubyelf Llc System and method of assessing intra-arterial fluid volume using intelligent pulse averaging with integrated ekg and ppg sensors
US20210275110A1 (en) * 2019-12-30 2021-09-09 RubyElf, LLC Systems For Synchronizing Different Devices To A Cardiac Cycle And For Generating Pulse Waveforms From Synchronized ECG and PPG Systems
RU2740601C1 (en) * 2020-03-05 2021-01-15 Федеральное государственное бюджетное образовательное учреждение высшего образования Читинская государственная медицинская академия Министерства здравоохранения российской федерации Method for prediction of risk of developing ischemic stroke in females over 50 years old
WO2021225573A1 (en) * 2020-05-04 2021-11-11 Hewlett-Packard Development Company, L.P. Partial signatures based on environmental characteristics
WO2022060991A1 (en) * 2020-09-16 2022-03-24 Oregon Health & Science University Wearable photoplethysmography device for detecting clinical decompensation based on heart rate variability
US12039025B2 (en) * 2020-11-19 2024-07-16 Jvckenwood Corporation Biometric authentication through vascular monitoring
US11995166B2 (en) 2020-12-17 2024-05-28 Jvckenwood Corporation Biometric authentication through vascular studies
CN112967801A (en) * 2021-01-28 2021-06-15 安徽华米健康科技有限公司 PAI value processing method, PAI value processing device, PAI value processing equipment and storage medium
US11468992B2 (en) 2021-02-04 2022-10-11 Harmonize Inc. Predicting adverse health events using a measure of adherence to a testing routine
KR102560787B1 (en) 2021-02-04 2023-07-26 삼성전자주식회사 Apparatus and method for estimating biological information, and electronic system having the same
KR20220125425A (en) * 2021-03-05 2022-09-14 삼성전자주식회사 Method for providing information on sleep quality and electronic device for supporting the same
TWI799821B (en) * 2021-03-30 2023-04-21 許維綸 Hazard Prediction and Prevention System
KR102570742B1 (en) * 2021-04-23 2023-08-24 우정하 Smart lifesaving alarm system
KR20230078415A (en) * 2021-11-26 2023-06-02 삼성전자주식회사 Electronic apparatus and controlling method thereof
WO2023150749A1 (en) * 2022-02-07 2023-08-10 Zoll Medical Corporation Patient engagement for wearable medical devices
JP2023158983A (en) * 2022-04-19 2023-10-31 株式会社エー・アンド・デイ Blood pressure measuring apparatus
TWI822234B (en) * 2022-08-08 2023-11-11 簡國隆 Disaster site control device and disaster site control system

Family Cites Families (115)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4913150A (en) 1986-08-18 1990-04-03 Physio-Control Corporation Method and apparatus for the automatic calibration of signals employed in oximetry
EP0267884A3 (en) 1986-11-10 1990-01-17 Saccinni ved. Vigna, Luisa Device for connecting electrical cables
US4854699A (en) 1987-11-02 1989-08-08 Nippon Colin Co., Ltd. Backscatter oximeter
US5289824A (en) 1991-12-26 1994-03-01 Instromedix, Inc. Wrist-worn ECG monitor
FI92139C (en) 1992-02-28 1994-10-10 Matti Myllymaeki Monitoring device for the health condition, which is attached to the wrist
US5692501A (en) 1993-09-20 1997-12-02 Minturn; Paul Scientific wellness personal/clinical/laboratory assessments, profile and health risk managment system with insurability rankings on cross-correlated 10-point optical health/fitness/wellness scales
US5836884A (en) * 1993-12-17 1998-11-17 Pulse Metric, Inc. Method for diagnosing, monitoring and treating hypertension and other cardiac problems
US5622178A (en) * 1994-05-04 1997-04-22 Spacelabs Medical, Inc. System and method for dynamically displaying cardiac interval data using scatter-plots
US6266623B1 (en) 1994-11-21 2001-07-24 Phatrat Technology, Inc. Sport monitoring apparatus for determining loft time, speed, power absorbed and other factors such as height
CN1255078C (en) 1996-06-12 2006-05-10 精工爱普生株式会社 Thermometer
US8734339B2 (en) * 1996-12-16 2014-05-27 Ip Holdings, Inc. Electronic skin patch for real time monitoring of cardiac activity and personal health management
US6008703A (en) 1997-01-31 1999-12-28 Massachusetts Institute Of Technology Digital compensation for wideband modulation of a phase locked loop frequency synthesizer
US5818788A (en) 1997-05-30 1998-10-06 Nec Corporation Circuit technique for logic integrated DRAM with SIMD architecture and a method for controlling low-power, high-speed and highly reliable operation
JP2004513669A (en) 1999-10-08 2004-05-13 ヘルセテック インコーポレイテッド Integrated calorie management system
US6527711B1 (en) 1999-10-18 2003-03-04 Bodymedia, Inc. Wearable human physiological data sensors and reporting system therefor
US6480733B1 (en) 1999-11-10 2002-11-12 Pacesetter, Inc. Method for monitoring heart failure
FI115289B (en) 2000-02-23 2005-04-15 Polar Electro Oy Measurement of an organism's energy metabolism and glucose levels
US6452149B1 (en) 2000-03-07 2002-09-17 Kabushiki Kaisha Toshiba Image input system including solid image sensing section and signal processing section
US7261690B2 (en) 2000-06-16 2007-08-28 Bodymedia, Inc. Apparatus for monitoring health, wellness and fitness
US7689437B1 (en) 2000-06-16 2010-03-30 Bodymedia, Inc. System for monitoring health, wellness and fitness
US6605038B1 (en) 2000-06-16 2003-08-12 Bodymedia, Inc. System for monitoring health, wellness and fitness
AU2001270092A1 (en) * 2000-06-23 2002-01-08 Bodymedia, Inc. System for monitoring health, wellness and fitness
EP2273778A1 (en) 2000-07-28 2011-01-12 Melexis Tessenderlo NV Precise MOS imager transfer function control for expanded dynamic range imaging
AU2002255568B8 (en) 2001-02-20 2014-01-09 Adidas Ag Modular personal network systems and methods
US7054679B2 (en) 2001-10-31 2006-05-30 Robert Hirsh Non-invasive method and device to monitor cardiac parameters
US7946959B2 (en) 2002-05-30 2011-05-24 Nike, Inc. Training scripts
US7020508B2 (en) 2002-08-22 2006-03-28 Bodymedia, Inc. Apparatus for detecting human physiological and contextual information
CZ2005209A3 (en) * 2002-09-10 2005-12-14 Ivi Smart Technologies, Inc. Safe biometric verification of identity
US6791462B2 (en) 2002-09-18 2004-09-14 Sang J. Choi Sleepy alarm system activated by heart pulse meter
US8157731B2 (en) 2002-10-09 2012-04-17 Bodymedia, Inc. Method and apparatus for auto journaling of continuous or discrete body states utilizing physiological and/or contextual parameters
US7349574B1 (en) 2002-10-11 2008-03-25 Sensata Technologies, Inc. System and method for processing non-linear image data from a digital imager
US7218966B2 (en) * 2003-04-11 2007-05-15 Cardiac Pacemakers, Inc. Multi-parameter arrhythmia discrimination
US20070159926A1 (en) 2003-04-17 2007-07-12 Nike, Inc. Adaptive Watch
EP1667579A4 (en) 2003-09-12 2008-06-11 Bodymedia Inc Method and apparatus for measuring heart related parameters
DE102004032812B4 (en) 2003-11-11 2006-07-20 Dräger Safety AG & Co. KGaA Combination sensor for physiological measurements
US20050209512A1 (en) * 2004-03-16 2005-09-22 Heruth Kenneth T Detecting sleep
US7717848B2 (en) * 2004-03-16 2010-05-18 Medtronic, Inc. Collecting sleep quality information via a medical device
WO2005092177A1 (en) 2004-03-22 2005-10-06 Bodymedia, Inc. Non-invasive temperature monitoring device
US8172761B1 (en) * 2004-09-28 2012-05-08 Impact Sports Technologies, Inc. Monitoring device with an accelerometer, method and system
JP4487730B2 (en) 2004-11-02 2010-06-23 株式会社日立製作所 Life status notification system
US7254516B2 (en) 2004-12-17 2007-08-07 Nike, Inc. Multi-sensor monitoring of athletic performance
US7319425B2 (en) 2005-03-21 2008-01-15 Massachusetts Institute Of Technology Comparator-based switched capacitor circuit for scaled semiconductor fabrication processes
US20070010748A1 (en) 2005-07-06 2007-01-11 Rauch Steven D Ambulatory monitors
US20070032731A1 (en) 2005-08-05 2007-02-08 Lovejoy Jeffrey L Non-invasive pulse rate detection via headphone mounted electrodes / monitoring system
US7534206B1 (en) 2005-09-19 2009-05-19 Garmin Ltd. Navigation-assisted fitness and dieting device
US7657307B2 (en) 2005-10-31 2010-02-02 Medtronic, Inc. Method of and apparatus for classifying arrhythmias using scatter plot analysis
US20070197881A1 (en) 2006-02-22 2007-08-23 Wolf James L Wireless Health Monitor Device and System with Cognition
US20070232454A1 (en) * 2006-03-28 2007-10-04 David Kagan Fitness assessment
US7539532B2 (en) 2006-05-12 2009-05-26 Bao Tran Cuffless blood pressure monitoring appliance
US8684922B2 (en) 2006-05-12 2014-04-01 Bao Tran Health monitoring system
US8500636B2 (en) 2006-05-12 2013-08-06 Bao Tran Health monitoring appliance
US7558622B2 (en) * 2006-05-24 2009-07-07 Bao Tran Mesh network stroke monitoring appliance
US8684900B2 (en) 2006-05-16 2014-04-01 Bao Tran Health monitoring appliance
US7539533B2 (en) 2006-05-16 2009-05-26 Bao Tran Mesh network monitoring appliance
RU2009114727A (en) 2006-09-18 2010-10-27 Конинклейке Филипс Электроникс, Н.В. (Nl) MONITORING AND SIGNALING BASED ON THE INTERNET PROTOCOL
US20080076972A1 (en) * 2006-09-21 2008-03-27 Apple Inc. Integrated sensors for tracking performance metrics
DE102006057709B4 (en) 2006-12-07 2015-04-02 Dräger Medical GmbH Apparatus and method for determining a respiratory rate
KR20080069851A (en) 2007-01-24 2008-07-29 삼성전자주식회사 Biosignal-measuring sensor instrument and headset having the sensor instrument and pendant having the sensor instrument
US7846104B2 (en) 2007-02-08 2010-12-07 Heart Force Medical Inc. Monitoring physiological condition and detecting abnormalities
EP2750098A3 (en) 2007-02-16 2014-08-06 BodyMedia, Inc. Systems and methods for understanding and applying the physiological and contextual life patterns of an individual or set of individuals
US20080287821A1 (en) * 2007-03-30 2008-11-20 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational user-health testing
US8140154B2 (en) * 2007-06-13 2012-03-20 Zoll Medical Corporation Wearable medical treatment device
US10154790B2 (en) 2007-08-21 2018-12-18 University College Dublin, National University Of Ireland Method and system for monitoring sleep
US8764653B2 (en) 2007-08-22 2014-07-01 Bozena Kaminska Apparatus for signal detection, processing and communication
WO2009108228A1 (en) * 2008-02-25 2009-09-03 Kingsdown, Inc. Systems and methods for controlling a bedroom environment and for providing sleep data
EP2116183B1 (en) 2008-05-07 2012-02-01 CSEM Centre Suisse d'Electronique et de Microtechnique SA Robust opto-electrical ear located cardiovascular monitoring device
US20100056878A1 (en) 2008-08-28 2010-03-04 Partin Dale L Indirectly coupled personal monitor for obtaining at least one physiological parameter of a subject
US20100076276A1 (en) 2008-09-25 2010-03-25 Nellcor Puritan Bennett Llc Medical Sensor, Display, and Technique For Using The Same
US8355769B2 (en) 2009-03-17 2013-01-15 Advanced Brain Monitoring, Inc. System for the assessment of sleep quality in adults and children
WO2010108287A1 (en) 2009-03-23 2010-09-30 Hongyue Luo A wearable intelligent healthcare system and method
WO2010111489A2 (en) 2009-03-27 2010-09-30 LifeWatch Corp. Methods and apparatus for processing physiological data acquired from an ambulatory physiological monitoring unit
EP2427102A2 (en) 2009-05-04 2012-03-14 MediSense Technologies, LLC System and method for monitoring blood glucose levels non-invasively
US20100292589A1 (en) 2009-05-13 2010-11-18 Jesse Bruce Goodman Hypothenar sensor
US8738118B2 (en) * 2009-05-20 2014-05-27 Sotera Wireless, Inc. Cable system for generating signals for detecting motion and measuring vital signs
EP2448474B1 (en) * 2009-06-23 2019-09-18 Boris Leschinsky Devices for remote ischemic preconditioning and near-continuous blood pressure monitoring
US20100331631A1 (en) 2009-06-30 2010-12-30 Nellcor Puritan Bennett Llc Oxygen saturation ear sensor design that optimizes both attachment method and signal quality
US20110066041A1 (en) * 2009-09-15 2011-03-17 Texas Instruments Incorporated Motion/activity, heart-rate and respiration from a single chest-worn sensor, circuits, devices, processes and systems
EP3735899B1 (en) 2009-10-15 2023-11-29 Masimo Corporation Acoustic respiratory monitoring sensor having multiple sensing elements
US20120209126A1 (en) 2009-10-20 2012-08-16 Widemed Ltd. Method and system for detecting cardiac arrhythmia
US9585589B2 (en) * 2009-12-31 2017-03-07 Cerner Innovation, Inc. Computerized systems and methods for stability-theoretic prediction and prevention of sudden cardiac death
CA2998940A1 (en) * 2010-01-14 2011-07-21 PhysIQ Inc. Multivariate residual-based health index for human health monitoring
KR20130051922A (en) * 2010-03-04 2013-05-21 뉴미트라 엘엘씨 Devices and methods for treating psychological disorders
WO2011113070A1 (en) 2010-03-07 2011-09-15 Centauri Medical, INC. Systems, devices and methods for preventing, detecting, and treating pressure-induced ischemia, pressure ulcers, and other conditions
US20110224499A1 (en) 2010-03-10 2011-09-15 Sotera Wireless, Inc. Body-worn vital sign monitor
RU2597774C2 (en) * 2010-07-21 2016-09-20 Конинклейке Филипс Электроникс Н.В. Detection and monitoring of abdominal aortic aneurysm
WO2012024106A2 (en) * 2010-08-17 2012-02-23 University Of Florida Research Foundation, Inc. Central site photoplethysmography, medication administration, and safety
US20120075122A1 (en) 2010-09-24 2012-03-29 Honeywell International Inc. Alert generation and related aircraft operating methods
US9167991B2 (en) 2010-09-30 2015-10-27 Fitbit, Inc. Portable monitoring devices and methods of operating same
JP5844389B2 (en) 2011-02-09 2016-01-13 マサチューセッツ インスティテュート オブ テクノロジー Ear-mounted multiple vital signs monitor
US20120220835A1 (en) * 2011-02-14 2012-08-30 Wayne Chung Wireless physiological sensor system and method
US8519835B2 (en) 2011-03-02 2013-08-27 Htc Corporation Systems and methods for sensory feedback
US8568330B2 (en) 2011-03-08 2013-10-29 Pulsaw Informatics, Inc. Composite human physiological stress index based on heart beat and sleep and/or activity history data including actigraphy
US8725311B1 (en) 2011-03-14 2014-05-13 American Vehicular Sciences, LLC Driver health and fatigue monitoring system and method
US20140089672A1 (en) * 2012-09-25 2014-03-27 Aliphcom Wearable device and method to generate biometric identifier for authentication using near-field communications
WO2013066642A1 (en) 2011-10-19 2013-05-10 Scanadu Incorporated Automated personal medical diagnostic system, method, and arrangement
US10006896B2 (en) * 2011-11-14 2018-06-26 University of Pittsburgh—of the Commonwealth System of Higher Education Method, apparatus and system for food intake and physical activity assessment
EP2747649A1 (en) 2011-12-20 2014-07-02 Koninklijke Philips N.V. Method and apparatus for monitoring the baroreceptor reflex of a user
US10251568B2 (en) * 2012-01-16 2019-04-09 Agency For Science, Technology And Research Method and system for optical blood pressure monitoring
US9186077B2 (en) 2012-02-16 2015-11-17 Google Technology Holdings LLC Method and device with customizable power management
RU2518134C2 (en) 2012-02-24 2014-06-10 Хилби Корпорейшн Method for determining individual's blood glucose concentration
US10219709B2 (en) * 2012-03-28 2019-03-05 Wayne State University Sensor and method for continuous health monitoring
US20130338460A1 (en) 2012-06-18 2013-12-19 David Da He Wearable Device for Continuous Cardiac Monitoring
US8954135B2 (en) * 2012-06-22 2015-02-10 Fitbit, Inc. Portable biometric monitoring devices and methods of operating same
US9044171B2 (en) * 2012-06-22 2015-06-02 Fitbit, Inc. GPS power conservation using environmental data
CA2912358A1 (en) 2012-08-10 2014-02-13 Cnv Systems Ltd. Mobile device system for measurement of cardiovascular health
EP2892421A1 (en) 2012-09-04 2015-07-15 Whoop, Inc. Systems, devices and methods for continuous heart rate monitoring and interpretation
EP2898470A4 (en) 2012-09-21 2017-09-06 CardioMems, Inc. Method and system for trend-based patient management
US20140085050A1 (en) 2012-09-25 2014-03-27 Aliphcom Validation of biometric identification used to authenticate identity of a user of wearable sensors
US20140089673A1 (en) 2012-09-25 2014-03-27 Aliphcom Biometric identification method and apparatus to authenticate identity of a user of a wearable device that includes sensors
US9098991B2 (en) 2013-01-15 2015-08-04 Fitbit, Inc. Portable monitoring devices and methods of operating the same
US20140235979A1 (en) 2013-02-20 2014-08-21 Perminova Inc. Necklace-shaped physiological monitor
US9320434B2 (en) 2013-03-04 2016-04-26 Hello Inc. Patient monitoring systems and messages that send alerts to patients only when the patient is awake
US20140275883A1 (en) * 2013-03-14 2014-09-18 Covidien Lp Wireless sensors
US9558336B2 (en) * 2013-10-04 2017-01-31 Salutron Inc. Persistent authentication using sensors of a user-wearable device
US20150173674A1 (en) * 2013-12-20 2015-06-25 Diabetes Sentry Products Inc. Detecting and communicating health conditions

Cited By (175)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10463299B1 (en) * 2013-02-22 2019-11-05 Cloud Dx, Inc. Systems and methods for monitoring medication effectiveness
US11872053B1 (en) * 2013-02-22 2024-01-16 Cloud Dx, Inc. Systems and methods for monitoring medication effectiveness
US11612352B1 (en) * 2013-02-22 2023-03-28 Cloud Dx, Inc. Systems and methods for monitoring medication effectiveness
US20160007931A1 (en) * 2013-04-05 2016-01-14 Healbe Corporation Method for determining a person's sleeping phase which is favourable for waking up
US11224385B2 (en) * 2013-04-05 2022-01-18 Healbe Corporation Method for determining a person's sleeping phase which is favourable for waking up
US10786161B1 (en) 2013-11-27 2020-09-29 Bodymatter, Inc. Method for collection of blood pressure measurement
US11684270B2 (en) 2013-11-27 2023-06-27 Bodymatter, Inc. Method for collection of blood pressure measurement
US10509478B2 (en) 2014-06-03 2019-12-17 Google Llc Radar-based gesture-recognition from a surface radar field on which an interaction is sensed
US9971415B2 (en) 2014-06-03 2018-05-15 Google Llc Radar-based gesture-recognition through a wearable device
US10948996B2 (en) 2014-06-03 2021-03-16 Google Llc Radar-based gesture-recognition at a surface of an object
US9575560B2 (en) 2014-06-03 2017-02-21 Google Inc. Radar-based gesture-recognition through a wearable device
US10278638B2 (en) * 2014-07-21 2019-05-07 Withings System and method to monitor and assist individual's sleep
US20160015314A1 (en) * 2014-07-21 2016-01-21 Withings System and Method to Monitor and Assist Individual's Sleep
US9921660B2 (en) 2014-08-07 2018-03-20 Google Llc Radar-based gesture recognition
US10642367B2 (en) 2014-08-07 2020-05-05 Google Llc Radar-based gesture sensing and data transmission
US9811164B2 (en) 2014-08-07 2017-11-07 Google Inc. Radar-based gesture sensing and data transmission
US9933908B2 (en) 2014-08-15 2018-04-03 Google Llc Interactive textiles
US10268321B2 (en) 2014-08-15 2019-04-23 Google Llc Interactive textiles within hard objects
US9895071B2 (en) 2014-08-18 2018-02-20 Cameron Health, Inc. Cardiac rate tracking in an implantable medical device
US9451892B2 (en) * 2014-08-18 2016-09-27 Cameron Health, Inc. Cardiac rate tracking in an implantable medical device
US20160045131A1 (en) * 2014-08-18 2016-02-18 Cameron Health, Inc. Cardiac rate tracking in an implantable medical device
US11221682B2 (en) 2014-08-22 2022-01-11 Google Llc Occluded gesture recognition
US9778749B2 (en) 2014-08-22 2017-10-03 Google Inc. Occluded gesture recognition
US11816101B2 (en) 2014-08-22 2023-11-14 Google Llc Radar recognition-aided search
US10936081B2 (en) 2014-08-22 2021-03-02 Google Llc Occluded gesture recognition
US11169988B2 (en) 2014-08-22 2021-11-09 Google Llc Radar recognition-aided search
US10409385B2 (en) 2014-08-22 2019-09-10 Google Llc Occluded gesture recognition
US10779738B2 (en) 2014-09-08 2020-09-22 Apple Inc. Electrical coupling of pulse transit time (PTT) measurement system to heart for blood pressure measurement
US10772512B2 (en) 2014-09-08 2020-09-15 Apple Inc. Blood pressure monitoring using a multi-function wrist-worn device
US11918324B2 (en) 2014-09-08 2024-03-05 Apple Inc. Electrical coupling of pulse transit time (PTT) measurement system to heart for blood pressure measurement
US10702171B2 (en) 2014-09-08 2020-07-07 Apple Inc. Systems, devices, and methods for measuring blood pressure of a user
US10517489B2 (en) 2014-09-08 2019-12-31 Apple Inc. Wrist worn accelerometer for pulse transit time (PTT) measurements of blood pressure
US11163371B2 (en) 2014-10-02 2021-11-02 Google Llc Non-line-of-sight radar-based gesture recognition
US10664059B2 (en) 2014-10-02 2020-05-26 Google Llc Non-line-of-sight radar-based gesture recognition
US9600080B2 (en) 2014-10-02 2017-03-21 Google Inc. Non-line-of-sight radar-based gesture recognition
WO2016065476A1 (en) * 2014-10-30 2016-05-06 2352409 Ontario Inc. A wearable device and method for non-invasive monitoring continuous blood pressure and other physiological parameters with reduced movement artifacts
USD772903S1 (en) * 2014-11-14 2016-11-29 Volvo Car Corporation Display screen with transitional graphical user interface
USD772905S1 (en) * 2014-11-14 2016-11-29 Volvo Car Corporation Display screen with graphical user interface
USD772904S1 (en) * 2014-11-14 2016-11-29 Volvo Car Corporation Display screen with transitional graphical user interface
USD772260S1 (en) * 2014-11-14 2016-11-22 Volvo Car Corporation Display screen with graphical user interface
US11622730B2 (en) 2014-11-17 2023-04-11 Rochester Institute Of Technology Pulse wave velocity, arterial compliance, and blood pressure
US20160150985A1 (en) * 2014-12-02 2016-06-02 Samsung Electronics Co., Ltd. Apparatus and method for measuring pulse wave
US10159417B2 (en) * 2014-12-02 2018-12-25 Samsung Electronics Co., Ltd. Apparatus and method for measuring pulse wave
US11076788B2 (en) * 2014-12-30 2021-08-03 Nitto Denko Corporation Method and apparatus for deriving a mental state of a subject
US10064582B2 (en) 2015-01-19 2018-09-04 Google Llc Noninvasive determination of cardiac health and other functional states and trends for human physiological systems
US10016162B1 (en) 2015-03-23 2018-07-10 Google Llc In-ear health monitoring
US11219412B2 (en) 2015-03-23 2022-01-11 Google Llc In-ear health monitoring
US11311198B2 (en) * 2015-03-25 2022-04-26 Tata Consultancy Services Limited System and method for determining psychological stress of a person
US20160278644A1 (en) * 2015-03-25 2016-09-29 Quanttus, Inc. Contact-less blood pressure measurement
US9983747B2 (en) 2015-03-26 2018-05-29 Google Llc Two-layer interactive textiles
US9848780B1 (en) * 2015-04-08 2017-12-26 Google Inc. Assessing cardiovascular function using an optical sensor
US11709552B2 (en) 2015-04-30 2023-07-25 Google Llc RF-based micro-motion tracking for gesture tracking and recognition
US10664061B2 (en) 2015-04-30 2020-05-26 Google Llc Wide-field radar-based gesture recognition
US10310620B2 (en) 2015-04-30 2019-06-04 Google Llc Type-agnostic RF signal representations
US10241581B2 (en) 2015-04-30 2019-03-26 Google Llc RF-based micro-motion tracking for gesture tracking and recognition
US10139916B2 (en) 2015-04-30 2018-11-27 Google Llc Wide-field radar-based gesture recognition
US10817070B2 (en) 2015-04-30 2020-10-27 Google Llc RF-based micro-motion tracking for gesture tracking and recognition
US10496182B2 (en) 2015-04-30 2019-12-03 Google Llc Type-agnostic RF signal representations
US10080528B2 (en) 2015-05-19 2018-09-25 Google Llc Optical central venous pressure measurement
US10155274B2 (en) 2015-05-27 2018-12-18 Google Llc Attaching electronic components to interactive textiles
US9693592B2 (en) 2015-05-27 2017-07-04 Google Inc. Attaching electronic components to interactive textiles
US10936085B2 (en) 2015-05-27 2021-03-02 Google Llc Gesture detection and interactions
US10088908B1 (en) 2015-05-27 2018-10-02 Google Llc Gesture detection and interactions
US10203763B1 (en) 2015-05-27 2019-02-12 Google Inc. Gesture detection and interactions
US10572027B2 (en) 2015-05-27 2020-02-25 Google Llc Gesture detection and interactions
US10376195B1 (en) 2015-06-04 2019-08-13 Google Llc Automated nursing assessment
US10799122B2 (en) 2015-06-14 2020-10-13 Facense Ltd. Utilizing correlations between PPG signals and iPPG signals to improve detection of physiological responses
US11103139B2 (en) 2015-06-14 2021-08-31 Facense Ltd. Detecting fever from video images and a baseline
US11154203B2 (en) 2015-06-14 2021-10-26 Facense Ltd. Detecting fever from images and temperatures
US10349887B1 (en) 2015-06-14 2019-07-16 Facense Ltd. Blood pressure measuring smartglasses
US20210275034A1 (en) * 2015-06-14 2021-09-09 Facense Ltd. Wearable-based health state verification for physical access authorization
US11103140B2 (en) 2015-06-14 2021-08-31 Facense Ltd. Monitoring blood sugar level with a comfortable head-mounted device
US10791938B2 (en) 2015-06-14 2020-10-06 Facense Ltd. Smartglasses for detecting congestive heart failure
US10667697B2 (en) 2015-06-14 2020-06-02 Facense Ltd. Identification of posture-related syncope using head-mounted sensors
US10376163B1 (en) 2015-06-14 2019-08-13 Facense Ltd. Blood pressure from inward-facing head-mounted cameras
US11986273B2 (en) 2015-06-14 2024-05-21 Facense Ltd. Detecting alcohol intoxication from video images
US20240108228A1 (en) * 2015-06-14 2024-04-04 Facense Ltd. Reservation booking using wearable-based health state verifications
US11903680B2 (en) * 2015-06-14 2024-02-20 Facense Ltd. Wearable-based health state verification for physical access authorization
US10638938B1 (en) 2015-06-14 2020-05-05 Facense Ltd. Eyeglasses to detect abnormal medical events including stroke and migraine
US11064892B2 (en) 2015-06-14 2021-07-20 Facense Ltd. Detecting a transient ischemic attack using photoplethysmogram signals
US10542961B2 (en) 2015-06-15 2020-01-28 The Research Foundation For The State University Of New York System and method for infrasonic cardiac monitoring
US11478215B2 (en) 2015-06-15 2022-10-25 The Research Foundation for the State University o System and method for infrasonic cardiac monitoring
US20170020443A1 (en) * 2015-07-20 2017-01-26 iFeel Healthy Ltd. Methods and systems of controlling a subject's body feature having a periodic wave function
CN106650196A (en) * 2015-07-20 2017-05-10 我感觉健康有限责任公司 Methods and systems of controlling a subject's body feature having a periodic wave function
US10307100B2 (en) * 2015-07-20 2019-06-04 iFeel Healthy Ltd. Methods and systems of controlling a subject's body feature having a periodic wave function
US9955303B2 (en) 2015-07-21 2018-04-24 IP Funding Group, LLC Determining relative position with a BLE beacon
US10413233B2 (en) * 2015-08-10 2019-09-17 Murata Manufacturing Co., Ltd. Monitoring of sleep phenomena
US20170042471A1 (en) * 2015-08-10 2017-02-16 Murata Manufacturing Co., Ltd. Monitoring of sleep phenomena
US10733274B2 (en) * 2015-08-11 2020-08-04 Suprema Inc. Biometric authentication using gesture
US20170046508A1 (en) * 2015-08-11 2017-02-16 Suprema Inc. Biometric authentication using gesture
US10849555B1 (en) 2015-09-29 2020-12-01 Apple Inc. Devices and systems for correcting errors in blood pressure measurements
US10646121B2 (en) 2015-09-29 2020-05-12 Apple Inc. Pressure measurement designs
US10881307B1 (en) 2015-09-29 2021-01-05 Apple Inc. Devices and systems for correcting errors in blood pressure measurements
US11175743B2 (en) 2015-10-06 2021-11-16 Google Llc Gesture recognition using multiple antenna
US10379621B2 (en) 2015-10-06 2019-08-13 Google Llc Gesture component with gesture library
US10817065B1 (en) 2015-10-06 2020-10-27 Google Llc Gesture recognition using multiple antenna
US12117560B2 (en) 2015-10-06 2024-10-15 Google Llc Radar-enabled sensor fusion
US12085670B2 (en) 2015-10-06 2024-09-10 Google Llc Advanced gaming and virtual reality control using radar
US11256335B2 (en) 2015-10-06 2022-02-22 Google Llc Fine-motion virtual-reality or augmented-reality control using radar
US11698438B2 (en) 2015-10-06 2023-07-11 Google Llc Gesture recognition using multiple antenna
US11698439B2 (en) 2015-10-06 2023-07-11 Google Llc Gesture recognition using multiple antenna
US10908696B2 (en) 2015-10-06 2021-02-02 Google Llc Advanced gaming and virtual reality control using radar
US11693092B2 (en) 2015-10-06 2023-07-04 Google Llc Gesture recognition using multiple antenna
US11385721B2 (en) 2015-10-06 2022-07-12 Google Llc Application-based signal processing parameters in radar-based detection
US10459080B1 (en) 2015-10-06 2019-10-29 Google Llc Radar-based object detection for vehicles
US10768712B2 (en) 2015-10-06 2020-09-08 Google Llc Gesture component with gesture library
US10503883B1 (en) 2015-10-06 2019-12-10 Google Llc Radar-based authentication
US11481040B2 (en) 2015-10-06 2022-10-25 Google Llc User-customizable machine-learning in radar-based gesture detection
US11132065B2 (en) 2015-10-06 2021-09-28 Google Llc Radar-enabled sensor fusion
US11656336B2 (en) 2015-10-06 2023-05-23 Google Llc Advanced gaming and virtual reality control using radar
US10823841B1 (en) 2015-10-06 2020-11-03 Google Llc Radar imaging on a mobile computing device
US11592909B2 (en) 2015-10-06 2023-02-28 Google Llc Fine-motion virtual-reality or augmented-reality control using radar
US10705185B1 (en) 2015-10-06 2020-07-07 Google Llc Application-based signal processing parameters in radar-based detection
US10401490B2 (en) 2015-10-06 2019-09-03 Google Llc Radar-enabled sensor fusion
US10300370B1 (en) 2015-10-06 2019-05-28 Google Llc Advanced gaming and virtual reality control using radar
US11080556B1 (en) 2015-10-06 2021-08-03 Google Llc User-customizable machine-learning in radar-based gesture detection
US10310621B1 (en) 2015-10-06 2019-06-04 Google Llc Radar gesture sensing using existing data protocols
US10540001B1 (en) 2015-10-06 2020-01-21 Google Llc Fine-motion virtual-reality or augmented-reality control using radar
US20170126613A1 (en) * 2015-11-03 2017-05-04 Joiiup Technology Inc. Instant information exchange system and method for online sports teams
US9837760B2 (en) 2015-11-04 2017-12-05 Google Inc. Connectors for connecting electronics embedded in garments to external devices
USD803240S1 (en) * 2015-12-09 2017-11-21 Facebook, Inc. Display screen with animated graphical user interface
US9892247B2 (en) * 2015-12-30 2018-02-13 Motorola Mobility Llc Multimodal biometric authentication system and method with photoplethysmography (PPG) bulk absorption biometric
US20230277075A1 (en) * 2016-01-22 2023-09-07 Fitbit, Inc. Photoplethysmography-Based Pulse Wave Analysis Using a Wearable Device
US10973422B2 (en) * 2016-01-22 2021-04-13 Fitbit, Inc. Photoplethysmography-based pulse wave analysis using a wearable device
US11684281B2 (en) * 2016-01-22 2023-06-27 Fitbit, Inc. Photoplethysmography-based pulse wave analysis using a wearable device
US12070297B2 (en) * 2016-01-22 2024-08-27 Fitbit, Inc. Photoplethysmography-based pulse wave analysis using a wearable device
US20210369128A1 (en) * 2016-01-22 2021-12-02 Fitbit, Inc. Photoplethysmography-based pulse wave analysis using a wearable device
US11589758B2 (en) 2016-01-25 2023-02-28 Fitbit, Inc. Calibration of pulse-transit-time to blood pressure model using multiple physiological sensors and various methods for blood pressure variation
CN107049293A (en) * 2016-02-02 2017-08-18 富士通株式会社 Sensor information processing equipment
US11179049B2 (en) 2016-02-29 2021-11-23 Fitbit, Inc. Intelligent inflatable cuff for arm-based blood pressure measurement
US12059233B2 (en) 2016-03-07 2024-08-13 Fitbit, Inc. Blood pressure sensors
US20170290526A1 (en) * 2016-04-07 2017-10-12 Oregon Health & Science University Telecentive spirometer
US11140787B2 (en) 2016-05-03 2021-10-05 Google Llc Connecting an electronic component to an interactive textile
US10492302B2 (en) 2016-05-03 2019-11-26 Google Llc Connecting an electronic component to an interactive textile
US10175781B2 (en) 2016-05-16 2019-01-08 Google Llc Interactive object with multiple electronics modules
US11850063B2 (en) * 2016-06-13 2023-12-26 Medtronic, Inc. Multi-parameter prediction of acute cardiac episodes and attacks
US20200069245A1 (en) * 2016-06-13 2020-03-05 Medtronic, Inc. Multi-parameter prediction of acute cardiac episodes and attacks
TWI720215B (en) * 2016-06-29 2021-03-01 南韓商三星電子股份有限公司 System and method for providing a real-time signal segmentation and fiducial points alignment framework
US10426411B2 (en) * 2016-06-29 2019-10-01 Samsung Electronics Co., Ltd. System and method for providing a real-time signal segmentation and fiducial points alignment framework
CN107536599A (en) * 2016-06-29 2018-01-05 三星电子株式会社 The System and method for of live signal segmentation and datum mark alignment framework is provided
JP2018000952A (en) * 2016-06-29 2018-01-11 三星電子株式会社Samsung Electronics Co.,Ltd. System and method for providing real-time signal segmentation and fiducial point alignment framework
US20180000426A1 (en) * 2016-06-29 2018-01-04 Samsung Electronics Co., Ltd. System and Method for Providing a Real-Time Signal Segmentation and Fiducial Points Alignment Framework
WO2018013569A1 (en) * 2016-07-11 2018-01-18 Mc10, Inc. Multi-sensor blood pressure measurement system
CN109688910A (en) * 2016-07-11 2019-04-26 Mc10股份有限公司 Multisensor blood pressure measuring system
US11963748B2 (en) 2016-08-18 2024-04-23 Verily Life Sciences Llc Portable monitor for heart rate detection
US11419509B1 (en) * 2016-08-18 2022-08-23 Verily Life Sciences Llc Portable monitor for heart rate detection
US10905382B2 (en) 2016-10-25 2021-02-02 Samsung Electronics Co., Ltd. Bio-signal quality assessment apparatus and method and bio-signal measurement parameter optimization apparatus and method
US10579150B2 (en) 2016-12-05 2020-03-03 Google Llc Concurrent detection of absolute distance and relative movement for sensing action gestures
US10980428B2 (en) 2016-12-15 2021-04-20 ViviPulse, LLC Wearable pulse waveform measurement system and method
US10939834B2 (en) * 2017-05-01 2021-03-09 Samsung Electronics Company, Ltd. Determining cardiovascular features using camera-based sensing
US11957454B2 (en) * 2017-06-23 2024-04-16 Koninklijke Philips N.V. Device, system and method for detection of pulse and/or pulse-related information of a patient
US10869627B2 (en) * 2017-07-05 2020-12-22 Osr Enterprises Ag System and method for fusing information related to a driver of a vehicle
US10896375B2 (en) * 2017-07-11 2021-01-19 International Business Machines Corporation Cognitive replication through augmented reality
US10896376B2 (en) * 2017-07-11 2021-01-19 International Business Machines Corporation Cognitive replication through augmented reality
US11687800B2 (en) * 2017-08-30 2023-06-27 P Tech, Llc Artificial intelligence and/or virtual reality for activity optimization/personalization
US20190065970A1 (en) * 2017-08-30 2019-02-28 P Tech, Llc Artificial intelligence and/or virtual reality for activity optimization/personalization
US12014289B2 (en) 2017-08-30 2024-06-18 P Tech, Llc Artificial intelligence and/or virtual reality for activity optimization/personalization
US11543785B2 (en) * 2017-12-06 2023-01-03 Sintokogio, Ltd. Industrial machine startup control system, startup control method, and program
US10244985B1 (en) 2017-12-28 2019-04-02 Saleem Sayani Wearable diagnostic device
US11076805B2 (en) 2017-12-28 2021-08-03 Saleem Sayani Wearable diagnostic device
US11445986B2 (en) * 2018-01-30 2022-09-20 Gaia Connect Inc. Health monitor wearable device
US11583197B2 (en) * 2018-12-05 2023-02-21 Acer Incorporated Method and device for detecting cardiac arrhythmia based on photoplethysmographic signal
US11432775B2 (en) 2018-12-21 2022-09-06 Samsung Electronics Co., Ltd. Apparatus and method for estimating blood pressure
US20220304622A1 (en) * 2019-06-17 2022-09-29 Happy Health, Inc. Wearable device operable to detect and/or prepare a user for sleep
CN112168139A (en) * 2019-07-05 2021-01-05 腾讯科技(深圳)有限公司 Health monitoring method and device and storage medium
US10559145B1 (en) * 2019-07-17 2020-02-11 Abdulaziz Mohammed Almehmadi Systems and methods for providing behavioral based intention detection
US11460928B2 (en) 2019-12-18 2022-10-04 Samsung Electronics Co., Ltd. Electronic device for recognizing gesture of user from sensor signal of user and method for recognizing gesture using the same
WO2021125677A1 (en) * 2019-12-18 2021-06-24 Samsung Electronics Co., Ltd. Electronic device for recognizing gesture of user from sensor signal of user and method for recognizing gesture using the same
US20210216904A1 (en) * 2020-01-13 2021-07-15 International Business Machines Corporation Knowledge Aided Feature Engineering
US11599826B2 (en) * 2020-01-13 2023-03-07 International Business Machines Corporation Knowledge aided feature engineering
US11800984B2 (en) 2020-07-08 2023-10-31 Nec Corporation Of America Sensor fusion for measurement of physiological parameters
US11361445B2 (en) 2020-07-08 2022-06-14 Nec Corporation Of America Image analysis for detecting mask compliance
US11666271B2 (en) * 2020-12-09 2023-06-06 Medtronic, Inc. Detection and monitoring of sleep apnea conditions
WO2023152622A1 (en) * 2022-02-08 2023-08-17 Bioenergetics Systems Technology Group Limited A wearabale monitor with photoplethysmogram sensor for determining emotion level
WO2024049973A1 (en) * 2022-09-02 2024-03-07 Board Of Regents Of The University Of Nebraska Systems and methods for determination of pulse arrival time with wearable electronic devices

Also Published As

Publication number Publication date
US9396642B2 (en) 2016-07-19
CA2928197A1 (en) 2015-04-30
JP2016538097A (en) 2016-12-08
US20150112154A1 (en) 2015-04-23
US20150112158A1 (en) 2015-04-23
KR20160075677A (en) 2016-06-29
US20150109124A1 (en) 2015-04-23
US20150112157A1 (en) 2015-04-23
US20150112452A1 (en) 2015-04-23
US9396643B2 (en) 2016-07-19
US20150112155A1 (en) 2015-04-23
US20150112156A1 (en) 2015-04-23
US20150112208A1 (en) 2015-04-23
US20150112159A1 (en) 2015-04-23
WO2015061579A1 (en) 2015-04-30
EP3060107A1 (en) 2016-08-31

Similar Documents

Publication Publication Date Title
US9396642B2 (en) Control using connected biometric devices
US20160302677A1 (en) Calibrating for Blood Pressure Using Height Difference
US20150164351A1 (en) Calculating pulse transit time from chest vibrations
RU2656559C2 (en) Method and device for determining life-important parameters
US11992288B2 (en) Systems and methods for quantification of, and prediction of smoking behavior
CN107708548B (en) System and method for quantification and prediction of smoking behavior
KR102318887B1 (en) Wearable electronic device and method for controlling thereof
US20240350098A1 (en) System for monitoring and providing alerts of a fall risk by predicting risk of experiencing symptoms related to abnormal blood pressure(s) and/or heart rate
EP4032469A1 (en) Systems and methods of multispectral blood measurement
US20200035337A1 (en) Method and product for determining a state value, a value representing the state of a subject
US20220296847A1 (en) Wearable device operable to detect and/or manage user stress
US20210244365A1 (en) Non-invasive epidermal health-monitoring sensor, patch system and method, and epidemiological monitoring and tracking system related thereto
WO2016137698A1 (en) Calculating pulse transit time from chest vibrations
CA3124360A1 (en) System and method for blood pressure monitoring with subject awareness information
US20220167859A1 (en) System and method for blood pressure monitoring with subject awareness information
US12076161B2 (en) Unobtrusive symptoms monitoring for allergic asthma patients
CA3115419A1 (en) Non-invasive epidermal health-monitoring sensor, patch system and method, and epidemiological monitoring and tracking system related thereto
US20240081647A1 (en) Systems and methods for assisting in smoking cessation
Ishaque Heart-rate Variability Analysis for Stress Assessment in a Video-Game Setup

Legal Events

Date Code Title Description
AS Assignment

Owner name: QUANTTUS, INC., MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HE, DAVID DA;BIJJANI, RICHARD ROBEHR;REEL/FRAME:034118/0010

Effective date: 20141030

AS Assignment

Owner name: ROBERT F. DUDLEY, AS TRUSTEE OF THE QUANTTUS LIQUI

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:QUANTTUS, INC.;REEL/FRAME:041019/0850

Effective date: 20161228

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION