CN113936421A - Detecting falls using a mobile device - Google Patents
Detecting falls using a mobile device Download PDFInfo
- Publication number
- CN113936421A CN113936421A CN202110792593.XA CN202110792593A CN113936421A CN 113936421 A CN113936421 A CN 113936421A CN 202110792593 A CN202110792593 A CN 202110792593A CN 113936421 A CN113936421 A CN 113936421A
- Authority
- CN
- China
- Prior art keywords
- user
- mobile device
- fall
- data
- impact
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000033001 locomotion Effects 0.000 claims abstract description 446
- 238000000034 method Methods 0.000 claims abstract description 169
- 230000004044 response Effects 0.000 claims description 107
- 238000004891 communication Methods 0.000 claims description 84
- 230000037081 physical activity Effects 0.000 claims description 41
- 230000006438 vascular health Effects 0.000 claims description 17
- 230000007423 decrease Effects 0.000 claims description 14
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 claims description 5
- 229910052760 oxygen Inorganic materials 0.000 claims description 5
- 239000001301 oxygen Substances 0.000 claims description 5
- 230000001133 acceleration Effects 0.000 description 247
- 239000000523 sample Substances 0.000 description 94
- 230000008569 process Effects 0.000 description 86
- 230000000694 effects Effects 0.000 description 84
- 238000005259 measurement Methods 0.000 description 75
- 208000014674 injury Diseases 0.000 description 68
- 230000008859 change Effects 0.000 description 62
- 230000006378 damage Effects 0.000 description 52
- 208000027418 Wounds and injury Diseases 0.000 description 51
- 230000000875 corresponding effect Effects 0.000 description 45
- 238000001514 detection method Methods 0.000 description 38
- 230000004927 fusion Effects 0.000 description 33
- 230000009429 distress Effects 0.000 description 32
- 230000007704 transition Effects 0.000 description 32
- 238000012545 processing Methods 0.000 description 25
- 230000006399 behavior Effects 0.000 description 24
- 230000000386 athletic effect Effects 0.000 description 22
- 230000006870 function Effects 0.000 description 22
- 210000000707 wrist Anatomy 0.000 description 20
- 230000009471 action Effects 0.000 description 18
- 230000005484 gravity Effects 0.000 description 18
- 230000008733 trauma Effects 0.000 description 17
- 238000006073 displacement reaction Methods 0.000 description 16
- 230000015654 memory Effects 0.000 description 15
- 238000003066 decision tree Methods 0.000 description 13
- 238000000605 extraction Methods 0.000 description 13
- 238000005070 sampling Methods 0.000 description 12
- 238000004458 analytical method Methods 0.000 description 10
- 238000011084 recovery Methods 0.000 description 10
- 238000013179 statistical model Methods 0.000 description 10
- 230000006735 deficit Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 9
- 230000000737 periodic effect Effects 0.000 description 9
- 230000005540 biological transmission Effects 0.000 description 8
- 230000001747 exhibiting effect Effects 0.000 description 8
- 208000025978 Athletic injury Diseases 0.000 description 7
- 238000010801 machine learning Methods 0.000 description 7
- 230000002093 peripheral effect Effects 0.000 description 7
- 208000037974 severe injury Diseases 0.000 description 7
- 230000009528 severe injury Effects 0.000 description 7
- 238000004590 computer program Methods 0.000 description 6
- 230000001186 cumulative effect Effects 0.000 description 6
- 230000007613 environmental effect Effects 0.000 description 6
- 230000035945 sensitivity Effects 0.000 description 6
- 230000009182 swimming Effects 0.000 description 6
- 230000001413 cellular effect Effects 0.000 description 5
- 230000000977 initiatory effect Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 230000010355 oscillation Effects 0.000 description 5
- 238000005096 rolling process Methods 0.000 description 5
- 230000008093 supporting effect Effects 0.000 description 5
- 238000013528 artificial neural network Methods 0.000 description 4
- 230000001174 ascending effect Effects 0.000 description 4
- 238000012790 confirmation Methods 0.000 description 4
- 238000013135 deep learning Methods 0.000 description 4
- 230000002068 genetic effect Effects 0.000 description 4
- 230000001771 impaired effect Effects 0.000 description 4
- 230000001939 inductive effect Effects 0.000 description 4
- 230000002787 reinforcement Effects 0.000 description 4
- 238000012706 support-vector machine Methods 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 230000003542 behavioural effect Effects 0.000 description 3
- 230000000981 bystander Effects 0.000 description 3
- 230000001351 cycling effect Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 210000000245 forearm Anatomy 0.000 description 3
- 238000003064 k means clustering Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 210000004247 hand Anatomy 0.000 description 2
- 230000003534 oscillatory effect Effects 0.000 description 2
- 239000013641 positive control Substances 0.000 description 2
- 230000003252 repetitive effect Effects 0.000 description 2
- 230000000630 rising effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 206010042772 syncope Diseases 0.000 description 2
- 239000002699 waste material Substances 0.000 description 2
- 241000234282 Allium Species 0.000 description 1
- 235000001270 Allium sibiricum Nutrition 0.000 description 1
- 238000012935 Averaging Methods 0.000 description 1
- 206010017577 Gait disturbance Diseases 0.000 description 1
- 208000003443 Unconsciousness Diseases 0.000 description 1
- 210000001015 abdomen Anatomy 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000009530 blood pressure measurement Methods 0.000 description 1
- 230000003139 buffering effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 230000000739 chaotic effect Effects 0.000 description 1
- 230000009194 climbing Effects 0.000 description 1
- 230000001427 coherent effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000036461 convulsion Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000018044 dehydration Effects 0.000 description 1
- 238000006297 dehydration reaction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 235000013305 food Nutrition 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 238000009532 heart rate measurement Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000003116 impacting effect Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 239000013642 negative control Substances 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000011514 reflex Effects 0.000 description 1
- 230000010076 replication Effects 0.000 description 1
- 230000004043 responsiveness Effects 0.000 description 1
- 238000012502 risk assessment Methods 0.000 description 1
- 239000012723 sample buffer Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
- 238000009423 ventilation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
- G08B21/0438—Sensor means for detecting
- G08B21/0446—Sensor means for detecting worn on the body to detect changes of posture, e.g. a fall, inclination, acceleration, gait
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1116—Determining posture transitions
- A61B5/1117—Fall detection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/681—Wristwatch-type devices
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/746—Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/08—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using communication transmission lines
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Surgery (AREA)
- Public Health (AREA)
- Pathology (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Physiology (AREA)
- Cardiology (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- General Physics & Mathematics (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Gerontology & Geriatric Medicine (AREA)
- Emergency Alarm Devices (AREA)
- Alarm Systems (AREA)
- Telephone Function (AREA)
Abstract
The invention is entitled "detecting falls using a mobile device. "in an exemplary method, a mobile device receives motion data acquired by one or more sensors over a period of time, wherein the one or more sensors are worn by a user. The mobile device determines impacts experienced by the user during the period of time based on the motion data and determines one or more of the characteristics of the user. The mobile device determines a likelihood that the user needs help after the impact based on the motion data and the one or more characteristics of the user, and generates one or more notifications based on the likelihood.
Description
Cross Reference to Related Applications
This application is a continuation-in-part application of U.S. patent application No.16/852,370, filed on 17.4.2020, which is a continuation-in-part application of U.S. application No.16/128,464, filed on 11.9.2018, which claims priority to U.S. provisional application No.62/565,988, filed on 29.9.2017, each of which is incorporated herein by reference in its entirety.
Technical Field
The present disclosure relates to techniques for determining whether a user has fallen using a mobile device.
Background
A motion sensor is a device that measures motion experienced by an object (e.g., velocity or acceleration of the object with respect to time, orientation or change in orientation of the object with respect to time, etc.). In some cases, a mobile device (e.g., a cellular phone, a smartphone, a tablet, a wearable electronic device such as a smart watch, etc.) may include one or more motion sensors that determine motion experienced by the mobile device over a period of time. If the mobile device is worn by a user, measurements taken by the motion sensor may be used to determine the motion experienced by the user over a period of time.
Disclosure of Invention
Systems, methods, devices, and non-transitory computer-readable media are disclosed herein for electronically determining whether a user has fallen using a mobile device.
In an aspect, a method includes acquiring, by a mobile device, motion data indicative of motion measured by a motion sensor over a period of time. The motion sensor is worn or carried by a user. The method also includes determining, by the mobile device, an impact experienced by the user based on the motion data, the impact occurring during a first interval of the time period. The method also includes determining, by the mobile device, one or more first motion characteristics of the user during a second interval of the time period based on the motion data. The second interval occurs before the first interval. The method also includes determining, by the mobile device, one or more second motion characteristics of the user during a third interval of the time period based on the motion data. The third interval occurs after the first interval. The method further comprises the following steps: determining, by the mobile device, that the user has fallen based on the impact, the one or more first motion characteristics of the user, and the one or more second motion characteristics of the user, and in response to determining that the user has fallen, generating, by the mobile device, a notification indicating that the user has fallen.
Implementations of this aspect may include one or more of the following features.
In some implementations, determining the one or more first motion characteristics may include determining that the user is walking during the second interval based on the motion data.
In some implementations, determining the one or more first motion characteristics may include determining that the user is ascending stairs or descending stairs during the second interval based on the motion data.
In some implementations, determining the one or more first motion characteristics may include determining, based on the motion data, that the user is moving the body part according to a wiggle motion or a support motion during the second interval.
In some implementations, determining the one or more second motion characteristics may include determining that the user is walking during a third interval based on the motion data.
In some implementations, determining the one or more second motion characteristics may include determining that the user is standing during a third interval based on the motion data.
In some implementations, determining the one or more second motion characteristics may include determining, based on the motion data, that the orientation of the body part of the user changed N or more times during the third interval.
In some implementations, determining that the user has fallen may include determining, based on the motion data, that the impact is greater than a first threshold, and determining, based on the motion data, that the motion of the user is impaired during a third interval.
In some implementations, determining that the user has fallen may include determining, based on the motion data, that the impact is less than a first threshold and greater than a second threshold; determining, based on the motion data, at least one of whether the user is walking during a second interval, ascending stairs during the second interval, or descending stairs during the second interval; determining, based on the motion data, that the user is moving the body part according to a waving motion or a supporting motion during the second interval; and determining, based on the motion data, that the motion of the user is impaired during the third interval.
In some implementations, generating the notification can include presenting an indication on at least one of a display device or an audio device of the mobile device that the user has fallen.
In some implementations, generating the notification can include transmitting data to a communication device remote from the mobile device, the data including an indication that the user has fallen.
In some implementations, determining that the user has fallen may include generating a statistical model based on the one or more sampled impacts, the one or more sampled first motion characteristics, and the one or more sampled second motion characteristics. An impact of the one or more samples, a first motion characteristic of the one or more samples, and a second motion characteristic of the one or more samples may be determined based on the sample motion data. The sample motion data may be indicative of motion measured by one or more additional sensors over one or more additional time periods, wherein each additional motion sensor is worn by a respective additional user.
In some implementations, the statistical model can be a bayesian statistical model.
In some implementations, the one or more sampled first motion characteristics may include an indication of a type of activity performed by a particular additional user relative to the sample motion data, an indication of a level of activity of the particular additional user relative to the sample motion data, and/or an indication of a walking speed of the particular additional user relative to the sample motion data.
In some implementations, the method may be performed by a co-processor of a mobile device. The co-processor may be configured to receive motion data acquired from the one or more motion sensors, process the motion data, and provide the processed motion data to one or more processors of the mobile device.
In some implementations, the mobile device can include a motion sensor.
In some implementations, the mobile device can be worn on the user's arm or wrist when the sensor measures motion.
In some implementations, the mobile device can be a wearable mobile device.
In another aspect, a method comprises: the method includes acquiring, by a mobile device, a first signal indicative of an acceleration measured by an accelerometer over a time period, and a second signal indicative of an orientation measured by an orientation sensor over the time period, wherein the accelerometer and the orientation sensor are physically coupled to a user. The method further comprises the following steps: the method further includes determining, by the mobile device, rotation data indicative of an amount of rotation experienced by the user during the period of time, determining, by the mobile device, that the user has tumbled based on the rotation data, and in response to determining that the user has tumbled, generating, by the mobile device, a notification indicative of the user having tumbled.
Implementations of this aspect may include one or more of the following features.
In some implementations, the rotation data can include a third signal indicative of a rate of rotation experienced by the user during the time period.
In some implementations, the rotation data can include an indication of one or more axes of rotation of the user in the reference coordinate system during the time period.
In some implementations, the rotation data can include an indication of an average axis of rotation of the user during the time period.
In some implementations, determining that the user has tumbled may include determining a change between one or more axes of rotation of the user during the time period and an average axis of rotation of the user during the time period.
In some implementations, determining that the user has fallen may include determining that the change is less than a first threshold, and in response to determining that the change is less than the first threshold, determining a fourth signal corresponding to an angular displacement of the user during the time period based on the third signal.
In some implementations, determining the fourth signal can include integrating the third signal with respect to the time period.
In some implementations, determining that the user has fallen may include determining that an angular displacement of the user during a period of time is greater than a second threshold; determining that at least one of the one or more axes of rotation of the user during the time period is greater than a third threshold; and determining that the user has tumbled in response to determining that the angular displacement experienced by the user during the time period is greater than a second threshold and determining that at least one of the one or more axes of rotation of the user during the time period is greater than a third threshold.
In some implementations, generating the notification can include presenting an indication on at least one of a display device or an audio device of the mobile device that the user has rolled over.
In some implementations, generating the notification can include transmitting data to a communication device remote from the mobile device, the data including an indication that the user has fallen.
In some implementations, the method may be performed by a co-processor of a mobile device. The co-processor may be configured to receive motion data acquired from the one or more sensors, process the motion data, and provide the processed motion data to one or more processors of the mobile device.
In another aspect, a method includes obtaining, by a mobile device, motion data indicative of motion measured by one or more motion sensors over a first time period. One or more motion sensors are worn by the user. The method also includes determining, by the mobile device, that the user has fallen based on the motion data, and in response to determining that the user has fallen, generating, by the mobile device, one or more notifications indicating that the user has fallen.
Implementations of this aspect may include one or more of the following features.
In some implementations, generating the one or more notifications can include presenting a first notification to the user indicating that the user has fallen.
In some implementations, the first notification can include at least one of a visual message, an audio message, or a tactile message.
In some implementations, generating the one or more notifications can include receiving, by the mobile device, an input from the user in response to the first notification. The input may indicate a user request for assistance. Additionally, generating the one or more notifications may include, in response to receiving the input, transmitting a second notification to the communication device remote from the mobile device indicating the request for assistance.
In some implementations, the communication device may be an emergency response system.
In some implementations, the second notification can indicate a location of the mobile device.
In some implementations, generating the one or more notifications can include determining, by the mobile device, that the user has not moved during a second time period after the user has fallen, and in response to determining that the user has not moved during the second time period, transmitting a second notification indicating the request for assistance to a communication device remote from the mobile device.
In some implementations, generating the one or more notifications can include determining, by the mobile device, that the user has moved during a second time period after the user has fallen, and refraining from transmitting a second notification indicating a request for assistance to a communication device remote from the mobile device in response to determining that the user has moved during the second time period.
In some implementations, one or more notifications can be generated according to a state machine.
In some implementations, the one or more motion sensors may include at least one of an accelerometer or a gyroscope.
In some implementations, the mobile device can be a wearable mobile device.
In some implementations, determining that the user has fallen may include determining that the user experienced an impact based on the motion data.
In some implementations, determining that the user has fallen may include determining behavior of the user during a first time period.
In another aspect, a method includes acquiring, by a mobile device, sample data generated by a plurality of sensors over a period of time. A plurality of sensors are worn by a user. The sample data includes motion data indicative of motion of the user acquired from one or more motion sensors of the plurality of sensors, and at least one of: the method may include obtaining location data indicative of a location of the mobile device from one or more location sensors of the plurality of sensors, obtaining altitude data indicative of an altitude of the mobile device from one or more altitude sensors of the plurality of sensors, or obtaining heart rate data indicative of a heart rate of the user from one or more heart rate sensors of the plurality of sensors. The method also includes determining, by the mobile device, that the user has fallen based on the sample data, and in response to determining that the user has fallen, generating, by the mobile device, one or more notifications indicating that the user has fallen.
Implementations of this aspect may include one or more of the following features.
In some implementations, the one or more motion sensors may include at least one of an accelerometer or a gyroscope.
In some implementations, acquiring the motion data may include acquiring acceleration data using an accelerometer during a first time interval during the time period. The gyroscope may be disabled during a first time interval. Acquiring the motion data may further include determining that movement of the user during the first time interval exceeds a threshold level based on acceleration data acquired during the first time interval, and in response to determining that the movement of the user during the first time interval exceeds the threshold level, acquiring the acceleration data using the accelerometer and acquiring the gyroscope data using the gyroscope during a second time interval subsequent to the first time interval.
In some implementations, the one or more altitude sensors may include at least one of an altimeter or a barometer.
In some implementations, the one or more position sensors may include at least one of a wireless transceiver or a global navigation satellite system receiver.
In some implementations, determining that the user has fallen can include determining a change in orientation of the mobile device during a period of time based on the motion data, and determining that the user has fallen based on the change in orientation.
In some implementations, determining that the user has fallen may include determining, based on the motion data, an impact experienced by the user during the time period, and determining, based on the impact, that the user has fallen.
In some implementations, determining that the user has fallen can include determining a change in height of the mobile device during a time period based on the height data, and determining that the user has fallen based on the change in height.
In some implementations, determining that the user has fallen may include determining a change in the heart rate of the user during the time period based on the heart rate data, and determining that the user has fallen based on the change in the heart rate.
In some implementations, determining the change in the heart rate of the user during the time period may include determining a rate of decay of the heart rate of the user during the time period.
In some implementations, determining that the user has fallen can include determining an environmental condition at the location of the mobile device based on the location data, and determining that the user has fallen based on the environmental condition.
In some implementations, the environmental condition may be weather.
In some implementations, the mobile device can determine that the user has fallen based on the motion data, the location data, the altitude data, and the heart rate data.
In some implementations, the mobile device can be a wearable mobile device.
In some implementations, generating the one or more notifications can include transmitting the notifications to a communication device remote from the mobile device.
In some implementations, the communication device may be an emergency response system.
In another aspect, a method comprises: receiving, by a mobile device, motion data acquired by one or more sensors over a period of time, wherein the one or more sensors are worn by a user; determining, by the mobile device, based on the motion data, an impact experienced by the user during the time period; determining, by the mobile device, one or more of the characteristics of the user; determining, by the mobile device, a likelihood that the user needs assistance after the impact based on the motion data and the one or more characteristics of the user; and generating, by the mobile device, one or more notifications based on the likelihood.
Implementations of this aspect may include one or more of the following features.
In some implementations, the one or more characteristics of the user may include an age of the user. The likelihood may increase as the user ages.
In some implementations, the one or more characteristics of the user may include a gender of the user. The likelihood may depend on the gender of the user.
In some implementations, the one or more characteristics of the user may include a historical physical activity level of the user. The likelihood may increase as the user's historical physical activity level decreases. In some implementations, the historical physical activity level may indicate a frequency of movement of the user prior to the impact. In some implementations, the historical physical activity level may indicate the intensity of movement of the user prior to the impact.
In some implementations, the one or more characteristics of the user may include a vascular health of the user. The likelihood may increase as the user's vascular health decreases. In some implementations, the maximum oxygen uptake (VO) may be based on a user2max) determines the vascular health of the user.
In some implementations, the one or more characteristics of the user may include a historical walking speed of the user. The likelihood may increase as the user's historical walking speed decreases.
In some implementations, generating the one or more notifications can include determining that the likelihood exceeds a threshold level, and in response to determining that the likelihood exceeds the threshold level, generating the one or more notifications.
In some implementations, generating the one or more notifications can include transmitting a first notification to a communication device remote from the mobile device, the first notification including an indication that the user has fallen. The communication device may be an emergency response system.
In some implementations, the mobile device can be a wearable mobile device.
In some implementations, at least some of the one or more sensors can be disposed on or in the mobile device.
In some implementations, at least some of the one or more sensors can be remote from the mobile device.
In another aspect, a method comprises: a database comprising a plurality of data records is obtained by a mobile device. Each data record includes an indication of a respective impact previously experienced by a user of the mobile device, and sensor data generated by one or more first sensors worn by the user during the impact. The method also includes obtaining, by the mobile device, additional sensor data generated by one or more second sensors worn by the user over a period of time; determining, by the mobile device, whether the user has fallen during the time period based on the database and the additional sensor data; and generating, by the mobile device, one or more notifications based on determining whether the user has fallen during the time period.
Implementations of this aspect may include one or more of the following features.
In some implementations, the method can include generating additional data records based on the additional sensor data; and including the additional data record in a database.
In some implementations, the database can be stored on a storage device of the mobile device.
In some implementations, the method can include generating, by the mobile device, one or more clusters in the plurality of data records based on similarities between the plurality of data records. One or more clusters may be generated using K-means clustering.
In some implementations, the one or more first sensors and the one or more second sensors can include at least one of an accelerometer or an orientation sensor. For each data record, the sensor data may include one or more first signals indicative of acceleration measured by the accelerometer during an impact associated with the data record and one or more second signals indicative of orientation measured by the orientation sensor during an impact associated with the data record.
In some implementations, the additional sensor data can include one or more additional first signals indicative of acceleration measured by the accelerometer during the time period, and one or more additional second signals indicative of orientation measured by the orientation sensor during the time period. Each data record may include metadata about the impact associated with the data record. The metadata may include at least one of: an indication of a respective time of impact associated with the data record or an indication of a respective day of the week of impact associated with the data record.
In some implementations, determining whether the user has fallen may include determining that the user has experienced an impact based on the additional sensor data.
In some implementations, determining whether the user has fallen may include determining, based on the additional sensor data, a likelihood that the impact corresponds to the user falling in response to determining that the user has experienced the impact.
In some implementations, determining whether the user has fallen may include determining that the user has fallen based on the determined likelihood, and determining a similarity metric indicative of a similarity between the additional sensor data and the sensor data of the one or more clusters in the plurality of data records in response to determining that the user has fallen.
In some implementations, generating one or more notifications can include determining that the similarity metric is less than a threshold level, and generating a first notification to the user to confirm whether the user has fallen in response to determining that the similarity metric is less than the threshold level.
In some implementations, generating one or more notifications can include receiving input from the user indicating that the user has fallen; and in response to receiving the input, transmitting a second notification to a communication device remote from the mobile device. The second notification may include an indication that the user has fallen. The communication device may be an emergency response system.
In some implementations, determining whether the user has fallen during the time period can include determining, based on the additional sensor data, that the user has experienced multiple impacts during the time period; determining that the multiple impacts are similar to each other based on the additional sensor data; and responsive to determining that the user experienced multiple impacts during the time period and determining that the multiple impacts are similar to each other, determining that the user is unlikely to have fallen during the time period.
In some implementations, determining whether the user has fallen during the time period can include determining, based on the additional sensor data, that the user has experienced a periodic sequence of multiple impacts during the time period; and responsive to determining that the user has experienced multiple impacts in a periodic sequence during the time period, determining that the user is unlikely to have fallen during the time period.
In some implementations, determining whether the user has fallen during the time period can include determining smoothness of movement of the user during the time period based on the additional sensor data; and determining whether the user has fallen during the time period based on the smoothness of the movement of the user.
In some implementations, determining whether the user has fallen during the time period can include determining, based on the additional sensor data, an acceleration of the user during the time period relative to a first direction and a second direction orthogonal to the first direction; determining that the acceleration in the first direction is greater than a first threshold and the acceleration in the second direction is less than a second threshold; in response to determining that the acceleration in the first direction is greater than a first threshold and the acceleration in the second direction is less than a second threshold, determining that the user is unlikely to have fallen during the time period. The first direction may be orthogonal to the direction of gravity. The second direction may be parallel to the direction of gravity.
In some implementations, at least some of the one or more first sensors or the one or more second sensors can be disposed on or in the mobile device.
In some implementations, at least some of the one or more first sensors or the one or more second sensors can be remote from the mobile device.
In some implementations, at least some of the one or more first sensors may be identical to at least some of the one or more second sensors.
In another aspect, a method comprises: receiving, by a mobile device, motion data acquired by one or more sensors, wherein the one or more sensors are worn by a user; determining, by the mobile device, that the user has fallen at a first time based on the motion data; determining, by the mobile device, based on the motion data, whether the user is moving between a second time and a third time after the first time; and upon determining that the user has not moved between the second time and the third time, initiating a communication at a fourth time after the third time requesting an emergency response service, the communication including an indication that the user has fallen and the location of the user.
Implementations of this aspect may include one or more of the following features.
In some implementations, the one or more sensors include one or more accelerometers. The motion data may include one or more acceleration signals acquired by one or more accelerometers. Determining whether the user has moved between the first time and the second time may include determining a change in one or more acceleration signals between the first time and the second time.
In some implementations, the one or more sensors include one or more orientation sensors. The motion data may include one or more orientation signals acquired by the one or more orientation sensors. Determining whether the user has moved between the first time and the second time may include determining a change in one or more orientation signals between the first time and the second time.
In some implementations, determining whether the user has moved between the second time and the third time may include determining whether the user is walking between the second time and the third time.
In some implementations, determining whether the user has moved between the second time and the third time may include determining whether the user is standing between the second time and the third time.
In some implementations, the method may include refraining from initiating a communication request emergency response service upon determining that the user has moved between the second time and the third time.
In some implementations, the method can include receiving, by the mobile device, a command from the user to terminate the communication after initiating the communication request emergency response service; and terminating the communication in response to receiving a command to terminate the communication.
In some implementations, the mobile device can be a wearable mobile device.
In some implementations, at least some of the one or more sensors can be disposed on or in the mobile device.
In some implementations, at least some of the one or more sensors can be remote from the mobile device.
In another aspect, a method comprises: obtaining, by a mobile device, sample data generated by one or more sensors over a period of time, wherein the one or more sensors are worn by a user; determining, by the mobile device, that the user has fallen based on the sample data; determining, by the mobile device, a severity of an injury suffered by the user based on the sample data; and generating, by the mobile device, one or more notifications based on the determination that the user has fallen and the determined severity of the injury.
Implementations of this aspect may include one or more of the following features.
In some implementations, the one or more sensors can include at least one of an accelerometer, an orientation sensor, or an altimeter.
In some implementations, the sample data may include motion data indicative of motion of the user over a period of time. Determining that the user has fallen may comprise determining a first impact experienced by the user during the time period based on the motion data, and determining a change in orientation of a part of the user's body during the time period based on the motion data.
In some implementations, the portion of the user's body can include the user's wrist.
In some implementations, the motion data may include a first signal indicative of an acceleration measured by the accelerometer over a time period and a second signal indicative of an orientation measured by the orientation sensor over the time period.
In some implementations, determining the severity of the injury suffered by the user may include determining the severity of the first impact experienced by the user based on the first signal and the second signal.
In some implementations, determining the severity of the injury suffered by the user may include determining, based on the first signal and the second signal, that the user has experienced multiple impacts, including a first impact, during the time period.
In some implementations, determining the severity of the injury suffered by the user can include determining a first set of characteristics associated with the first impact based on the first signal and the second signal; determining a second set of characteristics associated with a second impact experienced by the user during the time period based on the first signal and the second signal; and determining a similarity between the first set of characteristics and the second set of characteristics.
In some implementations, the motion data may include a third signal indicative of the altitude measured by the altimeter over the period of time. Determining the severity of the injury suffered by the user may comprise determining the distance the user has fallen within the time period based on the third signal. Determining the severity of the injury suffered by the user may comprise determining the severity of the injury based on the determined severity of the impact experienced by the user and the determined distance over which the user has fallen.
In some implementations, generating the one or more notifications can include transmitting the first notification to a communication device remote from the mobile device. The first notification may include an indication that the user has fallen and an indication of the severity of the determined injury suffered by the user. The communication device may be an emergency response system.
In some implementations, the mobile device can be a wearable mobile device.
In some implementations, at least some of the one or more sensors can be disposed on or in the mobile device.
Particular implementations provide at least the following advantages. In some cases, implementations described herein may be used to determine whether a user has fallen, and in response, automatically take appropriate action. For example, a mobile device may monitor its movements (e.g., walking, running, sitting, lying down, participating in athletic or physical activities, etc.) as the user walks around in their daily lives. Based on the user's movement, the mobile device can determine whether the user has fallen, and if so, whether the user may require assistance (e.g., physical assistance when standing up and/or recovering from a fall, medical assistance for treating an injury suffered in a fall, etc.). In response, the mobile device may automatically notify others (e.g., a caregiver, doctor, medical responder, or bystander) of the situation so that they may provide assistance to the user. Thus, help can be provided to the user more quickly and efficiently.
Additionally, implementations described herein can be used to more accurately determine whether a user has fallen and/or whether the user may need help. Accordingly, resources can be more efficiently utilized. For example, the mobile device may determine whether the user has fallen and/or whether the user may need help with fewer false positives. Thus, when the user does not need help, the mobile device is less likely to use computing resources and/or network resources to generate notifications and transmit the notifications to others. In addition, medical and logistical resources can be deployed to help users determine needed resources with greater confidence, thereby reducing the likelihood of waste. Accordingly, resources can be used more efficiently and in a manner that improves the effective responsiveness of the system.
Other implementations relate to systems, devices, and non-transitory computer-readable media that include computer-executable instructions for performing the techniques described herein.
The details of one or more embodiments are set forth in the accompanying drawings and the description below. Other features and advantages will be apparent from the description and drawings, and from the claims.
Drawings
Fig. 1 is an illustration of an example system for determining whether a user has fallen and/or may need assistance.
Fig. 2A is a diagram illustrating an example location of a mobile device on a user's body.
Fig. 2B is a diagram illustrating an example orientation axis relative to a mobile device.
FIG. 3 is a diagram illustrating an example acceleration signal acquired by a mobile device.
Fig. 4 is an illustration of an example heat map of motion data collected from a sample group of users who did not fall while performing athletic activities, and data points indicative of motion data collected from a sample group of users who fell.
Fig. 5 is an illustration of another example heat map of motion data collected from a sample group of users who did not fall while performing athletic activities, and data points indicative of motion data collected from a sample group of users who fell.
Fig. 6 is an illustration of another example heat map of motion data collected from a sample population of users who did not fall while performing physical activities, and data points indicative of motion data collected from a sample population of users who fell.
FIG. 7 is an illustration of an example technique for determining a duration in which a magnitude of an acceleration vector is below a threshold.
Fig. 8 is an illustration of an example decision tree for determining whether a user exhibits signs of trauma and/or athletic injury.
Fig. 9 is an illustration of an example decision tree for determining that a user has fallen and may need help, or that the user has not fallen or has fallen but does not need help.
Fig. 10 is an illustration of an example scatter plot of pose angle data collected from a sample group of users who have fallen.
FIG. 11A shows a graph of an example acceleration signal acquired by a mobile device within a sliding sample window.
FIG. 11B shows a graph of an example rotation rate signal relative to an inertial frame acquired by a mobile device within a sliding sample window.
FIG. 11C shows a rotation axis signal indicating the instantaneous rotation axis of the mobile device within the sliding sample window.
Fig. 11D shows a total angular displacement signal corresponding to the total angular displacement of the mobile device within the sliding sample window.
FIG. 12 is an illustration of an example state machine for determining whether to transmit a distress call using a mobile device.
Fig. 13 is a schematic representation of a fall detection technique based on user-specific sensitivity.
FIG. 14 is an illustration of an example state machine for selectively enabling and disabling a gyroscope.
Fig. 15 is a schematic representation of a fall detection technique based on multiple types of sensor measurements.
FIG. 16 is a schematic representation of an example use in conjunction with an accelerometer and a gyroscope to determine information about user motion.
Fig. 17 is a schematic representation of an example fall classifier.
Fig. 18 is a schematic representation of an example fall sensor fusion module.
Fig. 19 is a schematic representation of an example distress call module.
Figure 20 is a schematic representation of an example fall state machine.
Fig. 21 is an illustration of different types of falls that may be experienced by a user.
Fig. 22A is a schematic representation of an example fall detection module.
Fig. 22B is a flow chart of an example process for determining the likelihood that a user has fallen.
Fig. 23 shows an example line graph of the physical historical activity level of the user and the expected severity of a fall experienced by the user.
Fig. 24 shows example acceleration data collected by accelerometers positioned on the body of a user before, during and after a fall.
FIG. 25 is a flow diagram of an example process for generating and transmitting notifications.
FIG. 26 is an illustration of an example state machine that is used to determine whether to generate and transmit a notification.
Fig. 27 is a schematic representation showing a fall detection module for determining the severity of injury to a user due to a fall.
Fig. 28 is a flow chart of an example process for making a user-specific determination of whether a user has fallen.
Fig. 29 is a flow chart of an example process for determining whether a user has fallen and/or may need help.
FIG. 30 is a flow diagram of an example process for determining whether a user has fallen and/or may need help.
Fig. 31 is a flow chart of an example process for determining whether a user has fallen and/or may need help.
Fig. 32 is a flow chart of an example process for determining whether a user has fallen and/or may need help.
Fig. 33 is a flow chart of an example process for determining whether a user has fallen and/or may need help.
Fig. 34 is a flow chart of an example process for determining whether a user has fallen and/or may need help.
Fig. 35 is a flow chart of an example process for determining whether a user has fallen and/or may need help.
Fig. 36 is a flow chart of an example process for determining whether a user has fallen and/or may need help.
Fig. 37 is a block diagram of an example architecture for implementing the features and processes described with reference to fig. 1-36.
Detailed Description
SUMMARY
Fig. 1 illustrates an example system 100 for determining whether a user has fallen and/or may need assistance. The system 100 includes a mobile device 102, a server computer system 104, a communication device 106, and a network 108.
In an example use of the system 100, the user 110 positions the mobile device 102 on his body and walks around in his daily life. This may include, for example, walking, running, sitting, lying down, participating in sports or physical activities, or any other physical activity. During this time, the mobile device 102 collects sensor data regarding the movement of the mobile device 102, the orientation of the mobile device 102, and/or other dynamic attributes. Based on this information, the system 100 determines whether the user has fallen and, if so, whether the user may need help.
For example, the user 110 may trip and fall to the ground while walking. Additionally, after a fall, the user 110 may not be able to re-stand on his or her own and/or suffer injury from the fall. Thus, he may need assistance, such as physical assistance in standing up and/or recovering from the fall, medical assistance to treat the injury suffered in the fall, or other assistance. In response, the system 100 may automatically notify others of the situation. For example, the mobile device 102 can generate and transmit a notification to one or more of the communication devices 106 to notify one or more users 112 (e.g., caregivers, doctors, medical responders, emergency contacts, etc.) of the situation so that they can take action. As another example, the mobile device 102 can generate a notification and transmit the notification to one or more bystanders in the vicinity of the user (e.g., by broadcasting a visual alert and/or an audible alert) so that they can take action. As another example, the mobile device 102 can generate a notification and transmit the notification to the server computer system 104 (e.g., to forward the notification to others and/or to store information for future analysis). Thus, help may be provided to the user 110 more quickly and efficiently.
In some cases, the system 100 may determine that the user 110 has experienced an external force, but has not fallen and does not need help. For example, the user 110 may have been touched (e.g., hit by another person while playing basketball) during the athletic activity, but did not fall due to the touch, and could recover without the assistance of another person. Thus, the system 100 can avoid generating notifications and transmitting notifications to others.
In some cases, the system 100 may determine that the user 110 has fallen, but that the user does not need help. For example, the user 110 may fall as part of a sporting activity (e.g., slip while skiing), but can recover without the assistance of others. Thus, the system 100 can avoid generating notifications and transmitting notifications to others.
In some cases, the system 100 may determine that the user 110 has fallen, determine the severity of the injury suffered by the user 110 as a result of the fall, and in response perform one or more actions (or refrain from performing one or more actions). For example, the system 100 may determine that the user 110 has fallen but has not suffered an injury or has only minor injury, so that he can recover without the assistance of others. In response, the system 100 may refrain from generating a notification and transmitting the notification to others. As another example, the system 100 may determine that the user 110 has fallen and has suffered a more serious injury, such that he may need help. In response, the system 100 may generate a notification and transmit the notification to others to assist the user.
In some cases, the system 100 may make these determinations based on motion data acquired before, during, and/or after the impact experienced by the user 110. For example, the mobile device 102 may collect motion data (e.g., acceleration signals acquired by a motion sensor of the mobile device 102), and the system 100 may use the motion data to identify a point in time at which the user experienced an impact. In identifying the time of impact, the system 100 may analyze the motion data acquired during, before, and/or after the impact to determine whether the user has fallen, and if so, the severity of the injury suffered by the user and/or whether the user may need assistance.
In some cases, the system 100 can also make these determinations in a manner that eliminates or otherwise reduces the incidence of false positives (e.g., incorrect determinations that the user has fallen and/or needed help when the user has not actually fallen and/or does not actually need help). In some implementations, these determinations can be made on a user-specific basis. For example, these determinations may be made for a particular user based on previously collected historical sensor data about the user, historical information about the user's previous activities, personal characteristics of the user (e.g., physical characteristics, age, demographics, etc.), and/or other information specific to the user.
Implementations described herein enable the system 100 to more accurately determine whether a user has fallen and/or whether the user may need help so that resources may be used more efficiently. For example, the system 100 may determine whether the user has fallen and/or whether the user may need help with fewer false positives. Thus, when the user does not need help, the system 100 is less likely to use computing resources and/or network resources to generate notifications and transmit the notifications to others. In addition, medical and logistical resources can be deployed to help users determine needed resources with greater confidence, thereby reducing the likelihood of waste. Accordingly, resources may be used more efficiently and in a manner that increases the effective response capabilities of one or more systems (e.g., computer systems, communication systems, and/or emergency response systems).
The mobile device 102 may be any portable electronic device for receiving, processing, and/or transmitting data, including but not limited to a cellular phone, a smartphone, a tablet, a wearable computer (e.g., a smart watch), and so forth. The mobile device 102 is communicatively connected to the server computer system 104 and/or the communication device 106 using the network 108.
The server computer system 104 is communicatively connected to the mobile device 102 and/or the communication device 106 using a network 108. The server computer system 104 is shown as a respective single component. However, in practice, it may be implemented on one or more computing devices (e.g., each computing device including at least one processor such as a microprocessor or microcontroller). The server computer system 104 may be, for example, a single computing device connected to the network 108. In some implementations, the server computer system 104 may include multiple computing devices connected to the network 108. In some implementations, the server computer system 104 need not be located locally with respect to the rest of the system 100, and portions of the server computer system 104 may be located in one or more remote physical locations.
The communication device 106 may be any device for transmitting and/or receiving information transmitted over the network 108. Examples of communication devices 106 include computers (such as desktop computers, notebook computers, server systems, etc.), mobile devices (such as cellular phones, smart phones, tablets, personal data assistants, notebook computers with networking capabilities), telephones, faxes, and other devices capable of transmitting and receiving data from network 108. The communication devices 106 may include devices that operate using one or more operating systems (e.g., Microsoft Windows, Apple OS X, Linux, Unix, Android, Apple iOS, etc.) and/or architectures (e.g., X86, PowerPC, ARM, etc.). In some implementations, one or more of the communication devices 106 need not be located locally with respect to the rest of the system 100, and one or more of the communication devices 106 may be located at one or more remote physical locations.
The network 108 may be any communication network over which data may be transmitted and shared. For example, the network 108 may be a Local Area Network (LAN) or a Wide Area Network (WAN), such as the Internet. As another example, the network 108 may be a telephone or cellular communication network. The network 108 may be implemented using various network interfaces, such as a wireless network interface (such as Wi-Fi, Bluetooth, or infrared) or a wired network interface (such as Ethernet or a serial connection). The network 108 may also include a combination of more than one network and may be implemented using one or more network interfaces.
As described above, the user 110 may position the mobile device 102 on his or her body and move around in his or her daily life. For example, as shown in fig. 2A, the mobile device 102 may be a wearable electronic device or a wearable computer (e.g., a smart watch) secured to the wrist 202 of the user 110. The mobile device 102 may be secured to the user 110, for example, by a band or strap 204 that wraps around the wrist 202. In addition, the orientation of the mobile device 102 may vary depending on where it is placed on the user's body and where the user is positioned on his body. For example, the orientation 206 of the mobile device 102 is shown in FIG. 2A. Orientation 206 may, for example, refer to a vector projected from a front edge (e.g., the y-axis shown in fig. 2B) of mobile device 102.
While an example mobile device 102 and example locations of the mobile device 102 are shown, it should be understood that these are merely illustrative examples. In practice, the mobile device 102 may be any portable electronic device for receiving, processing, and/or transmitting data, including but not limited to a cellular phone, a smart phone, a tablet, a wearable computer (e.g., a smart watch), and so forth. For example, the mobile device 102 may be implemented in accordance with the architecture 2500 shown and described with respect to fig. 25. Additionally, in practice, the mobile device 102 may be positioned on other locations of the user's body (e.g., arms, shoulders, legs, hips, head, abdomen, hands, feet, or any other location).
As user 110 walks around in his daily life (e.g., walks, runs, sits, lies, engages in athletic or physical activity, or any other physical activity) with mobile device 102 on his body, mobile device 102 collects sensor data regarding the motion of user 110. For example, using the motion sensor 2510 (e.g., one or more accelerometers) shown in fig. 25, the mobile device 102 may measure the acceleration experienced by the motion sensor 2510 and, accordingly, the acceleration experienced by the mobile device 102. Additionally, using a motion sensor 2510 (e.g., one or more compasses or gyroscopes), the mobile device 102 may measure the orientation of the motion sensor 2510 and, accordingly, the orientation of the mobile device 102. In some cases, the motion sensor 2510 may collect data continuously or periodically over a period of time or in response to a triggering event. In some cases, the motion sensor 2510 may collect motion data relative to one or more particular directions relative to the orientation of the mobile device 102. For example, the motion sensor 2510 can collect sensor data regarding acceleration of the mobile device 102 relative to an x-axis (e.g., a vector protruding from a lateral edge of the mobile device 102, as shown in fig. 2B), a y-axis (e.g., a vector protruding from a front edge of the mobile device 102, as shown in fig. 2B), and/or a z-axis (e.g., a vector protruding from a top surface or screen of the mobile device 102, as shown in fig. 2B), where the x-axis, y-axis, and z-axis refer to cartesian coordinates (e.g., a "body" reference frame) fixed into a reference frame of the mobile device 102.
For example, as shown in fig. 3, as the user 110 moves, the mobile device 102 may continuously or periodically collect sensor data regarding the acceleration experienced by the motion sensor 2510 with respect to the y-axis over a period of time using the motion sensor 2510. The resulting sensor data may be presented in the form of a time-varying acceleration signal 300. In some cases, the acceleration system 300 may use a motion sensor 2510 with a sampling bandwidth of 200Hz to acquire acceleration samples at a sample frequency of 800 Hz. In practice, other sampling frequencies and/or sampling bandwidths are possible.
In the above example, the acceleration signal 300 is indicative of the acceleration experienced by the mobile device 102 relative to the y-axis of the mobile device. In some cases, the acceleration signal 300 may also be indicative of accelerations experienced by the mobile device 102 with respect to a plurality of different directions. For example, the acceleration signal 300 may include an x-component, a y-component, and a z-component, which refer to accelerations experienced by the mobile device 102 relative to the x-axis, y-axis, and z-axis, respectively, of the mobile device 102. Each component may also be referred to as a channel of the acceleration signal (e.g., "x-channel," "y-channel," and "z-channel").
The mobile device 102 can analyze the acceleration signal 300 to determine whether the user has fallen. For example, if the user has fallen, the mobile device 102 may experience a relatively strong impact (e.g., when the user's body impacts the ground). Such impacts may be identified based on a magnitude of acceleration experienced by the mobile device 102 (e.g., a rate of change of velocity of the mobile device), a magnitude of jitter experienced by the mobile device (e.g., a rate of change of acceleration of the mobile device), and an oscillating behavior of acceleration experienced by the mobile device 102. Each of these parameters may be determined using the acceleration signal 300.
For example, for each channel of the acceleration signal, the following relationship may be used to determine the magnitude of the acceleration experienced by the mobile device 102:
mag=max(abs(a(n))),
where mag is the magnitude of the acceleration of the channel, a (n) is the nth sample of the acceleration signal 300 for the channel, and "max" is the sliding window n of samples of the acceleration signal 300windowThe maximum value calculated in (1). In some cases, nwindowMay correspond to the number of samples spanning an interval of 0.2 seconds or about 0.2 seconds. For example, if the sampling frequency of the acceleration signal 300 is 800Hz, nwindowMay be 160. In fact, nwindowOther values of (a) are also possible.
Alternatively, for each channel of the acceleration signal, the following relationship may be used to determine the magnitude of the acceleration experienced by the mobile device 102:
mag=max(a(n))-min(a(n)),
where mag is the magnitude of the acceleration of the channel, a (n) is the nth sample of the acceleration signal 300 for the channel, and "max" is the sliding window n over the sampleswindowInner calculated maximum, "min" is the window n of samples in the acceleration signal 300windowMinimum value of inner calculation. As noted above, in some cases, nwindowMay correspond to the number of samples spanning a time interval of 0.2 seconds or about 0.2 seconds, but in practice nwindowOther values of (a) are also possible.
If the acceleration signal 300 includes acceleration measurements relative to a single direction (e.g., having a single channel, such as a y-channel), the above-described relationship may be used to determine the magnitude of the acceleration relative to that direction. The resulting value represents the magnitude of the acceleration signal 300. Alternatively, from a window of interest (e.g., n)window) The total energy of all three channels within can be used as the total magnitude of acceleration. For example, one concept of total energy may be calculated as:
mag=sqrt(max(|x|)2+max(|y|)2+max(|z|)2)。
if the acceleration signal 300 includes acceleration measurements relative to multiple directions (e.g., having multiple channels, such as an x-channel, a y-channel, and a z-channel), the magnitude of the acceleration relative to each direction may be determined one by one using the above relationships, resulting in three individual magnitudes (corresponding to three channels, respectively). The largest magnitude value may be selected to represent the magnitude of the acceleration signal 300.
As another example, for each channel of acceleration, the following relationship may be used to determine the magnitude of the jitter experienced by the mobile device 102:
where jerk is the magnitude of the channel jitter, a (n) is the nth sample of the acceleration signal 300 for the channel, and "Δ T" is the sampling interval (e.g., the inverse of the sampling frequency) of the acceleration signal 300.
If the acceleration signal 300 includes acceleration measurements relative to a single direction (e.g., has a single channel, such as a y-channel), the relationship described above can be used to determine the magnitude of jitter relative to that direction. The resulting value represents the magnitude of the jitter of the acceleration signal 300.
If the acceleration signal 300 includes acceleration measurements relative to multiple directions (e.g., having multiple channels, such as an x-channel, a y-channel, and a z-channel), the magnitude of the jitter relative to each direction may be determined one by one using the above relationships, resulting in three individual magnitudes (corresponding to three channels, respectively). The largest magnitude value may be selected to represent the magnitude of the jitter of the acceleration signal 300.
As another example, the oscillatory behavior of the acceleration experienced by the mobile device 102 may be determined by identifying a "third zero crossing" of the acceleration signal 300. The third zero-crossing point refers to a point in time at which the acceleration signal 300 changes from a negative value to a positive value for the third time in a certain oscillation period or vice versa (e.g., the third "pass" through zero). For example, within the window shown in FIG. 3, a third zero crossing of the acceleration signal 300 occurs at point 302. The time between the maximum value (e.g., point 304) of the acceleration signal 300 and the third zero-crossing 302 after the maximum value is referred to as the time to the third zero-crossing 306. This value can be used as an estimate of the oscillation period of the acceleration signal 300 and can be used to estimate the oscillation frequency of the acceleration signal 300. This may be useful, for example, because the impact response of a component of the motion sensor (e.g., a microelectromechanical system [ MEMS ] component) may be similar to the impact response of a second-order under-damped system. Thus, the period or frequency of the oscillation may be used as an approximation of the response.
The magnitude of the acceleration experienced by the mobile device 102, the magnitude of the jitter experienced by the mobile device 102, and the oscillating behavior of the acceleration experienced by the mobile device 102 (e.g., the third zero crossing of the acceleration signal 300) may be used in conjunction to identify an impact that may indicate that the user has fallen. In some cases, such a determination may be made using a statistical model or a probabilistic model (e.g., a bayesian model).
For example, fig. 4 shows a heat map 400 of athletic data collected from a sample population of users as they perform daily activities (e.g., "active daily living"). The movement data does not include any movement data caused by a fall of the user. The heat map 400 compares (i) the magnitude of acceleration experienced by the mobile device (x-axis), and (ii) the magnitude of jitter experienced by the mobile device for the sample group (y-axis). In addition, the distribution of motion data relative to each axis in the heat map 400 is illustrated by the corresponding cumulative data function 402a or 402 b. As shown in fig. 4, the motion data is primarily concentrated in areas corresponding to accelerations of relatively low magnitude combined with jitter of relatively low magnitude.
Fig. 4 also shows a plurality of points 404a and 404b representing motion data collected from a sample group of a fallen user. The circle point 404a represents a "radial dominant fall" and the square point 404b represents a "tangential dominant fall". These terms refer to the primary direction of movement of the mobile device 102 (e.g., the movement of the user's arm and wrist when the user falls) relative to the oscillating movement of the mobile device 102 when the user is walking (e.g., the swinging movement of the user's arm and wrist when the user is walking). As shown in fig. 4, each of these points 404a and 404b is located in a region corresponding to acceleration of a relatively high magnitude in combination with jitter of a relatively high magnitude. Thus, an impact that may indicate a fall of the user may be identified, at least in part, by identifying motion data having a sufficiently high magnitude of acceleration (e.g., greater than a particular threshold acceleration value) in combination with a sufficiently high magnitude of jitter (e.g., greater than a particular threshold jitter value). In practice, the threshold acceleration value and/or threshold dithering value may vary (e.g., based on empirically collected sample data).
As another example, fig. 5 illustrates a heat map 500 of athletic data collected from a sample population of users as they perform daily activities (e.g., "active daily living"). As with the heat map 400 of fig. 4, this motion data also does not include any motion data caused by a user fall. The heat map 500 compares (i) the magnitude of the acceleration experienced by the mobile device (x-axis), and (ii) the time for a third zero-crossing of the acceleration signal acquired by the mobile device for the group of samples (y-axis). In addition, the distribution of motion data relative to each axis in the heat map 500 is shown by the corresponding cumulative data function 502a or 502 b. As shown in fig. 5, the motion data is primarily concentrated in the region corresponding to relatively low magnitude accelerations combined with a relatively high time to the third zero crossing of the acceleration signal.
Fig. 5 also shows a plurality of points 504a and 504b representing motion data collected from a sample population of a fallen user. In a similar manner to fig. 4, the dots 504a represent "radial dominant falls" and the squares 504b represent "tangential dominant falls". As shown in fig. 5, each of these points 504a and 504b is located in a region corresponding to a relatively high magnitude of acceleration in combination with a relatively low time to the third zero crossing of the acceleration signal. Thus, an impact that may indicate a fall of the user may be identified, at least in part, by identifying an acceleration having a sufficiently high magnitude (e.g., greater than a particular threshold acceleration value) in combination with motion data having a sufficiently low time to a third zero-crossing of the acceleration signal (e.g., less than a particular threshold time value). In practice, the threshold acceleration value and/or threshold time value may vary (e.g., based on empirically collected sample data).
As described herein, the mobile device 102 may collect motion data and use the motion data to identify a point in time at which the user experienced an impact. In identifying the time of impact, the mobile device 102 can analyze the motion data acquired during, before, and/or after the impact to determine if the user has fallen and if so, whether the user may need assistance. In some cases, mobile device 102 may facilitate such analysis by continuously collecting and buffering motion data. The motion data may include acceleration signals (e.g., acceleration signals 300) describing accelerations experienced by the mobile device 102 and/or orientation signals (e.g., signals representing orientation measurements acquired using a motion sensor 2510, such as using a gyroscope or compass) describing an orientation of the mobile device 102. For example, the mobile device 102 may maintain portions of the acceleration signal and portions of the orientation signal on an ongoing basis, each portion corresponding to a sliding window of time t. Upon identifying an impact (e.g., using acceleration signals as described above), the mobile device 102 may retrieve portions of the buffered motion data corresponding to measurements taken during, before, and/or after the impact and analyze those portions to determine a condition of the user.
For example, the mobile device 102 may analyze portions of the acceleration signal and portions of the orientation signal corresponding to measurements acquired prior to the identified impact ("pre-impact" information). Based on this analysis, the mobile device 102 can determine whether the impact is likely to correspond to a user fall. In some cases, a statistical model or a probabilistic model may be used to make this determination.
For example, certain types of activities of the user may be more likely to trigger a fall (e.g., walking, running, going up and down stairs, or other such activities). Thus, the activity classifier may be used to identify whether the user is performing one of these types of activities prior to the impact. If so, the likelihood of the user falling may be higher. Otherwise, the likelihood of the user falling may be lower.
For example, the activity classifier may be implemented by collecting motion data (e.g., acceleration signals and/or orientation signals describing the motion of each individual of the sample population) from the sample population. Additionally, information about the physical activity being performed by the group of samples (e.g., an indication of the physical activity each individual is performing while measuring the athletic data, such as walking, running, cycling, golfing, boxing, skiing, basketball, or any other activity) may be collected. This information may be obtained, for example, from server computer system 104 (e.g., a server computer system used to collect and aggregate data from a plurality of different mobile devices). Based on this information, a particular characteristic or pattern may be determined for each different type of physical activity. Thus, when athletic data is collected from an additional user, the athletic data may be compared to previously collected athletic data to classify the user's activities.
Additionally, activity classifiers may be implemented using various techniques. For example, in some cases, activity classifiers can be implemented using "machine learning" techniques, such as decision tree learning, association rule learning, artificial neural networks, deep learning, inductive logic programming, support vector machine clustering, bayesian networks, reinforcement learning, representation learning, similarity and metric learning, sparse dictionary learning, genetic algorithms, rule-based machine learning, learning classifier systems, and so forth.
As another example, the user's historical activity level may indicate whether the user is more likely to fall and needs help. For example, a highly active historically (e.g., historically exhibiting frequent and violent movements) moving user may be at a lower risk of falling and needing assistance. However, frail users that have historically been inactive (e.g., historically exhibit infrequent and slight movements) may be at a higher risk of falling and needing help. Thus, the activity level classifier may be used to identify a historical activity level of the user, and may adjust the likelihood that the user has fallen and needs help based on the classification.
For example, the activity level classifier may also be implemented by collecting motion data (e.g., acceleration signals and/or orientation signals describing the motion of each individual of the sample population) from the sample population. Additionally, information regarding activity levels of the sample population (e.g., indications of each individual's exercise, physical health, and/or activity ability when measuring exercise data) may be collected. This information may be obtained, for example, from server computer system 104 (e.g., a server computer system used to collect and aggregate data from a plurality of different mobile devices). Based on this information, a particular characteristic or pattern may be determined for each different type of activity level. Thus, when athletic data is collected from an additional user, the athletic data may be compared to previously collected athletic data to classify the activity level of the user.
In addition, activity level classifiers may also be implemented using various techniques. For example, in some cases, activity level classifiers can be implemented using "machine learning" techniques, such as decision tree learning, association rule learning, artificial neural networks, deep learning, inductive logic programming, support vector machine clustering, bayesian networks, reinforcement learning, representation learning, similarity and metric learning, sparse dictionary learning, genetic algorithms, rule-based machine learning, learning classifier systems, and so forth.
In some cases, the statistical or probabilistic model may consider additional information about the user, such as the user's age, the user's walking speed, the number of steps the user takes per day, the number of calories the user consumes per day, the user's maximum effort during the day, the user's gender, the user's vascular health (e.g., quantifying usage such as the user's maximum oxygen uptake (VO)2max) or other characteristics of the user. The likelihood of the user falling may increase or decrease based on each of these characteristics. For example, older users may be at a higher risk of falling and needing assistance, while younger users may be at a lower risk of falling and needing assistance. In addition, a user with a lower maximum walking speed may be at a higher risk of falling and needing help, while a user with a higher maximum walking speed may be at a lower risk of falling and needing help. In addition, there is less vascular health (e.g., lower VO) 2max) may be at higher risk of falling and requiring assistance, while having higher vascular health (e.g., higher VO)2max) may be at a lower risk of falling and requiring assistance. VO (vacuum vapor volume)2max may be measured, for example, by measuring the ventilation of the user and performing physical activity on the user (e.g., on a graded surface, treadmill, pressure gauge, etc.)Walking on foot) of the patient. In some implementations, at least some of these measurements can be acquired using air from one or more sensors included in and/or communicatively coupled to the mobile device.
As another example, the gender of the user corresponds to a higher or lower risk of falling and needing assistance.
In addition, a plurality of different characteristics may be used in combination to determine a fall risk for a user. For example, the statistical or probabilistic model may include a multi-dimensional risk assessment model (e.g., a multi-dimensional risk matrix) where each dimension corresponds to a different characteristic of the user and its contribution to the overall risk of the user. Information about each user may be collected from server computer system 104 (e.g., a server computer system used to collect and aggregate data from multiple different mobile devices).
In some cases, the mobile device 102 may account for other types of movement performed by the user prior to the identified impact. As another example, when a user falls, the user often attempts to prop himself up from the ground. Thus, the mobile device 102 may determine whether the user is performing a supporting motion with his arm prior to the identified impact. If so, the likelihood of the user falling may be higher. Otherwise, the likelihood of the user falling may be lower.
When performing the support motion, the user's hand, and correspondingly the mobile device 102, may accelerate towards the ground for a period of time. Thus, the mobile device 102 may observe a positive acceleration in the inertial z-direction (e.g., a direction perpendicular to the ground or a direction of gravity). By measuring the acceleration in the inertial z-direction, the mobile device 102 can determine the duration of the fall. For example, the duration of a fall may be estimated as the period of time during which the acceleration signal in the inertial z-direction is above a certain threshold.
As another example, when a user falls, he often swings his arms. For example, if a user slips, stumbles, or falls while walking, the user may move his arms irregularly or disorganized before hitting the ground in an attempt to balance himself. The mobile device 102 may then determine whether the user is swinging their arm prior to the identified impact. If so, the likelihood of the user falling may be higher. Otherwise, the likelihood of the user falling may be lower.
The oscillatory motion may be detected, in part, by estimating a "pose angle" or orientation of the mobile device relative to an inertial frame (e.g., the earth frame). This may be determined, for example, using both an acceleration signal (e.g., to identify a direction of gravity or an inertial z-direction) and an orientation signal (e.g., to determine an orientation of the device relative to the direction of gravity). Using this information, changes in pose angle of the mobile device 102 over time may also be determined (e.g., over a sample window such as n)windowThe largest difference in inner pose angles). A relatively large change in the pose angle over a period of time (e.g., a large change in the orientation of the mobile device 102, and correspondingly, the orientation of the user's wrist and arm) may indicate that the user is more likely performing a wiggle motion.
In some cases, the amount of "clutter" in the user's motion may be determined by acquiring acceleration signals corresponding to the user's motion during a time period and determining the path length of the acceleration signals. When the user moves in an irregular manner (e.g., performs a wiggle motion), the acceleration signal will exhibit a higher degree of variation during this period of time. Therefore, the path length of the acceleration signal will be longer. In contrast, when the user movement irregularity degree is low, the acceleration signal will show a small degree of variation during the period. Therefore, the path length of the acceleration signal will be shorter.
In some cases, the path length of the acceleration signal may be calculated using the following equation:
(path length of acceleration signal) ═ Σ | an-an-1|
Where an is the nth sample in the acceleration signal. The path length may be determined by summing samples taken within a sliding window (e.g., a 1.4 second window around a fall or impact). In some cases, other techniques may be used to determine the clutter (e.g., by measuring the entropy of the acceleration signal).
This "pre-impact" information can be used in conjunction to determine whether the user may have fallen. For example, fig. 6 shows a heat map 600 of athletic data collected from a sample population of users as they perform daily activities (e.g., "active daily living"). The movement data does not include any movement data caused by a fall of the user. The heat map 600 compares (i) changes in the pose angle of the mobile device (x-axis), and (ii) the fall duration of the mobile device for the sample population (y-axis). In addition, the distribution of motion data relative to each axis in the heat map 600 is illustrated by the corresponding cumulative data function 602a or 602 b. As shown in fig. 6, the motion data is primarily concentrated in areas corresponding to relatively low attitude angle changes in combination with relatively low fall durations.
Fig. 6 also shows a plurality of points 604 representing motion data collected from a sample population of users who have fallen, where the motion data is collected immediately before the user falls. As shown in fig. 6, each of these points 604 is located in an area corresponding to a relatively high change in posture angle in combination with a relatively high fall duration. Thus, a user fall may be identified, at least in part, by identifying motion data (collected immediately prior to an identified impact) having a sufficiently large change in posture angle (e.g., greater than a particular threshold angle) in combination with a sufficiently long fall duration (e.g., greater than a particular threshold amount of time). In practice, the threshold angle and/or the threshold amount of time may vary (e.g., based on empirically collected sample data).
The mobile device 102 may also analyze portions of the acceleration signal and portions of the orientation signal corresponding to measurements acquired after the identified impact ("post-impact" information). Based on the analysis, the mobile device 102 may determine whether the user may need assistance. In some cases, a statistical model or a probabilistic model may be used to make this determination.
For example, if a user falls and is injured or in distress from the fall, the user may exhibit signs of trauma and/or athletic injury. The indication of trauma and/or motor impairment may be identified based on the acceleration signal and/or the orientation signal. For example, the mobile device 102 may analyze portions of the acceleration signal and portions of the orientation signal corresponding to measurements acquired after the identified impact. Based on the analysis, mobile device 102 may determine whether the impact may correspond to a sign of trauma and/or a motor impairment.
The analysis may be performed, for example, using an activity classifier. For example, an activity classifier can be implemented by collecting motion data (e.g., acceleration signals and/or orientation signals describing the motion of each individual of the sample population after the sample population falls) from the sample population after the sample population falls. Additionally, information about the condition of each individual after a fall may be collected (e.g., indications that certain individuals exhibit signs of trauma after a fall, indications that certain individuals exhibit athletic injury after a fall, etc.). This information may be obtained, for example, from server computer system 104 (e.g., a server computer system used to collect and aggregate data from a plurality of different mobile devices). Based on this information, a particular characteristic or pattern may be determined for each of the different types of conditions (e.g., signs of trauma, signs of non-trauma, motor impairment, no motor impairment, etc.). Thus, when collecting motion data from an additional user, the motion data may be compared to previously collected motion data to classify the user's condition after a fall.
In a manner similar to that described above, the activity classifier can be implemented using various techniques (e.g., decision tree learning, association rule learning, artificial neural networks, deep learning, inductive logic programming, support vector machine clustering, bayesian networks, reinforcement learning, representation learning, similarity and metric learning, sparse dictionary learning, genetic algorithms, rule-based machine learning, learning classifier systems, and so forth).
The mobile device 102 can also determine the condition of the user after a fall by identifying one or more motion-based metrics. For example, based on the acceleration signal and/or the orientation signal, the mobile device 102 can determine whether the user has taken steps after a fall (e.g., by identifying trends or patterns in the signal that are indicative of the user walking, and/or using sensors such as pedometers). If so, the likelihood that the user will need help is low. Otherwise, the likelihood of the user needing assistance is high.
As another example, based on the acceleration signal and/or the orientation signal, the mobile device 102 can determine whether the user has stood after falling (e.g., by identifying trends or patterns in the signal that represent the user rising from the ground and standing). If so, the likelihood that the user will need help is low. Otherwise, the likelihood of the user needing assistance is high.
As another example, mobile device 102 can determine a standard deviation of the acceleration vector and a duration of time for which the magnitude is below a threshold. If the magnitude of the acceleration vector is below the threshold for a certain period of time, this may indicate that the user is stationary (e.g., does not move his body). Additionally, if the user is stationary after his or her fall, this may indicate that the user is syncope or injured. Therefore, the likelihood that the user needs help is high. However, if the standard deviation and magnitude of the acceleration vector exceeds the threshold for that time period, this may indicate that the user moved their body after their fall (e.g., the user was not stationary). Therefore, the likelihood that the user needs help is low.
For example, as shown in fig. 7, the mobile device 102 may acquire a "raw" acceleration signal 700 (e.g., in a manner similar to that described above with respect to the acceleration signal 300). In the example shown in fig. 7, the acceleration signal 700 contains three channels, each referenced to accelerations measured according to different directions (e.g., x-channel, y-channel, and z-channel) relative to the mobile device 102. Additionally, the acceleration signal 700 exhibits a sharp increase 702 indicative of an impact.
Additionally, the mobile device 102 can calculate a Vector Magnitude (VM) of the acceleration signal 700 to obtain a vector magnitude signal 704. In practice, when the mobile device 102 is static (e.g., not moving), the vector magnitude signal 704 will be 1 (or about 1). As the mobile device 102 moves, the vector magnitude signal 704 will deviate from 1. Thus, the relationship | VM-1| can be used as a measure of quiesce. Similarly, the standard deviation of the acceleration signal can also be used as a measure of the standstill.
Additionally, the mobile device 102 can identify a time period 706 after the impact during which the magnitude of the normalized acceleration signal 704 is less than a particular threshold (indicated by the black rectangle). In some cases, the threshold may be 0.025 g. However, in practice, this threshold may vary from implementation to implementation.
Additionally, the mobile device 102 can sum each of the time periods 706 to determine a cumulative duration 708 for which the magnitude of the normalized acceleration signal 704 is below the threshold. If the cumulative duration 708 is relatively large, the likelihood that the user will need assistance is high. However, if the cumulative duration 708 is relatively small, the likelihood that the user will need assistance is low.
As another example, based on the acceleration signal and/or the orientation signal, mobile device 102 may determine a pose angle of mobile device 102. The pose angle may be determined, for example, based on an acceleration signal (e.g., a filtered acceleration signal) and/or a motion orientation of the mobile device 102 (e.g., information derived from a fusion of acceleration data and gyroscope data). The filtered acceleration signal may be, for example, an acceleration signal having one or more channels (e.g., x-channel, y-channel, and/or z-channel) in which high frequency content is removed (e.g., greater than a particular threshold frequency). In some cases, the removed high frequency content may correspond to machine-induced motion (e.g., bus, train, etc.). Based on these information sources, the mobile device 102 may determine a direction in which the wearer's forearm is pointing (e.g., approximately the pose angle of the mobile device 102 in the x-direction).
If the wearer's forearm is pointed toward the horizon for a longer period of time (e.g., greater than a threshold amount of time) and there is no movement below or above the horizon, the user may be more likely to be injured. For example, this may indicate that the user is lying on the floor and not moving his arms up or down. However, if the user's forearms are repeatedly moved above and below the horizontal line in relatively large motions (e.g., 45 °), this may indicate that the user is less injured, especially if the user is determined to be walking up and down stairs.
As another example, based on the acceleration signal and/or the orientation signal, the mobile device 102 can determine a number of times the acceleration signal (e.g., the filtered acceleration signal) crosses a given threshold T. This metric may be referred to as a threshold crossing number. As mentioned above, the filtered acceleration signal may be, for example, an acceleration signal having one or more channels and from which high frequency content has been removed. Symmetric crossing requires that the acceleration signal both rise above T and fall below (-T) (or vice versa), while asymmetric crossing counts the number of times the signal changes above ± T, regardless of whether it then crosses ± T.
These threshold crossings represent the movement of the person. For example, a user walking will typically generate a symmetric threshold crossing, while a user touching something or rotating the wrist will typically result in an asymmetric crossing, and so on. By counting the threshold number of crossings, the mobile device 102 can determine whether the individual is likely to be injured. For example, the greater the threshold number of crossings, the less likely the user is to be injured.
In some cases, this "post-impact" information may be used in conjunction to determine whether the user may need help. For example, the mobile device 102 may determine the condition of the user after his or her fall based on sample data collected from a group of samples. For example, the mobile device may collect motion data (e.g., acceleration signals and/or orientation signals) from a group of samples. Additionally, information about the condition of each individual may be collected (e.g., an indication that certain individuals exhibit signs of trauma when measuring athletic data, an indication that certain individuals exhibit athletic impairment when measuring athletic data, etc.). This information may be obtained, for example, from server computer system 104 (e.g., a server computer system used to collect and aggregate data from a plurality of different mobile devices).
Using this information, one or more correlations may be determined between characteristics of the user's movement and the user's condition. For example, based on sample data collected from a population of samples, a correlation may be determined between one or more particular characteristics of user movement corresponding to a user exhibiting evidence of trauma. Thus, if mobile device 102 determines that the user's motion exhibits similar characteristics, mobile device 102 may determine that the user also exhibits signs of trauma. As another example, based on sample data collected from a group of samples, a correlation may be determined between one or more particular characteristics of user movement corresponding to a user exhibiting athletic impairment. Thus, if mobile device 102 determines that the user's motion exhibits similar characteristics, mobile device 102 may determine that the user also exhibits motion impairment.
These correlations may be determined using various techniques. For example, in some cases, these correlations may be determined by using "machine learning" techniques such as decision tree learning, association rule learning, artificial neural networks, deep learning, inductive logic programming, support vector machine clustering, bayesian networks, reinforcement learning, representation learning, similarity and metric learning, sparse dictionary learning, genetic algorithms, rule-based machine learning, learning classifier systems, and the like.
For example, fig. 8 illustrates a decision tree 800 generated using sample data collected from a sample group (e.g., "trained" using the sample data). In this example, the sample data includes "positive control" information collected from a sleeping user (e.g., simulating the behavior of an individual exhibiting signs of trauma or motor impairment). The sample data includes "negative control" information collected from an unimpaired user performing daily physical activity.
As shown in fig. 8, certain combinations of characteristics indicate that the user exhibits signs of trauma and/or athletic injury. For example, if (i) the duration of time during which the magnitude of the acceleration signal is very low (parameter "duration _ low _ vm") is between 48.1 seconds and 51.2 seconds, and (ii) the number of steps taken by the user is less than 0.5 steps (parameter "steps"), it may be determined that the user is exhibiting signs of trauma and/or motor impairment.
As another example, if (i) the duration of time for which the magnitude of the acceleration signal is low (parameter "duration _ low _ vm") is between 51.2 seconds and 58.9 seconds, (ii) the user takes fewer steps than 0.5 steps (parameter "steps"), and (iii) the user stands less than 34.7 seconds (parameter "standing"), it may be determined that the user is exhibiting signs of trauma and/or athletic injury. In the example shown in fig. 8, the duration of time the user stands during the post-fall period (the parameter "standing") is indicated by an integer between 0 and 600, which refers to the duration (in tenths of a second) within a total sample window of 60 seconds. For example, if the "standing" value is 347, this indicates that the user has stood 34.7 seconds after falling within a sample window of 60 seconds. In this example, this particular branch indicates that there may be "sleep" (positive control) if the user is standing for between 0 and 34.7 seconds (e.g., the user is standing for less than or equal to about half of the post-fall period of time). In practice, this reflects the expectation that the user will not stand after a fall.
As another example, if (i) the duration of time for which the magnitude of the acceleration signal is low ("duration _ low _ vm") is between 51.2 seconds and 58.9 seconds, and (ii) the number of steps taken by the user is greater than 0.5 steps ("steps"), it may be determined that the user does not exhibit signs of trauma and/or athletic damage. Although example decision branches and parameter values are shown in fig. 8, these are merely illustrative examples. In practice, the decision branches and/or parameter values of the decision tree may vary depending on the implementation.
In the above example, the mobile device 102 determines whether the user may need assistance based on the acceleration signal and the orientation signal. However, the mobile device 102 is not limited to using only these types of information. For example, in some cases, the mobile device 102 may consider additional information, such as a signal describing a heart rate of the user. For example, the user's heart rate deviates from a particular range (e.g., a "normal" or "healthy" range), the mobile device 102 may determine that the user is more likely to need assistance.
In some cases, "pre-impact" information, impact information, and "post-impact" information may be used in conjunction to determine whether the user has fallen and may need help. In some cases, a decision tree may be used to make this determination.
For example, fig. 9 shows a decision tree 900 for determining whether (i) the user has fallen and may need help (indicated as "fallen" in the decision tree 900) or (ii) the user has not fallen, or has fallen but does not need help (indicated as "not fallen" in the decision tree 900). As shown in fig. 9, some combination of characteristics indicates each result.
For example, if the user experienced a high impact (e.g., greater than a first threshold) and exhibited signs of trauma and/or athletic injury after the impact, it may be determined that the user has fallen and may need help. As another example, if the user experienced a moderate impact (e.g., less than a first threshold, but greater than a second threshold less than the first threshold), was performing some activity (e.g., walking, climbing stairs, etc.) prior to the impact, was performing a swinging or support motion prior to the impact, and exhibited signs of trauma and/or athletic injury after the impact, it may be determined that the user has fallen and may need help. Otherwise, it may be determined that the user did not fall, or did fall but did not require help. Although example decision branches are shown in fig. 9, these are merely illustrative examples. In practice, the decision branches of the decision tree may vary depending on the implementation.
In some cases, the fall detector can be a bayesian classifier. For example, the posterior probability of a fall can be calculated given a set of features:
p(,all|,1,,2,,3,…)
=p(,1|,all)p(,2|,all)p(,3|,all)…p(,all)/p(,1,,2,,3,…),
where f1, f2, f 3.. is a characteristic calculated from the acceleration signal and the orientation signal (e.g., magnitude of impact, jitter, time to third zero-crossing, pre-impact activity, presence of support or oscillation, and/or post-impact characteristics determined from the sample population), and p (fall) is a prior probability of a fall, which may be determined based on age, gender, and/or other biometric information from the sample population. Similarly, the posterior probability of an impact that is not a fall can be calculated as:
(ADL|f1,f2,f3,…)
=p(f1|2DL)p(f2|2DL)p(f3|2DL)…p(2DL)/p(f1,f2,f3,…),
where ADL denotes activities of daily living that do not include any instances of a user's fall. When the ratio p (fall | f1, f2, f3, …)/p (2DL | f1, f2, f3, …) is greater than a threshold, the mobile device 102 may determine that a fall has occurred. In practice, the threshold may vary depending on the implementation.
Other types of information may also be used to determine whether a user has fallen. For example, as discussed above, erratic motion over a period of time may indicate a fall to a greater extent (e.g., a user may swing their arm when falling). However, periodic motion over a period of time may indicate to a greater extent intentional movement of the user (e.g., periodic movement of an arm while shaking hands, shredding food, etc.). Thus, the periodicity of the user's motion can be used to determine whether the user has fallen.
In some cases, the periodicity of the user motion may be determined based on a first periodicity metric corresponding to a Fast Fourier Transform (FFT) of coherent energy of the acceleration signal within a time window (e.g., 10 seconds) after the detected impact. The coherence energy is the sum of peaks in the spectrum whose mass is above a certain threshold. A larger first periodicity metric may indicate a larger movement periodicity, which may correspond to a lower likelihood that the user has fallen. A lower first periodicity metric may indicate a smaller movement periodicity, which may correspond to a higher likelihood that the user has fallen.
In some cases, the periodicity of the user motion may be determined based on a second periodicity metric corresponding to a quartering-spacing (IQR) of peak-to-peak intervals of the acceleration signal within a time window (e.g., 10 seconds) after the detected impact. This may be calculated by identifying all peaks around the detected impact that are greater than a certain threshold, and then calculating the IQR of the spacing between adjacent peaks. If the repeated peak intervals are equal (or substantially equal), then the IQR will be small, indicating periodic movement. Thus, a lower second periodicity metric may indicate a greater periodicity of movement, which may correspond to a lower likelihood that the user has fallen. In contrast, a larger second periodicity metric may indicate a smaller movement periodicity, which may correspond to a higher likelihood that the user has fallen.
In some cases, the mobile device 102 can be used to distinguish between different types of falls. For example, the mobile device 102 can distinguish between tripping (e.g., a user tripping forward, such as falling when their foot is captured by an obstacle), slipping (e.g., a user falling backward, such as falling out of balance on a slippery surface), falling (e.g., a user rotating or rolling about an axis of rotation, such as falling when the user rolls down a mountain or stairs), and/or other types of falls. This may be useful, for example, because different types of falls may have different effects on the user (e.g., some may be more harmful to the user, while other types may be less harmful to the user). Thus, the mobile device 102 may take more specific actions in response to the particular nature of the user's fall.
As described herein, the mobile device can determine whether the user has fallen based at least in part on a "pose angle" or an orientation of the mobile device relative to an inertial frame (e.g., the earth frame). In some cases, this may be determined using information acquired by an accelerometer (e.g., determined by using an acceleration signal generated by the accelerometer to identify a direction of gravity or an inertial z-direction) and/or information from an orientation sensor (such as a gyroscope) (e.g., determined by using an orientation signal generated by the gyroscope to determine an orientation of the device relative to the direction of gravity).
In some cases, the mobile device can distinguish between tripping and slipping based on changes in the attitude angle of the mobile device before and/or after an impact, and determining whether the measured attitude angle change indicates a trip or a fall. For example, when a user trips, he often moves his arms down to support against the bump. Thus, if the mobile device is attached to the user's wrist, it may experience a negative change in pose angle prior to impact. Additionally, the attitude angle may be directed toward the ground when falling. As another example, when a user slips down, he often swings his arms upward in an attempt to restore his balance. Thus, if the mobile device is attached to the user's wrist, it may experience a positive change in the pose angle prior to impact.
These characteristics can be used to distinguish between tripping and slipping. For example, fig. 10 shows a scatter plot 1000 of pose angle data collected from a sample group of users who have experienced different types of falls using a mobile device attached to each user's wrist. Each point 1002 indicates a particular type of fall (e.g., a trip, slip, or other type of fall), a corresponding change in attitude angle (e.g., in degrees) of the mobile device prior to impact (shown on the x-axis), and a corresponding change in attitude angle of the mobile device after impact (shown on the y-axis). The sign of the change in attitude angle may indicate the direction of motion. For example, a positive pose angular change may indicate that the final pose angle of the mobile device is higher than the initial pose angle during the considered time window. Likewise, a negative value of the change in attitude angle may indicate that the final attitude angle of the mobile device is lower than the initial attitude angle. As shown in fig. 10, when the user experiences a trip, the mobile device moves in a downward direction prior to impact because the user is preparing to support and then moves upward after impacting the surface. Thus, tripping often has a negative attitude angle change before impact and a positive attitude angle change after impact. In contrast, when the user experiences a slip, the mobile device moves in an upward direction because the user swings their arm before the impact and then moves downward after the impact. Thus, slips often have positive attitude angle changes prior to impact and negative attitude angle changes after impact. The mobile device can then distinguish between tripping and slipping based on changes in the attitude angle of the mobile device before and/or after the impact, and determining whether the measured attitude angle change indicates a trip or a fall. In addition, if the magnitude of the change in attitude angle in either direction is not sufficient to convincingly indicate a stumbling or flip-like fall behaviour, the fall may be classified as an inclusion of "other" falls. This may be the case, for example, in the event of a fall where the user is syncope due to dehydration, in which case there may be no apparent swinging or support motion prior to the impact when the user is unconscious.
In some cases, the system may determine whether the user has fallen (e.g., fallen in such a way that the user has rotated or rolled about an axis of rotation). This may be beneficial, for example, because a fall may be particularly harmful to the user, and may correspond to a higher likelihood that the user needs help. Example falls include a user falling and rolling down a staircase or hill, a user falling with chives (e.g., partially or fully tumbling), a user "rolling" or another fall involving some degree of rolling or rotation.
As described above, the user 110 may position the mobile device 102 on his body and move around in his daily life (e.g., walk, run, sit, lie, engage in athletic or physical activities, or any other physical activity). During this time, the mobile device 102 collects sensor data regarding the motion of the user 110. For example, using the motion sensor 2510 (e.g., one or more accelerometers) shown in fig. 25, the mobile device 102 may measure the acceleration experienced by the motion sensor 2510 and, accordingly, the acceleration experienced by the mobile device 102. Additionally, using a motion sensor 2510 (e.g., one or more compasses or gyroscopes), the mobile device 102 may measure an orientation of the mobile device 102. In some cases, the motion sensor 2510 may collect data continuously or periodically over a period of time or in response to a triggering event. In addition, the mobile device 102 can determine an acceleration and/or orientation relative to a reference frame (e.g., a body frame) fixed to the mobile device 102 and/or relative to an inertial frame (e.g., an earth frame).
The mobile device 102 can continuously measure the acceleration and orientation of the mobile device within the sliding sample window (e.g., to generate a continuous sample buffer). The acceleration signal may be used to identify a direction of gravity (or inertial z-direction), and the orientation signal may be used to determine an orientation of the mobile device 102 relative to the direction of gravity. Using this information, mobile device 102 can determine a pose angle (approximate the orientation of user 110) of mobile device 102. Additionally, using this information, the rate of rotation of the mobile device 102 (which approximates the rate of rotation of the user 110 relative to the body frame and inertial frame) can be determined relative to the body frame and inertial frame.
For example, graph 1100 in FIG. 11A shows an acceleration signal 1102 acquired within a sliding sample window spanning approximately 2.5 seconds to 7.5 seconds, and graph 1104 in FIG. 11B shows a corresponding rotation rate signal 1106 relative to the inertial frame within the same sliding sample window. In this example, the user has dropped a flight of stairs at the approximately 3 second mark, contacted the stairs and/or railing approximately between the 3 second mark and the 6.5 second mark, and began rolling down the stairs at the 6.5 second mark. Graph 1100 includes three different acceleration signals 1102, each corresponding to a different direction (e.g., x-direction, y-direction, and z-direction in a cartesian coordinate system in a reference or body system fixed to mobile device 102). Similarly, graph 1104 includes three different rotation rate signals 1106, each corresponding to a different direction (e.g., the x-direction, y-direction, and z-direction in a cartesian coordinate system fixed to the inertial system). Although a sliding sample window having an example length is shown, in practice, the sliding sample window may have any length (e.g., 1 second, 2 seconds, 3 seconds, 4 seconds, or any other length of time). In addition, different reference systems may be used in addition to the cartesian coordinate system. For example, in some cases, a four-element coordinate system may be used.
Additionally, using this information, mobile device 102 can determine one or more instantaneous axes of rotation of mobile device 102 within the sliding sample window, an average axis of rotation of the mobile device within the sliding sample window, and a degree of uncertainty (e.g., a variance value, a standard deviation value, or other uncertainty metric) associated with the average axis of rotation. For example, graph 1108 in fig. 11C shows a rotation axis signal 1110 that indicates the instantaneous rotation axis of the mobile device 102 at any given point in time in the sliding sample window relative to the inertial frame. An angle of 0 ° indicates that the mobile device 102 is rotating along an axis parallel to the direction of gravity at a particular point in time, while an angle of 90 ° indicates that the mobile device is rotating along an axis perpendicular to the direction of gravity at a particular point in time. The signal can be used to determine a mean axis of rotation of the mobile device within the sliding sample window and a degree of uncertainty associated with the mean axis of rotation (e.g., by averaging values of the signal and determining a variance, standard deviation, or other uncertainty metric based on the values of the signal).
In addition, mobile device 102 can determine whether mobile device 102 is rotating within a sliding sample window about a consistent axis of rotation. For example, if the variation or deviation between one or more instantaneous axes of rotation of mobile device 102 and the average axis of rotation during the sliding sample window is low, mobile device 102 may determine that mobile device 102 is rotating more consistently about a particular axis of rotation. However, if the variation or deviation between one or more instantaneous axes of rotation of mobile device 102 and the average axis of rotation during the sliding sample window is high, mobile device 102 may determine that mobile device 102 is rotating less uniformly about a particular axis of rotation. In some cases, mobile device 102 may determine that it is rotating within the sliding sample window relative to a consistent axis of rotation if the variation or deviation between one or more instantaneous axes of rotation of mobile device 102 and the average axis of rotation during the sliding sample window is below a particular threshold.
If mobile device 102 determines that mobile device 102 is rotating within the sliding sample window relative to a consistent axis of rotation, mobile device 102 can determine an angular displacement of mobile device 102 relative to the sliding sample window. This may be performed, for example, by angularly integrating the rotation rate signal relative to the inertial frame within a sliding sample window. For example, graph 1112 in fig. 11D shows total angular displacement signal 1014 corresponding to total angular displacement of mobile device 102 relative to the inertial frame over a sliding time period. The total angular displacement signal 1014 can be obtained, for example, by integrating one or more of the rotation rate signals 1006 shown in FIG. 11B based on slip.
The mobile device 102 can determine whether the user has tumbled based on the total angular displacement and the instantaneous axis of rotation of the mobile device 102. For example, as shown in fig. 11C, the instantaneous axis of rotation of the mobile device 102 is relatively stable from about the 6.5 second mark (corresponding to the time the user begins to roll down the stairs). In addition, during this time, the axis of rotation is always about 90 °, which indicates that the user is scrolling along the axis of rotation that is substantially perpendicular to the direction of gravity. In addition, as shown in FIG. 11D, the mobile device 102 experiences a relatively large angular displacement over approximately the same period of time. A combination of these characteristics may indicate a rollover.
In some cases, mobile device 102 may determine that the user has fallen if one or more instantaneous axes of rotation are greater than a threshold angular value and/or if the angular displacement is greater than a threshold displacement value. In some cases, mobile device 102 may determine that the user has fallen if one or more instantaneous axes of rotation are relatively large (e.g., greater than a threshold angular value) and/or remain consistent for a threshold period of time (e.g., approximately 90 ° in 1 second, 2 seconds, 3 seconds, or some period of time).
In some cases, the mobile device 102 may also identify different types of rollovers. For example, if one or more instantaneous axes of rotation are about 90 ° relative to the direction of gravity, this may indicate a fall of the user's plant onion or a sideways roll (e.g., a tumbling or tumbling). As another example, if one or more instantaneous axes of rotation are between 0 ° and 90 ° with respect to the direction of gravity, this may represent a fall where the user twists while falling.
In a similar manner as described above, upon identifying that the user has fallen and may need help, mobile device 102 may automatically take appropriate action in response. For example, the mobile device 102 can automatically notify the user or others (e.g., emergency response system, emergency contacts, or others) of such. Similarly, in some cases, upon determining that the user has fallen and may need help, the mobile device 102 may first notify the user before transmitting a message to others (e.g., before transmitting a notification to an emergency response system, emergency contacts, or others).
The implementations described herein are not limited to detecting a user's fall. In some cases, one or more implementations may be used to determine an intentional tumble or spin of a user. For example, the device may use the rotation data to determine whether the user has struggled or rolled (e.g., as part of a sporting activity such as a gymnastics). As another example, the device may use the spin data to determine whether the user has performed a roll turn during swimming (e.g., counting the number of swimming laps the user has performed in a swimming pool). In fact, other applications are possible depending on the implementation.
Upon recognizing that the user has fallen and may need help, the mobile device 102 may automatically take appropriate action in response. For example, the mobile device 102 can determine that the user has fallen and may need help and, in response, automatically notify the user or others of the situation. For example, the mobile device 102 may display a notification to the user to inform the user that he has fallen and may need help. Additionally, the mobile device 102 can transmit a notification to a remote device (e.g., one or more of the server computer system 104 and/or the communication device 106) to notify others of the user's condition. This may include, for example, notifying an emergency response system, a computer system associated with medical personnel, a computer system associated with the user's caregiver, bystanders, etc. The notification may include, for example, audible information (e.g., sound), textual information, graphical information (e.g., images, colors, patterns, etc.), and/or tactile information (e.g., vibrations). In some cases, the notification may be transmitted in the form of an email, an instant chat message, a text message (e.g., a short message service [ SMS ] message), a telephone message, a facsimile message, a radio message, an audio message, a video message, a tactile message (e.g., one or more bumps or vibrations), or another message used to convey information.
In some cases, the mobile device 102 may transmit the message to another system using predetermined contact information. For example, a user of the mobile device 102 may provide contact information about an emergency contact, such as a phone number, an email address, a username in an instant chat service, or some other contact information. The mobile device 102 may generate a message having a compatible data format (e.g., an audio telephony message, a video telephony message, a text message, an email message, a chat message, or some other message) and transmit the message to the emergency contact using the provided contact information (e.g., using one or more of the server computer system 104 and/or the communication device 106).
In some cases, upon determining that the user has fallen and may need help, the mobile device 102 may first notify the user before transmitting a message to others (e.g., before transmitting a notification to an emergency response system, emergency contacts, or others). In response, the user may instruct the mobile device 102 not to notify others (e.g., if the user does not need help). Alternatively, the user may explicitly instruct the mobile device 102 to notify others (e.g., if the user needs help). In some cases, if the user does not respond to the notification, the mobile device 102 may automatically notify others (e.g., if the user is incapacitated and unable to respond). In some cases, the mobile device 102 may notify the user a number of different times, and if no response is received from the user after a time period (e.g., 25 seconds or some other time period), automatically notify others to obtain assistance.
In some cases, mobile device 102 can determine whether to generate a notification using a state machine and transmit the notification to others. The state machine may specify that the mobile device send a fall alert when a brief period of inactivity (a behavior that typically occurs after a fall) is observed. The mobile device then detects incapacity by checking the user's "long" period of lying (e.g., the period of time the user is not moving or rising). If the inability is detected, the mobile device may automatically transmit the distress call to a third party and/or instruct another device to transmit the distress call. The user may also cancel the distress call (e.g., if the user thinks he does not need help).
An example state machine 1200 is shown in fig. 12. In this example, the mobile device begins with a "normal" state 1202 (e.g., a low alert state of the mobile device), "may fall" state 1204 (e.g., an elevated alert state of the mobile device when it is detected that the user may fall), "wait" states 1206a and 1206b (e.g., a state in which the mobile device waits for additional information such as possible input by the user or movement of the user), "alert" states 1208a and 1208b (e.g., a state in which the mobile device alerts the user of the mobile device that the user may fall and may send a distress call to a third party), "cancel" state 1210 (e.g., a state in which an impending distress call is cancelled), and "SOS" state 1212 (e.g., a state in which a distress call is performed). The mobile device may transition between each state based on certain signs of a detected fall (e.g., as described herein), a detected period of inactivity (e.g., no movement by the user), and/or input by the user.
For example, the mobile device begins in the "normal" state 1202. Upon detecting a fall flag (e.g., a combination of sensor measurements and other information indicative of a fall), the mobile device transitions to a "fall possible" state 1204. Rest period T after detecting a fallQAt this point, the mobile device transitions to an "alert" state 1208b for alertingThe user detects the fall and notifies the user that a distress call may be sent to a third party (e.g., an emergency responder). The mobile device transitions to a "wait" state 1206b and waits for possible movement by the user. If user movement is not detected during the lapse of a "timeout" period (e.g., 30 seconds) after transmission of the fall alert, and a continuous rest period (e.g., T) is detected during that time LL10 seconds), the mobile device transitions to the "SOS" state 1212 and transmits the distress call. The mobile device then returns to the "normal" state 1202. This may be useful, for example, if the user has fallen and is incapacitated for a long time. The mobile device may automatically seek help for the user even without input from the user.
As another example, the mobile device begins in a "normal" state 1202. Upon detecting the fall flag, the mobile device transitions to a "fall possible" state 1204. Rest period T after detecting a fall QWhen so, the mobile device transitions to an "alert" state 1208b and alerts the user that a fall has been detected and informs the user that a distress call may be sent to a third party. The mobile device transitions to a "wait" state 1206b and waits for possible movement by the user. If user movement is detected before a "timeout" period (e.g., 30 seconds) has elapsed after transmission of the fall alert, the mobile device transitions to a "cancel" state 1210 and cancels the distress call. The mobile device then returns to the "normal" state 1202. This may be useful, for example, if the user has fallen and has definitively indicated that he or she does not need help. In this case, the mobile device does not automatically seek help for the user.
As another example, the mobile device begins in a "normal" state 1202. Upon detecting the fall flag, the mobile device transitions to a "fall possible" state 1204. Upon detecting some type of movement (e.g., a walking movement, a standing movement, a highly dynamic movement, or any other movement that indicates recovery of the user) after a fall, the mobile device transitions to a "wait" state 1206 a. Detecting a dwell period T after a user has stopped movingQAt this point, the mobile device transitions to an "alert" state 1208a and alerts the user to the detected fall. Mobile device rotator Transition to the "cancel" state 1210 and no distress call is transmitted. The mobile device then returns to the "normal" state 1202. This may be useful, for example, if the user has fallen but shows signs of recovery. The mobile device may alert the user to the fall, but in this case help is not automatically sought for the user.
As another example, the mobile device begins in a "normal" state 1202. Upon detecting the fall flag, the mobile device transitions to a "fall possible" state 1204. Upon detecting some type of movement (e.g., a walking movement, a standing movement, a highly dynamic movement, or any other movement that indicates recovery of the user) after a fall, the mobile device transitions to a "wait" state 1206 a. The rest period T is not detected after the lapse of 25 secondsQWhen so, the mobile device returns to the "normal" state 1202. The mobile device then returns to the "normal" state 1202. This may be useful, for example, if the user has fallen but shows signs of recovery, and continues to move for a long time after falling. In this case, the mobile device may avoid alerting the user or automatically seek help for the user.
Although time values are shown in fig. 12, these are merely illustrative examples. In practice, one or more of the time values may vary depending on the implementation. In some cases, one or more of the time values may be adjustable parameters (e.g., parameters empirically selected to distinguish between different types or events or conditions).
In some cases, the response sensitivity of the mobile device 102 may vary depending on the characteristics of the user. For example, the mobile device 102 can determine a probability that the user has fallen and may need help. If the probability is greater than a threshold level, the mobile device can automatically notify the user and/or others that the user has fallen. The threshold level may vary on a per user basis. For example, if the user is at a higher risk of falling, the threshold level may be lower (e.g., so that the mobile device 102 is more likely to notify the user and/or others of the fall). However, if the user is at a lower risk of falling, the threshold level may be higher (e.g., so that the mobile device 102 is less likely to notify the user and/or others of the fall). The threshold level for each user may be varied based on one or more behavioral and/or demographic characteristics of the user, such as the user's age, activity level, walking speed (e.g., maximum walking speed observed over a period of time), or other factors.
For example, fig. 13 shows a schematic representation 1300 of a fall detection technique based on a user-specific sensitivity. The mobile device receives motion data 1302 and calculates behavioral characteristics of the user's motion (e.g., a swinging motion, a supporting motion, a chaotic motion, a periodic motion, or other characteristics as described herein) (1304). Additionally, the mobile device determines a sensitivity threshold based on the behavior and/or demographic characteristics of the user (1306). The mobile device determines whether the user has fallen and/or whether a distress call is transmitted based on the determined features and thresholds (e.g., using the fall detector 1308 to perform one or more of the techniques described herein).
As described herein, the mobile device can determine whether the user has fallen based at least in part on the "attitude angle" or orientation of the mobile device relative to the inertial frame. In some cases, this may be determined using information acquired by an accelerometer and/or information from an orientation sensor (such as a gyroscope).
In practice, the dynamic range of the accelerometer may vary depending on the implementation. For example, the accelerometer may have a dynamic range of 16 g. As another example, an accelerometer may have a dynamic range of 32g (e.g., to detect a greater range of acceleration).
In some cases, the accelerometer and gyroscope may each acquire measurements according to the same sampling rate (e.g., 200Hz, 400Hz, 800Hz, or some other frequency). In some cases, the accelerometer and gyroscope may each acquire measurements according to different sampling rates. For example, an accelerometer may obtain measurements according to a higher sampling rate (e.g., 800Hz), while a gyroscope may obtain measurements according to a lower sampling rate (e.g., 200 Hz). This may be useful, for example, when selectively reducing the power consumption of one sensor (e.g., a sensor that consumes more power during operation) relative to other sensors to improve the power efficiency of the mobile device. In some cases, the sampling rate of the accelerometer and/or gyroscope may be dynamically adjusted during operation. For example, the sampling rate of the accelerometer and/or gyroscope may be selectively increased during certain time periods and/or in response to certain conditions (e.g., greater user motion) and decreased during certain other time periods and/or in response to certain other conditions (e.g., less user motion).
In some cases, one of the accelerometer or gyroscope may be used to acquire measurements while the other sensor is disabled (e.g., so that it does not collect measurements). The disabled sensors may be selectively activated based on measurements obtained from the active sensors. This may be useful, for example, when reducing power consumption of the mobile device (e.g., by operating only one of the accelerometer or gyroscope during certain time periods, and selectively operating both the accelerometer or gyroscope in response to certain conditions).
For example, the mobile device may disable a gyroscope and use an accelerometer to acquire acceleration measurements over a period of time. If the measured acceleration is greater than a certain threshold level (e.g., the root mean square [ RMS ] of the energy of the accelerometer's acceleration signal is above a threshold energy level), the mobile device may activate the gyroscope and collect orientation information. Thus, the gyroscope is selectively activated in response to the accelerometer detecting "significant" motion.
In some cases, the gyroscope may be disabled if significant motion is no longer detected. For example, if the RMS energy of the acceleration signal of the accelerometer is less than a threshold energy level for a particular period of time (e.g., a predefined time interval), the mobile device may disable the gyroscope and continue to operate the accelerometer. In some cases, the gyroscope may be disabled for a particular period of time after the gyroscope is activated (e.g., after a predefined time interval has elapsed since the gyroscope was turned on). In some cases, the gyroscope may be disabled if the RMS energy of the acceleration signal of the accelerometer is less than a threshold energy level for a first time interval or if a second time interval has elapsed since the accelerometer was activated, whichever occurs first. In practice, this time interval may vary depending on the implementation.
In some cases, the gyroscope may be selectively enabled and disabled according to a state machine. Fig. 14 shows that the example state machines include a "gyroscope off state 1402 (corresponding to a disabled state of the gyroscope), a" gyroscope on "state 1404 (corresponding to an enabled state of the gyroscope), and a" wait "state 1406 (corresponding to a state in which the mobile device waits for further information before adjusting operation of the gyroscope).
The mobile device begins in a "gyroscope off" state 1402 in which the gyroscope is disabled. The mobile device transitions to a "gyroscope on" state 1404 and enables the gyroscope upon detection of the non-dwell period based on the acceleration signal acquired from the accelerometer. Upon detecting a pause and a low rotation based on the acceleration signal and the orientation signal from the gyroscope, the mobile device transitions to a "wait" state 1406. If the quiesce and low rotation continue, the mobile device periodically increments a counter over time. If the counter exceeds the threshold, the mobile device returns to the "gyroscope off" state 1402 and disables the gyroscope, and resets the counter. However, if a non-stop and/or a sufficiently high degree of rotation is detected, the mobile device turns back to the "gyroscope on" state 1404 and keeps the gyroscope in an enabled state, and resets the counter. As such, the mobile device selectively enables the gyroscope in response to detecting movement and disables it after a pause and low rotation for a period of time.
In some cases, the "stall" condition shown in FIG. 14 may be a Boolean value that is true when the following formula is satisfied:
thr1≤k1*VM+k2*dVM<thr2
where VM is the magnitude of the acceleration signal, dVM is the rate of change of the magnitude of the acceleration signal, thr1 and thr2 are adjustable thresholds.
In some cases, the "stall and low rotation" condition shown in FIG. 14 may be a Boolean value that is true when the following formula is satisfied:
(thr1+δ≤k1*VM+k2*dVM<thr2-δ)2ND(rot<thr3)
where VM is the magnitude of the acceleration signal, dVM is the rate of change of the magnitude of the acceleration signal, rot is the rate of rotation (e.g., determined based on a gyroscope), δ is an adjustable offset value, and thr1, thr2, and thr3 are adjustable thresholds.
In some cases, falls may be detected based on a combination or "fusion" of a plurality of different types of sensor measurements. For example, a fall may be detected based on acceleration measurements (e.g., acquired by an accelerometer), orientation measurements (e.g., acquired by a gyroscope), air pressure measurements (e.g., acquired by a pressure sensor or barometer), altitude measurements (e.g., acquired by an altimeter, a pressure sensor, an accelerometer, or other sensor), heart rate measurements (e.g., acquired by a heart rate sensor), and/or other types of measurements.
For example, fig. 15 shows a schematic representation 1500 of a fall detection technique based on multiple types of sensor measurements. An accelerometer is used to detect a severe impact (step 1502). For example, it may be based on the acceleration a measured by the accelerometer within the sliding windowmagMagnitude and jitter j ofmagTo detect severe impacts. The magnitude of the acceleration within the sliding window may be calculated using the following formula:
amag=sqrt(max(|x|)2+max(|y|)2+max(|z|)2)
where x, y and z are the x, y and z components of the acceleration signal, respectively, and max is taken within a window of 0.2 seconds.
The magnitude of jitter within the sliding window may be calculated using the following equation:
jmag=max(sqrt(dx2+dy2+dz2))
where dx, dy, and dz are the derivatives of the x, y, and z components of the acceleration signal, respectively, and max is taken over a window of 0.2 seconds.
If a ismagGreater than a threshold thr1 and jmagAbove a threshold thr2, the mobile device obtains gyroscope measurements (step 1504), altitude, or heightMeasurements (step 1506) and heart rate information (step 1508), and determining whether a fall has occurred based on the measurements (step 1510).
For example, accelerometer and gyroscope measurements can be used to determine the direction of impact and attitude angle of the mobile device before, during, and/or after the impact (e.g., as described herein). In some cases, accelerometer measurements may be used to estimate the altitude or height of the mobile device (e.g., when the mobile device is stationary). In some cases, accelerometer and gyroscope measurements may be combined to estimate an altitude or height of the mobile device (e.g., when the mobile device is in motion).
As another example, a pressure sensor may be used to detect a multi-level fall (e.g., a user falling from a ladder). As another example, a heart rate sensor may be used to detect changes in heart rate, such as an increase in heart rate (e.g., due to a non-warfare, i.e., escape response) or a heart rate decay curve (e.g., the decay in heart rate of a person after a fall may have different characteristics, such as a smaller time constant, than the decay in heart rate after the end of exercising the body). As another example, an accelerometer may be used to detect altitude or height (e.g., when the device is stationary). As another example, an accelerometer may be used to detect altitude or height (e.g., when the device is stationary).
FIG. 16 illustrates an example use of determining information about user motion in conjunction with an accelerometer and a gyroscope (e.g., as part of step 1504 illustrated in FIG. 15). The accelerometer and gyroscope generate accelerometer and gyroscope measurements (step 1602). Based on this information, the gyroscope controller may selectively turn off the gyroscope during a period of time (e.g., as described with respect to fig. 14) (step 1604). The accelerometer and gyroscope measurements are used in combination (e.g., "fusion") to obtain information about the device, such as the height or altitude of the device (step 1606). In addition, this information may be used to determine other information about the device, such as the attitude angle and direction at which the mobile device experienced the impact (step 1608). The mobile device determines whether a fall has occurred based on the measurements (step 1610).
Fig. 17 shows an example fall classifier 1700. The fall classifier 1700 can be used to determine whether a user has fallen and, if so, the type or nature of the fall. The fall classifier 1700 can be implemented, for example, using one or more of the components of the system 100 shown in fig. 1 (e.g., the mobile device 102).
The fall classifier 1700 receives input indicative of the measured movement of the user and outputs information indicative of whether the user has fallen and, if so, the type or nature of the fall. For example, as shown in diagram 1700, the fall classifier receives acceleration data 1702 indicative of accelerations experienced by a mobile device worn by a user (e.g., measured using an accelerometer), and gyroscope data 1704 indicative of the orientation of the mobile device (e.g., measured using a gyroscope).
The acceleration data 1702 and the gyroscope data 1704 are combined or "fused" together by a sensor fusion module 1706 (e.g., using one or more of the techniques described herein) and considered together by the fall classifier 1700. In some cases, the acceleration data 1702 and gyroscope data 1704 can be combined with respect to one or more (e.g., six) spatial axes.
The acceleration data 1702 and gyroscope data 1704 may be used in conjunction to determine information about the characteristics of the user's motion. This data may be used to determine the altitude 1708 of the mobile device, for example.
Additionally, the acceleration data 1702 and the gyroscope data 1704 may be input into a feature extraction module 1710 that identifies one or more features or characteristics of the acceleration data 1702 and the gyroscope data 1704. Feature extraction module 1710 may perform one or more of the techniques described herein. For example, the feature extractor 1710 may determine a wrist angle 1710 of the user (e.g., by determining a pose angle of the mobile device when worn on the user's wrist).
Additionally, behavior modeling module 1712 may be used to determine the behavior of the user. The behavior of the user may be modeling one or more of the techniques described herein. For example, based on changes in the pose angle of the mobile device 102, the behavior modeling module 1712 may determine behavior information 1714, such as whether the user is performing a support motion (e.g., reaching out their arm to stop a forward rush), a balance motion (e.g., reaching out their arm to restore balance), a swing motion (e.g., swinging their arm during and after an impact), or another motion. In some cases, support motion may be detected based on the following features: such as the wrist traversing the negative arc length prior to impact and the wrist pointing toward the ground at the moment of impact. In some cases, the balancing motion may be detected based on features such as the wrist traversing the length of the positive arc as the user attempts to restore balance. In some cases, the wiggle motion may be detected based on characteristics such as the wrist making one or more quick reversals in motion (as part of a grip reflex, or due to repeated secondary impacts with the ground). The behavioral information 1714 can be input into a classification module 1716 to aid in fall detection and classification.
The fall classifier 1700 can also analyze various aspects of the acceleration data 1702 independently of the gyroscope data 1704. For example, the acceleration data 1702 may be input to a feature extraction module 1718 that identifies one or more features or characteristics of the acceleration data 1702. The feature extraction module 1718 may perform one or more of the techniques described herein. For example, the feature extractor 1718 may determine impact information 1720, such as the magnitude of the impact experienced by the user, the movement made by the user before the impact, and the movement made by the user after the impact. As another example, the feature extractor 1718 may determine a degree of confusion of the user's motion over a period of time.
Information from the impact detector 1722 and the behavior modeling module 1712 can be used to determine whether the user has fallen, and if so, the type or nature of the fall. For example, based on input from the impact detector 1722 and the behavior modeling module 1712, the classification module 1716 may determine that the user has slipped, stumbled, rolled, or experienced some other type of fall. As another example, based on input from the bump detector 1722 and the behavior modeling module 1712, the classification module 1716 may determine that the user has experienced a bump but has not fallen. As another example, based on input from the impact detector 1722 and the behavior modeling module 1712, the classification module 1716 may determine that the user has fallen but has recovered. The classification module 1716 outputs fall information 1726 indicating whether the user has fallen, and if so, the type or nature of the fall.
As described above, multiple types of sensor measurements may be used in combination to determine a user's motion characteristics. For example, fig. 18 shows a fall sensor fusion module 1800 for determining whether a user has fallen, and if so, the type or nature of the fall. The sensor fusion module 1800 may be implemented, for example, using one or more of the components of the system 100 shown in fig. 1 (e.g., the mobile device 102).
The fall sensor fusion module 1800 receives inputs from a plurality of different sensors. For example, the fall sensor fusion module 1800 receives acceleration data 1802a indicative of accelerations experienced by a mobile device worn by the user (e.g., measured using an accelerometer), and gyroscope data 1802b indicative of the orientation of the mobile device (e.g., measured using a gyroscope). As another example, the sensor fusion module 1800 receives location data 1802c indicative of a location of the mobile device (e.g., measured using a global navigation satellite system receiver such as a global positioning system receiver and/or a wireless transceiver such as a Wi-Fi radio). As another example, the sensor fusion module 1800 receives height data 1802d indicative of the height or altitude of the device (e.g., measured using an altimeter, barometer, or other height sensor). As another example, the sensor fusion module 1800 receives heart rate data 1802d that is indicative of a heart rate of a user wearing the mobile device (e.g., measured using a heart rate sensor).
As described herein, the acceleration data 1802a and the gyroscope data 1802b may be used to determine whether a user has fallen. For example, the acceleration data 1802a and the gyroscope data 1802b may be input into the fall classifier 1804. In general, the fall classifier 1804 may operate in a similar manner as described with respect to fig. 17. For example, the fall classifier 1804 may determine one or more features based on the acceleration data 1802a and determine whether the user has experienced an impact based on the features. In addition, the fall classifier 1804 may determine one or more features based on both the acceleration data 1802a and the gyroscope data 1802b, and model the behavior of the user based on the features. In addition, the fall classifier 1804 may determine whether the user has fallen based on the detected impact and/or the modeled behavior.
In addition, the fall classifier 1804 may determine whether the user has fallen based at least in part on the location data 1802 c. For example, location data 1802c may be input into threshold module 1806. The threshold module 1806 determines information regarding the location of the mobile device 1806. For example, the threshold module 1806 may determine whether the mobile device is in the user's home, at the user's work site, in a public area (e.g., a store, gym, swimming pool, etc.), or some other location. As another example, threshold module 1806 may determine whether the mobile device is being worn by the user while the user is driving, cycling, skating, skateboarding, or traveling using some other mode of transportation. This information can be input into the fall classifier 1804 to improve the detection of falls. For example, a user may be more likely to fall while at home than when traveling in a car. Thus, the fall classifier 1804 may improve the sensitivity of determining that a user has fallen while at home as compared to when the user is driving a car. As another example, a user may be more likely to fall while out when snowing or raining, rather than when not snowing or raining. Thus, the fall classifier 1804 may increase the sensitivity of detecting falls when a user is determined to be outdoors and the location is determined to be raining or snowing (e.g., based on information obtained from weather services) as compared to when the user is not raining or snowing at the location.
The fall classifier 1804 outputs fall data 1808 which indicates whether the user has experienced a fall or, if so, the type or nature of the fall. The fall data 1808 may be input into a fusion module 1810 that rejects false positives of the fall classifier 1804. For example, the fusion module 1810 can receive fall data 1808 indicating that a fall occurred. However, based on the additional information received by the fusion module 1810, the fusion module 1810 can overwrite the fall data 1808 and determine that a fall has not occurred. The fusion module 1810 outputs confirmation data 1812 which confirms whether the user has experienced a fall and, if so, the type or nature of the fall.
In some cases, the fusion module 1810 can determine whether the fall data 1808 is a false positive based on the height data 1802 d. For example, the height data 1802d may be input into the filter module 1814. The filter module 1814 may be used to isolate a particular component (e.g., a particular frequency or range of frequencies) of the height data 1802 d. The filtered height data 1802d is input into a feature extraction module 1816, which determines feature data 1818 that indicates one or more features of the height of the mobile device. For example, feature extraction module 1816 may determine a change in the height or altitude of the mobile device over a period of time. The feature data 1818 is input into the fusion module 1810 and can be used to identify potential false positives. For example, if the mobile device experiences significant changes in altitude (e.g., a few feet, or several levels or floors), the fusion module 1810 can determine that false positives are unlikely. As another example, if the mobile device has not experienced any altitude change, the fusion module 1810 may determine that false positives are more likely.
In some cases, the fusion module 1810 can determine whether the fall data 1808 is a false positive based on the heart rate data 1802 e. For example, the heart rate data 1802e may be input into the filter module 1820. The filter module 1820 may be used to isolate a particular component (e.g., a particular frequency or range of frequencies) of the heart rate data 1802 e. The filtered heart rate data 1802e is input into a feature extraction module 1822 that determines feature data 1824 indicative of one or more features of the heart rate of a user wearing the mobile device. As another example, the feature extraction module 1822 can determine an increase or decrease in the user's heart rate after a fall. As another example, the feature extraction module 1822 may determine a subsequent decay or recovery of the heart rate (e.g., when the heart rate recovers to normal). As another example, the feature extraction module 1822 may determine a decay time constraint (e.g., a time constant indicative of a decay rate after a rise) associated with the decay or recovery of the heart rate. The feature data 1824 is input into the fusion module 1810 and can be used to identify potential false positives. For example, if the user's heart rate increases (e.g., due to non-warfare, i.e., escape responses), the fusion module 1810 may determine that false positives are unlikely. As another example, the heart rate of a user typically decays more quickly after a fall than after an exercise period. Thus, the mobile device may compare the user's decay time constant to the decay time constant sampled after the user exercises. If the user's decay time constant is less than the exercise-related decay time constant, fusion module 1810 may determine that false positives are unlikely.
As described above, the mobile device can automatically transmit the distress call to a third party (e.g., an emergency responder) in response to detecting the fall and determining that the user may need assistance. In a similar manner as described above (e.g., with respect to fig. 12), the mobile device may first alert the user that a distress call may be transmitted (e.g., after a fall), and allow the user to confirm whether to proceed with the call. The user may manually initiate the distress call in response to the alert (e.g., a command may be input into the mobile device). However, if the user does not respond to the alert, the mobile device may determine whether the call should continue based on user behavior after the call. For example, if the user does not move after a fall (e.g., indicating that the user is injured or unresponsive), the mobile device may continue to make calls. However, if the user exhibits activity (e.g., indicating that the user has resumed), the mobile device may cancel the call. Such progressive escalation may be beneficial, for example, to reduce the number of false positives with respect to falls, as well as to reduce the likelihood of calling a third party unnecessarily.
In some cases, the mobile device may determine whether to transmit the distress call based on measurements taken by a plurality of different sensors. For example, fig. 19 shows a distress call module 1900 for determining whether to transmit a distress call to a third party. For example, distress call module 1900 may be implemented using one or more of the components of system 100 (e.g., mobile device 102) shown in fig. 1, for example.
The distress call module 1900 includes a fusion module 1902 for determining whether the user has fallen, and if so, the type or nature of the fall. The fusion module 1902 can operate in a similar manner as described with respect to fig. 18. For example, the fusion module 1902 may receive several types of sensor data 1804, such as acceleration data, gyroscope data, position data, altitude data, and/or heart rate data. Based on this information, the fusion module 1902 can determine whether the user wearing the mobile device has fallen and identify potential false positives. The fusion module 1902 outputs confirmation data 1804 which confirms whether the user has experienced a fall and, if so, the type or nature of the fall.
Additionally, distress call module 1900 determines information about the motion of the user wearing the mobile device. For example, acceleration data and gyroscope data may be input into the feature extraction module 1806. The feature extraction module 1806 determines one or more features about the user's motion, such as whether the user moved for a period of time after a fall, whether the user can take steps after a fall, whether the user has stood after a fall, or other features. The extraction module 1806 outputs feature data 1808 indicative of each of the extracted features.
The confirmation data 1804 and the feature data 1806 can be input into the fall state machine 1810. The fall state machine 1810 determines whether to transmit the distress call to the third party based on the input. An example fall state machine 1810 is shown in fig. 20.
In this example, the mobile device begins in a "normal" state 2002 (e.g., a low alert state), "confirm falls" state 2004 (e.g., an elevated alert state when the fusion module 1802 detects a fall), "alert" state 2006 (e.g., a state in which the mobile device alerts the user of the mobile device of a fall and may send a distress call to a third party), "wait" state 2008 (e.g., a state in which the mobile device waits for additional information, such as potential input by the user), "cancel" state 2010 (e.g., a state in which an impending distress call is cancelled), and "SOS" state 2012 (e.g., a state in which a distress call is made). The mobile device may transition between each state based on certain signs of a detected fall (e.g., as described herein), a detected period of inactivity (e.g., no movement by the user), and/or input by the user.
For example, the mobile device begins in a "normal" state 2002. Upon detecting a confirmed fall (e.g., by the fusion module 1802), the mobile device transitions to a "confirm fall" state 2004. Rest period T after detecting a fall QWhen so, the mobile device transitions to an "alert" state 2006 and alerts the user that a fall has been detected and informs the user that a distress call may be sent to a third party (e.g., an emergency responder). The mobile device transitions to a "wait" state 2008 and waits for input by the user in response to the alert. If long lying period T after transmission of a fall alertLLIf no user movement is detected during (e.g., 30 seconds), the mobile device transitions to "SOS" state 2012 and transmits a distress call. The mobile device then returns to the "normal" state 2002. This may be useful, for example, if the user has fallen and is incapacitated for a long time. The mobile device may automatically seek help for the user even without input from the user.
As another example, the mobile device begins in a "normal" state 2002. Upon detecting an affirmative fall, the mobile device transitions to an "confirm fall" state 2004. After a fall in a rest period TQUpon detecting certain types of movement (e.g., walking movement, standing movement, high dynamic movement, or any other movement that indicates recovery of the user), the mobile device returns to the "normal" state 2002. This may be useful, for example, if the user has fallen, but shows signs of movement and recovery after falling. In this case, the mobile device may avoid alerting the user or automatically seek help for the user.
For example, the mobile device begins in a "normal" state 2002. Upon detecting a confirmed fall (e.g., by the fusion module 1802), the mobile device transitions to a "confirm fall" state 2004. Rest period T after detecting a fallQWhen the mobile device transitions to the "alert" state 2006 and alerts the user that a fall was detected and informs the user that a third party may be available (e.g., the mobile device may not be available to the user)E.g., emergency responder) sends a distress call. The mobile device transitions to a "wait" state 2008 and waits for input by the user in response to the alert. In a long lying time period T after transmission of a fall alertLLUpon detecting certain types of movement (e.g., a walking movement, a standing movement, a highly dynamic movement, or any other movement indicating user recovery), the mobile device transitions to the "cancel" state 2010 and no distress call is transmitted. The mobile device then returns to the "normal" state 2002. This may be useful, for example, if the user has fallen but shows signs of recovery. The mobile device may alert the user to the fall, but in this case help is not automatically sought for the user.
While time values are described with respect to fig. 20, these are merely illustrative examples. In practice, one or more of the time values may vary depending on the implementation. In some cases, one or more of the time values may be adjustable parameters (e.g., parameters empirically selected to distinguish between different types or events or conditions).
In some cases, the system 100 may determine that the user 110 has fallen, determine the severity of the injury suffered by the user 110 as a result of the fall, and in response perform one or more actions (or refrain from performing one or more actions). For example, the system 100 may determine that the user 110 has fallen but has not suffered an injury or has only minor injury, so that he can recover without the assistance of others. In response, the system 100 may refrain from generating a notification and transmitting the notification to others. As another example, the system 100 may determine that the user 110 has fallen and has suffered a more serious injury, such that he may need help. In response, the system 100 may generate a notification and transmit the notification to others to assist the user.
As an illustrative example, fig. 21 shows a diagram 2100 of classifying different types of falls that a user may experience. In this example, the set of all types of falls that the user may experience is represented by boundary 2102. Further, the subset of falls that cause injury to the user is represented by boundary 2104 within boundary 2102. Further, the subset of falls that cause injury to the user and render the user unresponsive is represented by boundary 2106 within boundary 2104.
In some implementations, the system 100 can distinguish each of these types of falls and perform different actions in response. For example, if a user has suffered a fall that has injured and made it unresponsive, the system 100 can generate and transmit a notification to others to assist the user, even if the user does not explicitly instruct the system 100 to do so after the fall. As another example, if a user has suffered a fall that has injured it but has not made it unresponsive, the system 100 can generate a notification and transmit the notification to others to assist the user (e.g., after receiving confirmation from the user after falling). As another example, if the user has suffered a fall that causes the user to be neither injured nor responded to, the system 100 may refrain from generating a notification and transmitting the notification to others to assist the user.
Although fig. 21 shows example categories or types of falls, these are merely illustrative examples. In practice, falls may be classified according to other categories or types, instead of or in addition to those shown in fig. 21.
As described herein, in some implementations, the system 100 can determine whether the user has fallen based on motion data acquired before, during, and/or after the impact experienced by the user 110. In some implementations, the system 100 can make this determination by detecting a motion characteristic of the user and determining whether the motion characteristic indicates that the user experienced an impact, expected a fall (e.g., performed a support motion), and/or responded to a fall. Based on this information, the system 100 can determine the probability that the user experienced a fall.
For example, fig. 22A shows a fall detection module 2200, and fig. 22B shows a process 2250 that may be performed at least in part by the fall detection module 2200. The fall detection module 2200 can be implemented as one or more of the components of the system 100 (e.g., the mobile device 102), such as shown in fig. 1.
As shown in fig. 22A, the fall detection module 2200 includes an arm motion tracker module 2202, an arm motion classifier module 2204, an impact detector module 2206, an impact classifier module 2208, and a fall event classifier module 2210.
During operation of the fall detection module 2200, the arm motion tracker module 2202 receives sensor measurements from one or more sensors worn by the user on their arm. For example, the arm motion tracker module 2202 may receive acceleration data acquired from one or more accelerometers worn by the user on their arm, and orientation data acquired from one or more gyroscopes worn by the user on their arm. Based on these sensor measurements, the arm motion tracker module 2202 determines the motion of the user's arm over a period of time. For example, arm motion tracker module 2202 may determine that a user has rotated along one or more axes and/or has translated along one or more axes.
The arm motion tracker module 2204 provides information about the motion of the user's arm to the arm motion classifier module 2204. Based on this information, the arm motion classifier module 2204 determines the type of motion performed by the user's arm. For example, the arm motion classifier module 2204 may determine that the user has performed a support motion (e.g., extending their arm to stop forward thrust), a balance motion (e.g., extending their arm to restore balance), a swing motion (e.g., swinging their arm during and after an impact), or another motion using their arm. For example, with reference to fig. 6, an example technique for classifying motion of a user's arm is described. The arm motion tracking module 2204 provides information about the type of motion performed by the user's arm to the fall event classifier module 2210 for further processing.
During operation of the fall detection module 2200, the impact detector module 2206 also receives sensor measurements from one or more sensors worn by the user on his arm. For example, the strike detector module 2206 may receive acceleration data acquired from one or more accelerometers worn by the user on his arm. Based on these sensor measurements, the impact detector module 2206 determines one or more features or characteristics that may be indicative of an impact experienced by the user over a period of time. For example, the impact detector module 2206 may determine a magnitude of acceleration measured by one or more features relative to one or more dimensions, jitter measured by one or more features relative to one or more dimensions, a time associated with each of the measurements, and/or any other property indicative of an impact.
The impact detector module 2206 provides information about the determined features or characteristics to the impact classifier module 2208. Based on these sensor measurements, the impact classifier module 2208 determines a likelihood that the user has experienced an impact. The impact classifier module 2208 may also use additional sensor information to make this determination. For example, the fall detection module may make this determination based on acceleration data acquired from one or more accelerometers worn by the user on their arm and orientation data acquired from one or more gyroscopes worn by the user on their arm. The impact classifier module 2208 provides information about the likelihood that the user has experienced an impact identified by the fall event classifier module 2210 for further processing.
The fall event classifier module 2210 determines the likelihood that the user experiences a fall based on the information received from the arm motion classifier module 2204 and the impact classifier module 2208. For example, if the user has experienced an impact, the fall event classifier module 2210 may determine that the user is more likely to have experienced a fall. As another example, if the user's arm moves in a particular manner prior to the impact (e.g., the user stretches his arm forward, swings his arm, etc.), the fall event classifier module 2210 may determine that the user is more likely to have experienced a fall.
In some implementations, the fall detection module 2200 can use the process 2250 shown in fig. 22B to determine the likelihood that the user has fallen. According to the process 2250, the fall detection module 2200 tracks the motion of the user's arm (e.g., using sensor data from one or more sensors worn by the user on his arm, and the arm motion tracker module 2202) (step 2252). Based on the sensor measurements, the fall detection module 2200 determines the type of motion performed by the user's arm (e.g., using the arm motion classifier module 2204) (step 2254). For example, the arm motion classifier module 2204 may determine that the user has performed a support motion, a balance motion, a swing motion, or another motion using their arm.
While tracking the motion of the user's arm and determining the type of motion performed by the user's arm, the fall detection module 2200 collects sensor data that may indicate that the user is experiencing an impact (e.g., using the impact detector module 2206) (step 2256). Based on the sensor data, the fall detection module 2200 determines the likelihood that the user has experienced an impact (e.g., using the impact classifier module 2208) (step 2258). If the fall detection module 2200 determines that it is unlikely that the user has experienced an impact, the fall detection module 2200 continues to collect sensor data and awaits the occurrence of an impact.
If the fall detection module 2200 determines that the user may have experienced an impact, the fall detection module 2200 determines the likelihood that the user has experienced a fall (e.g. using the fall event classifier module 2210) (step 2260). If the fall detection module 2200 determines that it is unlikely that the user has experienced a fall, the fall detection module 2200 continues to collect sensor data and await the occurrence of an impact and track the movement of the user's arm.
If the fall detection module 2200 determines that it is unlikely that the user has experienced a fall, the fall detection module 2200 can provide this information to the system 100. Based on this information, system 100 may determine whether to perform one or more actions (or refrain from performing one or more actions) in response to the determination. For example, the system 100 may generate a notification and transmit the notification to others to assist the user (or avoid doing so), such as using one or more of the techniques described herein.
As described herein, in some implementations, the system 100 can determine whether the user has fallen and needs help based on the activity level of the user. For example, a highly active (e.g., historically exhibiting frequent and violent movements) active user may be at a lower risk of falling and needing assistance. However, an inactive, infirm user (e.g., historically exhibiting infrequent and slight movements) may be at a higher risk of falling and needing assistance.
For example, fig. 23 shows an example line graph of the level of physical activity of the user (horizontal axis) and the expected severity of a fall experienced by the user (vertical axis). A user with a relatively high level of physical activity (e.g., corresponding to the right portion of the line graph) may be expected to experience a less severe fall (e.g., corresponding to the lower portion of the line graph). Thus, for these users, the system 100 may be configured to make it less likely to generate and transmit notifications to other users (e.g., to reduce false positives).
In contrast, a user with a relatively lower level of physical activity (e.g., corresponding to the left portion of the line graph) may be expected to experience a more severe fall (e.g., corresponding to the upper portion of the line graph). Thus, for these users, the system 100 may be configured to make it more likely to generate and transmit notifications to other users (e.g., to reduce false positives).
In some implementations, the activity level of the user may be determined based on factors such as the age of the user and/or historical information about the user's physical activity over time (e.g., the average number of steps the user takes per day).
As described herein, in some implementations, the system 100 can determine that the user may have fallen and selectively perform certain actions (e.g., generate a notification and transmit the notification to others) according to the user's motion characteristics before and after the fall. For example, in some implementations, the system 100 can compare user activity before a fall to user activity after the fall. If after a fall, the user is unable to resume his pre-fall activity, the system 100 may determine that the user may have suffered an injury and perform one or more actions in response (e.g., generate a notification and transmit the notification to others to assist the user).
For example, fig. 24 shows an accelerometer positioned on the user's body before a fall (time interval t)0To t1) At the time of fall (time t)1) And after a fall (t)1Later time interval) collected example acceleration data 2400. In this example, the user performs a physical activity before falling (corresponding to at time interval t)0To t1Period of timeChange in collected acceleration data) and at that time t)1Experience a fall (corresponding to at time t)1A large peak in the collected accelerometer data). After a fall, the user does not move during a "pause" period (corresponding to up to a certain time t after the fall)2Collected acceleration data that has little to no change). For example, the user may have fallen and/or be recovering from a fall. After the pause period, the user starts moving again (corresponding to at time t)2Changes in acceleration data collected thereafter).
The system 100 can determine whether to perform certain actions (e.g., generate a notification and transmit the notification to others) based on the user's motion characteristics before and after the fall. For example, fig. 25 shows an example process 2500 for generating and transmitting a notification. Process 2500 may be implemented, for example, in one or more of the components of system 100 shown in fig. 1 (e.g., mobile device 102).
According to process 2500, a device (e.g., mobile device 102 and/or one or more other of the components of system 100) collects sensor data from one or more sensors positioned on the user's body and uses the sensor data to determine whether the user has fallen (step 2502). Example techniques for determining whether a user has fallen are described herein.
Upon determining that the user has fallen, the device waits for a "pause" period of time after the fall (step 2504). During this time, the device does not generate any notification nor transmit it to others. However, the device continues to acquire sensor data to track the user's motion. In practice, the length of the dwell period may vary depending on the implementation. In some implementations, the dwell period may be empirically selected.
After the dwell period has elapsed, the device determines the type of activity being performed by the user (step 2506). For example, the device may determine that the user is performing some type of activity, such as walking, running, swimming, doing sports, and so on. For example, the determination may be made using an activity classifier based on sensor data (e.g., as described herein).
Further, the device compares the "post-fall" activity of the user to the "pre-fall" activity of the user (step 2508). The pre-fall activity of the user may be determined in a similar manner to the post-fall activity of the user. For example, the type of activity the user was performing before the fall may be determined by the activity classifier based on sensor data collected before the fall.
If the pre-fall activity of the user is different (or substantially dissimilar) from the pre-fall activity (e.g., a free fall activity < > post-fall activity, or a free fall activity ≠ post-fall activity), the device can responsively perform one or more actions. For example, the device may generate a notification and transmit the notification to others to assist the user (step 2508). For example, if the user is running before a fall, but is no longer running during the "after fall" period, the device may determine that the user is likely to be injured and may seek help for the user. For example, if the user swims before a fall, but no longer during the "after fall" period, the device may determine that the user may be injured and may seek help for the user.
If the pre-fall activity of the user is the same as (or substantially similar to) the pre-fall activity (e.g., free fall activity — post-fall activity), the device may refrain from generating and transmitting a notification to others, but may continue to acquire sensor data and monitor for falls by the user. For example, if the user is running before a fall, but also during a "fall," the device may determine that the user is less likely to be injured and may avoid seeking help for the user.
As described herein, the system 100 can determine whether to generate a notification using a state machine and transmit the notification to others. FIG. 26 illustrates an example state machine. In this example, the device (e.g., mobile device 102) begins in a "no fall" state 2602 (e.g., corresponding to the user not detecting a fall). If the device determines that the user has fallen (e.g., at time t)0) Then the device transitions to the fall state 2604 and the timer is reset. After the reset, the timer starts to increase over time.
After transitioning to the "fall" state, the device waits a first period of time (e.g., until time t)0After a time t1). This may be advantageous, for example, because it enables the user's movement to "stabilize" after a fall before the device analyses the nature of the fall and decides whether to take any action. In practice, the length of this first time period may vary depending on the implementation. For example, the length of the first time period may be one second, two seconds, three seconds, or any other time period. In some implementations, the length of the second time period can be selected empirically.
After waiting for the first period of time, the device determines whether the user has been during a second period of time (e.g., at time t) 1And time t2In which time t is2At time t1Then) move (state 2606). If the device determines that the user has been during the second time period (e.g., at time t)1And time t2In between), the device returns to the "not-fallen" state 2602 and avoids generating and transmitting notifications about detected falls to others. In practice, the length of this second time period may vary depending on the implementation. For example, the length of the second time period may be one second, two seconds, three seconds, or any other time period. In some implementations, the length of the second time period can be selected empirically.
The device may determine whether the user has moved based on measurements taken by one or more sensors worn by the user. For example, the device may receive acceleration measurements taken by one or more acceleration sensors worn by the user. If the acceleration measurements change to a sufficient extent (e.g., change in acceleration above a certain threshold level) during a second time period after the detected fall, the device can determine that the user has moved after the fall. As another example, the device may receive orientation measurements acquired by one or more orientation sensors (e.g., gyroscopes) worn by the user. If the orientation measurements change to a sufficient extent (e.g., change in orientation above a certain threshold level) during a second time period after the detected fall, the device can determine that the user has moved after the fall. In some implementations, the threshold level and/or the length of the second time period may be selected empirically (e.g., based on experimental studies identifying differences between instances in which the user needs assistance and instances in which the user does not need assistance).
As another example, in some implementations, the device can determine whether the user has moved by determining whether the user is walking or has been standing within a second time period after the detected fall. For example, the device may acquire acceleration measurements (e.g., acquired by one or more acceleration sensors), orientation measurements (e.g., acquired by one or more orientation sensors such as a gyroscope), altitude measurements (e.g., acquired by one or more altitude sensors such as a barometer), and/or other measurements acquired by one or more sensors worn by the user. The device may determine that the user has moved if the device determines that the measurements indicate that the user is walking and/or standing during a second time period after the detected fall. In some implementations, the device may make this determination by determining whether the measurement values match or are sufficiently similar to a predetermined measurement value pattern known to indicate that the user is walking or standing. In some implementations, the time intervals and/or predetermined patterns of measurements may be selected empirically (e.g., based on experimental studies identifying distinguishing characteristics of measurements of the user walking and/or standing relative to measurements of the user performing other types of activities).
If the device determines that the user has not moved during the second time period, the device transitions to an "alert" state 2608 and is ready to seek assistance for the user. E.g. at the end of the second time period (e.g. at time t)2) The device may present a notification (e.g., a visual, audible, and/or tactile notification) to the user indicating that the request for assistance will be transmitted. If the user does not indicate that the device is at the end of the third time period (e.g., at time t)2And time t3In which time t is3At time t2Thereafter), cancel the transmission, the device may transition to an "auto dial" state and automatically transmit a notification to the emergency responder (state 2610). For example, the device may automatically dial an emergency response system, such as "911," or initiate some other communication session. This may be advantageous, for example, because it provides the user with an opportunity to cancel the transmission manually if assistance is not required. However, if the user does not respond (or cannot respond), the device will continue to transmit. In practice, the length of the third time period may vary depending on the implementation. For example, the length of the third time period may be one second, two seconds, three seconds, or any other time period. In some implementations, the length of the third time period can be selected empirically.
In addition, the device may provide information to assist emergency responders in locating and treating the user. For example, the device may transmit the location of the user (e.g., as determined by determining the location of a subsystem of the device), the nature of the emergency, or other information.
When the device is in the auto-dial state 2610, the user may also instruct the device to cancel the automatic notification and/or terminate the communication system. For example, at time t3Previously, the user may instruct the device to cancel the automatic notification. In response, the device transitions back to the "not fallen" state 2602 and refrains from generating and transmitting a notification to others. As another example, at time t3Thereafter, the user may instruct the device to cancel the automatic notification. In response, the device transitions back to the "not fallen" state 2602 and terminates the transmission.
In some implementations, the system 100 can determine the severity of the injury suffered by the user 110 due to a fall based on multiple types of sensor data. For example, the system 100 may make this determination based on accelerometer data indicative of the user's movements, gyroscope data indicative of changes in the orientation of the user's body, and altimeter data indicative of changes in the user's altitude.
For example, fig. 27 shows a fall detection module 2700 for determining the severity of a user injury due to a fall. The fall detection module 2700 can be implemented as one or more of the components of the system 100 shown in fig. 1 (e.g., the mobile device 102), for example.
As shown in fig. 27, the fall detection module 2700 includes a fall height classifier module 2702, an impact detector module 2704 and an impact classifier module 2706, and an injury severity classifier module 2708.
During operation of the fall detection module 2700, the fall height classifier module 2702 receives sensor measurements from one or more sensors worn by the user on their body. For example, the fall height classifier module 2702 can receive height data obtained from one or more altimeters worn by the user on their body. Based on these sensor measurements, the fall height classifier module 2702 determines the height of the user over a period of time.
Further, the fall height classifier module 2702 can classify changes in the height of the user over time. For example, based on the sensor data, the fall height classifier module 2702 can determine whether the height of the user changes in a manner that indicates whether the user falls a relatively short distance (e.g., a "single-tier" fall, such as from a flight of stairs) or whether the user falls a relatively long distance (e.g., a "multi-tier" fall, such as from a ladder or balcony). In some implementations, the fall height classifier module 2702 can make this determination by determining the change in height of the user over a particular period of time, and determining whether the likelihood of a fall at that height falls within a different range (e.g., a relatively shorter range of distances for a single-level fall or a relatively longer range of distances for a multi-level fall). The range may be determined empirically (e.g., based on observations that the subject has fallen under various conditions).
Although two example classifications are described above, these are merely illustrative examples. In practice, the fall height classifier module 2702, in lieu of or in addition to those described above, can classify height variations of a user according to any number of different categories or types.
During operation of the fall detection module 2700, the impact detector module 2704 also receives sensor measurements from one or more sensors worn by the user on their arm. For example, the strike detector module 2704 may receive acceleration data acquired from one or more accelerometers worn by the user on their arm. Based on these sensor measurements, the impact detector module 2704 determines one or more characteristics or characteristics that may be indicative of an impact experienced by the user over a period of time. For example, the impact detector module 2704 may determine a magnitude of acceleration measured by the one or more features relative to the one or more dimensions, jitter measured by the one or more features relative to the one or more dimensions, a time associated with each of the measurements, and/or any other property that may indicate an impact.
The impact detector module 2704 provides information about the determined features or characteristics to the impact classifier module 2706. Based on these sensor measurements, the impact classifier module 2706 determines the severity of the impact experienced by the user. The impact classifier module 2706 may also use additional sensor information to make this determination. For example, the fall detection module may make this determination based on acceleration data acquired from one or more accelerometers worn by the user on their arm and orientation data acquired from one or more gyroscopes worn by the user on their arm. For example, if the magnitude of the change in sensor data is high (e.g., the change in acceleration, jitter, and/or orientation is high), the impact classifier module 2706 may classify the impact as more severe. As another example, if the magnitude of the change in sensor data is low (e.g., the change in acceleration, jitter, and/or orientation is low), the impact classifier module 2706 may classify the impact as less severe. The impact classifier module 2706 provides information regarding the severity of the impact experienced by the user to the injury severity classifier module 2708 for further processing.
The injury severity classifier module 2708 determines the severity of the injury suffered by the user based on information received from the fall height classifier module 2702 and the impact classifier module 2706. For example, if the user has experienced a severe impact and/or has fallen a long distance, the injury severity classifier module 2708 may determine that the user is more likely to have experienced a severe injury. As another example, if the user experienced a less severe impact and/or has fallen a shorter distance, the injury severity classifier module 2708 may determine that the user is more likely to have experienced a less severe injury.
As another example, if the user has experienced multiple impacts over time, injury severity classifier module 2708 may determine that the user is more likely to have experienced severe injury. As another example, if the user has experienced fewer impacts over time, the injury severity classifier module 2708 may determine that the user is more likely to have experienced a less severe injury.
As another example, if the user has experienced multiple impacts over time, injury severity classifier module 2708 may determine that the user is more likely to have experienced severe injury, where at least some of the impacts are different from one another. This may be advantageous, for example, because tumble (where the user is more likely to experience multiple varying impacts in sequence) may be more likely to cause serious injury.
For example, the severity classifier module 2708 may determine a set of characteristics associated with each impact experienced by the user for that impact. The characteristics may include, for example, the magnitude of the impact, the distance the user has fallen, one or more axes of rotation of the user, the acceleration of the user, the shake of the user, the orientation of the user, any changes thereto, and/or any other characteristics of the impact. If the users experience impacts that are relatively different from each other, injury severity classifier module 2708 may determine that the users are more likely to have experienced severe injuries. Further, if the users experience relatively similar impacts to each other, the injury severity classifier module 2708 may determine that the users are unlikely to have experienced a severe injury. Dissimilarity between characteristics may be determined, for example, based on an arithmetic difference between the characteristics of the impacts and/or by determining a percentage of change between the characteristics of the impacts. In some implementations, the impacts may be similar to each other if the difference between their respective characteristics is less than a particular threshold and/or the percentage of change between their respective characteristics is less than a threshold percentage. In some implementations, the impacts may be different from each other if the difference between their respective characteristics is greater than a particular threshold and/or the percentage of change between their respective characteristics is greater than a threshold percentage.
In some implementations, the injury severity classifier module 2708 may determine the severity of the injury suffered by the user based on one or more of these factors combined.
As described herein, the system 100 can determine whether a user has fallen in a manner that eliminates or otherwise reduces the incidence of false positives (e.g., determining that a user has fallen and/or needs assistance incorrectly when the user has not actually fallen and/or does not actually need assistance). In some implementations, these determinations can be made on a user-specific basis. For example, these determinations may be made for a particular user based on previously collected historical sensor data about the user, historical information about the user's previous activities, personal characteristics of the user (e.g., physical characteristics, age, demographics, etc.), and/or other information specific to the user.
In some implementations, the system 100 can generate and maintain a database of information specific to each user. The database includes one or more data records, each data record including information about an impact or "event" previously experienced by a user. For example, the data record may include an indication (e.g., a unique identifier) of an impact previously experienced by the user. Further, the data records may include sensor data (e.g., accelerometer data, gyroscope data, altimeter data, or any other sensor data) collected by sensors worn on the user's body before, during, and/or after the impact. Further, the data record may include an indication of the type of activity (e.g., walking, running, swimming, doing sports, cycling, etc.) the user was performing at the time the event occurred as determined by the activity classifier module. Further, the data record may include metadata about the impact, such as the time the impact occurred, the date the impact occurred, the day of the week the impact occurred, and/or the location where the impact occurred. Further, the data record may include an indication of whether the impact has been previously classified as a fall. The classification may be determined automatically (e.g., based on one or more of the techniques described herein) or manually by the user (e.g., based on input or feedback provided by the user after the impact).
The database may be used to determine user-specific characteristics that may indicate a fall. For example, the system 100 may cluster data records regarding one or more information dimensions (e.g., time, day of week, sensor data, type of activity, whether the user has fallen, etc.) into two or more clusters. At least one of the clusters may correspond to an instance in which the user previously experienced an impact and a fall. Further, at least one of the clusters may correspond to an instance in which the user previously experienced an impact without falling. In some implementations, the data records may be clustered using clustering techniques, such as K-means clustering.
The clustered data records may be used as "templates" to interpret newly collected sensor data to determine if the user has fallen. For example, the system 100 may collect sensor data about the user's daily life. When a user experiences an impact, the system 100 may collect information about the impact, such as sensor data collected from the user before, during, and/or after the impact, the time of day of the impact, the week of the impact, the geographic location of the impact, the type of activity the user is performing, and so forth. Further, the system 100 may compare the collected information to information contained in the data records of the cluster. If the collected information is similar to the information of one of the clusters, the system 100 can determine whether the user has fallen based on whether the cluster corresponds to an instance in which the user fell or an instance in which the user did not fall.
For example, if a user is regularly engaged in a regularly scheduled contact sport (e.g., football), he may experience impacts on a repetitive schedule (e.g., at the same time of day, the same time of week, etc.) and/or at the same geographic location (e.g., in certain football stadiums). Further, the type of activity of the user may be similar during each of those impacts (e.g., kicking a football), and the athletic characteristics of the user may be similar during each of those impacts (e.g., corresponding to the user being preempted by others). Further, after those impacts, he may provide feedback to the system 100 indicating that each of those impacts is associated with motion and does not fall into a situation where he needs medical assistance. The system 100 may generate data records of these impacts or "events" and then cluster them together into one or more clusters (e.g., based on their similarities in time, location, activity type, and sensor data). Further, the system 100 may indicate that the one or more clusters are not associated with a fall.
In the future, the user may play the soccer again according to his repetitive arrangement and typical position. During this time, the system 100 may collect sensor data to detect whether the user has experienced an impact (e.g., one or more of the techniques described herein). Upon detecting an impact, the system 100 may determine whether characteristics of the impact (e.g., time, location, type of activity, and/or sensor data regarding the impact) are similar to characteristics of the cluster's data records. In this example, when the user plays in a similar manner, at a similar time, and at a similar location. Thus, the system 100 can determine that the newly detected impact is similar to an impact in one or more clusters of data records that are not associated with a fall. Thus, the system 100 may avoid generating a notification and transmitting the notification to others to assist the user.
After kicking the football, the user may return to his home and fall from a flight of stairs. During this time, the system 100 may collect sensor data to detect whether the user has experienced an impact. Upon detecting an impact, the system 100 may determine whether the characteristics of the impact are similar to the characteristics of the clustered data records. In this example, since the user is in a different location than in the "non-fallen" cluster of data records, an impact that is different from its usual football schedule has been experienced at a certain time of day, a different type of activity is being performed, and an impact that is different from the sports characteristics associated with kicking a football is experienced. Thus, the system 100 can determine that the newly detected impact is sufficiently different from impacts in one or more clusters of data records that are not associated with a fall. In response, the system 100 may generate a notification and transmit the notification to others to assist the user.
In this way, the system 100 may adapt its fall detection behavior based on the time user's specific behavior and characteristics. Further, the system 100 may continuously "learn" and improve its detection over time. For example, as the system 100 collects more information about the user over the course of several impacts, the system 100 may update its database with new data records and re-cluster the data records to find new associations and trends about the collected information. Based on the re-clustered data records, the system 100 may determine whether the new impact experienced by the user is likely a fall.
In some implementations, the system 100 may discard, ignore, and/or give lower weight to certain data records based on their age. For example, if a data record is greater than a particular threshold age, the system 100 may discard or ignore the data record such that it is not included in the cluster (e.g., such that the impact associated with the impact is no longer considered). As another example, the system 100 may give the data a lower weight in clustering (e.g., so that the impact associated with the impact has less impact in determining whether the user has fallen). This may be advantageous, for example, because it enables the system 100 to adapt to changes in user behavior over time.
As described above, in some implementations, the user may manually provide input or feedback as to whether the impact is falling (e.g., he needs medical assistance) or non-falling (e.g., he does not need medical assistance). For example, after the system 100 detects an impact, the system 100 may prompt the user to classify the impact as falling or non-falling. The system 100 may include input from the user in the data records for the impact and may use the input to cluster the data records in the database.
In some implementations, the system 100 can apply user input with respect to one impact to other impacts having similar characteristics. This may be advantageous, for example, in reducing the number of times the system 100 requires manual input by a user. For example, if a user experiences multiple similar impacts, the system 100 may cluster the data records associated with these impacts into a common cluster. If the user indicates that one of those impacts does not correspond to a fall, the system 100 may update each of the data records in the cluster to indicate that it is not associated with a fall. In this way, the number of false positives can be reduced or eliminated even if the user has previously incorrectly indicated that other impacts within the same cluster have corresponded to a fall.
In some implementations, the system 100 can cluster data records from multiple types of clusters according to the likelihood that each cluster corresponds to a fall. For example, the system 100 may be based on at least three clusters: the data records are clustered by "fall" clusters, "non-fall" clusters, and "near-fall" clusters. A "falling" cluster is more likely to be associated with a fall (e.g., the information contained therein shows a stronger correlation or trend than its "non-falling" cluster). A "near fall" cluster is less associated with a fall (e.g., the information contained therein exhibits a less strongly correlated or trending relationship than its "non-fall" cluster). In some implementations, the likelihood that a cluster corresponds to a fall can be expressed as a posterior ratio or probability.
When determining whether a newly detected impact corresponds to a fall, the system 100 can ignore "near-fall" clusters (e.g., clusters that are only moderately likely to correspond to a fall), and consider only "fall" clusters (e.g., clusters that are more likely to correspond to a fall) and "non-fall" clusters (e.g., clusters that are less likely to correspond to a fall). This may be useful, for example, because it enables the system 100 to classify newly detected impacts based on more representative or indicative or "template" information for falls and non-falls (rather than information that is only weakly correlated with one or the other and may lead to ambiguous results). Thus, the system 100 may classify in a more accurate and/or predictable manner.
In some implementations, the information database specific to a particular user may be maintained exclusively on the user's personal device (e.g., his smart watch and/or smart phone, rather than a remote server). Further, the clustering and classification process may be performed exclusively on the user's personal device. This may be used, for example, to maintain the privacy of the user. However, in some implementations, at least some of the information data may be maintained on one or more remote servers in addition to or instead of the user's personal device. Similarly, in some implementations, at least a portion of the clustering and classification processes may be performed on one or more remote servers in addition to or instead of the user's personal device.
Fig. 28 shows an example process 2800 for making a user-specific determination of whether a user has fallen. Process 2800 may be implemented, for example, in one or more of the components of system 100 shown in fig. 1 (e.g., mobile device 102).
According to process 2800, a device (e.g., mobile device 102) makes an initial determination of whether a user has fallen based on sensor data obtained from sensors worn by the user on their body (step 2802). For example, the device may acquire sensor data regarding the impact (e.g., acceleration data and/or shake data), a pose angle or orientation of the device worn by the user and/or the user, and/or a motion of the device and/or the user (e.g., whether the user is stationary or active). Example techniques for determining whether a user has fallen based on sensor data are described herein.
In some implementations, the device can determine whether the user has fallen by calculating a probability metric (e.g., a posterior ratio). If the metric is above the first threshold, the device may determine that the user has fallen. If the metric is below the first threshold, the device may determine that the user has not fallen. In practice, the first threshold may be selected empirically.
If the device determines that the user has fallen, the device determines whether the detected fall is similar to an event previously recorded in the database (step 2804).
For example, if the newly detected fall has similar characteristics (e.g., similar time, day, activity type, sensor data, etc.) to those in one or more data records (or clusters of data records) stored in the user-specific database, the device can determine that the newly detected fall is similar to one or more of the previously detected falls by the user. In response, the device may update the database to include information about the newly detected fall (step 2806). For example, the device may generate a data record of a newly detected fall and include information about the fall. Example information includes sensor data collected by sensors worn by the user on his body before, during, and/or after an impact, the sensor data indicating the type of activity the user was performing at the time of the event, as well as metadata about the impact (e.g., the time the fall occurred, the date the fall occurred, the day of the week the fall occurred, and/or the location where the fall occurred). Further, the new data record may be clustered with one or more data records to which it is similar (e.g., added to an existing cluster).
If the newly detected fall does not have characteristics similar to those in any data record stored in the database, the device can determine that the newly detected fall is not similar to one or more of the previously detected falls by the user. In response, the device may generate an alert to the user (e.g., display a pop-up alert on the device) asking for further input by the user (step 2808). In response, the user may provide additional information about the fall. For example, the user may provide feedback on whether he has actually fallen, the type of activity he is performing, or any other information about the event.
Similarly, the device may update the database to include information about the newly detected fall (step 2806). For example, the device may generate a data record of a newly detected fall and include information about the fall. Example information includes sensor data collected by sensors worn by the user on his body before, during, and/or after an impact, the sensor data indicating the type of activity the user was performing at the time of the event, as well as metadata about the impact (e.g., the time the fall occurred, the date the fall occurred, the day of the week the fall occurred, and/or the location where the fall occurred). Furthermore, the new data record may include information provided by the user (e.g. whether he has actually fallen, the type of activity he is performing, or any other information about the event).
Referring again to step 2802, if the device instead determines that the user has not fallen, the device determines whether the probability metric associated with the newly detected event is close to a threshold level (step 2804). For example, the device may determine whether a probability metric associated with the newly detected event is above a second threshold level that is lower than the first threshold level. If the probability metric is below the second threshold (e.g., it is not "close" to the first threshold level used to determine whether the user has fallen), the device may refrain from updating the database and continue to collect sensor information about the user.
If the probability metric is above a second threshold level (e.g., it is "close" to the first threshold level used to determine whether the user has fallen), the device may update the database to include information about the new event (step 2806). For example, the device may generate a data record of the newly detected event and include information about the event. Example information includes sensor data collected by sensors worn by the user on their body before, during, and/or after an event, the sensor data indicating the type of activity the user was performing at the time of the event occurrence, and metadata about the event (e.g., the time of the event occurrence, the date of the event occurrence, the day of the week of the event occurrence, and/or the location of the event occurrence).
During this process, the device may also update its database based on the newly collected information to "learn" and improve its detection over time (step 2812). For example, the device may re-cluster the data records to find new associations and trends with respect to the collected information. In some implementations, the device may continuously re-cluster the data records over time. In some implementations, the device may periodically re-cluster the data records over time (e.g., every M days). In some implementations, a device may periodically re-cluster data records based on the number of events it has detected (e.g., every N detected events or hits). Further, in some implementations, the device may discard data records based on its age (e.g., discard data records that exceed T days). In practice, each of M, N and T may vary depending on the implementation.
As described herein, the resulting database can be used to determine whether the user has fallen (e.g., in a manner that eliminates or otherwise reduces the incidence of false positives).
In some implementations, the device can also identify false positives by measuring impacts experienced by the user over a period of time and identifying one or more patterns that can indicate false positives.
For example, the device determines, based on accelerometer measurements, that the user has experienced multiple impacts within a time period according to a periodic or near periodic sequence. For example, the device may determine that the user has experienced multiple impacts that repeat according to a particular frequency or that repeat according to a particular frequency within a particular tolerance range (e.g., ± 5%, ± 10%, ± 15%, or some other tolerance range). In response, the device may determine that the user is unlikely to have experienced a fall. This can be used, for example, to distinguish falls from repeated physical activities (e.g., golf, tennis, racquetball, badminton, baseball or softball, boxing, etc.). In some implementations, the time period and/or tolerance range can be selected empirically (e.g., based on experimental studies identifying differences between falls and other types of activities).
As another example, the device determines that the user has experienced multiple similar impacts over a period of time based on accelerometer measurements. For example, the device may determine that the user experienced multiple impacts of intensity, magnitude, and/or power equal to each other or within a certain tolerance range of each other (e.g., ± 5%, ± 10%, ± 15%, or some other tolerance range). In response, the device may determine that the user is unlikely to have experienced a fall. This can be used, for example, to distinguish falls from repeated physical activities (e.g., golf, tennis, racquetball, badminton, baseball or softball, boxing, etc.). In some implementations, the time period and/or tolerance range can be selected empirically (e.g., based on experimental studies identifying differences between falls and other types of activities).
In some implementations, the device may also identify false positives by determining smoothness of the user's movement over a period of time (e.g., guiding the user through the impact and/or after the user has experienced the impact). For example, if the user's movement is relatively smooth over the time period, the device may determine that the user is unlikely to have experienced a fall. Conversely, if the user's movement is relatively less smooth over the period of time, the device may determine that the user is more likely to have experienced a fall. This can be used, for example, to distinguish between falls (e.g., in some implementations, characterized by relatively unstable movement) and physical activity characterized by relatively smooth movement. In some implementations, the time period can be selected empirically (e.g., based on experimental studies that identify differences between falls and other types of activities).
Smoothness of user movement may be determined, for example, by acquiring accelerometer measurements as the user moves, and determining the strength of changes (e.g., "jitter") in the accelerometer measurements and/or the frequency of those changes. For example, a relatively smoother movement may correspond to less drastic and/or less frequent changes in acceleration measurements over a particular time period, and a relatively less smooth movement may correspond to more drastic and/or more frequent changes in acceleration measurements over a particular time period.
In some implementations, the device can also identify false positives by determining the acceleration of the user over a period of time relative to a number of different directions or axes. Higher acceleration relative to one or more axes related to one or more other axes may be less likely to indicate a fall. For example, the device may acquire accelerometer measurements as the user moves and determine accelerations relative to a plurality of different components according to an internal frame of reference. For example, a first component may correspond to acceleration in a first direction parallel to the direction of gravity (the "vertical" direction), and a second component may correspond to acceleration in a second direction parallel to the direction of gravity (the "horizontal" direction). If the acceleration in the horizontal direction is sufficiently high (e.g., above a first threshold level) and the acceleration in the vertical direction is sufficiently low (e.g., below a second threshold level), the device may determine that the user is unlikely to have experienced a fall. This can be used, for example, to distinguish between a fall (e.g., in some implementations characterized by acceleration in the vertical direction being higher than the acceleration in the horizontal direction) and physical activity in which the user experiences acceleration primarily in the horizontal direction rather than the vertical direction. In some implementations, the time period, the first threshold, and/or the second threshold can be selected empirically (e.g., based on experimental studies identifying differences between falls and other types of activities).
Although the foregoing examples are described with respect to one mobile device (e.g., mobile device 102) at a time, in some implementations, the techniques described herein can be performed using multiple devices in conjunction. For example, multiple devices (e.g., a smart watch, a smart phone, and/or a remote server) may be communicatively coupled to each other (e.g., over one or more wireless communication links), and may exchange sensor data and/or other information collected by each device. In some implementations, one device may collect sensor data and provide the sensor data to another device for processing according to the techniques described herein. For example, a smart watch may collect sensor data (e.g., because the smart watch may be more likely to be consistently positioned on a user's body than a smart phone), and provide the sensor data to the smart phone and/or a remote server for processing (e.g., because the smart phone and/or a removal server may have greater access to computing resources than the smart watch). In some implementations, multiple devices may collect sensor data simultaneously and provide the sensor data to other devices for processing according to the techniques. For example, a smart watch and a smart phone may collect sensor data simultaneously and provide the sensor data to each other and/or a remote server for processing.
Exemplary procedure
An example process 2900 for determining whether a user has fallen and/or may need help using a mobile device is shown in fig. 29. Process 2900 may be performed, for example, by one or more of the components of system 100 shown in fig. 1 (e.g., mobile device 102). In some cases, some or all of process 2900 may be performed by a co-processor of the mobile device. The co-processor may be configured to receive motion data acquired from the one or more sensors, process the motion data, and provide the processed motion data to one or more processors of the mobile device.
In process 2900, a mobile device (e.g., mobile device 102 and/or one or more other components of system 100) acquires motion data indicative of motion measured by a motion sensor over a period of time (step 2902). The sensor is worn by a user. For example, as described with respect to fig. 1 and 2A, a user may attach a mobile device, such as a smart watch, to his arm or wrist and move around in daily life. This may include, for example, walking, running, sitting, lying down, participating in sports or physical activities, or any other physical activity. During this time, the mobile device uses a motion sensor (e.g., an accelerometer) in the mobile device to measure the acceleration experienced by the sensor over a period of time. The sensor data may be presented in the form of a time-varying acceleration signal (e.g., as shown in fig. 3).
The mobile device determines an impact experienced by the user based on the motion data (step 2904), the impact occurring during a first interval of the time period. Example techniques for determining impacts are described above (e.g., with respect to fig. 3-5).
The mobile device determines one or more first motion characteristics of the user during a second interval of the time period based on the motion data (step 2906), the second interval occurring before the first interval. The second interval may be, for example, a "pre-impact" time period. Determining the first motion characteristic may include, for example, determining that the user is walking during the second interval, determining that the user is ascending or descending stairs during the second interval, and/or determining that the user is moving the body part according to a swinging motion or a supporting motion during the second interval. Example techniques for determining motion characteristics during a "pre-impact" time period are described above (e.g., with respect to fig. 6).
The mobile device determines one or more second motion characteristics of the user during a third interval of the time period based on the motion data (step 2908), the third interval occurring after the first interval. The third interval may be, for example, a "post-impact" time period. Determining the second motion characteristic may include, for example, determining that the user is walking during a third interval, determining that the user is standing during the third interval, and/or determining that the orientation of the body part of the user has changed N or more times during the third interval. Example techniques for determining motion characteristics during a "pre-impact" time period are described above (e.g., with respect to fig. 7 and 8).
The mobile device determines that the user has fallen based on the impact, one or more first motion characteristics of the user, and one or more second motion characteristics of the user (step 2910). The mobile device can also determine whether the user may need help (e.g., as a result of a fall). Example techniques for determining whether a user has fallen and may need assistance are described above (e.g., with respect to fig. 9).
For example, the mobile device may determine that the impact is greater than the first threshold based on the motion data and determine that the motion of the user is impaired during the third interval based on the motion data. Based on these determinations, the mobile device may determine that the user has fallen and may need help.
As another example, the mobile device may determine, based on the motion data, that the impact is less than the first threshold and greater than the second threshold. Additionally, the mobile device may determine, based on the motion data, whether the user is at least one of walking during the second interval, ascending stairs during the second interval, or descending stairs during the second interval. Additionally, the mobile device may determine, based on the motion data, that the user is moving the body part according to a waving motion or a supporting motion during the second interval. Additionally, the mobile device may determine, based on the motion data, that the motion of the user is impaired during the third interval. Based on these determinations, the mobile device may determine that the user has fallen and may need help.
In some cases, the mobile device may determine that the user has fallen based on a statistical model (e.g., a bayesian statistical model). For example, a statistical model may be generated based on the impact of the one or more samples, the first motion characteristics of the one or more samples, and the second motion characteristics of the one or more samples. The impact of the one or more samples, the first motion characteristic of the one or more samples, and the second motion characteristic of the one or more samples may be determined based on sample motion data collected from the group of samples. The sample motion data may be indicative of motion measured by one or more additional motion sensors over one or more additional time periods, wherein each additional motion sensor is worn by a respective user of the sample population. Example techniques for generating and using statistical models are described above. In some implementations, the one or more sampled first motion characteristics may include an indication of a type of activity performed by a particular additional user relative to the sample motion data, an indication of a level of activity of the particular additional user relative to the sample motion data, and/or an indication of a walking speed of the particular additional user relative to the sample motion data.
In response to determining that the user has fallen, the mobile device generates a notification indicating that the user has fallen (step 2912). For example, the mobile device may present an indication on a display device and/or an audio device of the mobile device that the user has fallen. As another example, the mobile device can transmit data to a communication device remote from the mobile device indicating that the user has fallen. This may include, for example, an email, an instant chat message, a text message, a telephone message, a facsimile message, a radio message, an audio message, a video message, a haptic message, or another message for communicating information. Example techniques for generating notifications are described above.
Fig. 30 shows another example process 3000 of using a mobile device to determine whether a user has fallen and/or may require assistance. Process 3000 may be performed, for example, using one or more of the components of system 100 shown in fig. 1 (e.g., mobile device 102). In some cases, some or all of process 3000 may be performed by a co-processor of a mobile device. The co-processor may be configured to receive motion data acquired from the one or more sensors, process the motion data, and provide the processed motion data to one or more processors of the mobile device.
In process 3000, a mobile device (e.g., one or more other components of mobile device 102 and/or components of system 100) acquires a first signal indicative of acceleration measured by an accelerometer over a period of time (step 3002), and a second signal indicative of orientation measured by an orientation sensor over the period of time (step 3004). The accelerometer and the orientation sensor are physically coupled to the user. For example, as described with respect to fig. 1 and 2A, a user may attach a mobile device, such as a smart watch, to his arm or wrist and move around in daily life. This may include, for example, walking, running, sitting, lying down, participating in sports or physical activities, or any other physical activity. During this time, the mobile device uses sensors (e.g., accelerometers and orientation sensors such as gyroscopes) in the mobile device to measure the acceleration experienced by the sensors over a period of time and the orientation of the sensors over the period of time. The sensor data may be presented in the form of a time-varying signal (e.g., as shown in fig. 11A).
The mobile device determines rotation data regarding an amount of rotation experienced by the user during the time period (step 3006). The rotation data may include a third signal corresponding to a rate of rotation of the user during a time period, an indication of one or more axes of rotation (e.g., one or more instantaneous axes of rotation) of the user in the reference coordinate system during the time period, and/or an indication of an average axis of rotation of the user during the time period. Example rotation data is shown and described in fig. 11A-11D, for example.
The mobile device determines whether the user has tumbled based on the rotation data (step 3008). In some cases, this may be performed by determining a change between one or more axes of rotation of the user during the time period and an average axis of rotation of the user's rotation during the time period. Additionally, the mobile device can determine that the change is less than the first threshold. In response to determining that the change is less than the first threshold, the mobile device may determine, based on the third signal, a fourth signal corresponding to an angular displacement of the user during the time period (e.g., by integrating the third signal with respect to the time period).
In addition, the mobile device may determine that an angular displacement of the user during the time period is greater than a second threshold and determine that at least one of the one or more axes of rotation of the user during the time period is greater than a third threshold. In response to determining that the angular displacement of the user during the time period is greater than the second threshold and that at least one of the one or more axes of rotation of the user during the time period is greater than the third threshold, the mobile device can determine that the user has fallen. Otherwise, the mobile device may determine that the user has not fallen.
In response to determining that the user has fallen, the mobile device generates a notification indicating that the user has fallen (step 3010). Generating the notification may include presenting an indication on at least one of a display device or an audio device of the mobile device that the user has fallen, and/or transmitting data to a communication device remote from the mobile device. This may include, for example, an email, an instant chat message, a text message, a telephone message, a facsimile message, a radio message, an audio message, a video message, a haptic message, or another message for communicating information. The data may include an indication that the user has fallen. Example techniques for generating notifications are described above.
Fig. 31 shows another example process 3100 for determining whether a user has fallen and/or assistance that may be needed using a mobile device. Process 3100 can be performed, for example, using one or more of the components of system 100 shown in fig. 1 (e.g., mobile device 102). In some cases, some or all of process 3100 may be performed by a co-processor of a mobile device. The co-processor may be configured to receive motion data acquired from the one or more sensors, process the motion data, and provide the processed motion data to one or more processors of the mobile device.
In process 3100, a mobile device (e.g., mobile device 102 and/or one or more other components of system 100) acquires motion data indicative of motion measured by one or more motion sensors over a first time period (step 3102). One or more motion sensors are worn by the user. The one or more motion sensors may include an accelerometer and/or a gyroscope. The mobile device may be a wearable mobile device. Example techniques for acquiring motion data are described above.
The mobile device determines that the user has fallen based on the motion data (step 3104). In some implementations, the mobile device can determine that the user has fallen by determining that the user experienced an impact based on the motion data). In some implementations, the mobile device can determine that the user has fallen by determining the behavior of the user during the first time period. Example techniques for determining whether a user has fallen are described above.
In response to determining that the user has fallen, the mobile device generates one or more notifications indicating that the user has fallen (step 3106). In some implementations, generating the one or more notifications can include presenting a first notification to the user indicating that the user has fallen. The first notification may comprise at least one of a visual message, an audio message, or a tactile message. Example techniques for generating notifications are described above.
In some implementations, the mobile device can receive an input from the user (e.g., an input indicating a request for assistance by the user) in response to the first notification. In response to receiving the input, the mobile device may transmit a second notification to the communication device remote from the mobile device indicating the request for assistance. The communication device may be an emergency response system. Additionally, the second notification may indicate a location of the mobile device.
In some implementations, the mobile device can determine that the user did not move during a second time period after the user fell (e.g., indicating that the user was injured or incapacitated). In response to determining that the user has not moved during the second time period, the mobile device may transmit a second notification to the communication device remote from the mobile device indicating the request for assistance.
In some implementations, the mobile device can determine that the user moved (e.g., walked, stood, or perhaps some other type of movement) during a second time period after the user fell. In response to determining that the user has moved during the second time period, the mobile device may refrain from transmitting a second notification to the communication device remote from the mobile device indicating the request for assistance.
In some implementations, one or more notifications can be generated according to a state machine. Example state machines are shown in fig. 12 and 20.
Fig. 32 shows another example process 3200 for determining whether a user has fallen and/or may require assistance using a mobile device. Process 3200 may be performed, for example, using one or more of the components of system 100 shown in fig. 1 (e.g., mobile device 102). In some cases, some or all of process 3200 may be performed by a co-processor of the mobile device. The co-processor may be configured to receive motion data acquired from the one or more sensors, process the motion data, and provide the processed motion data to one or more processors of the mobile device.
In process 3200, a mobile device (e.g., mobile device 102 and/or one or more other components of system 100) acquires sample data generated by a plurality of sensors over a period of time (step 3202). A plurality of sensors are worn by a user. The sample data includes motion data indicative of motion of the user acquired from one or more motion sensors of the plurality of sensors. The sample data also includes at least one of: the method may include indicating location data acquired from one or more location sensors of the plurality of sensors indicating a location of the mobile device, altitude data acquired from one or more altitude sensors of the plurality of sensors indicating an altitude of the mobile device, or heart rate data acquired from one or more heart rate sensors of the plurality of sensors indicating a heart rate of the user. The mobile device may be a wearable mobile device. Example techniques for obtaining sample data are described above.
In some implementations, the one or more motion sensors may include an accelerometer and/or a gyroscope. In some implementations, the accelerometer and gyroscope are independently operable to acquire motion data. For example, acceleration data may be acquired using an accelerometer during a first time interval during the time period. The gyroscope may be disabled during a first time interval. Additionally, based on the acceleration data acquired during the first time interval, the mobile device may determine that the movement of the user during the first time interval exceeds a threshold level. In response to determining that the movement of the user exceeds the threshold level during the first time interval, the mobile device may acquire acceleration data using the accelerometer and gyroscope data using the gyroscope during a second time interval subsequent to the first time interval. In some cases, the accelerometer and gyroscope may be operated according to a state machine. An example state machine is shown in fig. 14.
In some implementations, the one or more altitude sensors may include at least one of an altimeter or a barometer. Height sensors may be used, for example, to measure specific changes in height indicative of a fall (e.g., indicative of a height drop from a ladder or structure).
In some implementations, the one or more location sensors can include at least one of a wireless transceiver (e.g., a Wi-Fi radio or a cellular radio) or a global navigation satellite system receiver (e.g., a GPS receiver).
The mobile device determines that the user has fallen based on the sample data (step 3204). Example techniques for determining whether a user has fallen are described above.
In some implementations, the mobile device can determine that the user has fallen by determining a change in orientation (e.g., a stance angle) of the mobile device during the time period based on the motion data and determining that the user has fallen based on the change in orientation.
In some implementations, the mobile device can determine that the user has fallen by determining, based on the motion data, an impact that the user experienced during the time period, and determining, based on the impact, that the user has fallen.
In some implementations, the mobile device can determine that the user has fallen by determining a change in the height of the mobile device during the time period based on the height data and determining that the user has fallen based on the change in height.
In some implementations, the mobile device can determine that the user has fallen by determining a change in the user's heart rate during the time period based on the heart rate data and determining that the user has fallen based on the change in the heart rate. Determining a change in the heart rate of the user over the time period may include determining a decay rate (e.g., a time constant associated with the decay rate) of the heart rate of the user over the time period.
In some implementations, the mobile device can determine that the user has fallen by determining an environmental condition at the location of the mobile device based on the location data and determining that the user has fallen based on the environmental condition. The environmental condition may be the weather at the location (e.g., rain, snow, etc.).
In some implementations, the mobile device can determine that the user has fallen based on (e.g., in combination with) the motion data, the location data, the altitude data, and the heart rate data.
In response to determining that the user has fallen, the mobile device generates one or more notifications indicating that the user has fallen (step 3206). Generating the one or more notifications may include transmitting the notifications to a communication device remote from the mobile device. The communication device may be an emergency response system. Example techniques for generating notifications are described above.
Fig. 33 shows another example process 3300 of using a mobile device to determine whether a user has fallen and/or may require assistance. Process 3300 may be performed, for example, using one or more of the components of system 100 shown in fig. 1 (e.g., mobile device 102). In some cases, some or all of process 3300 may be performed by a co-processor of the mobile device. The co-processor may be configured to receive motion data acquired from the one or more sensors, process the motion data, and provide the processed motion data to one or more processors of the mobile device.
In process 3300, a mobile device (e.g., mobile device 102 and/or one or more other components of system 100) receives motion data acquired by one or more sensors over a first time period (step 3302). The one or more sensors are worn by the user. In some implementations, the one or more sensors may include one or more accelerometers, gyroscopes, and/or altimeters or barometers.
The mobile device may determine, based on the motion data, an impact experienced by the user during the time period (step 3304)
The mobile device determines one or more of the characteristics of the user (step 3306).
The mobile device determines a likelihood that the user needs assistance after the impact based on the motion data and one or more characteristics of the user (step 3308).
In some implementations, the one or more characteristics of the user may include an age of the user. Further, the likelihood may increase as the user ages.
In some implementations, the one or more characteristics of the user may include a gender of the user. Further, the likelihood may depend on the gender of the user.
In some implementations, the one or more characteristics of the user may include a historical physical activity level of the user. Further, the likelihood may increase as the user's historical physical activity level decreases. In some implementations, the historical physical activity level may indicate a frequency of movement of the user prior to the impact. In some implementations, the historical physical activity level may indicate the intensity of movement of the user prior to the impact.
In some implementations, the one or more characteristics of the user may include a vascular health of the user. Further, the likelihood may increase as the user's vascular health decreases. The vascular health of the user may be determined based on the user's maximum oxygen uptake (VO2 max).
In some implementations, the one or more characteristics of the user may include a historical walking speed of the user. The likelihood may increase as the user's historical walking speed decreases.
The mobile device generates one or more notifications based on the likelihood (step 3310).
In some implementations, generating the one or more notifications can include determining that the likelihood exceeds a threshold level, and in response to determining that the likelihood exceeds the threshold level, generating the one or more notifications.
In some implementations, generating the one or more notifications can include transmitting the first notification to a communication device remote from the mobile device. The first notification may include an indication that the user has fallen. The communication device may be an emergency response system.
In some implementations, the mobile device is a wearable mobile device (e.g., a smart watch). In some implementations, at least some of the one or more sensors can be disposed on or in the mobile device. In some implementations, at least some of the one or more sensors can be remote from the mobile device.
Additional details regarding process 3300 are described above (e.g., with reference to fig. 23).
Fig. 34 shows another example process 3400 for using a mobile device to determine whether a user has fallen and/or assistance that may be needed. The process 3400 may be performed, for example, using one or more of the components of the system 100 shown in fig. 1 (e.g., the mobile device 102). In some cases, some or all of process 3400 may be performed by a coprocessor of the mobile device. The co-processor may be configured to receive motion data acquired from the one or more sensors, process the motion data, and provide the processed motion data to one or more processors of the mobile device.
In process 3400, a mobile device (e.g., mobile device 102 and/or one or more other of the components of system 100) obtains a database that includes a number of data records (step 3402). Each data record includes an indication of a respective impact previously experienced by a user of the mobile device, and sensor data generated by one or more first sensors worn by the user during the impact. In some implementations, the one or more first sensors may include one or more accelerometers, gyroscopes, and/or altimeters or barometers.
The mobile device acquires additional sensor data generated by one or more second sensors worn by the user over a period of time (step 3404).
In some implementations, the one or more first sensors and the one or more second sensors can include at least one of an accelerometer or an orientation sensor. Further, for each data record, the sensor data may include one or more first signals indicative of acceleration measured by the accelerometer during the impact associated with the data record and one or more second signals indicative of orientation measured by the orientation sensor during the impact associated with the data record.
In some implementations, the additional sensor data can include one or more additional first signals indicative of acceleration measured by the accelerometer during the time period, and one or more additional second signals indicative of orientation measured by the orientation sensor during the time period.
In some implementations, each data record can include metadata about the impact associated with the data record. The metadata may include at least one of an indication of a respective time of impact associated with the data record or an indication of a respective day of the week of impact associated with the data record.
The mobile device determines whether the user has fallen during the time period based on the database and the additional sensor data (step 3406).
In some implementations, determining whether the user has fallen may include determining that the user has experienced an impact based on the additional sensor data. Further, in response to determining that the user has experienced an impact, a likelihood that the impact corresponds to a fall of the user may be determined based on the additional sensor data.
In some implementations, determining whether the user has fallen may include determining that the user has fallen from the determined likelihood. Further, in response to determining that the user has fallen, a similarity metric may be determined. The similarity metric may indicate a similarity between the additional sensor data and the sensor data of the one or more clusters in the plurality of data records.
In some implementations, determining whether the user has fallen may include determining that the user has experienced multiple impacts during the time period based on the additional sensor data, and determining that the multiple impacts are similar to each other based on the additional sensor data. In response to these determinations, it may be determined that the user is unlikely to have fallen during the time period.
In some implementations, determining whether the user has fallen may include determining, based on the additional sensor data, that the user has experienced a periodic sequence of multiple impacts during the time period, and in response, determining that the user is unlikely to have fallen during the time period.
In some implementations, determining whether the user has fallen may include determining smoothness of movement of the user during the time period based on the additional sensor data, and determining whether the user has fallen during the time period based on the smoothness of movement of the user.
In some implementations, determining whether the user has fallen may include determining, based on the additional sensor data, an acceleration of the user during the time period relative to a first direction and a second direction orthogonal to the first direction, and determining that the acceleration in the first direction is greater than a first threshold and the acceleration in the second direction is less than a second threshold. In response to these determinations, it may be determined that the user is unlikely to have fallen during the time period. In some implementations, the first direction can be orthogonal to the direction of gravity, and the second direction can be parallel to the direction of gravity.
The mobile device generates one or more notifications based on determining whether the user has fallen during the time period (step 3408).
In some implementations, generating one or more notifications can include determining that the similarity metric is below a threshold level, and in response, generating a first notification to the user to confirm whether the user has fallen.
In some implementations, generating the one or more notifications can include receiving an input from the user indicating that the user has fallen, and in response, transmitting a second notification to a communication device remote from the mobile device. The second notification may include an indication that the user has fallen. In some implementations, the communication device may be an emergency response system.
In some implementations, the process 3400 may further include generating an additional data record based on the additional sensor data and including the additional data record in a database.
In some implementations, the database can be stored on a storage device of the mobile device.
In some implementations, the process 3400 may further include generating, by the mobile device, one or more clusters of data records based on similarities between the data records. One or more clusters may be generated using K-means clustering.
In some implementations, at least some of the one or more first sensors or the one or more second sensors can be disposed on or in the mobile device.
In some implementations, at least some of the one or more first sensors or the one or more second sensors can be remote from the mobile device.
In some implementations, at least some of the one or more first sensors may be identical to at least some of the one or more second sensors.
Additional details regarding the process 3400 are described above (e.g., with reference to FIG. 28).
Fig. 35 shows another example process 3500 for determining whether a user has fallen and/or may require assistance using a mobile device. Process 3500 may be performed, for example, using one or more of the components of system 100 shown in fig. 1 (e.g., mobile device 102). In some cases, some or all of process 3500 may be performed by a co-processor of the mobile device. The co-processor may be configured to receive motion data acquired from the one or more sensors, process the motion data, and provide the processed motion data to one or more processors of the mobile device.
In process 3500, a mobile device (e.g., mobile device 102 and/or one or more other of the components of system 100) receives motion data acquired by one or more sensors (step 3502). The one or more sensors are worn by the user.
The mobile device determines that the user has fallen at the first time based on the motion data (step 3504).
The mobile device determines whether the user has moved between the second time and a third time after the first time based on the motion data (step 3506).
In some implementations, the one or more sensors include one or more accelerometers. Further, the motion data may include one or more acceleration signals acquired by one or more accelerometers. Further, determining whether the user has moved between the first time and the second time may include determining a change in one or more acceleration signals between the first time and the second time.
In some implementations, the one or more sensors include one or more orientation sensors. Further, the motion data may include one or more orientation signals acquired by one or more orientation sensors. Further, determining whether the user has moved between the first time and the second time may include determining a change in one or more orientation signals between the first time and the second time.
In some implementations, determining whether the user has moved between the second time and the third time may include determining whether the user is walking between the second time and the third time.
In some implementations, determining whether the user has moved between the second time and the third time may include determining whether the user is standing between the second time and the third time.
Upon determining that the user has not moved between the second time and the third time, the mobile device initiates a communication request emergency response service at a fourth time after the third time (step 3508). The communication includes an indication that the user has fallen and a location of the user.
In some implementations, the mobile device can refrain from initiating the communication request emergency response service upon determining that the user has moved between the second time and the third time.
In some implementations, after initiating the communication request emergency response service, the mobile device can receive a command to terminate communication from the user. In response, the mobile device may terminate the communication.
In some implementations, the mobile device can be a wearable mobile device. In some implementations, at least some of the one or more sensors can be disposed on or in the mobile device. In some implementations, at least some of the one or more sensors can be remote from the mobile device.
Additional details regarding process 3500 are described above (e.g., with reference to fig. 26).
Fig. 36 shows another example process 3600 for determining whether a user has fallen and/or may require assistance using a mobile device. The process 3600 may be performed, for example, using one or more of the components of the system 100 shown in fig. 1 (e.g., the mobile device 102). In some cases, some or all of process 3600 may be performed by a coprocessor of a mobile device. The co-processor may be configured to receive motion data acquired from the one or more sensors, process the motion data, and provide the processed motion data to one or more processors of the mobile device.
In process 3600, a mobile device (e.g., mobile device 102 and/or one or more other components of system 100) acquires sample data generated by one or more sensors over a period of time (step 3602). The one or more sensors are worn by the user.
The mobile device determines that the user has fallen based on the sample data (step 3604).
The mobile device determines the severity of the injury suffered by the user based on the sample data (step 3606).
The mobile device generates one or more notifications based on determining that the user has fallen and the determined severity of the injury (step 3608). In some implementations, generating the one or more notifications can include transmitting the first notification to a communication device remote from the mobile device. The first notification may include an indication that the user has fallen and an indication of the severity of the determined injury suffered by the user. The communication device may be an emergency response system.
In some implementations, the one or more sensors can include at least one of an accelerometer, an orientation sensor, or an altimeter. Further, the sample data may include motion data indicative of motion of the user over a period of time. Further, determining that the user has fallen may include determining, based on the motion data, a first impact experienced by the user during the time period, and determining, based on the motion data, a change in orientation of a portion of the user's body during the time period. The portion of the user's body may be (or may include) the user's wrist.
In some implementations, the motion data may include a first signal indicative of an acceleration measured by the accelerometer over a time period and a second signal indicative of an orientation measured by the orientation sensor over the time period. Further, determining the severity of the injury suffered by the user may include determining the severity of the first impact experienced by the user based on the first signal and the second signal.
In some implementations, determining the severity of the injury suffered by the user may include determining, based on the first signal and the second signal, that the user has experienced multiple impacts, including a first impact, during the time period.
In some implementations, the severity of the injury suffered by the user can be determined based on the similarity between the impacts experienced by the user. For example, a first set of characteristics associated with a first impact may be determined based on the first signal and the second signal. Further, a second set of characteristics associated with a second impact experienced by the user during the time period may be determined based on the first signal and the second signal. Further, a similarity between the first set of characteristics and the second set of characteristics may be determined. In some implementations, a greater degree of similarity between the characteristics of the first impact and the second impact may correspond to a lesser severity of injury. In some implementations, a lesser degree of similarity between the characteristics of the first and second impacts may correspond to a greater severity of injury.
In some implementations, the motion data may include a third signal indicative of the altitude measured by the altimeter over the period of time. Determining the severity of the injury suffered by the user may also include determining a distance over which the user has fallen within the time period based on the third signal. The severity of the injury may be determined based on the determined severity of the impact experienced by the user and the determined distance the user has fallen.
In some implementations, the mobile device can be a wearable mobile device. In some implementations, at least some of the one or more sensors can be disposed on or in the mobile device. In some implementations, at least some of the one or more sensors can be remote from the mobile device.
Additional details regarding process 3500 are described above (e.g., with reference to fig. 27).
Example Mobile device
Fig. 37 is a block diagram of an example device architecture 2500 for implementing the features and processes described with reference to fig. 1-36. For example, the architecture 3700 may be used to implement one or more of the mobile device 102, the server computer system 104, and/or the communication device 106. Architecture 3700 may be implemented in any device for generating the features described with reference to fig. 1-36, including, but not limited to, a desktop computer, a server computer, a portable computer, a smartphone, a tablet, a gaming console, a wearable computer, a set-top box, a media player, a smart television, and so forth.
The architecture 3700 may include a memory interface 3702, one or more data processors 3704, one or more data coprocessors 3774, and a peripheral interface 3706. The memory interface 3702, the one or more processors 3704, the one or more coprocessors 3774, and/or the peripheral interface 3706 may be separate components or may be integrated into one or more integrated circuits. One or more communication buses or signal lines may couple the various components.
The one or more processors 3704 and/or the one or more coprocessors 3774 may cooperate to perform the operations described herein. For example, the one or more processors 3704 can include one or more Central Processing Units (CPUs) configured to act as a main computer processor of the architecture 3700. For example, the one or more processors 3704 may be configured to perform the generalized data processing tasks of the architecture 3700. Additionally, at least some of the data processing tasks may be offloaded to the one or more coprocessors 3774. For example, specialized data processing tasks (such as processing motion data, processing image data, encrypting data, and/or performing certain types of arithmetic operations) may be offloaded to one or more dedicated coprocessors 3774 for processing these tasks. In some cases, the one or more processors 3704 may be relatively more powerful than the one or more coprocessors 3774 and/or may consume more power than the one or more coprocessors 3774. This may be useful, for example, because it enables the one or more processors 3704 to quickly process generalized tasks while also offloading certain other tasks to the one or more coprocessors 3774 that may perform those tasks more efficiently and/or more efficiently. In some cases, one or more coprocessors may include one or more sensors or other components (e.g., as described herein), and may be configured to process data acquired using these sensors or components and provide the processed data to the one or more processors 3704 for further analysis.
Sensors, devices, and subsystems can be coupled to peripherals interface 3706 to facilitate multiple functions. For example, motion sensor 3710, light sensor 3712, and proximity sensor 3714 may be coupled to the peripheral interface 3706 to facilitate orientation, illumination, and proximity functions of the architecture 3700. For example, in some implementations, the light sensor 3712 can be utilized to help adjust the brightness of the touch surface 3746. In some implementations, the motion sensor 3710 can be used to detect movement and orientation of the device. For example, the motion sensor 3710 may include one or more accelerometers (e.g., to measure acceleration experienced by the motion sensor 3710 and/or the architecture 3700 over a period of time) and/or one or more compasses or gyroscopes (e.g., to measure orientation of the motion sensor 3710 and/or the mobile device). In some cases, the measurement information acquired by the motion sensor 3710 may be in the form of one or more time-varying signals (e.g., time-varying graphs of acceleration and/or orientation over a period of time). Additionally, display objects or media may be presented according to the detected orientation (e.g., according to a "portrait" orientation or a "landscape" orientation). In some cases, the motion sensor 3710 may be integrated directly into the co-processor 3774 configured to process measurements acquired by the motion sensor 3710. For example, the co-processor 3774 may include one or more accelerometers, compasses, and/or gyroscopes, and may be configured to acquire sensor data from each of these sensors, process the sensor data, and transmit the processed data to the one or more processors 3704 for further analysis.
Other sensors may also be connected to the peripheral interface 3706, such as temperature sensors, biometric sensors, or other sensing devices to facilitate related functions. For example, as shown in fig. 37, the architecture 3700 may include a heart rate sensor 3732 that measures the heart beat of the user. Similarly, these other sensors may also be integrated directly into one or more coprocessors 3774 configured to process measurements taken from those sensors.
A location processor 3715 (e.g., a GNSS receiver chip) may be connected to the peripherals interface 3706 to provide a geo-reference. Electronic magnetometer 3716 (e.g., an integrated circuit chip) can also be connected to peripherals interface 3706 to provide data that can be used to determine the direction of magnetic north. Thus, the electronic magnetometer 3716 can be used as an electronic compass.
Communication functions can be facilitated through one or more communication subsystems 3724. The communication subsystem 3724 may include one or more wireless and/or wired communication subsystems. For example, the wireless communication subsystem may include a radio frequency receiver and transmitter and/or an optical (e.g., infrared) receiver and transmitter. As another example, a wired communication system may include a port device (e.g., a Universal Serial Bus (USB) port) or some other wired port connection that may be used to establish a wired connection to other computing devices, such as other communication devices, network access devices, personal computers, printers, display screens, or other processing devices capable of receiving or transmitting data.
The specific design and implementation of the communication subsystem 3724 may depend on the one or more communication networks or the one or more media over which the architecture 3700 is intended to operate. For example, architecture 3700 can include a network designed to communicate over a global system for mobile communications (GSM) network, GPRS network, Enhanced Data GSM Environment (EDGE) network, 802.x communication network (e.g., Wi-Fi, Wi-Max), Code Division Multiple Access (CDMA) network, NFC, and BluetoothTMA wireless communication subsystem for network operations. The wireless communication subsystem may also include a host protocol such that the architecture 3700 may be configured as a base station for other wireless devices. As another example, the communication subsystem may use one or more protocols, such as the TCP/IP protocol, the HTTP protocol, the UDP protocol, and any other known protocol to allow the architecture 3700 to synchronize with a host device.
An audio subsystem 3726 may be coupled to the speaker 3728 and one or more microphones 3730 to facilitate voice-enabled functions, such as voice recognition, voice replication, digital recording, and telephony functions.
I/O subsystem 3740 can include touch controller 3742 and/or one or more other input controllers 3744. The touch controller 3742 can be coupled to the touch surface 3746. Touch surface 3746 and touch controller 3742 can detect contact and movement or breaks thereof, for example, using any of a variety of touch sensitive technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch surface 3746. In one implementation, the touch surface 3746 may display virtual buttons or soft buttons and a virtual keyboard, which may be used by a user as input/output devices.
In some implementations, the architecture 3700 may present recorded audio and/or video files, such as MP3, AAC, and MPEG video files. In some implementations, the architecture 3700 may include the functionality of an MP3 player and may include pin connectors for connecting to other devices. Other input/output devices and control devices may be used.
The memory interface 3702 can be coupled to the memory 3750. Memory 3750 may include high-speed random access memory or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, or flash memory (e.g., NAND, NOR). The memory 3750 may store an operating system 3752, such as Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system (such as VxWorks). Operating system 3752 may include instructions for handling basic system services and for performing hardware related tasks. In some implementations, the operating system 3752 can include a kernel (e.g., UNIX kernel).
Each of the instructions and applications identified above may correspond to a set of instructions for performing one or more functions described herein. The instructions need not be implemented as separate software programs, procedures or modules. Memory 3750 may include additional instructions or fewer instructions. Further, various functions of the device may be performed in hardware and/or software, including in one or more signal processing and/or Application Specific Integrated Circuits (ASICs).
The features may be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them. The features may be implemented in a computer program product tangibly embodied in an information carrier, e.g., in a machine-readable storage device, for execution by a programmable processor; and method steps can be performed by a programmable processor executing a program of instructions to perform functions of the described implementations by operating on input data and generating output.
The described features can be implemented advantageously in one or more computer programs that are executable on a programmable system including at least one input device, at least one output device, and at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system. A computer program is a set of instructions that can be used, directly or indirectly, in a computer to perform a certain activity or bring about a certain result. A computer program can be written in any form of programming language, including compiled and interpreted languages (e.g., Objective-C, Java), and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
Suitable processors for the execution of a program of instructions include, by way of example, both general and special purpose microprocessors, and the sole processor or one of multiple processors or cores of any kind of computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for executing instructions and one or more memories for storing instructions and data. Generally, a computer may communicate with a mass storage device to store data files. These mass storage devices may include magnetic disks, such as internal hard disks and removable magnetic disks; magneto-optical disks; and an optical disc. Storage devices suitable for tangibly embodying computer program instructions and data include: all forms of non-volatile memory including, for example, semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and memory may be supplemented by, or incorporated in, ASICs (application-specific integrated circuits).
To provide for interaction with a user, these features can be implemented on a computer having a display device, such as a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the author, and a keyboard and a pointing device, such as a mouse or a trackball, by which the author can provide input to the computer.
The features can be implemented in a computer system that includes a back-end component, such as a data server, or that includes a middleware component, such as an application server or an Internet server, or that includes a front-end component, such as a client computer having a graphical user interface or an Internet browser, or any combination of them. The components of the system can be connected by any form or medium of digital data communication, such as a communication network. Examples of communication networks include a LAN, a WAN, and computers and networks forming the internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
One or more features or steps of the disclosed embodiments may be implemented using an Application Programming Interface (API). An API may define one or more parameters that are passed between a calling application and other software code (e.g., operating system, inventory program, functions) that provides a service, provides data, or performs an operation or computation.
An API may be implemented as one or more calls in program code that send or receive one or more parameters through a parameter list or other structure based on a calling convention defined in an API specification document. A parameter may be a constant, a key, a data structure, a target class, a variable, a data type, a pointer, an array, a list, or another call. The API calls and parameters may be implemented in any programming language. The programming language may define the vocabulary and calling conventions that a programmer will use to access the functions that support the API.
In some implementations, the API call can report to the application the device's capabilities to run the application, such as input capabilities, output capabilities, processing capabilities, power capabilities, communication capabilities, and the like.
As noted above, some aspects of the subject matter of this specification include the collection and use of data from various sources to improve the services that a mobile device can provide to a user. The present disclosure contemplates that, in some cases c, the collected data may identify a particular location or address based on device usage. Such personal information data may include location-based data, addresses, subscriber account identifiers, or other identifying information.
The present disclosure also contemplates that entities responsible for the collection, analysis, disclosure, transmission, storage, or other use of such personal information data will comply with established privacy policies and/or privacy practices. In particular, such entities should enforce and adhere to the use of privacy policies and practices that are recognized as meeting or exceeding industry or government requirements for maintaining privacy and security of personal information data. For example, personal information from a user should be collected for legitimate and legitimate uses by an entity and not shared or sold outside of these legitimate uses. In addition, such collection should only be done after the user has informed consent. In addition, such entities should take any required steps to secure and protect access to such personal information data, and to ensure that others who are able to access the personal information data comply with their privacy policies and procedures. In addition, such entities may subject themselves to third party evaluations to prove compliance with widely accepted privacy policies and practices.
In the context of an ad delivery service, the present disclosure also contemplates embodiments in which a user selectively prevents use or access to personal information data. That is, the present disclosure contemplates that hardware elements and/or software elements may be provided to prevent or block access to such personal information data. For example, in the case of an ad delivery service, the techniques of the present invention may be configured to allow a user to opt-in or opt-out of participating in the collection of personal information data during registration with the service.
Thus, while the present disclosure broadly covers the use of personal information data to implement one or more of the various disclosed embodiments, the present disclosure also contemplates that various embodiments may be implemented without the need to access such personal information data. That is, various embodiments of the present technology do not fail to function properly due to the lack of all or a portion of such personal information data. For example, content may be selected and delivered to a user by inferring preferences based on non-personal information data or an absolute minimum amount of personal information, such as content requested by a device associated with the user, other non-personal information available to a content delivery service, or publicly available information.
A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made. Elements of one or more implementations may be combined, deleted, modified, or supplemented to form additional implementations. As another example, the logic flows depicted in the figures do not require the particular order shown, or sequential order, to achieve desirable results. In addition, other steps may be provided or steps may be eliminated from the described flows, and other components may be added to or removed from the described systems. Accordingly, other embodiments are within the scope of the following claims.
Claims (20)
1. A method, comprising:
receiving, by a mobile device, motion data acquired by one or more sensors over a period of time, wherein the one or more sensors are worn by a user;
determining, by the mobile device, an impact experienced by the user during the time period based on the motion data;
determining, by the mobile device, one or more of characteristics of the user, wherein the one or more characteristics of the user include at least one of an age of the user, a gender of the user, or a vascular health of the user;
determining, by the mobile device, a fall risk for the user based on at least one of the age of the user, the gender of the user, or the vascular health of the user;
determining, by the mobile device, a likelihood that the user needs assistance after the impact based on the motion data, the one or more characteristics of the user, and the determined fall risk; and
generating, by the mobile device, one or more notifications based on the likelihood.
2. A method as claimed in claim 1, wherein the determined fall risk increases with the age of the user.
3. The method of claim 1, wherein the one or more characteristics of the user comprise a historical physical activity level of the user, and wherein the determined likelihood increases as the historical physical activity level of the user decreases.
4. The method of claim 3, wherein the historical physical activity level is indicative of at least one of:
the frequency of movement of the user prior to the impact, or
A movement intensity of the user prior to the impact.
5. The method of claim 1, wherein the determined likelihood increases as the vascular health of the user decreases.
6. The method of claim 1, wherein the vascular health of the user is based on a maximum oxygen uptake (VO) of the user2max) is determined.
7. The method of claim 1, wherein the one or more characteristics of the user include a historical walking speed of the user, and wherein the determined likelihood increases as the historical walking speed of the user decreases.
8. The method of claim 1, wherein generating the one or more notifications comprises:
Determining that the likelihood exceeds a threshold level, an
In response to determining that the likelihood exceeds the threshold level, generating the one or more notifications.
9. The method of claim 1, wherein generating the one or more notifications comprises:
transmitting a first notification to a communication device remote from the mobile device, the first notification including an indication that the user has fallen.
10. The method of claim 9, wherein the communication device is an emergency response system.
11. A system, comprising:
one or more processors; and
one or more non-transitory computer-readable media storing instructions that, when executed by the one or more processors, cause the one or more processors to perform operations comprising:
receiving, by a mobile device, motion data acquired by one or more sensors over a period of time, wherein the one or more sensors are worn by a user;
determining, by the mobile device, an impact experienced by the user during the time period based on the motion data;
determining, by the mobile device, one or more of characteristics of the user, wherein the one or more characteristics of the user include at least one of an age of the user, a gender of the user, or a vascular health of the user;
Determining, by the mobile device, a fall risk for the user based on at least one of the age of the user, the gender of the user, or the vascular health of the user;
determining, by the mobile device, a likelihood that the user needs assistance after the impact based on the motion data, the one or more characteristics of the user, and the determined fall risk; and
generating, by the mobile device, one or more notifications based on the likelihood.
12. The system of claim 11, wherein the determined likelihood increases as the age of the user increases.
13. The system of claim 11, wherein the one or more characteristics of the user include a historical physical activity level of the user, and wherein the determined likelihood increases as the historical physical activity level of the user decreases.
14. The system of claim 13, wherein the historical physical activity level indicates a frequency of movement of the user prior to the impact, and wherein the historical physical activity level indicates an intensity of movement of the user prior to the impact.
15. The system of claim 11, wherein the determined likelihood increases as the vascular health of the user decreases.
16. The system of claim 11, wherein the vascular health of the user is based on a maximum oxygen uptake (VO) of the user2max) is determined.
17. The system of claim 11, wherein the one or more characteristics of the user include a historical walking speed of the user, and wherein the determined likelihood increases as the historical walking speed of the user decreases.
18. The system of claim 11, wherein generating the one or more notifications comprises:
determining that the likelihood exceeds a threshold level, an
In response to determining that the likelihood exceeds the threshold level, generating the one or more notifications.
19. The system of claim 11, wherein generating the one or more notifications comprises:
transmitting a first notification to a communication device remote from the mobile device, the first notification including an indication that the user has fallen.
20. The system of claim 21, wherein the mobile device is a wearable mobile device, and wherein at least some of the one or more sensors are disposed on or in the wearable mobile device.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/929,010 | 2020-07-14 | ||
US16/929,010 US11282361B2 (en) | 2017-09-29 | 2020-07-14 | Detecting falls using a mobile device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113936421A true CN113936421A (en) | 2022-01-14 |
Family
ID=79274382
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110792593.XA Pending CN113936421A (en) | 2020-07-14 | 2021-07-14 | Detecting falls using a mobile device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113936421A (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9179864B2 (en) * | 2007-08-15 | 2015-11-10 | Integrity Tracking, Llc | Wearable health monitoring device and methods for fall detection |
CN105530865A (en) * | 2013-09-11 | 2016-04-27 | 皇家飞利浦有限公司 | Fall detection system and method |
CN105769205A (en) * | 2016-02-23 | 2016-07-20 | 中国科学院深圳先进技术研究院 | Body information detection device and fall detection system |
CN107123239A (en) * | 2017-05-10 | 2017-09-01 | 杭州伯仲信息科技有限公司 | A kind of intelligent shoe fall detection system and method |
CN107205679A (en) * | 2014-10-31 | 2017-09-26 | 意锐瑟科技公司 | Wireless physiological monitor device and system |
US10147296B2 (en) * | 2016-01-12 | 2018-12-04 | Fallcall Solutions, Llc | System for detecting falls and discriminating the severity of falls |
CN111132603A (en) * | 2017-09-29 | 2020-05-08 | 苹果公司 | Detecting falls using a mobile device |
-
2021
- 2021-07-14 CN CN202110792593.XA patent/CN113936421A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9179864B2 (en) * | 2007-08-15 | 2015-11-10 | Integrity Tracking, Llc | Wearable health monitoring device and methods for fall detection |
CN105530865A (en) * | 2013-09-11 | 2016-04-27 | 皇家飞利浦有限公司 | Fall detection system and method |
CN107205679A (en) * | 2014-10-31 | 2017-09-26 | 意锐瑟科技公司 | Wireless physiological monitor device and system |
US10147296B2 (en) * | 2016-01-12 | 2018-12-04 | Fallcall Solutions, Llc | System for detecting falls and discriminating the severity of falls |
CN105769205A (en) * | 2016-02-23 | 2016-07-20 | 中国科学院深圳先进技术研究院 | Body information detection device and fall detection system |
CN107123239A (en) * | 2017-05-10 | 2017-09-01 | 杭州伯仲信息科技有限公司 | A kind of intelligent shoe fall detection system and method |
CN111132603A (en) * | 2017-09-29 | 2020-05-08 | 苹果公司 | Detecting falls using a mobile device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111132603B (en) | Detecting falls using a mobile device | |
US12027027B2 (en) | Detecting falls using a mobile device | |
US11282362B2 (en) | Detecting falls using a mobile device | |
US11282361B2 (en) | Detecting falls using a mobile device | |
US11282363B2 (en) | Detecting falls using a mobile device | |
US20240315601A1 (en) | Monitoring user health using gait analysis | |
Zhao et al. | Recognition of human fall events based on single tri-axial gyroscope | |
Kepski et al. | Event‐driven system for fall detection using body‐worn accelerometer and depth sensor | |
CN113936422A (en) | Detecting falls using a mobile device | |
CN113936420B (en) | Detecting falls using a mobile device | |
CN113936421A (en) | Detecting falls using a mobile device | |
KR102719588B1 (en) | Detecting falls using a mobile device | |
Weng et al. | Fall detection based on tilt angle and acceleration variations | |
KR20240151879A (en) | Detecting falls using a mobile device | |
Carrington et al. | SpokeSense: developing a real-time sensing platform for wheelchair sports | |
Wang et al. | A wearable action recognition system based on acceleration and attitude angles using real-time detection algorithm | |
US20240041354A1 (en) | Tracking caloric expenditure using a camera | |
CN116386271A (en) | Fall early warning method, device, system and storage medium | |
Del Rosario | Convergence of smartphone technology and algorithms that estimate physical activity for cardiac rehabilitation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |