US20200100684A1 - Electronic Device that Computes Health Data - Google Patents
Electronic Device that Computes Health Data Download PDFInfo
- Publication number
- US20200100684A1 US20200100684A1 US16/700,710 US201916700710A US2020100684A1 US 20200100684 A1 US20200100684 A1 US 20200100684A1 US 201916700710 A US201916700710 A US 201916700710A US 2020100684 A1 US2020100684 A1 US 2020100684A1
- Authority
- US
- United States
- Prior art keywords
- light
- user
- body part
- wearable device
- health data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000036541 health Effects 0.000 title claims abstract description 81
- 238000012545 processing Methods 0.000 claims description 18
- 238000001514 detection method Methods 0.000 claims description 8
- 238000005259 measurement Methods 0.000 abstract description 15
- 238000000034 method Methods 0.000 description 23
- 230000031700 light absorption Effects 0.000 description 14
- 238000003384 imaging method Methods 0.000 description 12
- 210000004369 blood Anatomy 0.000 description 7
- 239000008280 blood Substances 0.000 description 7
- 229910000530 Gallium indium arsenide Inorganic materials 0.000 description 6
- KXNLCSXBJCPWGL-UHFFFAOYSA-N [Ga].[As].[In] Chemical compound [Ga].[As].[In] KXNLCSXBJCPWGL-UHFFFAOYSA-N 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 6
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 5
- 229910052710 silicon Inorganic materials 0.000 description 5
- 239000010703 silicon Substances 0.000 description 5
- 210000000577 adipose tissue Anatomy 0.000 description 4
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 4
- 230000036772 blood pressure Effects 0.000 description 4
- 229910052760 oxygen Inorganic materials 0.000 description 4
- 239000001301 oxygen Substances 0.000 description 4
- 230000010412 perfusion Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 210000004209 hair Anatomy 0.000 description 3
- 230000036571 hydration Effects 0.000 description 3
- 238000006703 hydration reaction Methods 0.000 description 3
- 238000001228 spectrum Methods 0.000 description 3
- 238000007792 addition Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 210000001519 tissue Anatomy 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000017531 blood circulation Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000002996 emotional effect Effects 0.000 description 1
- 230000004217 heart function Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 230000004962 physiological condition Effects 0.000 description 1
- 230000019612 pigmentation Effects 0.000 description 1
- 231100000430 skin reaction Toxicity 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A61B5/0402—
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
- A61B5/1455—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
- A61B5/14551—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/318—Heart-related electrical modalities, e.g. electrocardiography [ECG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
- A61B5/6898—Portable consumer electronic devices, e.g. music players, telephones, tablet computers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/70—Means for positioning the patient in relation to the detecting, measuring or recording means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7203—Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7405—Details of notification to user or communication with user or patient ; user input means using sound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7455—Details of notification to user or communication with user or patient ; user input means characterised by tactile indication, e.g. vibration or electrical stimulation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0257—Proximity sensors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
- A61B5/02055—Simultaneously evaluating both cardiovascular condition and temperature
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/021—Measuring pressure in heart or blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
- A61B5/02416—Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
- A61B5/02438—Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
- A61B5/0245—Detecting, measuring or recording pulse rate or heart rate by using sensing means generating electric signals, i.e. ECG signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/026—Measuring blood flow
- A61B5/0261—Measuring blood flow using optical means, e.g. infrared light
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/053—Measuring electrical impedance or conductance of a portion of the body
- A61B5/0537—Measuring body composition by impedance, e.g. tissue hydration or fat content
Definitions
- This disclosure relates generally to health data, and more specifically to an electronic device that computes health data
- health data may indicate emergency conditions or to enable the user to maximize fitness or wellness activities.
- health data is provided to users by health care professionals. However, it may be beneficial for users to have more access to health data.
- An electronic device may include a camera, an ambient light sensor, and a proximity sensor.
- the electronic device may use one or more of the camera and the proximity sensor to emit light into a body part of a user touching a surface of the electronic device and one or more of the camera, the ambient light sensor, and the proximity sensor to receive at least part of the emitted light reflected by the body part of the user.
- the electronic device may compute health data of the user based upon sensor data regarding the received light.
- the electronic device may also include one or more electrical contacts that contact one or more body parts of the user. In such implementations, the health data may be further computed based on an electrical measurement obtained using the electrical contacts.
- the electronic device may utilize the camera to determine the user's body part is misaligned with the camera, the ambient light sensor, and the proximity sensor for purposes of detecting the information about the body part of the user. In such implementations, the electronic device may provide guidance to correct the misalignment.
- a mobile personal computing device may include a camera, an ambient light sensor, a proximity sensor, and a processing unit communicably coupled to the camera, the ambient light sensor, and the proximity sensor.
- the processing unit may be configured to: use at least one of camera and a proximity sensor to emit light into a body part of a user touching a surface of the mobile personal computing device; use at least one of the camera, an ambient light sensor, or the proximity sensor to receive at least part of the emitted light reflected by the body part of the user and generate sensor data; and computing health data of the user, utilizing the processing unit, using at least the sensor data regarding the received light.
- a method for using a mobile personal computing device to obtain health data may include: using at least one of camera and a proximity sensor to emit light into a body part of a user touching a surface of the device; using at least one of the camera, an ambient light sensor, or the proximity sensor to receive at least part of the emitted light reflected by the body part of the user and generate sensor data; and computing health data of the user, utilizing the processing unit, using at least the sensor data regarding the received light.
- a method for guiding use of a mobile personal computing device to obtain health data may include: detecting, utilizing a camera, a profile of a body part of a user contacting the camera; determining, using the profile, if the body part is misaligned with a combination of the camera, an ambient light sensor, and a proximity sensor for purposes of obtaining health data for the user; and providing guidance to correct the misalignment.
- a computer program product including a non-transitory storage medium may include a first set of instructions, stored in the non-transitory storage medium, executable by at least one processing unit to use at least one of a camera and a proximity sensor to emit light into a body part of a user touching a surface of a mobile personal computing device; a second set of instructions, stored in the non-transitory storage medium, executable by the least one processing unit to use at least one of the camera, an ambient light sensor, or the proximity sensor to receive at least part of the emitted light reflected by the body part of the user and generate sensor data; and a third set of instructions, stored in the non-transitory storage medium, executable by the least one processing unit to compute health data of the user using at least the sensor data regarding the received light.
- FIG. 1 is an isometric view an example system for obtaining health data utilizing an electronic device.
- FIG. 2 illustrates the view of FIG. 1 while the example system is being utilized to obtain health data.
- FIG. 3 illustrates the view of FIG. 2 while the example system is providing guidance to obtain health data.
- FIG. 4 illustrates the view of FIG. 2 while the example system is providing the obtained health data.
- FIG. 5 is a flow chart illustrating an example method for using an electronic device to obtain health data. This method may be performed by the system of FIG. 1 .
- FIG. 6 is a flow chart illustrating an example method for guiding use of an electronic device to obtain health data. This method may be performed by the system of FIG. 1 .
- FIG. 7 is a block diagram illustrating functional relationships among components of the example system of FIG. 1 .
- An electronic device (such as a smart phone, tablet computer, mobile computer, digital media player, wearable device, or other electronic device) may include a camera, an ambient light sensor, and a proximity sensor.
- the electronic device may use one or more of the camera and the proximity sensor to emit light into a body part of a user (such as a finger, and ear, and so on) touching a surface of the electronic device.
- the electronic device may use one or more of the camera, the ambient light sensor, and the proximity sensor to receive at least part of the emitted light reflected by the body part of the user.
- the electronic device may compute health data of the user based upon sensor data regarding the received light. In this way, the health data of the user may be detected utilizing an electronic device including a camera, ambient light sensor, and proximity sensor without making the user obtain access to a dedicated fitness and/or wellness device.
- the camera, ambient light sensor, and proximity sensor may be positioned such that they are all at least partially covered (and/or contacted) by the user's body part at the same time, such as when the health data is computed.
- the electronic device may also include electrical contacts.
- the health data of the user may also be computed using an electrical measurement obtained using from the electrical contacts.
- the electrical contacts may be positioned to contact the body part of the user and an additional body part such that electrical measurement represents the electrical properties of organs or portions of the body located between the two contacting body parts.
- the two body parts are the user's left and right hands and the electrical measurement corresponds to an electrical property that is measured across the user's chest.
- the electronic device may utilize the camera to determine the user's body part is misaligned with the camera, the ambient light sensor, and the proximity sensor for purposes of detecting the information about the body part of the user.
- the electronic device may provide guidance (such as visual, audio, haptic, and/or other guidance) to correct the misalignment.
- the information from the camera may be utilized to detect this misalignment even in implementations where the camera is configured with a focal distance greater than a distance between the camera and the user's body part when the user's body part is touching the surface of the electronic device.
- the proximity sensor may be a multiple light wavelength sensor (such as a sensor that utilizes infrared and visible light, infrared and red light, and so on).
- the ambient light sensor may be a silicon ambient light sensor, an indium gallium arsenide ambient light sensor, and/or other kind of ambient light sensor.
- the camera may be both an infrared and visible light camera.
- the health data may include one or more of a variety of different wellness, fitness, and/or other parameters relating to the health of a user.
- the health data may include: a blood pressure index, a blood hydration, a body fat content, an oxygen saturation, a pulse rate, a perfusion index, an electrocardiogram, a photoplethysmogram, and/or any other such health data.
- the electronic device may provide the computed health data to the user.
- FIG. 1 is an isometric view an example system 100 for obtaining health data utilizing an electronic device.
- the system may include an electronic device 101 .
- the electronic device is shown as a smart phone. However, it is understood that this is an example.
- the electronic device may be any kind of electronic device such as any kind of mobile personal computing device (such as a smart phone, tablet computer, a mobile computer, a digital media player, a cellular telephone, a laptop computer, a wearable device, and so on), a desktop computer, a display, and/or any other electronic device.
- the electronic device 101 may include a housing 102 with a surface 103 where a camera 104 , an ambient light sensor 105 , and a proximity sensor 106 are positioned.
- the camera, ambient light sensor, and proximity sensor may be positioned such that they are partially or entirely covered (and/or contacted) by the body part 202 of a user (illustrated as a finger though such a body part may be an ear, a palm, and/or other body part of the user) at the same time.
- the electronic device may compute health data for the user.
- a camera may be capture images using a visible light imaging sensor and a lens focused at a focal distance away from the lens
- an ambient light sensor may use a broad range photodiode or similar non-imaging light detector to determine ambient light conditions
- a proximity sensor may use a limited range light source (such as an infrared light emitting diode or “LED”) to emit limited range light and a limited range non-imaging light detector to detect if the emitted limited range light is reflected by one or more object to determine whether or not such an object is proximate to the proximity sensor.
- the electronic device 101 may camera 104 , the ambient light sensor 105 , and the proximity sensor 106 in non-traditional ways to detect information about the body part 202 .
- the electronic device 101 may use one or more of the camera 104 and the proximity sensor 106 to emit light into a body part 202 of a user touching a surface 103 of the electronic device.
- the electronic device may use one or more of the camera, the ambient light sensor 105 , and the proximity sensor to receive at least part of the emitted light reflected by the body part of the user.
- the electronic device may compute health data of the user based upon sensor (such as the camera, the ambient light sensor, and/or the proximity sensor) data regarding the received light.
- one or more of the camera 104 , the ambient light sensor 105 , and the proximity sensor 106 may receive light reflected off of the body part 202 of the user.
- Such light may originate from one or more of the camera (in implementations where the camera includes a light source such as a LED used as a flash), the ambient light sensor (which may be a non-imaging photodiode in some implementations), the proximity sensor (such as in implementations where the proximity sensor is a non-imaging photodiode and one or more LEDs that determine proximity by measuring the time between transmission of light by the LED and receipt of the light by the non-imaging photodiode after reflection off of an object such as the body part 202 of the user), and/or other light source.
- the electronic device 101 may analyze sensor data regarding the received light and compute information such as the light absorption of the body part.
- Various health data for the user may be computed from the computed light absorption of the body part.
- sensor data regarding the received light may be used to estimate changes in the volume of the body part 202 of the user.
- sensor data regarding the received light may be used to estimate changes in the volume of the body part 202 of the user.
- some light is reflected, some light is scattered, and some light is absorbed, depending on what the light encounters.
- blood may absorb light more than surrounding tissue, so less reflected light may be sensed when more blood is present.
- the user's blood volume generally increases and decreases with each heartbeat.
- analysis of sensor data regarding the reflected light may reflect changes in blood volume and thus allow health data such as oxygen saturation, pulse rate, perfusion index, and such to be computed.
- one or more images of the body part 202 of the user captured by the camera 104 may be analyzed to compute various health data for the user.
- the camera may be an infrared camera and/or a combined visible light and infrared camera.
- infrared data in the image may be analyzed to compute temperature of the body part, changing blood flow in the body part, and so on.
- the ambient light sensor and/or proximity sensor may be utilized to obtain such infrared data regarding the body part.
- various information may be obtained regarding the body part 202 utilizing data from the camera 104 , the ambient light sensor 105 , and the proximity sensor 106 .
- Such information may be utilized in a variety of different ways.
- each of the camera, the ambient light sensor, and the proximity sensor may capture sensor data regarding light absorption of the body part 202 .
- the light absorption represented by the light received by each may be different based on the particular sensor strengths and/or weaknesses of the respective device.
- the sensor data related to light absorption from each may be compared to the others and/or combined in order to obtain a more accurate, single light absorption measurement.
- sensor data from one or more of the camera 104 , the ambient light sensor 105 , and the proximity sensor 106 may be used to adjust information from one or more others of the camera, the ambient light sensor, and the proximity sensor.
- the proximity sensor may be utilized to obtain sensor data related to light absorption of the body part 202 and the camera may be utilized to determine the specific area of the body part the information relates to. Light absorption may be interpreted differently in computing health data for different areas of the body part (such as where the area of the body part is hairless versus containing hair, where the area is a highly callused area as opposed to a non-callused area, and so on).
- the sensor data from the camera regarding the specific area of the body part being analyzed may be utilized to adjust the sensor data related to light absorption obtained from the proximity sensor to account for the specific characteristics of the area of the body part that may influence interpretation of light absorption for computing health data for the user.
- the electronic device 101 may also include electrical contacts such as electrical contacts 107 a and 107 b disposed on an exterior surface of the electronic device.
- the electronic device 101 may compute health data of the user based upon sensor data obtained from the camera 104 , the ambient light sensor 105 , and the proximity sensor 106 as well as an electrical measurement obtained using the electrical contacts.
- the electrical contacts 107 a and 107 b may be positioned to contact the body part 202 of the user (such as during the time when the information is being detected) and/or an additional body part 201 of the user. For example, as shown a finger of the user may contact a top electrical contact 107 a while a palm of the user contacts a bottom electrical contact 107 b .
- the electrical contacts may be configured to contact other body parts of the user (such as an ear, a cheek, and so on) without departing from the scope of the present disclosure.
- the electrical contacts 107 a and 107 b may be positioned to contact the body part 202 of the user and an additional body part of the user such that electrical measurement obtained using the electrical contacts corresponds to an electrical characteristic across the user's chest. For example, as shown a finger of the user's left hand may contact a top electrical contact 107 a while a right palm of the user (connected to each other through the user's chest) contacts a bottom electrical contact 107 b . Positioning the electrical contacts to contact user body parts such that the electrical measurement obtained using the electrical contacts corresponds to an electrical property across the user's chest. Such a measurement may enable information related to health data (such as an electrocardiogram) to be obtained that might not otherwise be possible absent such positioning.
- health data such as an electrocardiogram
- electrical measurements may be taken via the electrical contacts 107 a and 107 b (which may respectively be configured as positive and negative terminals) that may be used to detect electrical activity of the user's body.
- Such electrical measurements may be used (in some cases along with analysis of the received light) to measure heart function, compute an electrocardiogram, compute a galvanic skin response that may be indicative of emotional state and/or other physiological condition, and/or compute other health data such as body fat, or blood pressure.
- FIG. 1 illustrates a specific configuration including the camera 104 , the ambient light sensor 105 , the proximity sensor 106 , and the electrical contacts 107 a and 107 b , it is understood that this in an example. In various implementations other configurations are possible and contemplated without departing from the scope of the present disclosure.
- the ambient light sensor 105 and the proximity sensor 106 are illustrated and described as separated sensors. However, in some implementations the ambient light sensor and the proximity sensor may be incorporated into a single, unified sensor that may detect both ambient light and proximity without departing from the scope of the present disclosure.
- the proximity sensor 106 may operate utilizing a single wavelength of light, such as the infrared portion of the light spectrum.
- the proximity sensor and/or the camera 104 and/or the ambient light sensor 105 ) may be a multiple wavelength proximity sensor that operates utilizing multiple wavelengths of light.
- the proximity sensor 106 may operate utilizing infrared and visible light (such as red light).
- the proximity sensor may include an infrared LED for producing infrared light and a red LED for producing red light.
- Sensor data obtained utilizing different wavelengths of light may be different based on the particular detection strengths and/or weaknesses of the respective wavelength.
- the information detected utilizing the various wavelengths may be combined and/or utilized to adjust each other in order to obtain greater accuracy.
- dark and light hairs may have different light absorption due to their different pigmentation regardless of their other physical characteristics.
- a more accurate light absorption that accounts for such color difference may be possible such that detecting light absorption of different colored hairs does not result in inaccurate measurements.
- the ambient light sensor 105 may be a silicon ambient light sensor, such as a silicon non-imaging photodiode. In other implementations, the ambient light sensor 105 may be an indium gallium arsenide ambient light sensor, such as an indium gallium arsenide non-imaging photodiode. In various implementations, use of an indium gallium arsenide non-imaging photodiode may allow for detection of a larger spectrum of light than use of a silicon non-imaging photodiode.
- An indium gallium arsenide non-imaging photodiode may not be typically used as an ambient light sensor as such may be more expensive than a silicon non-imaging photodiode that may adequately be used to determine ambient light conditions by detecting a more limited spectrum of light.
- a variety of different health data for the user may be computed based at least thereon.
- the health data may include one or more of a variety of different wellness, fitness, and/or other parameters relating to the health of a user such as: a blood pressure index, a blood hydration, a body fat content, an oxygen saturation, a pulse rate, a perfusion index, an electrocardiogram, a photoplethysmogram, and/or any other such health data.
- FIG. 7 is a block diagram illustrating functional relationships among components of the example system 100 of FIG. 1 .
- the electronic device 101 may include one or more processing units 701 , one or more non-transitory storage media 702 (which may take the form of, but is not limited to, a magnetic storage medium; optical storage medium; magneto-optical storage medium; read only memory; random access memory; erasable programmable memory; flash memory; and so on), one or more communication components 703 (such as a Wi-Fi or other antenna that may be utilized to transmit computed health data for the user), one or more input/output components 704 , a display 108 (which may be utilized to present computed health data for the user), the camera 104 , the ambient light sensor 105 , the proximity sensor 106 , and/or the electrical contacts 107 a and 107 b .
- the electronic device 101 may omit one or more of these components and/or utilize one or more additional components not shown.
- the electronic device 101 may provide guidance to the user for aligning the user's body part 202 with the camera 104 , the ambient light sensor 105 , the proximity sensor 106 , and/or the electrical contacts 107 a and 107 b .
- Such correct alignment may aid in utilizing camera, the ambient light sensor, the proximity sensor, and/or the electrical contacts in detecting the information regarding the body part of the user.
- misalignment of the user's body part with the camera, the ambient light sensor, the proximity sensor, and/or the electrical contacts for purposes of obtaining the information may reduce the accuracy of the information and/or prevent detection of the information.
- the guidance may aid in the detection of the information and/or the computing of the health data.
- FIG. 3 illustrates the view of FIG. 2 while the example system 100 is providing guidance to obtain health data.
- the electronic device 101 provides a current body part position indicator 301 and a goal position indicator 302 .
- a user may compare the visual positions of the current body part position indicator and the goal position indicator to determine how to move the user's body part 202 into correct alignment.
- the user may move the user's body part down and to the right, aligning the current body part position indicator with the goal position indicator 302 , to move the user's body part into correct alignment.
- the electronic device 101 may also provide a status indicator 303 that indicates a progress 304 of obtaining the information. In this way, the user may be alerted to how long the user should stay in position once the user aligns the user's body part so that the information may be detected.
- the camera 104 may be utilized to detect the position of the user's body part for purposes of determining alignment/misalignment.
- the camera may be configured to detect this information even in implementations where the camera is configured with a focal distance greater than the distance from the camera to the user's body part 202 shown as less than full focused image quality may be adequate for determining alignment/misalignment.
- the ambient light sensor 105 , the proximity sensor 106 , the electrical contacts 107 a and 107 b , and/or other components may be utilized instead of and/or in addition to the camera for determining alignment/misalignment of the user's body part.
- FIG. 3 illustrates the electronic device 101 providing guidance output graphically using a visual output component
- this is an example.
- output may be provided in one or more of a variety of different ways.
- audio guidance instructions may be provided utilizing an audio output component and/or vibration guidance instructions may be provided utilizing a haptic output component without departing from the scope of the present disclosure.
- FIG. 4 illustrates the view of FIG. 2 while the example system 100 is providing the obtained health data.
- a variety of different health data may be presented.
- FIG. 4 illustrates the electronic device 101 providing the health data graphically using a visual output component, it is understood that this is an example.
- such health may be provided in one or more of a variety of different ways, such as audibly utilizing an audio output component without departing from the scope of the present disclosure.
- the health data may be communicated to another electronic device (such as a health data database maintained by a doctor and/or other medical or health provider) utilizing a communication component.
- FIG. 5 is a flow chart illustrating an example method 500 for using an electronic device to obtain health data. This method may be performed by the system of FIG. 1 .
- the flow may begin at block 501 where at least one of camera and a proximity sensor may be used to emit light into a body part of a user touching a surface of the electronic device.
- the flow may proceed to block 502 where at least one of the camera, an ambient light sensor, or the proximity sensor may be used to receive at least part of the emitted light reflected by the body part of the user to produce sensor output and generate sensor data.
- the flow may then proceed to block 503 where health data of the user may be computed using at least the sensor data regarding the received light.
- the computed health data for the user may be provided.
- the computed health data for the user may be provided to the user.
- Such providing may be performed using one or more visual output components such as a display, audio output components such as a speaker, haptic output components, and so on.
- the proximity sensor may be used to emit light into the user's body part
- the ambient light sensor and the camera may be used to receive at least part of the emitted light reflected by the user's body part
- electrical contacts may be used to obtain electrical measurements from the skin of the user's body part.
- a blood pressure index, a body fat content, and an electrocardiogram may be computed using data from the ambient light sensor, the camera, and the electrical contacts.
- the proximity sensor may be a multiple light wavelength proximity sensor that utilizes infrared and visible light and the ambient light sensor may be a indium gallium arsenide ambient light sensor.
- the proximity sensor may be used to emit light into the user's body part, the ambient light sensor and the camera may be used to receive at least part of the emitted light reflected by the user's body part, and electrical contacts may be used to obtain electrical measurements from the skin of the user's body part.
- a blood hydration may be computed using data from the ambient light sensor, the camera, and the electrical contacts.
- the proximity sensor may be used to emit light into the user's body part and the ambient light sensor and the camera receive at least part of the emitted light reflected by the user's body part.
- an oxygen saturation, a pulse rate, a perfusion index and a photoplethysmogram may be computed using data from the ambient light sensor and the camera.
- example method 500 is illustrated and described above as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.
- block 503 is illustrated and described as providing the computed health data for the user. However, in various implementations this operation may be omitted. In some examples of such an implementation, the computed health data for the user may be stored for later use as opposed to being provided to the user.
- FIG. 6 is a flow chart illustrating an example method 600 for guiding use of an electronic device to obtain health data. This method may be performed by the system of FIG. 1 .
- the flow may begin at block 601 where at least a profile of a body part of a user (such as the outline, location, or orientation) contacting a camera may be detected using a camera.
- the flow may proceed to block 602 where it is determined based on the detection that the user's body part is misaligned with a combination of the camera, an ambient light sensor, and a proximity sensor for purposes of obtaining health data for the user.
- Detection of the user's body part may include comparing the profile to data representing a correct alignment. For example, an image of the profile of the user's body part may be captured and compared to a sample image representing what the image of the profile of the user's body part should look like if the user's body part is correctly aligned. A mismatch may indicate that the user's body part is misaligned.
- guidance to correct the misalignment may be provided.
- the differences between the two images may be utilized to determine guidance to provide.
- the image of the profile of the user's body part has the user's body part further to the left than the sample image then it may be determined that the user should more the user's body part to the right.
- Such guidance may be provided using one or more visual output components such as a display, audio output components such as a speaker, haptic output components such as a vibrator, and so on.
- a user may place his finger on the camera.
- An image may be taken of the profile of the user's finger and compared to a sample image of what the profile of the user's finger should look like if correctly aligned with a combination of the camera, an ambient light sensor, and a proximity sensor for purposes of obtaining health data for the user. Comparison of the two images may indicate that the two images do not match and the user's finger is not correctly aligned.
- the image of the profile of the user's finger may be further up and to the right of the sample image.
- a correct placement indicator and a current placement indicator may be displayed to the user where the current placement indicator is displayed further up and to the right of the correct placement indicator. In this way, the user can see that to correctly align the user's finger the user should move the user's finger down and to the left.
- the user may move the user's finger based on the provided guidance.
- a new image may be taken of the current profile of the user's finger and compared to the sample image. Comparison of the two images may indicate that the two images, though closer, still do not match and the user's finger is not still correctly aligned.
- the image of the profile of the user's finger may be less but still further up and to the right of the sample image.
- the current placement indicator may be displayed moved closer but still further up and to the right of the correct placement indicator. In this way, the user can see that to correctly align the user's finger the user should move the user's finger still further down and to the left.
- the process in this example may be repeated until comparison of an image of profile of the user's finger matches the sample image.
- the current placement indicator may then be displayed over the correct placement indicator to indicate to the user that the user's finger is correctly aligned and to not move further until health data is obtained.
- example method 600 is illustrated and described above as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.
- blocks 601 - 603 are described as a series of linear operations that are performed a single time, it is understood that this is an example. In various implementations, one or more of blocks 601 - 603 may be repeated until the user's body part is no longer misaligned without departing from the scope of the present disclosure.
- An electronic device (such as a smart phone, tablet computer, mobile computer, digital media player, wearable device, or other electronic device) may include a camera, an ambient light sensor, and a proximity sensor.
- the electronic device use one or more of the camera and the proximity sensor to emit light into a body part of a user (such as a finger, and ear, and so on) touching a surface of the electronic device.
- the electronic device may use one or more of the camera, the ambient light sensor, and the proximity sensor to receive at least part of the emitted light reflected by the body part of the user.
- the electronic device may compute health data of the user based upon sensor data regarding the received light. In this way, the health data of the user may be detected utilizing an electronic device including a camera, ambient light sensor, and proximity sensor without making the user obtain access to a dedicated fitness and/or wellness device.
- the methods disclosed may be implemented as sets of instructions or software readable by a device. Further, it is understood that the specific order or hierarchy of steps in the methods disclosed are examples of sample approaches. In other embodiments, the specific order or hierarchy of steps in the method can be rearranged while remaining within the disclosed subject matter.
- the accompanying method claims present elements of the various steps in a sample order, and are not necessarily meant to be limited to the specific order or hierarchy presented.
- a non-transitory machine-readable medium includes any mechanism for storing information in a form (e.g., software, processing application) readable by a machine (e.g., a computer).
- the non-transitory machine-readable medium may take the form of, but is not limited to, a magnetic storage medium (e.g., floppy diskette, video cassette, and so on); optical storage medium (e.g., CD-ROM); magneto-optical storage medium; read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; and so on.
- a magnetic storage medium e.g., floppy diskette, video cassette, and so on
- optical storage medium e.g., CD-ROM
- magneto-optical storage medium e.g., magneto-optical storage medium
- ROM read only memory
- RAM random access memory
- EPROM and EEPROM erasable programmable memory
- flash memory and so on.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Physiology (AREA)
- Cardiology (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Multimedia (AREA)
- Pulmonology (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Psychiatry (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)
Abstract
An electronic device includes a camera, an ambient light sensor, and a proximity sensor. The electronic device uses one or more of the camera and the proximity sensor to emit light into a body part of a user touching a surface of the electronic device and one or more of the camera, the ambient light sensor, and the proximity sensor to receive at least part of the emitted light reflected by the body part of the user. The electronic device computes health data of the user based upon sensor data regarding the received light. In some implementations, the electronic device may also include one or more electrical contacts that contact one or more body parts of the user. In such implementations, the health data may be further computed based on an electrical measurement obtained using the electrical contacts.
Description
- This application is a continuation of U.S. patent application Ser. No. 15/667,832, filed Aug. 3, 2017, which is a continuation of U.S. patent application Ser. No. 14/617,422, filed Feb. 9, 2015, which claims the benefit under 35 U.S.C. § 119(e) of U.S. Provisional Patent Application No. 62/056,299, filed on Sep. 26, 2014, the contents of which are incorporated by reference as if fully disclosed herein.
- This disclosure relates generally to health data, and more specifically to an electronic device that computes health data
- It may be beneficial for a user to have information about his or her health data, including fitness data and wellness data. For example, health data may indicate emergency conditions or to enable the user to maximize fitness or wellness activities. Traditionally, health data is provided to users by health care professionals. However, it may be beneficial for users to have more access to health data.
- The present disclosure discloses systems, apparatuses, and methods related to an electric device that computes health data. An electronic device may include a camera, an ambient light sensor, and a proximity sensor. The electronic device may use one or more of the camera and the proximity sensor to emit light into a body part of a user touching a surface of the electronic device and one or more of the camera, the ambient light sensor, and the proximity sensor to receive at least part of the emitted light reflected by the body part of the user. The electronic device may compute health data of the user based upon sensor data regarding the received light. In some implementations, the electronic device may also include one or more electrical contacts that contact one or more body parts of the user. In such implementations, the health data may be further computed based on an electrical measurement obtained using the electrical contacts.
- In some implementations, the electronic device may utilize the camera to determine the user's body part is misaligned with the camera, the ambient light sensor, and the proximity sensor for purposes of detecting the information about the body part of the user. In such implementations, the electronic device may provide guidance to correct the misalignment.
- In various embodiments, a mobile personal computing device may include a camera, an ambient light sensor, a proximity sensor, and a processing unit communicably coupled to the camera, the ambient light sensor, and the proximity sensor. The processing unit may be configured to: use at least one of camera and a proximity sensor to emit light into a body part of a user touching a surface of the mobile personal computing device; use at least one of the camera, an ambient light sensor, or the proximity sensor to receive at least part of the emitted light reflected by the body part of the user and generate sensor data; and computing health data of the user, utilizing the processing unit, using at least the sensor data regarding the received light.
- In some embodiments, a method for using a mobile personal computing device to obtain health data may include: using at least one of camera and a proximity sensor to emit light into a body part of a user touching a surface of the device; using at least one of the camera, an ambient light sensor, or the proximity sensor to receive at least part of the emitted light reflected by the body part of the user and generate sensor data; and computing health data of the user, utilizing the processing unit, using at least the sensor data regarding the received light.
- In one or more embodiments, a method for guiding use of a mobile personal computing device to obtain health data may include: detecting, utilizing a camera, a profile of a body part of a user contacting the camera; determining, using the profile, if the body part is misaligned with a combination of the camera, an ambient light sensor, and a proximity sensor for purposes of obtaining health data for the user; and providing guidance to correct the misalignment.
- In various embodiments, a computer program product including a non-transitory storage medium may include a first set of instructions, stored in the non-transitory storage medium, executable by at least one processing unit to use at least one of a camera and a proximity sensor to emit light into a body part of a user touching a surface of a mobile personal computing device; a second set of instructions, stored in the non-transitory storage medium, executable by the least one processing unit to use at least one of the camera, an ambient light sensor, or the proximity sensor to receive at least part of the emitted light reflected by the body part of the user and generate sensor data; and a third set of instructions, stored in the non-transitory storage medium, executable by the least one processing unit to compute health data of the user using at least the sensor data regarding the received light.
- It is to be understood that both the foregoing general description and the following detailed description are for purposes of example and explanation and do not necessarily limit the present disclosure. The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate subject matter of the disclosure. Together, the descriptions and the drawings serve to explain the principles of the disclosure.
-
FIG. 1 is an isometric view an example system for obtaining health data utilizing an electronic device. -
FIG. 2 illustrates the view ofFIG. 1 while the example system is being utilized to obtain health data. -
FIG. 3 illustrates the view ofFIG. 2 while the example system is providing guidance to obtain health data. -
FIG. 4 illustrates the view ofFIG. 2 while the example system is providing the obtained health data. -
FIG. 5 is a flow chart illustrating an example method for using an electronic device to obtain health data. This method may be performed by the system ofFIG. 1 . -
FIG. 6 is a flow chart illustrating an example method for guiding use of an electronic device to obtain health data. This method may be performed by the system ofFIG. 1 . -
FIG. 7 is a block diagram illustrating functional relationships among components of the example system ofFIG. 1 . - The description that follows includes sample systems, apparatuses, and methods that embody various elements of the present disclosure. However, it should be understood that the described disclosure may be practiced in a variety of forms in addition to those described herein.
- The present disclosure details systems, apparatuses, and methods related to an electric device that computes health data. An electronic device (such as a smart phone, tablet computer, mobile computer, digital media player, wearable device, or other electronic device) may include a camera, an ambient light sensor, and a proximity sensor. The electronic device may use one or more of the camera and the proximity sensor to emit light into a body part of a user (such as a finger, and ear, and so on) touching a surface of the electronic device. The electronic device may use one or more of the camera, the ambient light sensor, and the proximity sensor to receive at least part of the emitted light reflected by the body part of the user. The electronic device may compute health data of the user based upon sensor data regarding the received light. In this way, the health data of the user may be detected utilizing an electronic device including a camera, ambient light sensor, and proximity sensor without making the user obtain access to a dedicated fitness and/or wellness device.
- In various implementations, the camera, ambient light sensor, and proximity sensor may be positioned such that they are all at least partially covered (and/or contacted) by the user's body part at the same time, such as when the health data is computed. In one or more implementations, the electronic device may also include electrical contacts. The health data of the user may also be computed using an electrical measurement obtained using from the electrical contacts. In some examples of such implementations, the electrical contacts may be positioned to contact the body part of the user and an additional body part such that electrical measurement represents the electrical properties of organs or portions of the body located between the two contacting body parts. In some embodiments, the two body parts are the user's left and right hands and the electrical measurement corresponds to an electrical property that is measured across the user's chest.
- In some implementations, the electronic device may utilize the camera to determine the user's body part is misaligned with the camera, the ambient light sensor, and the proximity sensor for purposes of detecting the information about the body part of the user. In such implementations, the electronic device may provide guidance (such as visual, audio, haptic, and/or other guidance) to correct the misalignment. The information from the camera may be utilized to detect this misalignment even in implementations where the camera is configured with a focal distance greater than a distance between the camera and the user's body part when the user's body part is touching the surface of the electronic device.
- In various implementations, the proximity sensor may be a multiple light wavelength sensor (such as a sensor that utilizes infrared and visible light, infrared and red light, and so on). In some implementations, the ambient light sensor may be a silicon ambient light sensor, an indium gallium arsenide ambient light sensor, and/or other kind of ambient light sensor. In various implementations, the camera may be both an infrared and visible light camera.
- The health data may include one or more of a variety of different wellness, fitness, and/or other parameters relating to the health of a user. For example, in various implementations the health data may include: a blood pressure index, a blood hydration, a body fat content, an oxygen saturation, a pulse rate, a perfusion index, an electrocardiogram, a photoplethysmogram, and/or any other such health data. In some implementations, the electronic device may provide the computed health data to the user.
-
FIG. 1 is an isometric view anexample system 100 for obtaining health data utilizing an electronic device. As illustrated, the system may include anelectronic device 101. The electronic device is shown as a smart phone. However, it is understood that this is an example. In various implementations, the electronic device may be any kind of electronic device such as any kind of mobile personal computing device (such as a smart phone, tablet computer, a mobile computer, a digital media player, a cellular telephone, a laptop computer, a wearable device, and so on), a desktop computer, a display, and/or any other electronic device. - As also illustrated, the
electronic device 101 may include ahousing 102 with asurface 103 where acamera 104, an ambientlight sensor 105, and aproximity sensor 106 are positioned. As illustrated inFIG. 2 , the camera, ambient light sensor, and proximity sensor may be positioned such that they are partially or entirely covered (and/or contacted) by thebody part 202 of a user (illustrated as a finger though such a body part may be an ear, a palm, and/or other body part of the user) at the same time. At such a time, the electronic device may compute health data for the user. - Traditionally, a camera may be capture images using a visible light imaging sensor and a lens focused at a focal distance away from the lens, an ambient light sensor may use a broad range photodiode or similar non-imaging light detector to determine ambient light conditions, and a proximity sensor may use a limited range light source (such as an infrared light emitting diode or “LED”) to emit limited range light and a limited range non-imaging light detector to detect if the emitted limited range light is reflected by one or more object to determine whether or not such an object is proximate to the proximity sensor. However, the
electronic device 101may camera 104, the ambientlight sensor 105, and theproximity sensor 106 in non-traditional ways to detect information about thebody part 202. - The
electronic device 101 may use one or more of thecamera 104 and theproximity sensor 106 to emit light into abody part 202 of a user touching asurface 103 of the electronic device. The electronic device may use one or more of the camera, the ambientlight sensor 105, and the proximity sensor to receive at least part of the emitted light reflected by the body part of the user. The electronic device may compute health data of the user based upon sensor (such as the camera, the ambient light sensor, and/or the proximity sensor) data regarding the received light. - For example, one or more of the
camera 104, the ambientlight sensor 105, and theproximity sensor 106 may receive light reflected off of thebody part 202 of the user. Such light may originate from one or more of the camera (in implementations where the camera includes a light source such as a LED used as a flash), the ambient light sensor (which may be a non-imaging photodiode in some implementations), the proximity sensor (such as in implementations where the proximity sensor is a non-imaging photodiode and one or more LEDs that determine proximity by measuring the time between transmission of light by the LED and receipt of the light by the non-imaging photodiode after reflection off of an object such as thebody part 202 of the user), and/or other light source. Theelectronic device 101 may analyze sensor data regarding the received light and compute information such as the light absorption of the body part. Various health data for the user may be computed from the computed light absorption of the body part. - By way of illustration, sensor data regarding the received light may be used to estimate changes in the volume of the
body part 202 of the user. In general, as light passes through the user's skin and into the underlying tissue, some light is reflected, some light is scattered, and some light is absorbed, depending on what the light encounters. In some instances, blood may absorb light more than surrounding tissue, so less reflected light may be sensed when more blood is present. The user's blood volume generally increases and decreases with each heartbeat. Thus, analysis of sensor data regarding the reflected light may reflect changes in blood volume and thus allow health data such as oxygen saturation, pulse rate, perfusion index, and such to be computed. - By way of another example, one or more images of the
body part 202 of the user captured by thecamera 104 may be analyzed to compute various health data for the user. In some implementations, the camera may be an infrared camera and/or a combined visible light and infrared camera. In such implementations, infrared data in the image may be analyzed to compute temperature of the body part, changing blood flow in the body part, and so on. In various implementations, the ambient light sensor and/or proximity sensor may be utilized to obtain such infrared data regarding the body part. - In various implementations, various information may be obtained regarding the
body part 202 utilizing data from thecamera 104, the ambientlight sensor 105, and theproximity sensor 106. Such information may be utilized in a variety of different ways. For example, in some implementations each of the camera, the ambient light sensor, and the proximity sensor may capture sensor data regarding light absorption of thebody part 202. However, the light absorption represented by the light received by each may be different based on the particular sensor strengths and/or weaknesses of the respective device. In such an implementation, the sensor data related to light absorption from each may be compared to the others and/or combined in order to obtain a more accurate, single light absorption measurement. - By way of another example, in some implementations sensor data from one or more of the
camera 104, the ambientlight sensor 105, and theproximity sensor 106 may be used to adjust information from one or more others of the camera, the ambient light sensor, and the proximity sensor. For example, in various implementations the proximity sensor may be utilized to obtain sensor data related to light absorption of thebody part 202 and the camera may be utilized to determine the specific area of the body part the information relates to. Light absorption may be interpreted differently in computing health data for different areas of the body part (such as where the area of the body part is hairless versus containing hair, where the area is a highly callused area as opposed to a non-callused area, and so on). As such, the sensor data from the camera regarding the specific area of the body part being analyzed may be utilized to adjust the sensor data related to light absorption obtained from the proximity sensor to account for the specific characteristics of the area of the body part that may influence interpretation of light absorption for computing health data for the user. - As also illustrated in
FIGS. 1 and 2 , theelectronic device 101 may also include electrical contacts such aselectrical contacts electronic device 101 may compute health data of the user based upon sensor data obtained from thecamera 104, the ambientlight sensor 105, and theproximity sensor 106 as well as an electrical measurement obtained using the electrical contacts. - As illustrated in
FIG. 2 , theelectrical contacts body part 202 of the user (such as during the time when the information is being detected) and/or anadditional body part 201 of the user. For example, as shown a finger of the user may contact a topelectrical contact 107 a while a palm of the user contacts a bottomelectrical contact 107 b. However, it is understood that this is an example and the electrical contacts may be configured to contact other body parts of the user (such as an ear, a cheek, and so on) without departing from the scope of the present disclosure. - In some implementations, the
electrical contacts body part 202 of the user and an additional body part of the user such that electrical measurement obtained using the electrical contacts corresponds to an electrical characteristic across the user's chest. For example, as shown a finger of the user's left hand may contact a topelectrical contact 107 a while a right palm of the user (connected to each other through the user's chest) contacts a bottomelectrical contact 107 b. Positioning the electrical contacts to contact user body parts such that the electrical measurement obtained using the electrical contacts corresponds to an electrical property across the user's chest. Such a measurement may enable information related to health data (such as an electrocardiogram) to be obtained that might not otherwise be possible absent such positioning. - By way of illustration, electrical measurements may be taken via the
electrical contacts - Although
FIG. 1 illustrates a specific configuration including thecamera 104, the ambientlight sensor 105, theproximity sensor 106, and theelectrical contacts - For example, the ambient
light sensor 105 and theproximity sensor 106 are illustrated and described as separated sensors. However, in some implementations the ambient light sensor and the proximity sensor may be incorporated into a single, unified sensor that may detect both ambient light and proximity without departing from the scope of the present disclosure. - In some implementations, the
proximity sensor 106 may operate utilizing a single wavelength of light, such as the infrared portion of the light spectrum. However, in other implementations the proximity sensor (and/or thecamera 104 and/or the ambient light sensor 105) may be a multiple wavelength proximity sensor that operates utilizing multiple wavelengths of light. - For example, in various implementations the
proximity sensor 106 may operate utilizing infrared and visible light (such as red light). In some embodiments of such an implementation, the proximity sensor may include an infrared LED for producing infrared light and a red LED for producing red light. - Sensor data obtained utilizing different wavelengths of light may be different based on the particular detection strengths and/or weaknesses of the respective wavelength. By utilizing multiple wavelengths, the information detected utilizing the various wavelengths may be combined and/or utilized to adjust each other in order to obtain greater accuracy.
- For example, dark and light hairs may have different light absorption due to their different pigmentation regardless of their other physical characteristics. By averaging light absorption detected utilizing both infrared and red light, a more accurate light absorption that accounts for such color difference may be possible such that detecting light absorption of different colored hairs does not result in inaccurate measurements.
- In some implementations, the ambient
light sensor 105 may be a silicon ambient light sensor, such as a silicon non-imaging photodiode. In other implementations, the ambientlight sensor 105 may be an indium gallium arsenide ambient light sensor, such as an indium gallium arsenide non-imaging photodiode. In various implementations, use of an indium gallium arsenide non-imaging photodiode may allow for detection of a larger spectrum of light than use of a silicon non-imaging photodiode. An indium gallium arsenide non-imaging photodiode may not be typically used as an ambient light sensor as such may be more expensive than a silicon non-imaging photodiode that may adequately be used to determine ambient light conditions by detecting a more limited spectrum of light. - In various implementations, a variety of different health data for the user may be computed based at least thereon. For example, in one or more implementations the health data may include one or more of a variety of different wellness, fitness, and/or other parameters relating to the health of a user such as: a blood pressure index, a blood hydration, a body fat content, an oxygen saturation, a pulse rate, a perfusion index, an electrocardiogram, a photoplethysmogram, and/or any other such health data.
-
FIG. 7 is a block diagram illustrating functional relationships among components of theexample system 100 ofFIG. 1 . As shown, theelectronic device 101 may include one ormore processing units 701, one or more non-transitory storage media 702 (which may take the form of, but is not limited to, a magnetic storage medium; optical storage medium; magneto-optical storage medium; read only memory; random access memory; erasable programmable memory; flash memory; and so on), one or more communication components 703 (such as a Wi-Fi or other antenna that may be utilized to transmit computed health data for the user), one or more input/output components 704, a display 108 (which may be utilized to present computed health data for the user), thecamera 104, the ambientlight sensor 105, theproximity sensor 106, and/or theelectrical contacts electronic device 101 may omit one or more of these components and/or utilize one or more additional components not shown. - Returning to
FIG. 2 , in various implementations theelectronic device 101 may provide guidance to the user for aligning the user'sbody part 202 with thecamera 104, the ambientlight sensor 105, theproximity sensor 106, and/or theelectrical contacts - For example,
FIG. 3 illustrates the view ofFIG. 2 while theexample system 100 is providing guidance to obtain health data. As illustrated in this example, theelectronic device 101 provides a current bodypart position indicator 301 and agoal position indicator 302. A user may compare the visual positions of the current body part position indicator and the goal position indicator to determine how to move the user'sbody part 202 into correct alignment. As shown, the user may move the user's body part down and to the right, aligning the current body part position indicator with thegoal position indicator 302, to move the user's body part into correct alignment. - Further, the
electronic device 101 may also provide astatus indicator 303 that indicates aprogress 304 of obtaining the information. In this way, the user may be alerted to how long the user should stay in position once the user aligns the user's body part so that the information may be detected. - In some implementations, the
camera 104 may be utilized to detect the position of the user's body part for purposes of determining alignment/misalignment. The camera may be configured to detect this information even in implementations where the camera is configured with a focal distance greater than the distance from the camera to the user'sbody part 202 shown as less than full focused image quality may be adequate for determining alignment/misalignment. In other implementations, the ambientlight sensor 105, theproximity sensor 106, theelectrical contacts - Although
FIG. 3 illustrates theelectronic device 101 providing guidance output graphically using a visual output component, it is understood that this is an example. In various implementations, such output may be provided in one or more of a variety of different ways. For example, audio guidance instructions may be provided utilizing an audio output component and/or vibration guidance instructions may be provided utilizing a haptic output component without departing from the scope of the present disclosure. -
FIG. 4 illustrates the view ofFIG. 2 while theexample system 100 is providing the obtained health data. As illustrated, a variety of different health data may be presented. AlthoughFIG. 4 illustrates theelectronic device 101 providing the health data graphically using a visual output component, it is understood that this is an example. In various implementations, such health may be provided in one or more of a variety of different ways, such as audibly utilizing an audio output component without departing from the scope of the present disclosure. In other implementations, the health data may be communicated to another electronic device (such as a health data database maintained by a doctor and/or other medical or health provider) utilizing a communication component. -
FIG. 5 is a flow chart illustrating anexample method 500 for using an electronic device to obtain health data. This method may be performed by the system ofFIG. 1 . - The flow may begin at
block 501 where at least one of camera and a proximity sensor may be used to emit light into a body part of a user touching a surface of the electronic device. The flow may proceed to block 502 where at least one of the camera, an ambient light sensor, or the proximity sensor may be used to receive at least part of the emitted light reflected by the body part of the user to produce sensor output and generate sensor data. The flow may then proceed to block 503 where health data of the user may be computed using at least the sensor data regarding the received light. - At
block 504, the computed health data for the user may be provided. In some implementations, the computed health data for the user may be provided to the user. Such providing may be performed using one or more visual output components such as a display, audio output components such as a speaker, haptic output components, and so on. - In one example, the proximity sensor may be used to emit light into the user's body part, the ambient light sensor and the camera may be used to receive at least part of the emitted light reflected by the user's body part, and electrical contacts may be used to obtain electrical measurements from the skin of the user's body part. In such an example, a blood pressure index, a body fat content, and an electrocardiogram may be computed using data from the ambient light sensor, the camera, and the electrical contacts.
- In another example, the proximity sensor may be a multiple light wavelength proximity sensor that utilizes infrared and visible light and the ambient light sensor may be a indium gallium arsenide ambient light sensor. The proximity sensor may be used to emit light into the user's body part, the ambient light sensor and the camera may be used to receive at least part of the emitted light reflected by the user's body part, and electrical contacts may be used to obtain electrical measurements from the skin of the user's body part. In such an example, a blood hydration may be computed using data from the ambient light sensor, the camera, and the electrical contacts.
- In yet another example, the proximity sensor may be used to emit light into the user's body part and the ambient light sensor and the camera receive at least part of the emitted light reflected by the user's body part. In such an example, an oxygen saturation, a pulse rate, a perfusion index and a photoplethysmogram may be computed using data from the ambient light sensor and the camera.
- Although the
example method 500 is illustrated and described above as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure. - For example, block 503 is illustrated and described as providing the computed health data for the user. However, in various implementations this operation may be omitted. In some examples of such an implementation, the computed health data for the user may be stored for later use as opposed to being provided to the user.
-
FIG. 6 is a flow chart illustrating anexample method 600 for guiding use of an electronic device to obtain health data. This method may be performed by the system ofFIG. 1 . - The flow may begin at
block 601 where at least a profile of a body part of a user (such as the outline, location, or orientation) contacting a camera may be detected using a camera. The flow may proceed to block 602 where it is determined based on the detection that the user's body part is misaligned with a combination of the camera, an ambient light sensor, and a proximity sensor for purposes of obtaining health data for the user. - Detection of the user's body part may include comparing the profile to data representing a correct alignment. For example, an image of the profile of the user's body part may be captured and compared to a sample image representing what the image of the profile of the user's body part should look like if the user's body part is correctly aligned. A mismatch may indicate that the user's body part is misaligned.
- At
block 603, guidance to correct the misalignment may be provided. In the example discussed above where a mismatch between the image of the profile of the user's body part and the sample image indicated that the user's body part was misaligned, the differences between the two images may be utilized to determine guidance to provide. By way of illustration, if the image of the profile of the user's body part has the user's body part further to the left than the sample image then it may be determined that the user should more the user's body part to the right. Such guidance may be provided using one or more visual output components such as a display, audio output components such as a speaker, haptic output components such as a vibrator, and so on. - For example, a user may place his finger on the camera. An image may be taken of the profile of the user's finger and compared to a sample image of what the profile of the user's finger should look like if correctly aligned with a combination of the camera, an ambient light sensor, and a proximity sensor for purposes of obtaining health data for the user. Comparison of the two images may indicate that the two images do not match and the user's finger is not correctly aligned. In this example, the image of the profile of the user's finger may be further up and to the right of the sample image. As such, a correct placement indicator and a current placement indicator may be displayed to the user where the current placement indicator is displayed further up and to the right of the correct placement indicator. In this way, the user can see that to correctly align the user's finger the user should move the user's finger down and to the left.
- To continue with this example, the user may move the user's finger based on the provided guidance. A new image may be taken of the current profile of the user's finger and compared to the sample image. Comparison of the two images may indicate that the two images, though closer, still do not match and the user's finger is not still correctly aligned. In this example, the image of the profile of the user's finger may be less but still further up and to the right of the sample image. As such, the current placement indicator may be displayed moved closer but still further up and to the right of the correct placement indicator. In this way, the user can see that to correctly align the user's finger the user should move the user's finger still further down and to the left.
- The process in this example may be repeated until comparison of an image of profile of the user's finger matches the sample image. The current placement indicator may then be displayed over the correct placement indicator to indicate to the user that the user's finger is correctly aligned and to not move further until health data is obtained.
- Although the
example method 600 is illustrated and described above as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure. - For example, though blocks 601-603 are described as a series of linear operations that are performed a single time, it is understood that this is an example. In various implementations, one or more of blocks 601-603 may be repeated until the user's body part is no longer misaligned without departing from the scope of the present disclosure.
- As discussed above and illustrated in the accompanying figures, the present disclosure details systems, apparatuses, and methods related to an electric device that computes health data. An electronic device (such as a smart phone, tablet computer, mobile computer, digital media player, wearable device, or other electronic device) may include a camera, an ambient light sensor, and a proximity sensor. The electronic device use one or more of the camera and the proximity sensor to emit light into a body part of a user (such as a finger, and ear, and so on) touching a surface of the electronic device. The electronic device may use one or more of the camera, the ambient light sensor, and the proximity sensor to receive at least part of the emitted light reflected by the body part of the user. The electronic device may compute health data of the user based upon sensor data regarding the received light. In this way, the health data of the user may be detected utilizing an electronic device including a camera, ambient light sensor, and proximity sensor without making the user obtain access to a dedicated fitness and/or wellness device.
- In the present disclosure, the methods disclosed may be implemented as sets of instructions or software readable by a device. Further, it is understood that the specific order or hierarchy of steps in the methods disclosed are examples of sample approaches. In other embodiments, the specific order or hierarchy of steps in the method can be rearranged while remaining within the disclosed subject matter. The accompanying method claims present elements of the various steps in a sample order, and are not necessarily meant to be limited to the specific order or hierarchy presented.
- Techniques detailed in the described disclosure may be provided as a computer program product, or software, that may include a non-transitory machine-readable medium having stored thereon instructions, which may be used to program a computer system (or other electronic devices) to perform a process according to the present disclosure. A non-transitory machine-readable medium includes any mechanism for storing information in a form (e.g., software, processing application) readable by a machine (e.g., a computer). The non-transitory machine-readable medium may take the form of, but is not limited to, a magnetic storage medium (e.g., floppy diskette, video cassette, and so on); optical storage medium (e.g., CD-ROM); magneto-optical storage medium; read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; and so on.
- It is believed that the present disclosure and many of its attendant advantages will be understood by the foregoing description, and it will be apparent that various changes may be made in the form, construction and arrangement of the components without departing from the disclosed subject matter or without sacrificing all of its material advantages. The form described is merely explanatory, and it is the intention of the following claims to encompass and include such changes.
- While the present disclosure has been described with reference to various embodiments, it will be understood that these embodiments are illustrative and that the scope of the disclosure is not limited to them. Many variations, modifications, additions, and improvements are possible. More generally, embodiments in accordance with the present disclosure have been described in the context or particular embodiments. Functionality may be separated or combined in blocks differently in various embodiments of the disclosure or described with different terminology. These and other variations, modifications, additions, and improvements may fall within the scope of the disclosure as defined in the claims that follow.
Claims (20)
1. A wearable device, comprising:
a first light source;
a second light source that operates at a different wavelength than the first light source;
at least one light receiver; and
a processing unit that is configured to:
use the first light source and the at least one light receiver to detect a proximity of a body part of a user;
determine the body part if proximate to the wearable device; and
when the body part is determined to be proximate to the wearable device, use the second light source and the at least one light receiver to determine health data for the user.
2. The wearable device of claim 1 , wherein the health data comprises a pulse rate of the user.
3. The wearable device of claim 1 , wherein the first light source and the second light source comprise light emitting diodes.
4. The wearable device of claim 1 , wherein the at least one light receiver comprises a proximity sensor.
5. The wearable device of claim 1 , wherein the at least one light receiver comprises a camera.
6. The wearable device of claim 1 , further comprising an electrical contact, wherein the processing unit is further configured to use the electrical contact to determine the health data.
7. The wearable device of claim 1 , wherein:
the first light source is configured for a first color; and
the second light source is configured for a second color.
8. A wearable device, comprising:
a first light source configured to emit a first colored light;
a second light source configured to emit a second colored light;
at least one light receiver; and
a processing unit that is configured to:
use the first colored light received by the at least one light receiver to detect a body part of a user; and
upon detection of the body part of the user, use the second colored light received by the at least one light receiver to determine health data for the user.
9. The wearable device of claim 8 , wherein the at least one light receiver comprises a first light receiver and a second light receiver.
10. The wearable device of claim 9 , wherein the processing unit:
uses the first light receiver to receive the first colored light; and
uses the second light receiver to receive the second colored light.
11. The wearable device of claim 9 , wherein:
the first light receiver is configured to receive light of a first wavelength; and
the second light receiver is configured to receive light of a second wavelength.
12. The wearable device of claim 8 , wherein:
the first colored light is red light; and
the second colored light is green light.
13. The wearable device of claim 8 , wherein:
the first colored light is green light; and
the second colored light is red light.
14. The wearable device of claim 8 , wherein the at least one light receiver is a single light receiver that receives both the first colored light and the second colored light.
15. A wearable device, comprising:
a first light source that operates at a first wavelength;
a second light source that operates at a second wavelength;
at least one light receiver; and
a processing unit that is configured to, upon detection of an object using light of the first wavelength received by the at least one light receiver, use light of the second wavelength received by the at least one light receiver to determine health data for a user.
16. The wearable device of claim 15 , wherein the processing unit causes the first light source to emit the light of the first wavelength when attempting to detect the object.
17. The wearable device of claim 15 , wherein the processing unit causes the second light source to emit the light of the second wavelength when attempting to determine the health data for the user.
18. The wearable device of claim 17 , wherein the processing unit causes the second light source to emit the light of the second wavelength into the object.
19. The wearable device of claim 15 , wherein the processing unit is configured to use the first light source as part of attempting to detect the object prior to attempting to determine the health data for the user.
20. The wearable device of claim 15 , wherein the first wavelength overlaps the second wavelength.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/700,710 US20200100684A1 (en) | 2014-09-26 | 2019-12-02 | Electronic Device that Computes Health Data |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462056299P | 2014-09-26 | 2014-09-26 | |
US14/617,422 US9723997B1 (en) | 2014-09-26 | 2015-02-09 | Electronic device that computes health data |
US15/667,832 US10524671B2 (en) | 2014-09-26 | 2017-08-03 | Electronic device that computes health data |
US16/700,710 US20200100684A1 (en) | 2014-09-26 | 2019-12-02 | Electronic Device that Computes Health Data |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/667,832 Continuation US10524671B2 (en) | 2014-09-26 | 2017-08-03 | Electronic device that computes health data |
Publications (1)
Publication Number | Publication Date |
---|---|
US20200100684A1 true US20200100684A1 (en) | 2020-04-02 |
Family
ID=59411334
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/617,422 Active 2035-03-25 US9723997B1 (en) | 2014-09-26 | 2015-02-09 | Electronic device that computes health data |
US15/667,832 Active 2035-08-13 US10524671B2 (en) | 2014-09-26 | 2017-08-03 | Electronic device that computes health data |
US16/700,710 Abandoned US20200100684A1 (en) | 2014-09-26 | 2019-12-02 | Electronic Device that Computes Health Data |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/617,422 Active 2035-03-25 US9723997B1 (en) | 2014-09-26 | 2015-02-09 | Electronic device that computes health data |
US15/667,832 Active 2035-08-13 US10524671B2 (en) | 2014-09-26 | 2017-08-03 | Electronic device that computes health data |
Country Status (1)
Country | Link |
---|---|
US (3) | US9723997B1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10987054B2 (en) | 2017-09-05 | 2021-04-27 | Apple Inc. | Wearable electronic device with electrodes for sensing biological parameters |
US11166104B2 (en) | 2014-02-11 | 2021-11-02 | Apple Inc. | Detecting use of a wearable device |
US11281262B2 (en) | 2014-02-11 | 2022-03-22 | Apple Inc. | Detecting a gesture made by a person wearing a wearable electronic device |
WO2022071773A1 (en) * | 2020-09-29 | 2022-04-07 | 삼성전자주식회사 | Mobile device, control method therefor, and computer program stored in recording medium |
US11504057B2 (en) | 2017-09-26 | 2022-11-22 | Apple Inc. | Optical sensor subsystem adjacent a cover of an electronic device housing |
US11534088B2 (en) * | 2017-08-31 | 2022-12-27 | Fujifilm Business Innovation Corp. | Optical measuring apparatus and non-transitory computer readable medium |
Families Citing this family (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6697658B2 (en) | 2001-07-02 | 2004-02-24 | Masimo Corporation | Low power pulse oximeter |
JP2008531217A (en) | 2005-03-01 | 2008-08-14 | マシモ・ラボラトリーズ・インコーポレーテッド | Multi-wavelength sensor driver |
US8570186B2 (en) * | 2011-04-25 | 2013-10-29 | Endotronix, Inc. | Wireless sensor reader |
US20100030040A1 (en) | 2008-08-04 | 2010-02-04 | Masimo Laboratories, Inc. | Multi-stream data collection system for noninvasive measurement of blood constituents |
US20100004518A1 (en) | 2008-07-03 | 2010-01-07 | Masimo Laboratories, Inc. | Heat sink for noninvasive medical sensor |
US20110082711A1 (en) | 2009-10-06 | 2011-04-07 | Masimo Laboratories, Inc. | Personal digital assistant or organizer for monitoring glucose levels |
US9408542B1 (en) | 2010-07-22 | 2016-08-09 | Masimo Corporation | Non-invasive blood pressure measurement system |
US9753436B2 (en) | 2013-06-11 | 2017-09-05 | Apple Inc. | Rotary input mechanism for an electronic device |
KR102035445B1 (en) | 2013-08-09 | 2019-10-22 | 애플 인크. | Electronic watch |
WO2015122885A1 (en) | 2014-02-12 | 2015-08-20 | Bodhi Technology Ventures Llc | Rejection of false turns of rotary inputs for electronic devices |
WO2015137788A1 (en) * | 2014-03-14 | 2015-09-17 | Samsung Electronics Co., Ltd. | Electronic apparatus for providing health status information, method of controlling the same, and computer-readable storage medium |
CN205121417U (en) | 2014-09-02 | 2016-03-30 | 苹果公司 | Wearable electronic device |
US9723997B1 (en) | 2014-09-26 | 2017-08-08 | Apple Inc. | Electronic device that computes health data |
KR101993073B1 (en) | 2015-03-08 | 2019-06-25 | 애플 인크. | A compressible seal for rotatable and translatable input mechanisms |
US10229754B2 (en) * | 2015-03-09 | 2019-03-12 | Koninklijke Philips N.V. | Wearable device obtaining audio data for diagnosis |
US10448871B2 (en) | 2015-07-02 | 2019-10-22 | Masimo Corporation | Advanced pulse oximetry sensor |
US10244983B2 (en) * | 2015-07-20 | 2019-04-02 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
US11288979B1 (en) * | 2016-01-11 | 2022-03-29 | John J. Blackburn | Stethoscope training devices and methods |
US10551798B1 (en) | 2016-05-17 | 2020-02-04 | Apple Inc. | Rotatable crown for an electronic device |
US10061399B2 (en) | 2016-07-15 | 2018-08-28 | Apple Inc. | Capacitive gap sensor ring for an input device |
US10019097B2 (en) | 2016-07-25 | 2018-07-10 | Apple Inc. | Force-detecting input structure |
CN110267589A (en) * | 2016-08-26 | 2019-09-20 | Ami 研发有限责任公司 | Vital sign monitoring is carried out via touch screen using bio-electrical impedance |
US11615257B2 (en) | 2017-02-24 | 2023-03-28 | Endotronix, Inc. | Method for communicating with implant devices |
AU2018224198B2 (en) | 2017-02-24 | 2023-06-29 | Endotronix, Inc. | Wireless sensor reader assembly |
US10962935B1 (en) | 2017-07-18 | 2021-03-30 | Apple Inc. | Tri-axis force sensor |
AU2018304316A1 (en) | 2017-07-19 | 2020-01-30 | Endotronix, Inc. | Physiological monitoring system |
CN111491561A (en) * | 2017-12-20 | 2020-08-04 | 医疗光电设备有限公司 | Lipid measuring device and method thereof |
US10674967B2 (en) | 2018-02-05 | 2020-06-09 | Samsung Electronics Co., Ltd. | Estimating body composition on a mobile device |
EP3761866A1 (en) * | 2018-03-08 | 2021-01-13 | Biostealth Limited | Cardiovascular health monitoring |
CN109009066A (en) * | 2018-06-16 | 2018-12-18 | 广州百福企业管理有限公司 | Electronic health care testing equipment and detection system |
US11360440B2 (en) | 2018-06-25 | 2022-06-14 | Apple Inc. | Crown for an electronic watch |
EP3813653A4 (en) * | 2018-06-28 | 2022-04-13 | Board of Trustees of Michigan State University | Mobile device applications to measure blood pressure |
US11561515B2 (en) | 2018-08-02 | 2023-01-24 | Apple Inc. | Crown for an electronic watch |
CN209560398U (en) | 2018-08-24 | 2019-10-29 | 苹果公司 | Electronic watch |
CN209625187U (en) | 2018-08-30 | 2019-11-12 | 苹果公司 | Electronic watch and electronic equipment |
US11194298B2 (en) * | 2018-08-30 | 2021-12-07 | Apple Inc. | Crown assembly for an electronic watch |
US11194299B1 (en) | 2019-02-12 | 2021-12-07 | Apple Inc. | Variable frictional feedback device for a digital crown of an electronic watch |
CN110742596A (en) * | 2019-10-17 | 2020-02-04 | Oppo广东移动通信有限公司 | Electronic equipment for photographing and biological information measurement |
WO2021146333A1 (en) | 2020-01-13 | 2021-07-22 | Masimo Corporation | Wearable device with physiological parameters monitoring |
US11550268B2 (en) | 2020-06-02 | 2023-01-10 | Apple Inc. | Switch module for electronic crown assembly |
US12092996B2 (en) | 2021-07-16 | 2024-09-17 | Apple Inc. | Laser-based rotation sensor for a crown of an electronic watch |
Family Cites Families (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3899744B2 (en) | 1999-09-10 | 2007-03-28 | カシオ計算機株式会社 | Portable body fat measurement device |
CN1875370B (en) | 2003-09-05 | 2010-04-14 | 奥森泰克公司 | Multi-biometric finger sensor using different biometrics having different selectivities and associated methods |
US7729748B2 (en) | 2004-02-17 | 2010-06-01 | Joseph Florian | Optical in-vivo monitoring systems |
US7957762B2 (en) * | 2007-01-07 | 2011-06-07 | Apple Inc. | Using ambient light sensor to augment proximity sensor output |
US7486386B1 (en) * | 2007-09-21 | 2009-02-03 | Silison Laboratories Inc. | Optical reflectance proximity sensor |
US7822469B2 (en) | 2008-06-13 | 2010-10-26 | Salutron, Inc. | Electrostatic discharge protection for analog component of wrist-worn device |
US20110015496A1 (en) | 2009-07-14 | 2011-01-20 | Sherman Lawrence M | Portable medical device |
EP2290478A1 (en) | 2009-09-01 | 2011-03-02 | ETA SA Manufacture Horlogère Suisse | Covering element for a wristwatch |
WO2013016007A2 (en) * | 2011-07-25 | 2013-01-31 | Valencell, Inc. | Apparatus and methods for estimating time-state physiological parameters |
US9485345B2 (en) | 2011-09-21 | 2016-11-01 | University Of North Texas | 911 services and vital sign measurement utilizing mobile phone sensors and applications |
US8988372B2 (en) * | 2012-02-22 | 2015-03-24 | Avolonte Health LLC | Obtaining physiological measurements using a portable device |
US10172562B2 (en) * | 2012-05-21 | 2019-01-08 | Lg Electronics Inc. | Mobile terminal with health care function and method of controlling the mobile terminal |
US8954135B2 (en) | 2012-06-22 | 2015-02-10 | Fitbit, Inc. | Portable biometric monitoring devices and methods of operating same |
US9042971B2 (en) * | 2012-06-22 | 2015-05-26 | Fitbit, Inc. | Biometric monitoring device with heart rate measurement activated by a single user-gesture |
US9100579B2 (en) * | 2012-12-10 | 2015-08-04 | Cisco Technology, Inc. | Modification of a video signal of an object exposed to ambient light and light emitted from a display screen |
EP2967377A1 (en) * | 2013-03-14 | 2016-01-20 | Koninklijke Philips N.V. | Device and method for obtaining vital sign information of a subject |
CN104050444B (en) | 2013-03-15 | 2018-06-05 | 飞比特公司 | Wearable biometrics monitoring arrangement, interchangeable parts and the integrated clasp for allowing wearing |
KR102040426B1 (en) | 2013-06-11 | 2019-11-04 | 애플 인크. | Rotary input mechanism for an electronic device |
KR102035445B1 (en) | 2013-08-09 | 2019-10-22 | 애플 인크. | Electronic watch |
WO2015030712A1 (en) | 2013-08-26 | 2015-03-05 | Bodhi Technology Ventures Llc | Method of detecting the wearing limb of a wearable electronic device |
US10254804B2 (en) | 2014-02-11 | 2019-04-09 | Apple Inc. | Detecting the limb wearing a wearable electronic device |
WO2015126095A1 (en) | 2014-02-21 | 2015-08-27 | 삼성전자 주식회사 | Electronic device |
CN203732900U (en) | 2014-05-26 | 2014-07-23 | 屈卫兵 | Intelligent bluetooth watch for detecting heart rate |
US9848823B2 (en) | 2014-05-29 | 2017-12-26 | Apple Inc. | Context-aware heart rate estimation |
US9874457B2 (en) | 2014-05-30 | 2018-01-23 | Microsoft Technology Licensing, Llc | Adaptive lifestyle metric estimation |
US10123710B2 (en) | 2014-05-30 | 2018-11-13 | Microsoft Technology Licensing, Llc | Optical pulse-rate sensor pillow assembly |
US9348322B2 (en) | 2014-06-05 | 2016-05-24 | Google Technology Holdings LLC | Smart device including biometric sensor |
JP2016047154A (en) | 2014-08-27 | 2016-04-07 | セイコーエプソン株式会社 | Biological information measurement device |
US10092197B2 (en) | 2014-08-27 | 2018-10-09 | Apple Inc. | Reflective surfaces for PPG signal detection |
CN205121417U (en) | 2014-09-02 | 2016-03-30 | 苹果公司 | Wearable electronic device |
WO2016040392A1 (en) | 2014-09-08 | 2016-03-17 | Aliphcom | Forming wearable devices that include metalized interfaces and strap-integrated sensor electrodes |
KR20160030821A (en) | 2014-09-11 | 2016-03-21 | 삼성전자주식회사 | Wearable device |
TWI541621B (en) | 2014-09-15 | 2016-07-11 | 神達電腦股份有限公司 | Watch and method for automatically turning on a backlight |
US9723997B1 (en) | 2014-09-26 | 2017-08-08 | Apple Inc. | Electronic device that computes health data |
US20160198966A1 (en) | 2015-01-13 | 2016-07-14 | Seiko Epson Corporation | Biological information measuring module, biological information measuring apparatus, light detecting apparatus, light detecting module, and electronic apparatus |
US20160242659A1 (en) | 2015-02-20 | 2016-08-25 | Seiko Epson Corporation | Pulse-wave measuring module, biological-information measuring module, and electronic device |
KR20160114930A (en) | 2015-03-25 | 2016-10-06 | 삼성전자주식회사 | Module recognition method and electronic device performing thereof |
JP2016214641A (en) | 2015-05-22 | 2016-12-22 | セイコーエプソン株式会社 | Biological information measurement device |
US20160338642A1 (en) | 2015-05-23 | 2016-11-24 | Andrew Parara | Wearable Care Security Smart Watch Device |
KR20160145284A (en) | 2015-06-10 | 2016-12-20 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
KR102424360B1 (en) | 2015-06-19 | 2022-07-25 | 삼성전자주식회사 | Electronic apparatus for measuring information about a body of user and operating method thereof |
US9557716B1 (en) | 2015-09-20 | 2017-01-31 | Qualcomm Incorporated | Multipurpose magnetic crown on wearable device and adapter for power supply and audio, video and data access |
US11036318B2 (en) | 2015-09-30 | 2021-06-15 | Apple Inc. | Capacitive touch or proximity detection for crown |
US10716478B2 (en) | 2015-12-29 | 2020-07-21 | Motorola Mobility Llc | Wearable device heart monitor systems |
ES2963483T3 (en) | 2017-09-05 | 2024-03-27 | Apple Inc | Wearable electronic device with electrodes to detect biological parameters |
EP3459447B1 (en) | 2017-09-26 | 2024-10-16 | Apple Inc. | Optical sensor subsystem adjacent a cover of an electronic device housing |
-
2015
- 2015-02-09 US US14/617,422 patent/US9723997B1/en active Active
-
2017
- 2017-08-03 US US15/667,832 patent/US10524671B2/en active Active
-
2019
- 2019-12-02 US US16/700,710 patent/US20200100684A1/en not_active Abandoned
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11166104B2 (en) | 2014-02-11 | 2021-11-02 | Apple Inc. | Detecting use of a wearable device |
US11281262B2 (en) | 2014-02-11 | 2022-03-22 | Apple Inc. | Detecting a gesture made by a person wearing a wearable electronic device |
US11534088B2 (en) * | 2017-08-31 | 2022-12-27 | Fujifilm Business Innovation Corp. | Optical measuring apparatus and non-transitory computer readable medium |
US10987054B2 (en) | 2017-09-05 | 2021-04-27 | Apple Inc. | Wearable electronic device with electrodes for sensing biological parameters |
US11432766B2 (en) | 2017-09-05 | 2022-09-06 | Apple Inc. | Wearable electronic device with electrodes for sensing biological parameters |
US11504057B2 (en) | 2017-09-26 | 2022-11-22 | Apple Inc. | Optical sensor subsystem adjacent a cover of an electronic device housing |
WO2022071773A1 (en) * | 2020-09-29 | 2022-04-07 | 삼성전자주식회사 | Mobile device, control method therefor, and computer program stored in recording medium |
Also Published As
Publication number | Publication date |
---|---|
US9723997B1 (en) | 2017-08-08 |
US10524671B2 (en) | 2020-01-07 |
US20170354332A1 (en) | 2017-12-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10524671B2 (en) | Electronic device that computes health data | |
CN109645974B (en) | Apparatus and method for measuring biological information and wearable device | |
US11642086B2 (en) | Apparatus and method for correcting error of bio-information sensor, and apparatus and method for estimating bio-information | |
De Greef et al. | Bilicam: using mobile phones to monitor newborn jaundice | |
US11000192B2 (en) | Bio-information measuring apparatus, bio-information measuring method, and case apparatus for the bio-information measuring apparatus | |
US10398328B2 (en) | Device and system for monitoring of pulse-related information of a subject | |
US20170202505A1 (en) | Unobtrusive skin tissue hydration determining device and related method | |
KR20200014523A (en) | Apparatus and method for measuring bio-information | |
KR20160115017A (en) | Apparatus and method for sensing information of the living body | |
US20200187835A1 (en) | Apparatus and method for estimating blood glucose | |
US20190008392A1 (en) | Devices and methods for predicting hemoglobin levels using electronic devices such as mobile phones | |
US20180333088A1 (en) | Pulse Oximetry Capturing Technique | |
EP3639732B1 (en) | Apparatus and method for estimating bio-information | |
CN107920786A (en) | Pulse oximetry | |
KR20200097143A (en) | Apparatus and method for estimating bio-information | |
EP4179963A1 (en) | Electronic device and method of estimating bio-information using the same | |
KR20220030089A (en) | Apparatus and method for estimating bio-information | |
US20200113453A1 (en) | Apparatus and method for estimating blood pressure | |
US20210068668A1 (en) | Electronic device and method for obtaining vital sign | |
EP4186415B1 (en) | Electronic device and method of estimating bio-information using the same | |
US20210298678A1 (en) | Wearable Apparatus, And Accessory For Terminal Device | |
KR20220012582A (en) | Apparatus and method for estimating bio-information | |
CN112617746B (en) | Non-contact physiological signal detection device | |
US11974834B2 (en) | Apparatus and method for estimating bio-information | |
US20220233149A1 (en) | Apparatus and method for estimating body component |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |