Nothing Special   »   [go: up one dir, main page]

US20170231545A1 - Apparatus and methods for monitoring a subject - Google Patents

Apparatus and methods for monitoring a subject Download PDF

Info

Publication number
US20170231545A1
US20170231545A1 US15/431,842 US201715431842A US2017231545A1 US 20170231545 A1 US20170231545 A1 US 20170231545A1 US 201715431842 A US201715431842 A US 201715431842A US 2017231545 A1 US2017231545 A1 US 2017231545A1
Authority
US
United States
Prior art keywords
sensor
subject
motion
computer processor
applications
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/431,842
Inventor
Zvika Shinar
Liat Tsoref
Guy Meger
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
EarlySense Ltd
Original Assignee
EarlySense Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by EarlySense Ltd filed Critical EarlySense Ltd
Priority to US15/431,842 priority Critical patent/US20170231545A1/en
Assigned to EARLYSENSE LTD. reassignment EARLYSENSE LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MEGER, GUY, SHINAR, ZVIKA, TSOREF, LIAT
Publication of US20170231545A1 publication Critical patent/US20170231545A1/en
Assigned to KREOS CAPITAL V (EXPERT FUND) L.P. reassignment KREOS CAPITAL V (EXPERT FUND) L.P. SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EARLYSENSE LTD.
Assigned to EARLYSENSE LTD. reassignment EARLYSENSE LTD. RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: KREOS CAPITAL V (EXPERT FUND) L.P.
Assigned to KREOS CAPITAL VI (EXPERT FUND) L.P. reassignment KREOS CAPITAL VI (EXPERT FUND) L.P. SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EARLYSENSE LTD.
Priority to US16/877,543 priority patent/US20200275876A1/en
Assigned to EARLYSENSE LTD. reassignment EARLYSENSE LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HALPERIN, AVNER, KARASIK, ROMAN, KATZ, Yaniv, MEGER, GUY, SHINAR, ZVIKA, TSOREF, LIAT, YIZRAELI DAVIDOVICH, Maayan Lia
Priority to US17/103,826 priority patent/US11547336B2/en
Priority to US18/152,457 priority patent/US20230157602A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6891Furniture
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6893Cars
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60NSEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
    • B60N2/00Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
    • B60N2/002Seats provided with an occupancy detection means mounted therein or thereon
    • B60N2/0021Seats provided with an occupancy detection means mounted therein or thereon characterised by the type of sensor or measurement
    • B60N2/0022Seats provided with an occupancy detection means mounted therein or thereon characterised by the type of sensor or measurement for sensing anthropometric parameters, e.g. heart rate or body temperature
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60NSEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
    • B60N2/00Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
    • B60N2/002Seats provided with an occupancy detection means mounted therein or thereon
    • B60N2/0021Seats provided with an occupancy detection means mounted therein or thereon characterised by the type of sensor or measurement
    • B60N2/0023Seats provided with an occupancy detection means mounted therein or thereon characterised by the type of sensor or measurement for detection of driver fatigue
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60NSEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
    • B60N2/00Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
    • B60N2/002Seats provided with an occupancy detection means mounted therein or thereon
    • B60N2/0021Seats provided with an occupancy detection means mounted therein or thereon characterised by the type of sensor or measurement
    • B60N2/003Seats provided with an occupancy detection means mounted therein or thereon characterised by the type of sensor or measurement characterised by the sensor mounting location in or on the seat
    • B60N2/44
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60NSEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
    • B60N2/00Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
    • B60N2/90Details or parts not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60NSEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
    • B60N2210/00Sensor types, e.g. for passenger detection systems or for controlling seats
    • B60N2210/40Force or pressure sensors
    • B60N2210/48Piezoelectric
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60NSEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
    • B60N2210/00Sensor types, e.g. for passenger detection systems or for controlling seats
    • B60N2210/50Inertial sensors

Definitions

  • the present invention relates generally to monitoring a subject. Specifically, some applications of the present invention relate to monitoring a subject, while the subject is in a vehicle.
  • a sensor unit is disposed under a seat of a vehicle.
  • the sensor unit is configured to monitor physiological parameters of a subject who is sitting on the seat, and to generate a sensor signal in response thereto.
  • the subject is an operator of the vehicle (e.g., the driver of a car, the pilot of an airplane, the driver of a train, etc.).
  • a computer processor is configured to receive and analyze the sensor signal for any one of a number of reasons.
  • the computer processor derives vital signs of the subject (such as heart rate, respiratory rate, and/or heart-rate variability) from the sensor signal.
  • the computer processor compares the subject's vital signs to a baseline of the subject that was derived during occasions when the subject previously operated the vehicle.
  • the computer processor may determine that the subject's vital signs have changed substantially from the baseline, that the subject is unwell, drowsy, asleep, and/or under the influence of drugs or alcohol. In response thereto, the computer processor may generate an alert to the driver, or to a remote location (such as to a family member, and/or to a corporate control center). Alternatively or additionally, the computer processor may automatically disable the vehicle.
  • the sensor unit is configured to be placed underneath the seat and to detect motion of the subject who is sitting on the seat during motion of the vehicle.
  • the sensor unit typically includes a housing, at least one first motion sensor disposed within the housing, such that the first motion sensor generates a first sensor signal that is indicative of the motion of the vehicle, and at least one second motion sensor disposed within the housing, such that the second motion sensor generates a second sensor signal that is indicative of the motion of the subject and the motion of the vehicle.
  • the computer processor is typically configured to at least partially distinguish between the motion of the subject and the motion of the vehicle by analyzing the first and second sensor signals.
  • a temperature control device (such as an electric blanket, or an electric mattress) includes at least first and second sections corresponding to respective portions of a body of a single subject.
  • a blanket may include three types of sections: a trunk section corresponding to the subject's trunk, leg sections corresponding to the subject's legs, and arm sections corresponding to the subject's arms.
  • a temperature-regulation unit regulates respective portions of the subject's body to be at respective temperatures by, simultaneously, setting a temperature of the first section of the temperature control device to a first temperature, and setting a temperature of the second section of the temperature control device to a second temperature that is different from the first temperature.
  • the temperature-regulation unit sets the temperature of additional sections of the temperature control device to further respective temperatures.
  • thermoregulation during sleep affects sleep quality.
  • selective vasodilation of distal skin regions may promote the onset of sleep.
  • a computer processor drives the temperature-regulation unit to regulate respective portions of the subject's body to be at respective temperatures in the manner described herein, such as to improve sleep quality, shorten sleep latency, and/or better maintain sleep continuity.
  • the computer processor may drive the temperature-regulation unit to regulate the temperature of the subject's legs and/or arms to be at a greater temperature than the subject's trunk (e.g., by heating the legs and/or arms by more than the trunk, or by cooling the trunk by less than the legs and/or arms).
  • the computer processor drives the temperature-regulation unit to regulate respective portions of the subject's body to be at respective temperatures, in response to the subject's sleep stage, which is detected automatically by analyzing a sensor signal from a sensor (such as motion sensor) that is configured to monitor the subject.
  • a sensor such as motion sensor
  • apparatus for use with a seat of a vehicle comprising:
  • a sensor unit configured to be placed underneath the seat and configured to detect motion of a subject who is sitting on the seat during motion of the vehicle, the sensor unit comprising:
  • a computer processor configured to at least partially distinguish between the motion of the subject and the motion of the vehicle by analyzing the first and second sensor signals.
  • the first motion sensor is disposed within the housing such that the first motion sensor is isolated from the motion of the subject, such that the first motion sensor only detects motion that is due to motion of the vehicle.
  • the computer processor is configured to:
  • the housing is flexible
  • the apparatus further comprises a fluid compartment disposed on an inner surface of the housing,
  • the at least one first motion sensor is disposed on a surface of the fluid compartment
  • At least one second motion sensor is disposed on at least one inner surface of the flexible portion of the housing.
  • the first motion sensor comprises a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
  • the second motion sensor comprises a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
  • the at least one second motion sensor comprises two or more second motion sensors disposed on respective inner surfaces of the flexible portion of the housing.
  • the housing comprising flexible and rigid portions
  • the at least one first motion sensor is disposed on at least one inner surface of the rigid portion of the housing;
  • the at least one second motion sensor is disposed on at least one inner surface of the flexible portion of the housing, and configured to generate a second sensor signal.
  • the first motion sensor comprises a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
  • the second motion sensor comprises a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
  • the at least one first motion sensor comprises two or more first motion sensors disposed on respective inner surfaces of the rigid portion of the housing.
  • the at least one second motion sensor comprises two or more second motion sensors disposed on respective inner surfaces of the flexible portion of the housing.
  • apparatus for use with a seat of a vehicle including:
  • a sensor unit configured to be placed underneath the seat and configured to detect motion of a subject who is sitting on the seat during motion of the vehicle, the sensor unit comprising:
  • a computer processor configured to at least partially distinguish between the motion of the subject and the motion of the vehicle by analyzing the first and second sensor signals.
  • the first motion sensor includes a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
  • the second motion sensor includes a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
  • the at least one second motion sensor includes two or more second motion sensors disposed on respective inner surfaces of the flexible portion of the housing.
  • apparatus for use with a seat of a vehicle including:
  • a sensor unit configured to be placed underneath the seat and configured to detect motion of a subject who is sitting on the seat during motion of the vehicle, the sensor unit comprising:
  • a computer processor configured to at least partially distinguish between the motion of the subject and the motion of the vehicle by analyzing the first and second sensor signals.
  • the first motion sensor includes a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
  • the second motion sensor includes a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
  • the at least one first motion sensor includes two or more first motion sensors disposed on respective inner surfaces of the rigid portion of the housing.
  • the at least one second motion sensor includes two or more second motion sensors disposed on respective inner surfaces of the flexible portion of the housing.
  • apparatus including:
  • a temperature-control device comprising at least first and second sections corresponding to respective portions of a body of a single subject
  • a temperature-regulation unit configured to regulate temperatures of the respective portions of the subject's body to be at respective temperatures by, simultaneously, setting a temperature of the first section of the temperature-control device to a first temperature, and setting a temperature of the second section of the temperature control device to a second temperature that is different from the first temperature.
  • the temperature control device includes a device selected from the group consisting of: a blanket and a mattress, and the selected device has a length of less than 250 cm, and a width of less than 130 cm.
  • the temperature control device includes a blanket configured to be placed above the subject, and the first and second section include first and second sections that are configured to be placed over respective portions of the subject's body.
  • the temperature control device includes a blanket configured to be disposed underneath the subject, and the first and second section include first and second sections that are configured to be disposed underneath respective portions of the subject's body.
  • the temperature control device includes a mattress configured to be disposed underneath the subject, and the first and second section include first and second sections that are configured to be disposed underneath respective portions of the subject's body.
  • the first section corresponds to a trunk of the subject
  • the second section corresponds to a distal portion of the subject's body selected from the group consisting of: at least one arm of the subject, and at least one leg of the subject.
  • the apparatus further includes:
  • a sensor configured to monitor the subject and generate a sensor signal in response thereto
  • a computer processor configured to:
  • the computer processor is configured to:
  • a falling-asleep stage a beginning-sleep stage, a mid-sleep stage, a premature-awakening stage, an awakening stage, a light sleep stage, a slow-wave sleep stage, and a rapid-eye-movement sleep stage, and
  • the temperature-regulation unit in response to the differentially identified sleep stages, drive the temperature-regulation unit to regulate the temperatures of the respective portions of the subject's body to be at the respective temperatures.
  • the senor is configured to monitor the subject without contacting or viewing the subject, and without contacting or viewing clothes the subject is wearing.
  • the first section corresponds to a trunk of the subject
  • the second section corresponds to at least one distal portion of the subject's body selected from the group consisting of: at least one arm of the subject, and at least one leg of the subject.
  • the computer processor is configured, in response to detecting that the subject is trying to fall asleep, to drive the temperature-modulation unit to regulate the subject's trunk to be at a first temperature, and to regulate at least the selected distal portion of the subject's body to be at a second temperature that is greater than the first temperature.
  • the computer processor is configured, in response to detecting that the subject is at a sleep stage at which it is suitable to wake up the subject, to drive the temperature-regulation unit to heat the subject's trunk.
  • the senor includes a motion sensor configured to sense motion of the subject.
  • the senor is configured to monitor the subject without contacting or viewing the subject, and without contacting or viewing clothes the subject is wearing.
  • the apparatus is for use with a room-climate regulation device, and, in response to the identified sleep stage, the computer processor is further configured to adjust a parameter of the room-climate regulation device.
  • the room-climate regulation device includes an air-conditioning unit, and, in response to the identified sleep stage, the computer processor is configured to adjust a parameter of the air-conditioning unit.
  • apparatus for use with an output device including:
  • a sensor configured to monitor a subject, during a sleeping session of the subject, and to generate a sensor signal in response to the monitoring;
  • a computer processor configured to:
  • apparatus for use with a female subject including:
  • a sensor configured to monitor the subject, prior to the subject becoming pregnant and during a pregnancy of the subject, and to generate a sensor signal in response to the monitoring;
  • a computer processor configured to:
  • apparatus for use with a stimulus-providing device that is configured to provide a stimulus to a subject selected from the group consisting of: an audio stimulus, a visual stimulus, and a tactile stimulus, the apparatus including:
  • a sensor configured to monitor a subject and to generate a sensor signal in response thereto
  • control unit configured to:
  • apparatus for monitoring a subject comprising:
  • a sensor configured to monitor the subject without contacting the subject or clothes the subject is wearing, and without viewing the subject or clothes the subject is wearing, and to generate a sensor signal in response to the monitoring;
  • a computer processor configured to:
  • apparatus for monitoring a subject comprising:
  • a sensor configured to monitor the subject and to generate a sensor signal in response thereto
  • a plurality of filters configured to filter the sensor signal using respective filter parameters
  • a computer processor configured to:
  • FIG. 1 is a schematic illustration of apparatus for monitoring a subject, in accordance with some applications of the present invention
  • FIG. 2 is a schematic illustration of a blanket, in accordance with some applications of the present invention.
  • FIG. 3 is a flowchart showing steps that are performed by a computer processor in order to control a subject's body temperature, in accordance with some applications of the present invention
  • FIG. 4 is a flowchart showing steps that are performed by a computer processor in order to monitor sleep apnea of a subject, in accordance with some applications of the present invention
  • FIG. 5 is a schematic illustration of a sensor unit disposed under the seat of a vehicle, in accordance with some applications of the present invention.
  • FIGS. 6A-C are schematic illustrations of a sensor unit as shown in FIG. 5 , in accordance with respective applications of the present invention.
  • FIGS. 7A-B are schematic illustrations of subject-monitoring apparatus, in accordance with some applications of the present invention.
  • FIG. 8 is a flowchart showing steps that are performed by a computer processor in order to monitor a subject who is pregnant, in accordance with some applications of the present invention
  • FIGS. 9A-C show histograms of patients' cardiac interbeat intervals that were recorded in accordance with some applications of the present invention.
  • FIG. 10 shows components of a subject's cardiac cycle that were detected in accordance with some applications of the present invention.
  • FIG. 1 is a schematic illustration of subject-monitoring apparatus 20 , in accordance with some applications of the present invention.
  • Apparatus 20 is generally used to monitor a subject 24 , while he or she is in his or her bed in a home setting.
  • the subject-monitoring apparatus is used in a hospital setting.
  • Subject-monitoring apparatus 20 comprises a sensor 22 (e.g., a motion sensor) that is configured to monitor subject 24 .
  • Sensor 22 may be a motion sensor that is similar to sensors described in U.S. Pat. No. 8,882,684 to Halperin, which is incorporated herein by reference.
  • the term “motion sensor” refers to a sensor that senses the subject's motion (e.g., motion due to the subject's cardiac cycle, respiratory cycle, or large-body motion of the subject), while the term “sensor” refers more generally to any type of sensor, e.g., a sensor that includes an electromyographic sensor and/or an imaging sensor.
  • sensor 22 includes a sensor that performs monitoring of the subject without contacting the subject or clothes the subject is wearing, and/or without viewing the subject or clothes the subject is wearing.
  • the sensor may perform the monitoring without having a direct line of sight of the subject's body, or the clothes that the subject is wearing, and/or without any visual observation of the subject's body, or the clothes that the subject is wearing.
  • the sensor performs monitoring of the subject without requiring subject compliance (i.e., without the subject needing to perform an action to facilitate the monitoring that would not have otherwise been performed).
  • sensor 22 is disposed on or within the subject's bed, and configured to monitor the subject automatically, while the subject is in their bed.
  • sensor 22 may be disposed underneath the subject's mattress 26 , such that the subject is monitored while she is lying upon the mattress, and while carrying out her normal sleeping routine, without the subject needing to perform an action to facilitate the monitoring that would not have otherwise been performed.
  • a computer processor 28 which acts as a control unit that performs the algorithms described herein, analyzes the signal from sensor 22 .
  • computer processor 28 communicates with a memory 29 .
  • computer processor 28 is embodied in a desktop computer 30 , a laptop computer 32 , a tablet device 34 , a smartphone 36 , and/or a similar device that is programmed to perform the techniques described herein (e.g., by downloading a dedicated application or program to the device), such that the computer processor acts as a special-purpose computer processor.
  • computer processor 28 is a dedicated computer processor that receives (and optionally analyzes) data from sensor 22 , and communicates with computer processors of one or more of the aforementioned devices, which act as external devices.
  • the subject communicates with (e.g., sends data to and/or receives data from) computer processor 28 via a user interface device 35 .
  • computer processor is embodied in a desktop computer 30 , a laptop computer 32 , a tablet device 34 , a smartphone 36 , and/or a similar device that is programmed to perform the techniques described herein.
  • components of the device e.g., the touchscreen, the mouse, the keyboard, the speakers, the screen
  • computer processor 28 is a dedicated computer processor that receives (and optionally analyzes) data from sensor 22 .
  • the dedicated computer processor communicates with computer processors of one or more of the aforementioned external devices (e.g., via a network), and the user interfaces of the external devices (e.g., the touchscreen, the mouse, the keyboard, the speakers, the screen) are used by the subject, as user interface device 35 , to communicate with the dedicated computer processor and vice versa.
  • the external devices are programmed to communicate with the dedicated computer processor (e.g., by downloading a dedicated application or program to the external device).
  • user interface includes an input device such as a keyboard 38 , a mouse 40 , a joystick (not shown), a touchscreen device (such as smartphone 36 or tablet device 34 ), a touchpad (not shown), a trackball (not shown), a voice-command interface (not shown), and/or other types of user interfaces that are known in the art.
  • the user interface includes an output device such as a display (e.g., a monitor 42 , a head-up display (not shown) and/or a head-mounted display (not shown)), and/or a different type of visual, text, graphics, tactile, audio, and/or video output device, e.g., speakers, headphones, smartphone 36 , or tablet device 34 .
  • the user interface acts as both an input device and an output device.
  • the processor generates an output on a computer-readable medium (e.g., a non-transitory computer-readable medium), such as a disk, or a portable USB drive.
  • a computer-readable medium e.g., a non-transitory computer-readable medium
  • FIG. 2 is a schematic illustration of a temperature control device, e.g., a blanket 50 (which is typically an electric blanket), in accordance with some applications of the present invention.
  • the temperature control device includes at least first and second sections corresponding to respective portions of a body of a single subject.
  • the blanket includes three types of sections: a trunk section 52 corresponding to the subject's trunk, leg sections 54 corresponding to the subject's legs, and arm sections 56 corresponding to the subject's arms.
  • a temperature-regulation unit 58 regulates respective portions of the subject's body to be at respective temperatures by, simultaneously, setting the temperature of the first section of the temperature control device to a first temperature, and setting the temperature of the second section of the temperature control device to a second temperature that is different from the first temperature, and, optionally, setting the temperatures of additional sections of the temperature control device to further respective temperatures.
  • blanket 50 can be an over-blanket that is placed over the subject's body, or an under-blanket that is placed above the subject's mattress and beneath the subject (as shown).
  • the scope of the present invention includes any temperature control device that includes first and second sections corresponding to respective portions of a body of a single subject, for use with a temperature-regulation unit that regulates the respective portions of the subject's body to be at respective temperatures by, simultaneously, setting the temperature of the first section of the temperature control device to a first temperature, and setting the temperature of the second section of the temperature control device to a second temperature that is different from the first temperature.
  • the temperature control device may include a mattress (e.g., an electric mattress), which includes built-in heating pads.
  • thermoregulation during sleep affects sleep quality.
  • the computer processor drives the temperature-regulation unit to regulate the temperatures of respective portions of the subject's body to be at respective temperatures, in the manner described herein, such as to improve sleep quality, shorten sleep latency, and/or better maintain sleep continuity.
  • the computer processor may drive the temperature-regulation unit to heat the subject's legs and/or arms to a greater temperature than the subject's trunk.
  • the computer processor may drive the temperature-regulation unit to cool one or more portions of the subject's body.
  • the computer processor drives the temperature-regulation unit to heat and/or cool respective portions of the subject's body to respective temperatures, in response to the subject's sleep stage, which is detected automatically by analyzing the sensor signal from sensor 22 .
  • FIG. 3 is a flowchart showing steps that are performed by a computer processor in order to control a subject's body temperature, in accordance with some applications of the present invention.
  • the computer processor receives a signal from sensor 22 , which is typically as described hereinabove.
  • the computer processor analyzes the sensor signal in order to determine the subject's current sleep stage. For example, the computer processor may determine that the subject is currently in a falling-asleep stage (prior to falling asleep), a beginning-sleep stage, a mid-sleep stage, an awakening stage, a premature awakening stage, an REM stage, or a slow-wave stage.
  • the sleep stage is detected based upon the sensor signal using techniques as described in US 2007/0118054 to Pinhas (now abandoned), which is incorporated herein by reference.
  • the computer processor in step 64 ) adjusts the temperature of a first portion of the temperature-control device (e.g., the arm or leg portion of blanket 50 ), and/or separately (in step 66 ) adjusts the temperature of a second portion of the temperature-control device (e.g., the trunk portion of blanket 50 ).
  • the computer processor additionally adjusts the temperature of an additional room-climate regulation device, such as an air-conditioning unit (e.g., unit 44 , FIG. 1 ), an electric heater, and/or a radiator.
  • the air-conditioning unit may be used to provide additional control of the temperature of the subject's trunk by controlling the temperature of the air that the subject inhales.
  • the computer processor drives the temperature-regulation unit to heat distal parts of the subject's body (e.g., the subject's arms and/or legs) to a higher temperature than the subject's trunk.
  • the computer processor will use different temperature profiles for different sleep states. For example, when the subject is in slow wave sleep, the computer processor may drive the temperature-regulation unit to keep temperatures lower than during other phases of the subject's sleep. Alternatively or additionally, when the subject wakes up during the night the computer processor may use a similar profile to that used when the subject is initially trying to fall asleep.
  • the computer processor drives the temperature-regulation unit to warm the subject's trunk in order to gently wake up the subject.
  • the computer processor may use trunk warming to wake up the subject, based on having received an input of a desired time for the subject to wake up (e.g., via the user interface), or based on detecting that the current sleep phase of the subject is such that it would be a good time to wake up the subject.
  • a user designates temperature profiles corresponding to respective sleep stages, via a user input into the computer processor.
  • the temperature profile of any sleep stage will include respective temperatures for respective portions of the subject's body, and/or differences between the temperatures to which respective portions are heated or cooled.
  • the computer processor utilizes a machine learning algorithm, based upon which the computer processor analyzes the subject's response to different temperature profiles at different sleep stages and learns which temperature profiles at which sleep phases result in the best quality sleep for the subject.
  • the computer processor automatically designates temperature profiles to respective sleep stages.
  • the computer processor additionally adjusts the temperature of an additional room-climate regulation device, such as an air-conditioning unit, an electric heater, and/or a radiator.
  • an air-conditioning unit may be used to provide additional control of the temperature of the subject's trunk by controlling the temperature of the air that the subject inhales.
  • the temperature profiles of the respective sleep stages include a setting for the additional room-climate regulation device.
  • typically blanket is sized for use with a single subject, and includes separate regions the temperatures of which are controlled separately from one another.
  • the length of the blanket is less than 250 cm, e.g., less than 220 cm, or less than 200 cm
  • the width of the blanket is less than 130 cm, e.g., less than 120 cm, or less than 110 cm.
  • the mattress has similar dimensions to those described with respect to the blanket.
  • the temperature control device is a portion of a blanket or a mattress that is suitable for being used by two subjects (e.g., partners in a double bed).
  • a portion of the blanket or mattress that is configured to be placed underneath or over a single subject includes at least first and second sections (e.g., a trunk section corresponding to the subject's trunk, leg sections corresponding to the subject's legs, and/or arm sections corresponding to the subject's arms), and the temperature-regulation unit regulates the respective portions of the subject's body to be at respective temperatures by, simultaneously, setting the temperature of the first section of the temperature control device to a first temperature, and setting the temperature the second section of the temperature control device to a second temperature that is different from the first temperature, and, optionally, setting the temperature of additional sections of the temperature control device to further respective temperatures.
  • the techniques described herein are practiced in combination with techniques described in WO 16/035073 to Shinar, which is incorporated herein by reference.
  • the computer processor may drive the user interface to prompt the subject to input changes to the temperature profiles corresponding to respective sleep stages, in response to a change in a relevant parameter. For example, in response to a change in season, an ambient temperature, an ambient humidity, and/or a going-to-sleep time (e.g., the subject is going to bed at an unusual time), the computer processor may drive the user interface to prompt the subject to re-enter his/her temperature profiles.
  • the computer processor may identify the change of the relevant parameter in a variety of ways, such as, for example, by receiving input from a sensor, or by checking the internet.)
  • the computer processor calculates a sleep score of the subject. For example, the computer processor may calculate a score from one or more parameters such as a time to fall asleep, duration of sleep, or “sleep efficiency,” which is the percentage of in-bed time during which the subject is sleeping. For some applications, the score is calculated using one or more of the aforementioned parameters, such that a higher sleep score is indicative of more restful sleeping session relative to a lower sleep score. The computer processor may then compare the sleep score to a baseline value, e.g., an average sleep score over a previous period of time.
  • a baseline value e.g., an average sleep score over a previous period of time.
  • the computer processor may drive the user interface to prompt the subject to re-enter new temperature profiles for respective sleep stages, since it is possible that the temperature profiles were a contributing factor in the subject's low sleep score.
  • the computer processor may drive user interface to prompt the subject to input at least one factor that may have caused the low sleep score. The computer processor then controls the heating device in response to the input.
  • the computer processor computes a measure of relaxation, i.e., a relaxation score, for the subject, one or more times during a sleeping session. For example, a high relaxation score may be computed if the subject shows little movement, and little variation in both respiration rate and respiration amplitude. The relaxation score may be used to compute the sleep score. Alternatively or additionally, in response to a low relaxation score, the computer processor may immediately adjust the temperature of sections of the temperature control device.
  • a measure of relaxation i.e., a relaxation score
  • the computer processor in response to a low sleep score, adjusts the temperature profiles even without any input from the user, or the computer processor generates an output (e.g., via user interface device 35 ) that includes suggested temperature profiles, which the subject may edit and/or confirm via the user interface.
  • the computer processor when the temperature control device is initially used by the subject, the computer processor is configured to perform a “sweep” (or “optimization routine”) over a plurality of different temperature profiles at respective sleep stages, in order to ascertain which profiles at which sleep stages are conducive to a higher sleep score, relative to other settings, e.g., which setting maximizes the sleep score. For example, over the course of several sleeping sessions, the computer processor may change the temperature profiles that are used at respective sleep stages in different ways, and in response thereto, determine the optimal temperature profiles.
  • a “sweep” or “optimization routine”
  • the computer processor may change the temperature profiles that are used at respective sleep stages in different ways, and in response thereto, determine the optimal temperature profiles.
  • FIG. 4 is a flowchart showing steps that are performed by computer processor 28 in order to monitor sleep apnea of the subject, in accordance with some applications of the present invention.
  • sensor 22 is configured to monitor the subject during a sleeping session of the subject.
  • the computer processor receives and analyzes the sensor signal (step 70 ). Based on the analysis of the signal, the computer processor identifies the positions of the subject's body at respective times during the sleeping session (step 72 ). For example, the system may identify when during the sleeping session the subject was lying on his/her side, when during the sleeping session the subject was lying on his/her back (i.e., supine), and when during the sleeping session the subject was lying on his/her stomach.
  • the computer processor determines the positions of the subject's body by analyzing the sensor signal using analysis techniques as described in U.S. Pat. No. 8,821,418 to Meger, which is incorporated herein by reference.
  • a calibration process is performed by the processor.
  • the processor may instruct the subject to lie on his/her back, side, and stomach, each for a given time period.
  • the processor analyzes the subject's cardiac and respiratory related waveforms, and/or other signal components of the sensor signal that are recorded when the subject is lying is respective positions. Based upon this analysis, the processor correlates respective signal characteristics to respective positions of the subject. Thereafter, the processor identifies the subject's position based upon characteristics of the sensor signal.
  • the computer processor based upon the analysis of the sensor signal, identifies apnea events that occur during the sleeping session (step 74 ). For example, the computer processor may identify apnea events by analyzing the sensor signal using analysis techniques as described in US 2007/0118054 to Pinhas (now abandoned), which is incorporated herein by reference.
  • the computer processor identifies a correspondence between positions of the subject and occurrences of apnea events of the subject during the sleeping session.
  • the computer processor typically generates an output on an output device (e.g., any one of the output devices described with reference to FIG. 1 ), in response to the identified correspondence.
  • the computer processor may generate an indication of:
  • a recommended position for the subject to assume while sleeping e.g. “Try sleeping on your side”
  • the analysis of the sensor signal (step 70 ), the identification of subject positions (step 72 ), the identification of apnea events (step 74 ), and/or the identification of correspondence between the apnea events and the subject positions (step 76 ) are performed in real time, as the sensor signal is received by the processor. Alternatively, one or more of the aforementioned steps are performed subsequent to the sleeping session.
  • the computer processor in response to detecting that the subject is lying in a given position that the processor has determined to cause the subject to undergo apnea events, the computer processor generates an alert and/or nudges the subject to change positions. For example, in response to detecting that the subject is in a supine position (and having determined that lying in this position causes the subject to undergo apnea events), the computer processor may cause the subject's bed to vibrate, or may adjust the tilt angle of the bed or a portion thereof.
  • the apparatus described herein may be used with a bed or mattress with an adjustable tilt angle, and/or an inflatable pillow which, when activated, inflates or deflates to vary the elevation of the head of the subject as desired.
  • the pillow's air pressure level is changed, and/or the tilt angle of the bed or the mattress is changed, in order to change the patient's posture and prevent an upcoming apnea event, or stop a currently-occurring apnea event.
  • a processor as described with reference to FIG. 4 is used in combination with a vibrating mechanism and/or an adjustable resting surface.
  • the vibrating mechanism may include a vibrating mechanism disposed underneath mattress 26 and/or a vibrating wristwatch.
  • the subject is more likely to snore, cough, or have an apnea episode when the subject is in a supine position.
  • the computer processor reduces the frequency of snoring, coughing, and/or apnea of subject 24 by encouraging (e.g., by “nudging”) the subject to move from a supine position to a different position.
  • the computer processor identifies the subject's sleeping position by analyzing the sensor signal from sensor 22 .
  • the computer processor drives the vibrating mechanism to vibrate, and/or adjusts a parameter (e.g., an angle) of the surface upon which the subject is lying.
  • the vibration typically nudges the subject to change his posture, while the adjustment of the parameter may nudge the subject to change his posture or actually move the subject into the new posture.
  • an inflatable pillow is used and the computer processor adjusts a level of inflation of the inflatable pillow.
  • the computer processor may drive an inflating mechanism to inflate the inflatable pillow, by communicating a signal to the inflating mechanism.
  • the computer processor is configured to identify a sleep stage of the subject.
  • the computer processor drives the vibrating mechanism to vibrate, and/or adjusts the parameter of the resting surface, further in response to the identified sleep stage.
  • the computer processor may drive the vibrating mechanism to vibrate, and/or adjust the parameter of the resting surface, in response to the identified sleep stage being within 5 minutes of an onset or an end of an REM sleep stage, since at these points in time, the “nudging” or moving is less likely to disturb the subject's sleep.
  • FIG. 5 is a schematic illustration of a sensor unit 80 disposed under a seat 82 of a vehicle, in accordance with some applications of the present invention.
  • Sensor unit 80 is configured to monitor physiological parameters of a subject who is sitting on seat 82 , and to generate a sensor signal in response thereto.
  • the subject is an operator of the vehicle (e.g., the driver of a car, the pilot of an airplane, the driver of a train, etc.).
  • a computer processor which is typically like computer processor 28 described herein, is configured to receive and analyze the sensor signal for any one of a number of reasons.
  • the computer processor derives vital signs of the subject (such as heart rate, respiratory rate, and/or heart-rate variability) from the sensor signal. For some applications, the computer processor compares the subject's vital signs to a baseline of the subject that was derived during previous occasions when the subject operated the vehicle. In response thereto, the computer processor may determine that the subject's vital signs have changed substantially from the baseline, that the subject is unwell, drowsy, asleep, and/or under the influence of drugs or alcohol. In response thereto, the computer processor may generate an alert to the driver, or to a remote location (such as to a family member, and/or to a corporate control center). Alternatively or additionally, the computer processor may automatically disable the vehicle.
  • vital signs of the subject such as heart rate, respiratory rate, and/or heart-rate variability
  • the computer processor integrates the analysis of the sensor signal from sensor unit 80 with the analysis of a sensor signal from an additional sensor, which may be disposed in the subject's bed, for example.
  • the computer processor may determine that the subject has not had enough sleep based upon the analysis of the signals from both sensors.
  • the sensor may derive, from the combination of the sensor signals, that the subject has had enough sleep, but appears to be unwell, and/or under the influence of drugs or alcohol.
  • the computer processor may generate an alert to the driver, or to a remote location (such as to a family member, and/or to a corporate control center). Alternatively or additionally, the computer processor may automatically disable the vehicle.
  • sensor units 80 are disposed underneath more than one seat in the vehicle.
  • sensor units may be disposed underneath the seats of a pilot and a co-pilot in an airplane (e.g., as described in WO 16/035073 to Shinar, which is incorporated herein by reference).
  • sensor units may be disposed underneath each of the seats in an airplane or a car.
  • the computer processor may determine that a child has been left alone in a car, and may generate an alert in response thereto. For example, the alert may be generated on the driver's and/or parents' cellular phone(s). Alternatively or additionally, the computer processor may determine the number of people in the car.
  • the senor is typically configured to distinguish between a person who is disposed upon the seat and an inanimate object (such as a suitcase, or backpack) that is disposed upon the seat.)
  • the computer processor may generate seatbelt alerts, for example.
  • the computer processor may automatically communicate with the billing system of a toll road for which prices are determined based upon the number of passengers in the car.
  • sensor unit 80 is configured to generate a sensor signal that is such that the computer processor is able to distinguish between artifacts from motion of vehicle, and motion that is indicative of physiological parameters of the subject.
  • the sensor unit includes (a) a housing, (b) at least one first motion sensor disposed within the housing, such that the first motion sensor generates a first sensor signal that is indicative of the motion of the vehicle, and (c) at least one second motion sensor disposed within the housing, such that the second motion sensor generates a second sensor signal that is indicative of the motion of the subject and the motion of the vehicle.
  • the computer processor configured to at least partially distinguish between the motion of the subject and the motion of the vehicle by analyzing the first and second sensor signals.
  • the first motion sensor is disposed within the housing, such that the first motion sensor is isolated from the motion of the subject, and/or such that the first motion sensor only detects motion that is due to motion of the vehicle.
  • the computer processor at least partially distinguishes between the motion of the subject and the motion of the vehicle by (a) deriving the motion of the vehicle from the first sensor signal(s), and (b) based upon the derived motion of the vehicle, subtracting the vehicular motion (i.e., subtracting the portion of the sensor signal that is generated by the motion of the vehicle) from the sensor signal that is generated by the second sensor(s).
  • FIGS. 6A-C are schematic illustrations of sensor unit 80 , in accordance with respective applications of the present invention.
  • sensor unit 80 includes a housing 90 at least a portion 92 of which is flexible.
  • a fluid compartment 94 which is filled with a gas or a liquid, is disposed on an inner surface of the housing.
  • a first motion sensor 96 e.g., a deformation sensor, a piezoelectric sensor, and/or an accelerometer
  • a first motion sensor 96 is disposed on a surface of the fluid compartment, and is configured to generate a first sensor signal.
  • two or more first motion sensors are disposed on the surface of the fluid compartment, and each of the first motion sensors generates a respective sensor signal.
  • a second motion sensor 98 (e.g., a deformation sensor, a piezoelectric sensor, and/or an accelerometer) is disposed on at least one inner surface of flexible portion 92 of housing 90 .
  • the second motion sensor is configured to generate a second sensor signal.
  • two or more motion sensors 98 are disposed on respective inner surfaces of flexible portion 92 of housing 90 , and each of motion sensors 98 generates a respective sensor signal.
  • the computer processor is configured to at least partially distinguish between the motion of the subject and the motion of the vehicle by analyzing the first and second sensor signals.
  • fluid compartment 94 isolates first motion sensor 96 from motion of the subject who is sitting on the seat, such that motion sensor 96 only detects motion that is due to motion of the vehicle.
  • Second motion sensor(s) detects both motion of the vehicle, and motion of the subject, the motion of the subject being conveyed to the second motion sensor(s) via the flexible portion of the housing.
  • the computer processor at least partially distinguishes between the motion of the subject and the motion of the vehicle by (a) deriving the motion of the vehicle from the first sensor signal, and (b) based upon the derived motion of the vehicle, subtracting the vehicular motion (i.e., subtracting the portion of the sensor signal that is generated by the motion of the vehicle) from the sensor signal that is generated by the second sensor(s).
  • sensor unit 80 includes a housing 100 that includes a flexible portion 102 and a rigid portion 104 .
  • At least one first motion sensor(s) 106 e.g., a deformation sensor, a piezoelectric sensor, and/or an accelerometer
  • first sensor signal is disposed on at least one inner surface of the rigid portion of the housing, and is configured to generate a first sensor signal.
  • two or more first motion sensors are disposed on respective inner surfaces of the rigid portion of the housing, and each of motion sensors 106 generates a respective sensor signal.
  • At least one second motion sensor 108 (e.g., a deformation sensor, a piezoelectric sensor, and/or an accelerometer) is disposed on at least one inner surface of flexible portion 102 of housing 100 .
  • the second motion sensor is configured to generate a second sensor signal.
  • two or more motion sensors 108 are disposed on respective inner surfaces of flexible portion 102 of housing 100 , and each of motion sensors 108 generates a respective sensor signal.
  • the computer processor is configured to at least partially distinguish between the motion of the subject and the motion of the vehicle by analyzing the first and second sensor signals.
  • the rigidity of the rigid portion of the housing isolates first motion sensor(s) 106 from motion of the subject who is sitting on the seat, such that first motion sensor(s) 106 only detects motion that is due to motion of the vehicle.
  • Second motion sensor(s) detects both motion of the vehicle, and motion of the subject, the motion of the subject being conveyed to the second motion sensor(s) via the flexible portion of the housing.
  • the computer processor at least partially distinguishes between the motion of the subject and the motion of the vehicle by (a) deriving the motion of the vehicle from the first sensor signal(s), and (b) based upon the derived motion of the vehicle, subtracting the vehicular motion (i.e., subtracting the portion of the sensor signal that is generated by the motion of the vehicle) from the sensor signal that is generated by the second sensor(s).
  • a sensor unit as described with reference to FIGS. 5-6C is used in an airplane, and the computer processor generates one or more of the following outputs, based upon analysis of the sensor signal:
  • An alert may be generated if, by analyzing the sensor signal, the computer processor identifies an elevated stress level of a subject, e.g., by identifying an elevated heart rate, and/or a decreased stroke volume, e.g., as described in WO 2015/008285 to Shinar, which is incorporated herein by reference. For example, in response to the pilot experiencing an elevated stress level, the computer processor may generate an alert to another member of the flight crew, and/or individuals on the ground. The computer processor may also analyze the signal of the co-pilot, and generate an alert in response to both the pilot and co-pilot experiencing an elevated stress level, since the presence of an elevated stress level in both individuals at the same time is likely to be indicative of an emergency situation. Similarly, an alert may be generated if two or more passengers experience an elevated stress level at the same time.
  • An alert may be generated if, by analyzing the sensor signal, the computer processor identifies that it is likely that the subject is experiencing, or will soon experience, a clinical event, such as a heart attack. For example, if the pilot or one of the passengers is experiencing a heart attack, members of the flight crew, and/or a physician who is travelling on the airplane, may be alerted to the situation.
  • a clinical event such as a heart attack. For example, if the pilot or one of the passengers is experiencing a heart attack, members of the flight crew, and/or a physician who is travelling on the airplane, may be alerted to the situation.
  • An alert may be generated if, by analyzing the sensor signal, the computer processor identifies that it is at least somewhat likely that the subject is a carrier of a disease, such as severe acute respiratory syndrome (SARS). For example, if the computer processor identifies a change in the baseline heart rate of the subject without any correlation to motion of the subject, the computer processor may ascertain that the subject has likely experienced a rapid change in body temperature, which may indicate that the subject is sick. (The baseline heart rate is typically an average heart rate over a period of time, e.g., 1-2 hours.) In response, the computer processor may alert the flight crew to isolate the subject.
  • SARS severe acute respiratory syndrome
  • An alert may be generated if, by analyzing the sensor signal, the computer processor identifies that the subject (in particular, the pilot or co-pilot) is drowsy or sleeping.
  • a sleep study may be performed.
  • the computer processor may analyze the sensor signals from various passengers, and identify which passengers were sleeping at which times.
  • the computer processor may generate an output to help the airline improve the sleeping conditions on their aircraft (e.g., by reducing lighting, or increasing leg room).
  • the computer processor may also control the lighting, temperature, or other cabin-environment parameters, in order to facilitate a more pleasant travelling experience. For example, upon detecting that a significant number of passengers are sleeping or are trying to fall asleep, the lights in the cabin may be dimmed, and/or the movie that is playing may be stopped. Alternatively or additionally, meals may be served to the passengers only if a given number of passengers are awake. To help prevent deep vein thrombosis (DVT), passengers may be prompted to stand up and take a walk, if the computer processor detects that they have been sitting in place for too long.
  • DVT deep vein thrombosis
  • FIGS. 7A-B are schematic illustrations of subject-monitoring apparatus, in accordance with some applications of the present invention.
  • Components of subject-monitoring apparatus 20 are as described hereinabove with reference to FIG. 1 .
  • sensor 22 is disposed under a chair 111 that the subject sits upon, and is configured to monitor the subject while the subject is sitting on the chair, in the manner described hereinabove, mutatis mutandis.
  • techniques described herein are practiced in combination with techniques described in WO 16/035073 to Shinar, which is incorporated herein by reference.
  • Subject-monitoring apparatus 20 comprises a sensor 22 , which is generally as described hereinabove, and is configured to monitor subject 24 .
  • Subject-monitoring apparatus 20 includes a control unit, which is typically a computer processor, such as computer processor 28 described hereinabove. As described hereinabove, computer processor typically communicates with a memory 29 .
  • the computer processor is typically a control unit that performs the algorithms described herein, including analyzing the signal from sensor 22 . It is noted that, in general, in the specification and claims of the present application, the terms “computer processor” and “control unit” are used interchangeably, since steps of the techniques described herein are typically performed by a computer processor that functions as a control unit. Therefore, the present application refers to component 28 both as a “computer processor” and a “control unit.”
  • computer processor 28 controls a property (e.g., the content, genre, volume, frequency, and/or phase-shift) of a sound signal, and drives a speaker 110 to play the sound signal.
  • a property e.g., the content, genre, volume, frequency, and/or phase-shift
  • the property of the sound signal is controlled such as to help the subject fall asleep or remain asleep.
  • the computer processor may select a sound signal of the “relaxing nature sounds” genre, and may further select the content of the signal to be the sound of waves hitting the seashore.
  • the computer processor may further set the frequency of the sound signal (e.g., the frequency of the waves) to an offset less than the subject's current heart rate or respiratory rate, in order to facilitate slowing of the subject's heart rate and/or respiratory rate.
  • the computer processor controls the offset, in response to analyzing the sensor signal; for example, as the heart rate of the subject approaches a target “relaxed” heart rate, the computer processor may reduce the offset, such that the frequency of the sound signal is very close to or identical with the subject's heart rate. As the subject begins to fall asleep, the computer processor may reduce the volume of the sound signal.
  • the computer processor controls a phase-shift of the sound signal with respect to a cardiac signal and/or a respiratory signal of the subject.
  • the computer processor may cause the sound of a wave hitting the seashore to occur a given amount of time (e.g., 300 milliseconds) before or after each heartbeat of the subject, or a given amount of time (e.g., 1 second) after each expiration of the subject.
  • the computer processor ascertains that the subject is trying to fall asleep, at least in response to analyzing the sensor signal. For example, by analyzing the sensor signal, the computer processor may ascertain that the subject is awake and is exhibiting a large amount of movement indicative of restlessness in bed. Alternatively or additionally, the ascertaining is in response to one or more other factors, such as a signal from a light sensor that indicates a low level of ambient light in the room, and/or the time of day. In response to ascertaining that the subject is trying to fall asleep, the computer processor controls the property of the sound signal, as described hereinabove.
  • the computer processor by analyzing the sensor signal, ascertains a sleep stage of the subject, and controls the property of the sound signal in response to the ascertained sleep stage. For example, in response to ascertaining that the subject has entered a slow-wave (i.e., deep) sleep stage, the volume of the sound signal may be reduced to a relatively low level (e.g., zero).
  • the computer processor may use one or more of the techniques described in (a) US 2007/0118054 to Pinhas (now abandoned), and/or (b) Shinar et al., Computers in Cardiology 2001; Vol.
  • the computer processor controls the property of the sound signal further in response to a historical physiological parameter of the subject that was exhibited in response to a historical sound signal. For example, the computer processor may “learn” the subject's typical responses to particular sound-signal properties, and control the sound signal in response thereto. Thus, for example, if the subject has historically responded well to a “relaxing nature sounds” genre, but less so to a “classical music” genre, the computer processor may select the former genre for the subject.
  • the computer processor looks at some or all of historical physiological parameters such as a quality of sleep, a time-to-fall-asleep, a heart-rate-variability, a change in heart rate, a change in respiratory rate, a change in heart-rate-variability, a change in blood pressure, a rate of change in heart rate, a rate of change in respiratory rate, a rate of change in heart-rate-variability, and a rate of change in blood pressure.
  • historical physiological parameters such as a quality of sleep, a time-to-fall-asleep, a heart-rate-variability, a change in heart rate, a change in respiratory rate, a change in heart-rate-variability, a change in blood pressure, a rate of change in heart rate, a rate of change in respiratory rate, a rate of change in heart-rate-variability, and a rate of change in blood pressure.
  • the computer processor controls the frequency of the sound signal by synthesizing the sound signal, or by selecting a pre-recorded sound signal that has the desired frequency; in other words, the computer processor selects the content of the signal, without the user's input.
  • the computer processor selects content of the sound signal in response to a manual input, e.g., an input entered via user interface device 35 ( FIG. 1 ).
  • the subject may select a particular piece of classical music, and the computer processor may then control properties (such as the frequency, i.e., the tempo) of that particular piece. This may be done, for example, using appropriate software, such as Transcribe!TM by Seventh String Software of London, UK.
  • the computer processor controls a property of light (such as intensity, flicker frequency, or color) emitted by a light 112 in a generally similar manner to that described with respect to controlling the sound that is generated by speaker 110 , mutatis mutandis.
  • the computer processor may select a light signal that causes the subject to enter a relaxed state, in response to detecting that the subject is trying to fall asleep.
  • the computer processor may modulate the property of the light at a frequency of modulation that is based upon the subject's current heart rate or respiratory rate, in order to facilitate slowing of the subject's heart rate and/or respiratory rate, as described hereinabove with respect to the sound signal.
  • the computer processor may ascertain a sleep stage of the subject, and modulate the property of the light in response to the ascertained sleep stage.
  • the computer processor controls the property of the light further in response to a historical physiological parameter of the subject that was exhibited in response to a historical light signal.
  • the computer processor may “learn” the subject's typical responses to particular light-signal properties, and control the light in response thereto.
  • the computer processor may control parameters of light 112 , as an alternative to, or in addition to, controlling properties of the sound that is generated by speaker 110 .
  • the computer processor controls a property of light (such as intensity, flicker frequency, or color) that is emitted by a screen 122 of a device that the subject is using in a generally similar manner to that described with respect to controlling the sound that is generated by speaker 110 , mutatis mutandis.
  • the device may be a laptop computer 32 ( FIG. 1 ), a tablet device 34 ( FIG. 1 ), a smartphone 36 ( FIG. 7B ), and or a TV 124 ( FIG. 7B ).
  • the computer processor may select a light signal that causes the subject to enter a relaxed state, in response to detecting that the subject is trying to fall asleep.
  • the computer processor may modulate the property of the light at a frequency of modulation that is based upon the subject's current heart rate or respiratory rate, in order to facilitate slowing of the subject's heart rate and/or respiratory rate, as described hereinabove with respect to the sound signal.
  • the computer processor may ascertain a sleep stage of the subject, and modulate the property of the light in response to the ascertained sleep stage.
  • the computer processor controls the property of the light further in response to a historical physiological parameter of the subject that was exhibited in response to a historical light signal.
  • the computer processor may “learn” the subject's typical responses to particular light-signal properties, and control the light in response thereto.
  • the computer processor may control parameters of light emitted by screen 122 , as an alternative to, or in addition to, controlling parameters of the sound that is generated by speaker 110 , and/or light that is generated by light 112 .
  • a vibrating element 126 is disposed underneath a surface of chair 111 upon which the subject sits. Alternatively (not shown), a vibrating element may be disposed underneath the surface of the bed upon which the subject lies.
  • the computer processor controls a property of the vibration (such as vibrating frequency, or a strength of vibration) that is applied to the subject by the vibrating element, in a generally similar manner to that described with respect to controlling the sound that is generated by speaker 110 , mutatis mutandis.
  • the computer processor may select a vibration signal that causes the subject to enter a relaxed state, in response to detecting that the subject is trying to fall asleep.
  • the computer processor may modulate the property of the vibration at a frequency of modulation that is based upon the subject's current heart rate or respiratory rate, in order to facilitate slowing of the subject's heart rate and/or respiratory rate, as described hereinabove with respect to the sound signal.
  • the computer processor may ascertain a sleep stage of the subject, and modulate the property of the vibration in response to the ascertained sleep stage.
  • the computer processor controls the property of the vibration further in response to a historical physiological parameter of the subject that was exhibited in response to a historical vibration signal.
  • the computer processor may “learn” the subject's typical responses to particular vibration-signal properties, and control the vibrating element in response thereto.
  • the computer processor may control parameters of the vibration of the vibrating element, as an alternative to, or in addition to, controlling parameters of the sound that is generated by speaker 110 , and/or light that is generated by light 112 or by screen 122 .
  • a stimulus-providing device in response to analysis of the signal from sensor 22 , computer processor controls a property of a stimulus-providing device, in a manner that changes a physiological parameter of the subject, such as the subject's heart rate, respiration rate, or sleep latency period.
  • the stimulus-providing device may provide an audio stimulus (e.g., speaker 110 ), a visual stimulus (e.g., light 112 or screen 122 ), or a tactile stimulus (e.g., vibrating element 126 ).
  • the stimulus is provided to the subject in a manner that does not require any compliance by the user, during the provision of the stimulus to the subject.
  • certain actions may need to be performed.
  • the term “without requiring subject compliance” should not be interpreted as excluding such actions.
  • the subject may perform routine activities (such as browsing the internet, or watching TV), and while the subject is performing routine activities, the computer processor automatically controls a property of the stimulus that is provided to the subject in the above-described manner.
  • routine activities such as browsing the internet, or watching TV
  • the computer processor automatically controls a property of the stimulus that is provided to the subject in the above-described manner.
  • the stimulus is provided to the subject in manner that does not require the subject to consciously change the physiological parameter upon which the stimulus has an effect. Rather, the stimulus is provided to the subject such that the physiological parameter of the subject is changed without requiring the subject to consciously adjust the physiological parameter.
  • FIG. 8 is a flowchart showing steps that are performed by a computer processor in order to monitor a subject who is pregnant, in accordance with some applications of the present invention.
  • a pregnant woman's heart rate is typically expected to increase during pregnancy and be higher than the woman's heart rate prior to pregnancy.
  • a female subject is monitored using sensor 22 before pregnancy.
  • Computer processor receives the sensor signal (step 130 ), analyzes the sensor signal (step 132 ), and, based upon the analysis, determines a baseline heart rate (e.g., a baseline average daily heart rate, or a baseline heart rate at a given time period of the day, and/or at a given period of the subject's circadian cycle) for the subject (step 134 ).
  • a baseline heart rate e.g., a baseline average daily heart rate, or a baseline heart rate at a given time period of the day, and/or at a given period of the subject's circadian cycle
  • the computer processor determines a pregnancy heart rate measure, which is indicative of what the subject's heart rate is expected to be (e.g., what the average daily heart rate, or the heart rate at a given time period of the day, and/or at a given period of the subject's circadian cycle is expected to be) during a healthy pregnancy (step 136 ).
  • the computer processor determines a range of heart rates that are considered to be healthy when the subject is pregnant, based upon the determined baseline heart rate.
  • the computer processor receives the sensor signal (step 138 ), analyzes the sensor signal (step 140 ), and based upon the analysis of the signal determines the subject's heart rate (step 142 ).
  • the computer processor compares the heart rate to the pregnancy heart rate measure that was determined based upon the baseline heart rate (step 144 ). Based on the comparison, the computer processor determines whether the subject's pregnancy is healthy. For some applications, the computer processor generates an output (e.g., an alert) on an output device (as described hereinabove), in response to the comparison (step 146 ). For some applications, in response to detecting that the subject's heart rate has returned to the pre-pregnancy baseline heart rate, the computer processor generates an output that is indicative of a recommendation to visit a physician.
  • an output e.g., an alert
  • FIGS. 9A-C show histograms of patients' cardiac interbeat intervals that were recorded in accordance with some applications of the present invention.
  • sensor 22 performs monitoring of the subject without contacting the subject or clothes the subject is wearing, and/or without viewing the subject or clothes the subject is wearing.
  • the sensor is configured to detect the subject's cardiac cycle, using techniques as described herein. In some cases, typically due to the non-contact nature of the sensing, some of the subject's heartbeats are not reliably detected.
  • the computer processor determines a quality indicator that indicates the quality of the sensed heartbeat. For example, the computer processor may determine the signal-to-noise ratio of the signal, and compare the signal-to-noise ratio to a threshold.
  • the computer processor selects a subset of heartbeats, based upon the qualities of the heartbeats, and some steps of the subsequent analysis (as described herein) are performed only with respect to the subset of heartbeats. For some applications, only in cases in which two consecutive heart beats have a quality indicator that exceeds a threshold, the interbeat interval is calculated and/or is selected for use in subsequent analysis. For some applications, the computer processor builds a histogram of the selected interbeat intervals. The computer processor analyzes the selected interbeat intervals over a period of time, and in response thereto, the computer processor determines whether the subject is healthy or is suffering from arrhythmia, which type of arrhythmia the subject is suffering from, and/or identifies or predicts arrhythmia episodes. For example, the computer processor may build a histogram of the selected interbeat intervals and may perform the above-described steps by analyzing the histogram.
  • FIGS. 9A-C show sample histograms that were constructed using the above-described technique.
  • the x-axes of the histograms measure the time at which the interbeat interval measurement was recorded, the y-axes measure the interbeat interval and the color legend measures the amplitude of the histogram at each interbeat interval (with a lighter color representing greater amplitude).
  • FIG. 9A shows measurements recorded from a healthy subject, there being only one peak at approximately 900 ms.
  • FIG. 9B is the histogram of an arrhythmic subject, the histogram including two dominant peaks shown by the two light lines at approximately 450 ms and 800 ms.
  • FIG. 9C is the histogram of a subject who starts with a normal cardiac rhythm and at about an x-axis time of 5,500 sec. starts to show an arrhythmia that is manifested by the much wider distribution of the histogram.
  • an alert is generated that an arrhythmia event may be taking place.
  • the computer processor may generate an alert in response to identifying that the width of a peak of a histogram exceeds a threshold (or performing an equivalent algorithmic operation). For example, the width of the peak may be compared to a threshold that is determined based upon population averages according to the age and or other indications of the subject (such as, a level of fitness of the subject).
  • the computer processor in response to the computer processor identifying two distinct peaks in a histogram that is plotted using the techniques described herein (or performing an equivalent algorithmic operation), performs the following steps.
  • the computer processor identifies heartbeats belonging to respective interbeat interval groups (i.e., which heartbeats had an interbeat interval that corresponds to a first one of the peaks, and which heartbeats had an interbeat interval corresponding to the second one of the peaks.)
  • the average amplitude of the signal of each of these groups is then calculated.
  • the computer processor generates an output that is indicative of the average amplitude of each of the peaks, and/or the interbeat interval of each of the peaks.
  • the computer processor automatically determines a condition of the subject. For example, the computer processor may determine which category of arrhythmia the subject is suffering from, e.g., atrial fibrillation or ventricular fibrillation.
  • the analysis of the interbeat intervals is described as being performed using histogram analysis, the techniques described herein may be combined with other types of analysis that would yield similar results, mutatis mutandis.
  • the computer processor may perform algorithmic steps that do not include a step of generating a histogram, but which analyze the subject's interbeat interval over time, in a similar manner to that described hereinabove.
  • FIG. 10 shows components of a subject's cardiac cycle that were detected, in accordance with some applications of the present invention.
  • sensor 22 is used to monitor a cardiac-related signal of the subject.
  • a bank of matched filters with varying filter parameters e.g., varying width properties
  • one of the filtered signals is selected by the computer processor.
  • the filter having the greatest signal-to-noise ratio may be selected, by selecting the filter that generates the highest ratio of the main lobe to the side lobes in the filtered signal.
  • the filters are designed to have a main lobe with full-width-half-maximum value that fits a human biological beat as recorded with the contact free sensor under the mattress.
  • the bank of filters typically includes filters having a range of relevant full-width-half-maximum values for biological signals.
  • the filters are zero-mean, e.g., in order to remove any trends, movements or respiration.
  • the selection of which filter to use is repeated in response to certain events. For some applications, the selection of a filter is repeated if the signal quality falls below a threshold. Alternatively or additionally, the filter selection is repeated at fixed times intervals (e.g., once every 5 minutes, ten minutes, or 15 minutes). Further, alternatively or additionally, the filter selection is repeated in response to detecting motion of the subject, e.g., large body motion of the subject. For example, in response to the sensor signal indicating that the subject has undergone motion (e.g., large body motion), the computer processor may perform the filter selection.
  • a signal that was detected using the above described technique is shown above the corresponding ECG signal. It may be observed that certain cardiac events, which correlate with the ECG signal, may be extracted from the sensor signal. For example, the following mechanical events can typically be extracted from the sensor signal: mitral valve closure (MC), aortic valve opening (AO), systolic ejection (SE), aortic valve closure (AC), and mitral valve opening (MO). Therefore, for some applications, a cardiac signal that is detected using techniques described herein is analyzed by the computer processor, and one or more of the events are identified. For some applications, in this manner, the computer processor monitors mechanical functioning of the heart. For example, the computer processor may use the identified events to measure the subject's left ventricular ejection time. For some applications, the computer processor analyzes the subject's cardiac cycle, by using the above-described technique in combination with ECG sensing.
  • mitral valve closure MC
  • AO aortic valve opening
  • SE systolic ejection
  • AC a
  • Computer processor 28 may be embodied as a single computer processor 28 , or a cooperatively networked or clustered set of computer processors.
  • Computer processor 28 is typically a programmed digital computing device comprising a central processing unit (CPU), random access memory (RAM), non-volatile secondary storage, such as a hard drive or CD ROM drive, network interfaces, and/or peripheral devices.
  • Program code, including software programs, and data are loaded into the RAM for execution and processing by the CPU and results are generated for display, output, transmittal, or storage, as is known in the art.
  • computer processor 28 is connected to one or more sensors via one or more wired or wireless connections.
  • Computer processor 28 is typically configured to receive signals (e.g., motion signals) from the one or more sensors, and to process these signals as described herein.
  • signals e.g., motion signals
  • motion signal is used to denote any signal that is generated by a sensor, upon the sensor sensing motion. Such motion may include, for example, respiratory motion, cardiac motion, or other body motion, e.g., large body-movement.
  • motion sensor is used to denote any sensor that senses motion, including the types of motion delineated above.
  • a computer-usable or computer-readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium.
  • the computer-usable or computer readable medium is a non-transitory computer-usable or computer readable medium.
  • Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk.
  • Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W) and DVD.
  • a data processing system suitable for storing and/or executing program code will include at least one processor (e.g., computer processor 28 ) coupled directly or indirectly to memory elements (e.g., memory 29 ) through a system bus.
  • the memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.
  • the system can read the inventive instructions on the program storage devices and follow these instructions to execute the methodology of the embodiments of the invention.
  • Network adapters may be coupled to the processor to enable the processor to become coupled to other processors or remote printers or storage devices through intervening private or public networks.
  • Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.
  • Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the C programming language or similar programming languages.
  • object oriented programming language such as Java, Smalltalk, C++ or the like
  • conventional procedural programming languages such as the C programming language or similar programming languages.
  • each block of the flowcharts shown in FIGS. 3, 4, and 8 can be implemented by computer program instructions.
  • These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer (e.g., computer processor 28 ) or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowcharts and/or algorithms described in the present application.
  • These computer program instructions may also be stored in a computer-readable medium (e.g., a non-transitory computer-readable medium) that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the flowchart blocks and algorithms.
  • the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowcharts and/or algorithms described in the present application.
  • Computer processor 28 is typically a hardware device programmed with computer program instructions to produce a special purpose computer. For example, when programmed to perform the algorithms described with reference to FIG. 3 , computer processor 28 typically acts as a special purpose temperature control computer processor, when programmed to perform the algorithms described with reference to FIG. 4 , computer processor 28 typically acts as a special purpose apnea monitoring processor, and when programmed to perform the algorithms described with reference to FIG. 8 , computer processor 28 typically acts as a special purpose pregnancy monitoring processor. Typically, the operations described herein that are performed by computer processor 28 transform the physical state of memory 29 , which is a real physical article, to have a different magnetic polarity, electrical charge, or the like depending on the technology of the memory that is used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physics & Mathematics (AREA)
  • Cardiology (AREA)
  • Physiology (AREA)
  • Psychiatry (AREA)
  • Hospice & Palliative Care (AREA)
  • Educational Technology (AREA)
  • Developmental Disabilities (AREA)
  • Child & Adolescent Psychology (AREA)
  • Psychology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Social Psychology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

Apparatus and methods are described for use with a seat (82) of a vehicle, including a sensor unit (80) configured to be placed underneath the seat and configured to detect motion of a subject who is sitting on the seat during motion of the vehicle. The sensor unit includes a housing (90, 100), a first motion sensor (96, 106) disposed within the housing such as to generate a first sensor signal that is indicative of the motion of the vehicle, and a second motion sensor (98, 108) disposed within the housing, such as to generate a second sensor signal that is indicative of the motion of the subject and the motion of the vehicle. A computer processor at least partially distinguishes between motion of the subject and motion of the vehicle by analyzing the first and second sensor signals. Other applications are also described.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present application claims the benefit of U.S. Provisional Application 62/295,077, entitled “Apparatus and method for monitoring a subject,” filed Feb. 14, 2016.
  • The present application is related to an International Patent Application, entitled “Apparatus and methods for monitoring a subject,” being filed on even date herewith.
  • Each of the above referenced applications is incorporated herein by reference.
  • FIELD OF EMBODIMENTS OF THE INVENTION
  • The present invention relates generally to monitoring a subject. Specifically, some applications of the present invention relate to monitoring a subject, while the subject is in a vehicle.
  • BACKGROUND
  • Quality and duration of sleep plays an important role in overall physical and psychological wellbeing. Unfortunately, many subjects have difficulty falling or staying asleep. Thermoregulation during sleep affects sleep quality.
  • An article entitled “Mechanisms and functions of coupling between sleep and temperature rhythms,” by Van Someren (Prog Brain Res 2006, 153:309-324) describes heat production and heat loss as showing circadian modulation. The article states that sleep preferably occurs during the circadian phase of decreased heat production and increased heat loss, the latter due to a profound increase in skin blood flow and, consequently, skin warming.
  • An article entitled “Functional link between distal vasodilation and sleep-onset latency,” by Krauchi et al. (Am J Physiol Regul Integr Comp Physiol 2000, 278:R741-R748) describes a study in which the role of heat loss in sleep initiation was evaluated. The article states that the study provides evidence that selective vasodilation of distal skin regions (and hence heat loss) promotes the rapid onset of sleep.
  • An article entitled “Skin temperature and sleep-onset latency: Changes with age and insomnia,” by Raymann et al. (Physiology & Behavior 90 (2007) 257-266) states that changes in skin temperature may causally affect the ability to initiate and maintain sleep. The article describes findings on the relation between skin temperature and sleep-onset latency, indicating that sleep propensity can be enhanced by warming the skin to the level that normally occurs prior to, and during, sleep. The article describes a study to investigate whether different methods of foot warming could provide an applicable strategy to address sleep complaints.
  • SUMMARY OF EMBODIMENTS
  • For some applications, a sensor unit is disposed under a seat of a vehicle. The sensor unit is configured to monitor physiological parameters of a subject who is sitting on the seat, and to generate a sensor signal in response thereto. Typically, the subject is an operator of the vehicle (e.g., the driver of a car, the pilot of an airplane, the driver of a train, etc.). A computer processor is configured to receive and analyze the sensor signal for any one of a number of reasons. Typically, the computer processor derives vital signs of the subject (such as heart rate, respiratory rate, and/or heart-rate variability) from the sensor signal. For some applications, the computer processor compares the subject's vital signs to a baseline of the subject that was derived during occasions when the subject previously operated the vehicle. In response thereto, the computer processor may determine that the subject's vital signs have changed substantially from the baseline, that the subject is unwell, drowsy, asleep, and/or under the influence of drugs or alcohol. In response thereto, the computer processor may generate an alert to the driver, or to a remote location (such as to a family member, and/or to a corporate control center). Alternatively or additionally, the computer processor may automatically disable the vehicle.
  • For some applications, the sensor unit is configured to be placed underneath the seat and to detect motion of the subject who is sitting on the seat during motion of the vehicle. The sensor unit typically includes a housing, at least one first motion sensor disposed within the housing, such that the first motion sensor generates a first sensor signal that is indicative of the motion of the vehicle, and at least one second motion sensor disposed within the housing, such that the second motion sensor generates a second sensor signal that is indicative of the motion of the subject and the motion of the vehicle. The computer processor is typically configured to at least partially distinguish between the motion of the subject and the motion of the vehicle by analyzing the first and second sensor signals.
  • For some applications of the present invention, a temperature control device (such as an electric blanket, or an electric mattress) includes at least first and second sections corresponding to respective portions of a body of a single subject. For example, a blanket may include three types of sections: a trunk section corresponding to the subject's trunk, leg sections corresponding to the subject's legs, and arm sections corresponding to the subject's arms. A temperature-regulation unit regulates respective portions of the subject's body to be at respective temperatures by, simultaneously, setting a temperature of the first section of the temperature control device to a first temperature, and setting a temperature of the second section of the temperature control device to a second temperature that is different from the first temperature. Optionally, the temperature-regulation unit sets the temperature of additional sections of the temperature control device to further respective temperatures.
  • As described hereinabove, thermoregulation during sleep affects sleep quality. Moreover, as described in the Krauchi article for example, selective vasodilation of distal skin regions (and hence heat loss) may promote the onset of sleep. For some applications, a computer processor drives the temperature-regulation unit to regulate respective portions of the subject's body to be at respective temperatures in the manner described herein, such as to improve sleep quality, shorten sleep latency, and/or better maintain sleep continuity. For example, the computer processor may drive the temperature-regulation unit to regulate the temperature of the subject's legs and/or arms to be at a greater temperature than the subject's trunk (e.g., by heating the legs and/or arms by more than the trunk, or by cooling the trunk by less than the legs and/or arms). For some applications, the computer processor drives the temperature-regulation unit to regulate respective portions of the subject's body to be at respective temperatures, in response to the subject's sleep stage, which is detected automatically by analyzing a sensor signal from a sensor (such as motion sensor) that is configured to monitor the subject.
  • There is therefore provided, in accordance with some applications of the present invention, apparatus for use with a seat of a vehicle comprising:
  • a sensor unit configured to be placed underneath the seat and configured to detect motion of a subject who is sitting on the seat during motion of the vehicle, the sensor unit comprising:
      • a housing;
      • at least one first motion sensor disposed within the housing, such that the first motion sensor generates a first sensor signal that is indicative of the motion of the vehicle;
      • at least one second motion sensor disposed within the housing, such that the second motion sensor generates a second sensor signal that is indicative of the motion of the subject and the motion of the vehicle; and
  • a computer processor configured to at least partially distinguish between the motion of the subject and the motion of the vehicle by analyzing the first and second sensor signals.
  • For some applications, the first motion sensor is disposed within the housing such that the first motion sensor is isolated from the motion of the subject, such that the first motion sensor only detects motion that is due to motion of the vehicle.
  • For some applications, the computer processor is configured to:
  • derive the motion of the vehicle from the first sensor signal, and
  • based upon the derived motion of the vehicle, to subtract, from the second sensor signal, a portion of the second sensor signal that is generated by the motion of the vehicle.
  • For some applications:
  • at least a portion of the housing is flexible,
  • the apparatus further comprises a fluid compartment disposed on an inner surface of the housing,
  • the at least one first motion sensor is disposed on a surface of the fluid compartment, and
  • at least one second motion sensor is disposed on at least one inner surface of the flexible portion of the housing.
  • For some applications, the first motion sensor comprises a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
  • For some applications, the second motion sensor comprises a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
  • For some applications, the at least one second motion sensor comprises two or more second motion sensors disposed on respective inner surfaces of the flexible portion of the housing.
  • For some applications:
  • the housing comprising flexible and rigid portions;
  • the at least one first motion sensor is disposed on at least one inner surface of the rigid portion of the housing;
  • the at least one second motion sensor is disposed on at least one inner surface of the flexible portion of the housing, and configured to generate a second sensor signal.
  • For some applications, the first motion sensor comprises a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
  • For some applications, the second motion sensor comprises a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
  • For some applications, the at least one first motion sensor comprises two or more first motion sensors disposed on respective inner surfaces of the rigid portion of the housing.
  • For some applications, the at least one second motion sensor comprises two or more second motion sensors disposed on respective inner surfaces of the flexible portion of the housing.
  • There is further provided, in accordance with some applications of the present invention, apparatus for use with a seat of a vehicle including:
  • a sensor unit configured to be placed underneath the seat and configured to detect motion of a subject who is sitting on the seat during motion of the vehicle, the sensor unit comprising:
      • a housing at least a portion of which is flexible;
      • a fluid compartment disposed on an inner surface of the housing;
      • at least one first motion sensor disposed on a surface of the fluid compartment, and configured to generate a first sensor signal;
      • at least one second motion sensor disposed on at least one inner surface of the flexible portion of the housing, the second motion sensor being configured to generate a second sensor signal; and
  • a computer processor configured to at least partially distinguish between the motion of the subject and the motion of the vehicle by analyzing the first and second sensor signals.
  • In some applications, the first motion sensor includes a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
  • In some applications, the second motion sensor includes a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
  • In some applications, the at least one second motion sensor includes two or more second motion sensors disposed on respective inner surfaces of the flexible portion of the housing.
  • There is further provided, in accordance with some applications of the present invention, apparatus for use with a seat of a vehicle including:
  • a sensor unit configured to be placed underneath the seat and configured to detect motion of a subject who is sitting on the seat during motion of the vehicle, the sensor unit comprising:
      • a housing comprising flexible and rigid portions;
      • at least one first motion sensor disposed on at least one inner surface of the rigid portion of the housing, and configured to generate a first sensor signal;
      • at least one second motion sensor disposed on at least one inner surface of the flexible portion of the housing, and configured to generate a second sensor signal; and
  • a computer processor configured to at least partially distinguish between the motion of the subject and the motion of the vehicle by analyzing the first and second sensor signals.
  • In some applications, the first motion sensor includes a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
  • In some applications, the second motion sensor includes a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
  • In some applications, the at least one first motion sensor includes two or more first motion sensors disposed on respective inner surfaces of the rigid portion of the housing.
  • In some applications, the at least one second motion sensor includes two or more second motion sensors disposed on respective inner surfaces of the flexible portion of the housing.
  • There is additionally provided, in accordance with some applications of the present invention, apparatus including:
  • a temperature-control device comprising at least first and second sections corresponding to respective portions of a body of a single subject; and
  • a temperature-regulation unit configured to regulate temperatures of the respective portions of the subject's body to be at respective temperatures by, simultaneously, setting a temperature of the first section of the temperature-control device to a first temperature, and setting a temperature of the second section of the temperature control device to a second temperature that is different from the first temperature.
  • In some applications, the temperature control device includes a device selected from the group consisting of: a blanket and a mattress, and the selected device has a length of less than 250 cm, and a width of less than 130 cm.
  • In some applications, the temperature control device includes a blanket configured to be placed above the subject, and the first and second section include first and second sections that are configured to be placed over respective portions of the subject's body.
  • In some applications, the temperature control device includes a blanket configured to be disposed underneath the subject, and the first and second section include first and second sections that are configured to be disposed underneath respective portions of the subject's body.
  • In some applications, the temperature control device includes a mattress configured to be disposed underneath the subject, and the first and second section include first and second sections that are configured to be disposed underneath respective portions of the subject's body.
  • In some applications, the first section corresponds to a trunk of the subject, and the second section corresponds to a distal portion of the subject's body selected from the group consisting of: at least one arm of the subject, and at least one leg of the subject.
  • In some applications, the apparatus further includes:
  • a sensor, configured to monitor the subject and generate a sensor signal in response thereto; and
  • a computer processor, configured to:
      • analyze the signal,
      • in response thereto, identify a sleep stage of the subject, and
      • in response to the identified sleep stage, drive the temperature-regulation unit to regulate the temperatures of the respective portions of the subject's body to be at the respective temperatures.
  • In some applications, the computer processor is configured to:
  • differentially identify at least two sleep stages selected from the group consisting of: a falling-asleep stage, a beginning-sleep stage, a mid-sleep stage, a premature-awakening stage, an awakening stage, a light sleep stage, a slow-wave sleep stage, and a rapid-eye-movement sleep stage, and
  • in response to the differentially identified sleep stages, drive the temperature-regulation unit to regulate the temperatures of the respective portions of the subject's body to be at the respective temperatures.
  • In some applications, the sensor is configured to monitor the subject without contacting or viewing the subject, and without contacting or viewing clothes the subject is wearing.
  • In some applications, the first section corresponds to a trunk of the subject, and the second section corresponds to at least one distal portion of the subject's body selected from the group consisting of: at least one arm of the subject, and at least one leg of the subject.
  • In some applications, the computer processor is configured, in response to detecting that the subject is trying to fall asleep, to drive the temperature-modulation unit to regulate the subject's trunk to be at a first temperature, and to regulate at least the selected distal portion of the subject's body to be at a second temperature that is greater than the first temperature.
  • In some applications, the computer processor is configured, in response to detecting that the subject is at a sleep stage at which it is suitable to wake up the subject, to drive the temperature-regulation unit to heat the subject's trunk.
  • In some applications, the sensor includes a motion sensor configured to sense motion of the subject.
  • In some applications, the sensor is configured to monitor the subject without contacting or viewing the subject, and without contacting or viewing clothes the subject is wearing.
  • In some applications, the apparatus is for use with a room-climate regulation device, and, in response to the identified sleep stage, the computer processor is further configured to adjust a parameter of the room-climate regulation device.
  • In some applications, the room-climate regulation device includes an air-conditioning unit, and, in response to the identified sleep stage, the computer processor is configured to adjust a parameter of the air-conditioning unit.
  • There is further provided, in accordance with some applications of the present invention, apparatus for use with an output device, the apparatus including:
  • a sensor, configured to monitor a subject, during a sleeping session of the subject, and to generate a sensor signal in response to the monitoring; and
  • a computer processor, configured to:
      • analyze the signal,
      • in response thereto, identify a correspondence between positions of the subject and occurrences of apnea events of the subject during the sleeping session, and
      • generate an output on the output device, in response to the identified correspondence.
  • There is further provided, in accordance with some applications of the present invention, apparatus for use with a female subject, the apparatus including:
  • a sensor, configured to monitor the subject, prior to the subject becoming pregnant and during a pregnancy of the subject, and to generate a sensor signal in response to the monitoring; and
  • a computer processor, configured to:
      • analyze the sensor signal,
      • based upon the sensor signal generated by the sensor in response to the monitoring prior to the subject becoming pregnant, to determine a baseline heart rate for the subject,
      • based upon the baseline heart rate, define a pregnancy heart rate measure, which is indicative of one or more heart rates that the subject is expected to have during a healthy pregnancy,
      • based upon the sensor signal generated by the sensor in response to the monitoring during the subject's pregnancy, determine a heart rate of the subject during the pregnancy,
      • compare the subject's heart rate during the pregnancy to the pregnancy heart rate measure, and
      • generate an output on the output device, in response to the comparison.
  • There is further provided, in accordance with some applications of the present invention, apparatus for use with a stimulus-providing device that is configured to provide a stimulus to a subject selected from the group consisting of: an audio stimulus, a visual stimulus, and a tactile stimulus, the apparatus including:
  • a sensor configured to monitor a subject and to generate a sensor signal in response thereto; and
  • a control unit configured to:
      • analyze the sensor signal,
      • modulate a property of the stimulus that is provided to the subject by the stimulus-providing device, in response to (a) the analyzing of the sensor signal, and (b) a historical physiological parameter of the subject that was exhibited in response to a historical modulation of the property of the stimulus, and
      • drive the stimulus-providing device to provide the stimulus to the subject.
  • There is additionally provided, in accordance with some applications of the present invention, apparatus for monitoring a subject, the apparatus comprising:
  • a sensor, configured to monitor the subject without contacting the subject or clothes the subject is wearing, and without viewing the subject or clothes the subject is wearing, and to generate a sensor signal in response to the monitoring; and
  • a computer processor, configured to:
      • receive the sensor signal,
      • extract from the sensor signal a plurality of heartbeats of the subject, and, for each of the extracted heartbeats, an indication of a quality of the extracted heartbeat,
      • select a subset of heartbeats, by selecting for inclusion in the subset only heartbeats for which qualities of both the heartbeat itself, and an adjacent heartbeat to the heartbeat, exceed a threshold,
      • for only the subset of heartbeats, determining interbeat intervals between adjacent heartbeats,
      • in response thereto, determining a physiological state of the subject, and
      • generating an output in response thereto.
  • There is additionally provided, in accordance with some applications of the present invention, apparatus for monitoring a subject, the apparatus comprising:
  • a sensor, configured to monitor the subject and to generate a sensor signal in response thereto;
  • a plurality of filters configured to filter the sensor signal using respective filter parameters; and
  • a computer processor, configured to:
      • receive the sensor signal,
      • filter the signal with each of two or more of the filters,
      • in response to a quality of each of the filtered signal, select one of the plurality of filters to filter the sensor signal,
      • subsequently:
        • detecting that the subject has undergone motion, by analyzing the sensor signal,
        • in response thereto, filtering the signal with each of two or more of the filters, and
        • in response to a quality of each of the filtered signal, select one of the plurality of filters to filter the sensor signal.
  • The present invention will be more fully understood from the following detailed description of applications thereof, taken together with the drawings, in which:
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic illustration of apparatus for monitoring a subject, in accordance with some applications of the present invention;
  • FIG. 2 is a schematic illustration of a blanket, in accordance with some applications of the present invention;
  • FIG. 3 is a flowchart showing steps that are performed by a computer processor in order to control a subject's body temperature, in accordance with some applications of the present invention;
  • FIG. 4 is a flowchart showing steps that are performed by a computer processor in order to monitor sleep apnea of a subject, in accordance with some applications of the present invention;
  • FIG. 5 is a schematic illustration of a sensor unit disposed under the seat of a vehicle, in accordance with some applications of the present invention;
  • FIGS. 6A-C are schematic illustrations of a sensor unit as shown in FIG. 5, in accordance with respective applications of the present invention;
  • FIGS. 7A-B are schematic illustrations of subject-monitoring apparatus, in accordance with some applications of the present invention;
  • FIG. 8 is a flowchart showing steps that are performed by a computer processor in order to monitor a subject who is pregnant, in accordance with some applications of the present invention;
  • FIGS. 9A-C show histograms of patients' cardiac interbeat intervals that were recorded in accordance with some applications of the present invention; and
  • FIG. 10 shows components of a subject's cardiac cycle that were detected in accordance with some applications of the present invention.
  • DETAILED DESCRIPTION OF EMBODIMENTS
  • Reference is made to FIG. 1, which is a schematic illustration of subject-monitoring apparatus 20, in accordance with some applications of the present invention. Apparatus 20 is generally used to monitor a subject 24, while he or she is in his or her bed in a home setting. For some applications, the subject-monitoring apparatus is used in a hospital setting.
  • Subject-monitoring apparatus 20 comprises a sensor 22 (e.g., a motion sensor) that is configured to monitor subject 24. Sensor 22 may be a motion sensor that is similar to sensors described in U.S. Pat. No. 8,882,684 to Halperin, which is incorporated herein by reference. The term “motion sensor” refers to a sensor that senses the subject's motion (e.g., motion due to the subject's cardiac cycle, respiratory cycle, or large-body motion of the subject), while the term “sensor” refers more generally to any type of sensor, e.g., a sensor that includes an electromyographic sensor and/or an imaging sensor.
  • Typically, sensor 22 includes a sensor that performs monitoring of the subject without contacting the subject or clothes the subject is wearing, and/or without viewing the subject or clothes the subject is wearing. For example, the sensor may perform the monitoring without having a direct line of sight of the subject's body, or the clothes that the subject is wearing, and/or without any visual observation of the subject's body, or the clothes that the subject is wearing. Further typically, the sensor performs monitoring of the subject without requiring subject compliance (i.e., without the subject needing to perform an action to facilitate the monitoring that would not have otherwise been performed). It is noted that, prior to the monitoring, certain actions (such as purchasing the sensor, placing the sensor under the subject's mattress, downloading software for use with the subject-monitoring apparatus, and/or configuring software for use with the subject-monitoring apparatus) may need to be performed. The term “without requiring subject compliance” should not be interpreted as excluding such actions. Rather the term “without requiring subject compliance” should be interpreted as meaning that, once the sensor has been purchased, placed in a suitable position and activated, the sensor can be used to monitor the subject (e.g., to monitor the subject during repeated monitoring sessions), without the subject needing to perform any actions to facilitate the monitoring that would not have otherwise been performed.
  • For some applications, sensor 22 is disposed on or within the subject's bed, and configured to monitor the subject automatically, while the subject is in their bed. For example, sensor 22 may be disposed underneath the subject's mattress 26, such that the subject is monitored while she is lying upon the mattress, and while carrying out her normal sleeping routine, without the subject needing to perform an action to facilitate the monitoring that would not have otherwise been performed.
  • A computer processor 28, which acts as a control unit that performs the algorithms described herein, analyzes the signal from sensor 22. Typically, computer processor 28 communicates with a memory 29. For some applications, computer processor 28 is embodied in a desktop computer 30, a laptop computer 32, a tablet device 34, a smartphone 36, and/or a similar device that is programmed to perform the techniques described herein (e.g., by downloading a dedicated application or program to the device), such that the computer processor acts as a special-purpose computer processor. For some applications, as shown in FIG. 1, computer processor 28 is a dedicated computer processor that receives (and optionally analyzes) data from sensor 22, and communicates with computer processors of one or more of the aforementioned devices, which act as external devices.
  • For some applications, the subject (or another person, such as a care-giver) communicates with (e.g., sends data to and/or receives data from) computer processor 28 via a user interface device 35. As described, for some applications, computer processor is embodied in a desktop computer 30, a laptop computer 32, a tablet device 34, a smartphone 36, and/or a similar device that is programmed to perform the techniques described herein. For such applications, components of the device (e.g., the touchscreen, the mouse, the keyboard, the speakers, the screen) typically act as user interface device 35. Alternatively, as shown in FIG. 1, computer processor 28 is a dedicated computer processor that receives (and optionally analyzes) data from sensor 22. For some such applications, the dedicated computer processor communicates with computer processors of one or more of the aforementioned external devices (e.g., via a network), and the user interfaces of the external devices (e.g., the touchscreen, the mouse, the keyboard, the speakers, the screen) are used by the subject, as user interface device 35, to communicate with the dedicated computer processor and vice versa. For some applications, in order to communicate with computer processor 28, the external devices are programmed to communicate with the dedicated computer processor (e.g., by downloading a dedicated application or program to the external device).
  • For some applications, user interface includes an input device such as a keyboard 38, a mouse 40, a joystick (not shown), a touchscreen device (such as smartphone 36 or tablet device 34), a touchpad (not shown), a trackball (not shown), a voice-command interface (not shown), and/or other types of user interfaces that are known in the art. For some applications, the user interface includes an output device such as a display (e.g., a monitor 42, a head-up display (not shown) and/or a head-mounted display (not shown)), and/or a different type of visual, text, graphics, tactile, audio, and/or video output device, e.g., speakers, headphones, smartphone 36, or tablet device 34. For some applications, the user interface acts as both an input device and an output device. For some applications, the processor generates an output on a computer-readable medium (e.g., a non-transitory computer-readable medium), such as a disk, or a portable USB drive.
  • Reference is now made to FIG. 2, which is a schematic illustration of a temperature control device, e.g., a blanket 50 (which is typically an electric blanket), in accordance with some applications of the present invention. The temperature control device includes at least first and second sections corresponding to respective portions of a body of a single subject. For example, as shown the blanket includes three types of sections: a trunk section 52 corresponding to the subject's trunk, leg sections 54 corresponding to the subject's legs, and arm sections 56 corresponding to the subject's arms. A temperature-regulation unit 58 regulates respective portions of the subject's body to be at respective temperatures by, simultaneously, setting the temperature of the first section of the temperature control device to a first temperature, and setting the temperature of the second section of the temperature control device to a second temperature that is different from the first temperature, and, optionally, setting the temperatures of additional sections of the temperature control device to further respective temperatures.
  • It is noted that blanket 50 can be an over-blanket that is placed over the subject's body, or an under-blanket that is placed above the subject's mattress and beneath the subject (as shown). Furthermore, the scope of the present invention includes any temperature control device that includes first and second sections corresponding to respective portions of a body of a single subject, for use with a temperature-regulation unit that regulates the respective portions of the subject's body to be at respective temperatures by, simultaneously, setting the temperature of the first section of the temperature control device to a first temperature, and setting the temperature of the second section of the temperature control device to a second temperature that is different from the first temperature. For example, the temperature control device may include a mattress (e.g., an electric mattress), which includes built-in heating pads.
  • As described hereinabove, thermoregulation during sleep affects sleep quality. Moreover, as described in the Krauchi article, for example, selective vasodilation of distal skin regions (and hence heat loss) may promote the onset of sleep. For some applications, the computer processor drives the temperature-regulation unit to regulate the temperatures of respective portions of the subject's body to be at respective temperatures, in the manner described herein, such as to improve sleep quality, shorten sleep latency, and/or better maintain sleep continuity. For example, the computer processor may drive the temperature-regulation unit to heat the subject's legs and/or arms to a greater temperature than the subject's trunk. For some applications, the computer processor may drive the temperature-regulation unit to cool one or more portions of the subject's body. For some applications, the computer processor drives the temperature-regulation unit to heat and/or cool respective portions of the subject's body to respective temperatures, in response to the subject's sleep stage, which is detected automatically by analyzing the sensor signal from sensor 22.
  • Reference is now made to FIG. 3, which is a flowchart showing steps that are performed by a computer processor in order to control a subject's body temperature, in accordance with some applications of the present invention. In a first step 60, the computer processor receives a signal from sensor 22, which is typically as described hereinabove. In a second step 62, the computer processor analyzes the sensor signal in order to determine the subject's current sleep stage. For example, the computer processor may determine that the subject is currently in a falling-asleep stage (prior to falling asleep), a beginning-sleep stage, a mid-sleep stage, an awakening stage, a premature awakening stage, an REM stage, or a slow-wave stage. For some applications, the sleep stage is detected based upon the sensor signal using techniques as described in US 2007/0118054 to Pinhas (now abandoned), which is incorporated herein by reference. In response to the analysis of the sensor signal, the computer processor (in step 64) adjusts the temperature of a first portion of the temperature-control device (e.g., the arm or leg portion of blanket 50), and/or separately (in step 66) adjusts the temperature of a second portion of the temperature-control device (e.g., the trunk portion of blanket 50). For some applications (in step 68), the computer processor additionally adjusts the temperature of an additional room-climate regulation device, such as an air-conditioning unit (e.g., unit 44, FIG. 1), an electric heater, and/or a radiator. For example, the air-conditioning unit may be used to provide additional control of the temperature of the subject's trunk by controlling the temperature of the air that the subject inhales.
  • For some applications, in response to the computer processor determining that the subject is at the start of a sleeping session (e.g., in a falling-asleep stage or a beginning-sleep stage), the computer processor drives the temperature-regulation unit to heat distal parts of the subject's body (e.g., the subject's arms and/or legs) to a higher temperature than the subject's trunk. Typically, the computer processor will use different temperature profiles for different sleep states. For example, when the subject is in slow wave sleep, the computer processor may drive the temperature-regulation unit to keep temperatures lower than during other phases of the subject's sleep. Alternatively or additionally, when the subject wakes up during the night the computer processor may use a similar profile to that used when the subject is initially trying to fall asleep.
  • For some applications, the computer processor drives the temperature-regulation unit to warm the subject's trunk in order to gently wake up the subject. For example, the computer processor may use trunk warming to wake up the subject, based on having received an input of a desired time for the subject to wake up (e.g., via the user interface), or based on detecting that the current sleep phase of the subject is such that it would be a good time to wake up the subject.
  • For some applications, a user designates temperature profiles corresponding to respective sleep stages, via a user input into the computer processor. Typically, the temperature profile of any sleep stage will include respective temperatures for respective portions of the subject's body, and/or differences between the temperatures to which respective portions are heated or cooled. Alternatively or additionally, the computer processor utilizes a machine learning algorithm, based upon which the computer processor analyzes the subject's response to different temperature profiles at different sleep stages and learns which temperature profiles at which sleep phases result in the best quality sleep for the subject. Typically, for such applications, based upon the aforementioned analysis, the computer processor automatically designates temperature profiles to respective sleep stages.
  • As described hereinabove, for some applications, the computer processor additionally adjusts the temperature of an additional room-climate regulation device, such as an air-conditioning unit, an electric heater, and/or a radiator. For example, an air-conditioning unit may be used to provide additional control of the temperature of the subject's trunk by controlling the temperature of the air that the subject inhales. For some applications, the temperature profiles of the respective sleep stages include a setting for the additional room-climate regulation device.
  • Referring again to FIG. 2, it is noted that typically blanket is sized for use with a single subject, and includes separate regions the temperatures of which are controlled separately from one another. Typically, the length of the blanket is less than 250 cm, e.g., less than 220 cm, or less than 200 cm, and the width of the blanket is less than 130 cm, e.g., less than 120 cm, or less than 110 cm. For applications, in which the temperature-control device for use with a single subject is a mattress, typically, the mattress has similar dimensions to those described with respect to the blanket.
  • For some applications, the temperature control device is a portion of a blanket or a mattress that is suitable for being used by two subjects (e.g., partners in a double bed). Even for such applications, a portion of the blanket or mattress that is configured to be placed underneath or over a single subject (e.g., a left half of the blanket, or a left half of the mattress) includes at least first and second sections (e.g., a trunk section corresponding to the subject's trunk, leg sections corresponding to the subject's legs, and/or arm sections corresponding to the subject's arms), and the temperature-regulation unit regulates the respective portions of the subject's body to be at respective temperatures by, simultaneously, setting the temperature of the first section of the temperature control device to a first temperature, and setting the temperature the second section of the temperature control device to a second temperature that is different from the first temperature, and, optionally, setting the temperature of additional sections of the temperature control device to further respective temperatures.
  • Typically, the techniques described herein are practiced in combination with techniques described in WO 16/035073 to Shinar, which is incorporated herein by reference. For example, the computer processor may drive the user interface to prompt the subject to input changes to the temperature profiles corresponding to respective sleep stages, in response to a change in a relevant parameter. For example, in response to a change in season, an ambient temperature, an ambient humidity, and/or a going-to-sleep time (e.g., the subject is going to bed at an unusual time), the computer processor may drive the user interface to prompt the subject to re-enter his/her temperature profiles. (The computer processor may identify the change of the relevant parameter in a variety of ways, such as, for example, by receiving input from a sensor, or by checking the internet.)
  • For some applications, in response to analyzing the sensor signal, the computer processor calculates a sleep score of the subject. For example, the computer processor may calculate a score from one or more parameters such as a time to fall asleep, duration of sleep, or “sleep efficiency,” which is the percentage of in-bed time during which the subject is sleeping. For some applications, the score is calculated using one or more of the aforementioned parameters, such that a higher sleep score is indicative of more restful sleeping session relative to a lower sleep score. The computer processor may then compare the sleep score to a baseline value, e.g., an average sleep score over a previous period of time. In response to the calculated sleep score being lower than the baseline value, the computer processor may drive the user interface to prompt the subject to re-enter new temperature profiles for respective sleep stages, since it is possible that the temperature profiles were a contributing factor in the subject's low sleep score. Alternatively or additionally, the computer processor may drive user interface to prompt the subject to input at least one factor that may have caused the low sleep score. The computer processor then controls the heating device in response to the input.
  • In some applications, the computer processor computes a measure of relaxation, i.e., a relaxation score, for the subject, one or more times during a sleeping session. For example, a high relaxation score may be computed if the subject shows little movement, and little variation in both respiration rate and respiration amplitude. The relaxation score may be used to compute the sleep score. Alternatively or additionally, in response to a low relaxation score, the computer processor may immediately adjust the temperature of sections of the temperature control device.
  • In some applications, in response to a low sleep score, the computer processor adjusts the temperature profiles even without any input from the user, or the computer processor generates an output (e.g., via user interface device 35) that includes suggested temperature profiles, which the subject may edit and/or confirm via the user interface.
  • For some applications, when the temperature control device is initially used by the subject, the computer processor is configured to perform a “sweep” (or “optimization routine”) over a plurality of different temperature profiles at respective sleep stages, in order to ascertain which profiles at which sleep stages are conducive to a higher sleep score, relative to other settings, e.g., which setting maximizes the sleep score. For example, over the course of several sleeping sessions, the computer processor may change the temperature profiles that are used at respective sleep stages in different ways, and in response thereto, determine the optimal temperature profiles.
  • Additional techniques as described in WO 16/035073 to Shinar, which is incorporated herein by reference, may be practiced in combination with the apparatus and methods described herein.
  • Reference is now made to FIG. 4, which is a flowchart showing steps that are performed by computer processor 28 in order to monitor sleep apnea of the subject, in accordance with some applications of the present invention.
  • For some applications, sensor 22 is configured to monitor the subject during a sleeping session of the subject. The computer processor receives and analyzes the sensor signal (step 70). Based on the analysis of the signal, the computer processor identifies the positions of the subject's body at respective times during the sleeping session (step 72). For example, the system may identify when during the sleeping session the subject was lying on his/her side, when during the sleeping session the subject was lying on his/her back (i.e., supine), and when during the sleeping session the subject was lying on his/her stomach. For some applications, the computer processor determines the positions of the subject's body by analyzing the sensor signal using analysis techniques as described in U.S. Pat. No. 8,821,418 to Meger, which is incorporated herein by reference. For some applications, when the computer processor is first used for monitoring sleep apnea events, in accordance with the procedure shown in FIG. 4, a calibration process is performed by the processor. For example, the processor may instruct the subject to lie on his/her back, side, and stomach, each for a given time period. The processor analyzes the subject's cardiac and respiratory related waveforms, and/or other signal components of the sensor signal that are recorded when the subject is lying is respective positions. Based upon this analysis, the processor correlates respective signal characteristics to respective positions of the subject. Thereafter, the processor identifies the subject's position based upon characteristics of the sensor signal.
  • In addition, based upon the analysis of the sensor signal, the computer processor identifies apnea events that occur during the sleeping session (step 74). For example, the computer processor may identify apnea events by analyzing the sensor signal using analysis techniques as described in US 2007/0118054 to Pinhas (now abandoned), which is incorporated herein by reference. In step 76, the computer processor identifies a correspondence between positions of the subject and occurrences of apnea events of the subject during the sleeping session. The computer processor typically generates an output on an output device (e.g., any one of the output devices described with reference to FIG. 1), in response to the identified correspondence.
  • For example, the computer processor may generate an indication of:
  • (a) which positions cause the subject to undergo apnea events (e.g., “Sleeping on your back causes apnea events to occur”),
  • (b) a recommended position for the subject to assume while sleeping (e.g. “Try sleeping on your side”), and/or
  • (c) recommended steps to take in order to reduce the likelihood of apnea events occurring (e.g., “Try sleeping with a ball strapped to your back”).
  • For some applications, the analysis of the sensor signal (step 70), the identification of subject positions (step 72), the identification of apnea events (step 74), and/or the identification of correspondence between the apnea events and the subject positions (step 76) are performed in real time, as the sensor signal is received by the processor. Alternatively, one or more of the aforementioned steps are performed subsequent to the sleeping session.
  • For some applications, in response to detecting that the subject is lying in a given position that the processor has determined to cause the subject to undergo apnea events, the computer processor generates an alert and/or nudges the subject to change positions. For example, in response to detecting that the subject is in a supine position (and having determined that lying in this position causes the subject to undergo apnea events), the computer processor may cause the subject's bed to vibrate, or may adjust the tilt angle of the bed or a portion thereof.
  • For some applications, techniques described herein are practiced in combination with techniques described in US 2007/0118054 to Pinhas, which is incorporated into the present application by reference. For example, the apparatus described herein may be used with a bed or mattress with an adjustable tilt angle, and/or an inflatable pillow which, when activated, inflates or deflates to vary the elevation of the head of the subject as desired. For some applications, in response to detecting that the subject is lying in a given position that the processor has determined to cause the subject to undergo apnea events, the pillow's air pressure level is changed, and/or the tilt angle of the bed or the mattress is changed, in order to change the patient's posture and prevent an upcoming apnea event, or stop a currently-occurring apnea event.
  • Typically, the techniques described herein are practiced in combination with techniques described in WO 16/035073 to Shinar, which is incorporated herein by reference. For some applications, a processor as described with reference to FIG. 4 is used in combination with a vibrating mechanism and/or an adjustable resting surface. The vibrating mechanism may include a vibrating mechanism disposed underneath mattress 26 and/or a vibrating wristwatch.
  • Typically, the subject is more likely to snore, cough, or have an apnea episode when the subject is in a supine position. The computer processor reduces the frequency of snoring, coughing, and/or apnea of subject 24 by encouraging (e.g., by “nudging”) the subject to move from a supine position to a different position.
  • As described hereinabove the computer processor identifies the subject's sleeping position by analyzing the sensor signal from sensor 22. In response to the identified sleeping position, e.g., in response to the identified posture being a supine position, the computer processor drives the vibrating mechanism to vibrate, and/or adjusts a parameter (e.g., an angle) of the surface upon which the subject is lying. The vibration typically nudges the subject to change his posture, while the adjustment of the parameter may nudge the subject to change his posture or actually move the subject into the new posture.
  • In some applications, an inflatable pillow is used and the computer processor adjusts a level of inflation of the inflatable pillow. For example, to inhibit coughing and/or snoring, the computer processor may drive an inflating mechanism to inflate the inflatable pillow, by communicating a signal to the inflating mechanism.
  • As described hereinabove, for some applications, the computer processor is configured to identify a sleep stage of the subject. For some such applications, the computer processor drives the vibrating mechanism to vibrate, and/or adjusts the parameter of the resting surface, further in response to the identified sleep stage. For example, the computer processor may drive the vibrating mechanism to vibrate, and/or adjust the parameter of the resting surface, in response to the identified sleep stage being within 5 minutes of an onset or an end of an REM sleep stage, since at these points in time, the “nudging” or moving is less likely to disturb the subject's sleep.
  • Reference is now made to FIG. 5 is a schematic illustration of a sensor unit 80 disposed under a seat 82 of a vehicle, in accordance with some applications of the present invention. Sensor unit 80 is configured to monitor physiological parameters of a subject who is sitting on seat 82, and to generate a sensor signal in response thereto. Typically, the subject is an operator of the vehicle (e.g., the driver of a car, the pilot of an airplane, the driver of a train, etc.). A computer processor, which is typically like computer processor 28 described herein, is configured to receive and analyze the sensor signal for any one of a number of reasons.
  • Typically, the computer processor derives vital signs of the subject (such as heart rate, respiratory rate, and/or heart-rate variability) from the sensor signal. For some applications, the computer processor compares the subject's vital signs to a baseline of the subject that was derived during previous occasions when the subject operated the vehicle. In response thereto, the computer processor may determine that the subject's vital signs have changed substantially from the baseline, that the subject is unwell, drowsy, asleep, and/or under the influence of drugs or alcohol. In response thereto, the computer processor may generate an alert to the driver, or to a remote location (such as to a family member, and/or to a corporate control center). Alternatively or additionally, the computer processor may automatically disable the vehicle.
  • For some applications, the computer processor integrates the analysis of the sensor signal from sensor unit 80 with the analysis of a sensor signal from an additional sensor, which may be disposed in the subject's bed, for example. For example, the computer processor may determine that the subject has not had enough sleep based upon the analysis of the signals from both sensors. Or, the sensor may derive, from the combination of the sensor signals, that the subject has had enough sleep, but appears to be unwell, and/or under the influence of drugs or alcohol. In response thereto, the computer processor may generate an alert to the driver, or to a remote location (such as to a family member, and/or to a corporate control center). Alternatively or additionally, the computer processor may automatically disable the vehicle.
  • For some applications, sensor units 80 are disposed underneath more than one seat in the vehicle. For example, sensor units may be disposed underneath the seats of a pilot and a co-pilot in an airplane (e.g., as described in WO 16/035073 to Shinar, which is incorporated herein by reference). Or, sensor units may be disposed underneath each of the seats in an airplane or a car. Based upon the sensor signals from the sensor units, the computer processor may determine that a child has been left alone in a car, and may generate an alert in response thereto. For example, the alert may be generated on the driver's and/or parents' cellular phone(s). Alternatively or additionally, the computer processor may determine the number of people in the car. (It is noted that the sensor is typically configured to distinguish between a person who is disposed upon the seat and an inanimate object (such as a suitcase, or backpack) that is disposed upon the seat.) In response thereto, the computer processor may generate seatbelt alerts, for example. Alternatively or additionally, the computer processor may automatically communicate with the billing system of a toll road for which prices are determined based upon the number of passengers in the car.
  • Typically, in order to facilitate the above-described applications, sensor unit 80 is configured to generate a sensor signal that is such that the computer processor is able to distinguish between artifacts from motion of vehicle, and motion that is indicative of physiological parameters of the subject. Typically, the sensor unit includes (a) a housing, (b) at least one first motion sensor disposed within the housing, such that the first motion sensor generates a first sensor signal that is indicative of the motion of the vehicle, and (c) at least one second motion sensor disposed within the housing, such that the second motion sensor generates a second sensor signal that is indicative of the motion of the subject and the motion of the vehicle. The computer processor configured to at least partially distinguish between the motion of the subject and the motion of the vehicle by analyzing the first and second sensor signals.
  • For some applications, the first motion sensor is disposed within the housing, such that the first motion sensor is isolated from the motion of the subject, and/or such that the first motion sensor only detects motion that is due to motion of the vehicle. The computer processor at least partially distinguishes between the motion of the subject and the motion of the vehicle by (a) deriving the motion of the vehicle from the first sensor signal(s), and (b) based upon the derived motion of the vehicle, subtracting the vehicular motion (i.e., subtracting the portion of the sensor signal that is generated by the motion of the vehicle) from the sensor signal that is generated by the second sensor(s).
  • Reference is now made to FIGS. 6A-C are schematic illustrations of sensor unit 80, in accordance with respective applications of the present invention.
  • As shown in FIG. 6A, for some applications, sensor unit 80 includes a housing 90 at least a portion 92 of which is flexible. A fluid compartment 94, which is filled with a gas or a liquid, is disposed on an inner surface of the housing. A first motion sensor 96 (e.g., a deformation sensor, a piezoelectric sensor, and/or an accelerometer) is disposed on a surface of the fluid compartment, and is configured to generate a first sensor signal. For some applications (not shown), two or more first motion sensors are disposed on the surface of the fluid compartment, and each of the first motion sensors generates a respective sensor signal. A second motion sensor 98 (e.g., a deformation sensor, a piezoelectric sensor, and/or an accelerometer) is disposed on at least one inner surface of flexible portion 92 of housing 90. The second motion sensor is configured to generate a second sensor signal. For some applications, as shown, two or more motion sensors 98 are disposed on respective inner surfaces of flexible portion 92 of housing 90, and each of motion sensors 98 generates a respective sensor signal. The computer processor is configured to at least partially distinguish between the motion of the subject and the motion of the vehicle by analyzing the first and second sensor signals.
  • Typically, fluid compartment 94 isolates first motion sensor 96 from motion of the subject who is sitting on the seat, such that motion sensor 96 only detects motion that is due to motion of the vehicle. Second motion sensor(s) detects both motion of the vehicle, and motion of the subject, the motion of the subject being conveyed to the second motion sensor(s) via the flexible portion of the housing. Thus, the computer processor at least partially distinguishes between the motion of the subject and the motion of the vehicle by (a) deriving the motion of the vehicle from the first sensor signal, and (b) based upon the derived motion of the vehicle, subtracting the vehicular motion (i.e., subtracting the portion of the sensor signal that is generated by the motion of the vehicle) from the sensor signal that is generated by the second sensor(s).
  • As shown in FIGS. 6B and 6C, for some applications, sensor unit 80 includes a housing 100 that includes a flexible portion 102 and a rigid portion 104. At least one first motion sensor(s) 106 (e.g., a deformation sensor, a piezoelectric sensor, and/or an accelerometer) is disposed on at least one inner surface of the rigid portion of the housing, and is configured to generate a first sensor signal. For some applications, as shown in FIG. 6C, two or more first motion sensors are disposed on respective inner surfaces of the rigid portion of the housing, and each of motion sensors 106 generates a respective sensor signal. At least one second motion sensor 108 (e.g., a deformation sensor, a piezoelectric sensor, and/or an accelerometer) is disposed on at least one inner surface of flexible portion 102 of housing 100. The second motion sensor is configured to generate a second sensor signal. For some applications, as shown in FIGS. 6B and 6C, two or more motion sensors 108 are disposed on respective inner surfaces of flexible portion 102 of housing 100, and each of motion sensors 108 generates a respective sensor signal. The computer processor is configured to at least partially distinguish between the motion of the subject and the motion of the vehicle by analyzing the first and second sensor signals.
  • Typically, the rigidity of the rigid portion of the housing isolates first motion sensor(s) 106 from motion of the subject who is sitting on the seat, such that first motion sensor(s) 106 only detects motion that is due to motion of the vehicle. Second motion sensor(s) detects both motion of the vehicle, and motion of the subject, the motion of the subject being conveyed to the second motion sensor(s) via the flexible portion of the housing. Thus, the computer processor at least partially distinguishes between the motion of the subject and the motion of the vehicle by (a) deriving the motion of the vehicle from the first sensor signal(s), and (b) based upon the derived motion of the vehicle, subtracting the vehicular motion (i.e., subtracting the portion of the sensor signal that is generated by the motion of the vehicle) from the sensor signal that is generated by the second sensor(s).
  • Typically, the techniques described herein are practiced in combination with techniques described in WO 16/035073 to Shinar, which is incorporated herein by reference. For some applications, a sensor unit as described with reference to FIGS. 5-6C is used in an airplane, and the computer processor generates one or more of the following outputs, based upon analysis of the sensor signal:
  • (a) An alert may be generated if, by analyzing the sensor signal, the computer processor identifies an elevated stress level of a subject, e.g., by identifying an elevated heart rate, and/or a decreased stroke volume, e.g., as described in WO 2015/008285 to Shinar, which is incorporated herein by reference. For example, in response to the pilot experiencing an elevated stress level, the computer processor may generate an alert to another member of the flight crew, and/or individuals on the ground. The computer processor may also analyze the signal of the co-pilot, and generate an alert in response to both the pilot and co-pilot experiencing an elevated stress level, since the presence of an elevated stress level in both individuals at the same time is likely to be indicative of an emergency situation. Similarly, an alert may be generated if two or more passengers experience an elevated stress level at the same time.
  • (b) An alert may be generated if, by analyzing the sensor signal, the computer processor identifies that it is likely that the subject is experiencing, or will soon experience, a clinical event, such as a heart attack. For example, if the pilot or one of the passengers is experiencing a heart attack, members of the flight crew, and/or a physician who is travelling on the airplane, may be alerted to the situation.
  • (c) An alert may be generated if, by analyzing the sensor signal, the computer processor identifies that it is at least somewhat likely that the subject is a carrier of a disease, such as severe acute respiratory syndrome (SARS). For example, if the computer processor identifies a change in the baseline heart rate of the subject without any correlation to motion of the subject, the computer processor may ascertain that the subject has likely experienced a rapid change in body temperature, which may indicate that the subject is sick. (The baseline heart rate is typically an average heart rate over a period of time, e.g., 1-2 hours.) In response, the computer processor may alert the flight crew to isolate the subject.
  • (d) An alert may be generated if, by analyzing the sensor signal, the computer processor identifies that the subject (in particular, the pilot or co-pilot) is drowsy or sleeping.
  • (e) A sleep study may be performed. For example, the computer processor may analyze the sensor signals from various passengers, and identify which passengers were sleeping at which times. In response, the computer processor may generate an output to help the airline improve the sleeping conditions on their aircraft (e.g., by reducing lighting, or increasing leg room).
  • The computer processor may also control the lighting, temperature, or other cabin-environment parameters, in order to facilitate a more pleasant travelling experience. For example, upon detecting that a significant number of passengers are sleeping or are trying to fall asleep, the lights in the cabin may be dimmed, and/or the movie that is playing may be stopped. Alternatively or additionally, meals may be served to the passengers only if a given number of passengers are awake. To help prevent deep vein thrombosis (DVT), passengers may be prompted to stand up and take a walk, if the computer processor detects that they have been sitting in place for too long.
  • Reference is now made to FIGS. 7A-B, which are schematic illustrations of subject-monitoring apparatus, in accordance with some applications of the present invention. Components of subject-monitoring apparatus 20 are as described hereinabove with reference to FIG. 1. For some applications, as shown in FIG. 7B sensor 22 is disposed under a chair 111 that the subject sits upon, and is configured to monitor the subject while the subject is sitting on the chair, in the manner described hereinabove, mutatis mutandis. For some applications techniques described herein are practiced in combination with techniques described in WO 16/035073 to Shinar, which is incorporated herein by reference.
  • Subject-monitoring apparatus 20 comprises a sensor 22, which is generally as described hereinabove, and is configured to monitor subject 24. Subject-monitoring apparatus 20 includes a control unit, which is typically a computer processor, such as computer processor 28 described hereinabove. As described hereinabove, computer processor typically communicates with a memory 29. The computer processor is typically a control unit that performs the algorithms described herein, including analyzing the signal from sensor 22. It is noted that, in general, in the specification and claims of the present application, the terms “computer processor” and “control unit” are used interchangeably, since steps of the techniques described herein are typically performed by a computer processor that functions as a control unit. Therefore, the present application refers to component 28 both as a “computer processor” and a “control unit.”
  • In response to the analyzing the signal from sensor 22, computer processor 28 controls a property (e.g., the content, genre, volume, frequency, and/or phase-shift) of a sound signal, and drives a speaker 110 to play the sound signal. Typically, as described hereinbelow, the property of the sound signal is controlled such as to help the subject fall asleep or remain asleep.
  • For example, if the subject is trying to fall asleep, the computer processor may select a sound signal of the “relaxing nature sounds” genre, and may further select the content of the signal to be the sound of waves hitting the seashore. The computer processor may further set the frequency of the sound signal (e.g., the frequency of the waves) to an offset less than the subject's current heart rate or respiratory rate, in order to facilitate slowing of the subject's heart rate and/or respiratory rate. In some applications, the computer processor controls the offset, in response to analyzing the sensor signal; for example, as the heart rate of the subject approaches a target “relaxed” heart rate, the computer processor may reduce the offset, such that the frequency of the sound signal is very close to or identical with the subject's heart rate. As the subject begins to fall asleep, the computer processor may reduce the volume of the sound signal.
  • In some applications, the computer processor controls a phase-shift of the sound signal with respect to a cardiac signal and/or a respiratory signal of the subject. For example, the computer processor may cause the sound of a wave hitting the seashore to occur a given amount of time (e.g., 300 milliseconds) before or after each heartbeat of the subject, or a given amount of time (e.g., 1 second) after each expiration of the subject.
  • In some applications, the computer processor ascertains that the subject is trying to fall asleep, at least in response to analyzing the sensor signal. For example, by analyzing the sensor signal, the computer processor may ascertain that the subject is awake and is exhibiting a large amount of movement indicative of restlessness in bed. Alternatively or additionally, the ascertaining is in response to one or more other factors, such as a signal from a light sensor that indicates a low level of ambient light in the room, and/or the time of day. In response to ascertaining that the subject is trying to fall asleep, the computer processor controls the property of the sound signal, as described hereinabove.
  • In some applications, by analyzing the sensor signal, the computer processor ascertains a sleep stage of the subject, and controls the property of the sound signal in response to the ascertained sleep stage. For example, in response to ascertaining that the subject has entered a slow-wave (i.e., deep) sleep stage, the volume of the sound signal may be reduced to a relatively low level (e.g., zero). (In identifying a sleep stage of a subject, as described throughout the present application, the computer processor may use one or more of the techniques described in (a) US 2007/0118054 to Pinhas (now abandoned), and/or (b) Shinar et al., Computers in Cardiology 2001; Vol. 28: 593-596, and (c) Shinar Z et al., “Identification of arousals using heart rate beat-to-beat variability,” Sleep 21(3 Suppl):294 (1998), each of which is incorporated herein by reference.)
  • Typically, the computer processor controls the property of the sound signal further in response to a historical physiological parameter of the subject that was exhibited in response to a historical sound signal. For example, the computer processor may “learn” the subject's typical responses to particular sound-signal properties, and control the sound signal in response thereto. Thus, for example, if the subject has historically responded well to a “relaxing nature sounds” genre, but less so to a “classical music” genre, the computer processor may select the former genre for the subject. To determine whether the subject has historically responded well to particular properties of the sound signal, the computer processor looks at some or all of historical physiological parameters such as a quality of sleep, a time-to-fall-asleep, a heart-rate-variability, a change in heart rate, a change in respiratory rate, a change in heart-rate-variability, a change in blood pressure, a rate of change in heart rate, a rate of change in respiratory rate, a rate of change in heart-rate-variability, and a rate of change in blood pressure.
  • In some applications, the computer processor controls the frequency of the sound signal by synthesizing the sound signal, or by selecting a pre-recorded sound signal that has the desired frequency; in other words, the computer processor selects the content of the signal, without the user's input. In other applications, the computer processor selects content of the sound signal in response to a manual input, e.g., an input entered via user interface device 35 (FIG. 1). For example, the subject may select a particular piece of classical music, and the computer processor may then control properties (such as the frequency, i.e., the tempo) of that particular piece. This may be done, for example, using appropriate software, such as Transcribe!™ by Seventh String Software of London, UK.
  • For some applications, in response to parameters of the signal detected by sensor 22, the computer processor controls a property of light (such as intensity, flicker frequency, or color) emitted by a light 112 in a generally similar manner to that described with respect to controlling the sound that is generated by speaker 110, mutatis mutandis. For example, the computer processor may select a light signal that causes the subject to enter a relaxed state, in response to detecting that the subject is trying to fall asleep. Alternatively or additionally, the computer processor may modulate the property of the light at a frequency of modulation that is based upon the subject's current heart rate or respiratory rate, in order to facilitate slowing of the subject's heart rate and/or respiratory rate, as described hereinabove with respect to the sound signal. Further alternatively or additionally, the computer processor may ascertain a sleep stage of the subject, and modulate the property of the light in response to the ascertained sleep stage. For some applications, the computer processor controls the property of the light further in response to a historical physiological parameter of the subject that was exhibited in response to a historical light signal. For example, the computer processor may “learn” the subject's typical responses to particular light-signal properties, and control the light in response thereto. The computer processor may control parameters of light 112, as an alternative to, or in addition to, controlling properties of the sound that is generated by speaker 110.
  • For some applications, in response to parameters of the signal detected by sensor 22, the computer processor controls a property of light (such as intensity, flicker frequency, or color) that is emitted by a screen 122 of a device that the subject is using in a generally similar manner to that described with respect to controlling the sound that is generated by speaker 110, mutatis mutandis. For example, the device may be a laptop computer 32 (FIG. 1), a tablet device 34 (FIG. 1), a smartphone 36 (FIG. 7B), and or a TV 124 (FIG. 7B). For example, the computer processor may select a light signal that causes the subject to enter a relaxed state, in response to detecting that the subject is trying to fall asleep. Alternatively or additionally, the computer processor may modulate the property of the light at a frequency of modulation that is based upon the subject's current heart rate or respiratory rate, in order to facilitate slowing of the subject's heart rate and/or respiratory rate, as described hereinabove with respect to the sound signal. Further alternatively or additionally, the computer processor may ascertain a sleep stage of the subject, and modulate the property of the light in response to the ascertained sleep stage. For some applications, the computer processor controls the property of the light further in response to a historical physiological parameter of the subject that was exhibited in response to a historical light signal. For example, the computer processor may “learn” the subject's typical responses to particular light-signal properties, and control the light in response thereto. The computer processor may control parameters of light emitted by screen 122, as an alternative to, or in addition to, controlling parameters of the sound that is generated by speaker 110, and/or light that is generated by light 112.
  • For some applications, a vibrating element 126 is disposed underneath a surface of chair 111 upon which the subject sits. Alternatively (not shown), a vibrating element may be disposed underneath the surface of the bed upon which the subject lies. For some applications, in response to parameters of the signal detected by sensor 22, the computer processor controls a property of the vibration (such as vibrating frequency, or a strength of vibration) that is applied to the subject by the vibrating element, in a generally similar manner to that described with respect to controlling the sound that is generated by speaker 110, mutatis mutandis. For example, the computer processor may select a vibration signal that causes the subject to enter a relaxed state, in response to detecting that the subject is trying to fall asleep. Alternatively or additionally, the computer processor may modulate the property of the vibration at a frequency of modulation that is based upon the subject's current heart rate or respiratory rate, in order to facilitate slowing of the subject's heart rate and/or respiratory rate, as described hereinabove with respect to the sound signal. Further alternatively or additionally, the computer processor may ascertain a sleep stage of the subject, and modulate the property of the vibration in response to the ascertained sleep stage. For some applications, the computer processor controls the property of the vibration further in response to a historical physiological parameter of the subject that was exhibited in response to a historical vibration signal. For example, the computer processor may “learn” the subject's typical responses to particular vibration-signal properties, and control the vibrating element in response thereto. The computer processor may control parameters of the vibration of the vibrating element, as an alternative to, or in addition to, controlling parameters of the sound that is generated by speaker 110, and/or light that is generated by light 112 or by screen 122.
  • It is noted that, typically, for any of the embodiments described with reference to FIGS. 7A-B, in response to analysis of the signal from sensor 22, computer processor controls a property of a stimulus-providing device, in a manner that changes a physiological parameter of the subject, such as the subject's heart rate, respiration rate, or sleep latency period. The stimulus-providing device may provide an audio stimulus (e.g., speaker 110), a visual stimulus (e.g., light 112 or screen 122), or a tactile stimulus (e.g., vibrating element 126). Typically, the stimulus is provided to the subject in a manner that does not require any compliance by the user, during the provision of the stimulus to the subject. (Prior to the monitoring of the subject and providing the stimulus to the subject, certain actions (such as purchasing the sensor, placing the sensor under the subject's mattress or chair, downloading software for use with the subject-monitoring apparatus, configuring software for use with the subject-monitoring apparatus, or turning on the stimulus providing device) may need to be performed. The term “without requiring subject compliance” should not be interpreted as excluding such actions.) Typically, the subject may perform routine activities (such as browsing the internet, or watching TV), and while the subject is performing routine activities, the computer processor automatically controls a property of the stimulus that is provided to the subject in the above-described manner. Furthermore, typically the stimulus is provided to the subject in manner that does not require the subject to consciously change the physiological parameter upon which the stimulus has an effect. Rather, the stimulus is provided to the subject such that the physiological parameter of the subject is changed without requiring the subject to consciously adjust the physiological parameter.
  • FIG. 8 is a flowchart showing steps that are performed by a computer processor in order to monitor a subject who is pregnant, in accordance with some applications of the present invention. A pregnant woman's heart rate is typically expected to increase during pregnancy and be higher than the woman's heart rate prior to pregnancy. For some applications, during a calibration phase, a female subject is monitored using sensor 22 before pregnancy. Computer processor receives the sensor signal (step 130), analyzes the sensor signal (step 132), and, based upon the analysis, determines a baseline heart rate (e.g., a baseline average daily heart rate, or a baseline heart rate at a given time period of the day, and/or at a given period of the subject's circadian cycle) for the subject (step 134). Based upon the baseline heart rate, the computer processor determines a pregnancy heart rate measure, which is indicative of what the subject's heart rate is expected to be (e.g., what the average daily heart rate, or the heart rate at a given time period of the day, and/or at a given period of the subject's circadian cycle is expected to be) during a healthy pregnancy (step 136). Typically, the computer processor determines a range of heart rates that are considered to be healthy when the subject is pregnant, based upon the determined baseline heart rate. When the subject is pregnant, during a pregnancy monitoring phase, the computer processor receives the sensor signal (step 138), analyzes the sensor signal (step140), and based upon the analysis of the signal determines the subject's heart rate (step 142). The computer processor compares the heart rate to the pregnancy heart rate measure that was determined based upon the baseline heart rate (step 144). Based on the comparison, the computer processor determines whether the subject's pregnancy is healthy. For some applications, the computer processor generates an output (e.g., an alert) on an output device (as described hereinabove), in response to the comparison (step 146). For some applications, in response to detecting that the subject's heart rate has returned to the pre-pregnancy baseline heart rate, the computer processor generates an output that is indicative of a recommendation to visit a physician.
  • Reference is now made to FIGS. 9A-C, which show histograms of patients' cardiac interbeat intervals that were recorded in accordance with some applications of the present invention. As described hereinabove, for some applications, sensor 22 performs monitoring of the subject without contacting the subject or clothes the subject is wearing, and/or without viewing the subject or clothes the subject is wearing. For some applications, the sensor is configured to detect the subject's cardiac cycle, using techniques as described herein. In some cases, typically due to the non-contact nature of the sensing, some of the subject's heartbeats are not reliably detected. For some such applications, for each heartbeat, the computer processor determines a quality indicator that indicates the quality of the sensed heartbeat. For example, the computer processor may determine the signal-to-noise ratio of the signal, and compare the signal-to-noise ratio to a threshold.
  • For some applications, the computer processor selects a subset of heartbeats, based upon the qualities of the heartbeats, and some steps of the subsequent analysis (as described herein) are performed only with respect to the subset of heartbeats. For some applications, only in cases in which two consecutive heart beats have a quality indicator that exceeds a threshold, the interbeat interval is calculated and/or is selected for use in subsequent analysis. For some applications, the computer processor builds a histogram of the selected interbeat intervals. The computer processor analyzes the selected interbeat intervals over a period of time, and in response thereto, the computer processor determines whether the subject is healthy or is suffering from arrhythmia, which type of arrhythmia the subject is suffering from, and/or identifies or predicts arrhythmia episodes. For example, the computer processor may build a histogram of the selected interbeat intervals and may perform the above-described steps by analyzing the histogram.
  • FIGS. 9A-C show sample histograms that were constructed using the above-described technique. The x-axes of the histograms measure the time at which the interbeat interval measurement was recorded, the y-axes measure the interbeat interval and the color legend measures the amplitude of the histogram at each interbeat interval (with a lighter color representing greater amplitude). FIG. 9A shows measurements recorded from a healthy subject, there being only one peak at approximately 900 ms. FIG. 9B is the histogram of an arrhythmic subject, the histogram including two dominant peaks shown by the two light lines at approximately 450 ms and 800 ms. FIG. 9C is the histogram of a subject who starts with a normal cardiac rhythm and at about an x-axis time of 5,500 sec. starts to show an arrhythmia that is manifested by the much wider distribution of the histogram.
  • In accordance with the above, for some applications, in response to the computer processor identifying two distinct peaks in a histogram that is plotted using the techniques described herein (or performing an equivalent algorithmic operation), an alert is generated that an arrhythmia event may be taking place. Alternatively or additionally, the computer processor may generate an alert in response to identifying that the width of a peak of a histogram exceeds a threshold (or performing an equivalent algorithmic operation). For example, the width of the peak may be compared to a threshold that is determined based upon population averages according to the age and or other indications of the subject (such as, a level of fitness of the subject).
  • For some applications, in response to the computer processor identifying two distinct peaks in a histogram that is plotted using the techniques described herein (or performing an equivalent algorithmic operation), the computer processor performs the following steps. The computer processor identifies heartbeats belonging to respective interbeat interval groups (i.e., which heartbeats had an interbeat interval that corresponds to a first one of the peaks, and which heartbeats had an interbeat interval corresponding to the second one of the peaks.) The average amplitude of the signal of each of these groups is then calculated. For some applications, the computer processor generates an output that is indicative of the average amplitude of each of the peaks, and/or the interbeat interval of each of the peaks. Alternatively or additionally, based upon these data, the computer processor automatically determines a condition of the subject. For example, the computer processor may determine which category of arrhythmia the subject is suffering from, e.g., atrial fibrillation or ventricular fibrillation.
  • It is noted that, although the analysis of the interbeat intervals is described as being performed using histogram analysis, the techniques described herein may be combined with other types of analysis that would yield similar results, mutatis mutandis. For example, the computer processor may perform algorithmic steps that do not include a step of generating a histogram, but which analyze the subject's interbeat interval over time, in a similar manner to that described hereinabove.
  • Reference is now made to FIG. 10, which shows components of a subject's cardiac cycle that were detected, in accordance with some applications of the present invention. For some applications, sensor 22 is used to monitor a cardiac-related signal of the subject. For some applications, a bank of matched filters with varying filter parameters (e.g., varying width properties) is applied to the raw signal, and one of the filtered signals is selected by the computer processor. For example, the filter having the greatest signal-to-noise ratio may be selected, by selecting the filter that generates the highest ratio of the main lobe to the side lobes in the filtered signal. Typically, the filters are designed to have a main lobe with full-width-half-maximum value that fits a human biological beat as recorded with the contact free sensor under the mattress. The bank of filters typically includes filters having a range of relevant full-width-half-maximum values for biological signals. Typically, the filters are zero-mean, e.g., in order to remove any trends, movements or respiration.
  • Typically, the selection of which filter to use is repeated in response to certain events. For some applications, the selection of a filter is repeated if the signal quality falls below a threshold. Alternatively or additionally, the filter selection is repeated at fixed times intervals (e.g., once every 5 minutes, ten minutes, or 15 minutes). Further, alternatively or additionally, the filter selection is repeated in response to detecting motion of the subject, e.g., large body motion of the subject. For example, in response to the sensor signal indicating that the subject has undergone motion (e.g., large body motion), the computer processor may perform the filter selection.
  • Referring to FIG. 10, a signal that was detected using the above described technique is shown above the corresponding ECG signal. It may be observed that certain cardiac events, which correlate with the ECG signal, may be extracted from the sensor signal. For example, the following mechanical events can typically be extracted from the sensor signal: mitral valve closure (MC), aortic valve opening (AO), systolic ejection (SE), aortic valve closure (AC), and mitral valve opening (MO). Therefore, for some applications, a cardiac signal that is detected using techniques described herein is analyzed by the computer processor, and one or more of the events are identified. For some applications, in this manner, the computer processor monitors mechanical functioning of the heart. For example, the computer processor may use the identified events to measure the subject's left ventricular ejection time. For some applications, the computer processor analyzes the subject's cardiac cycle, by using the above-described technique in combination with ECG sensing.
  • In general, computer processor 28 may be embodied as a single computer processor 28, or a cooperatively networked or clustered set of computer processors. Computer processor 28 is typically a programmed digital computing device comprising a central processing unit (CPU), random access memory (RAM), non-volatile secondary storage, such as a hard drive or CD ROM drive, network interfaces, and/or peripheral devices. Program code, including software programs, and data are loaded into the RAM for execution and processing by the CPU and results are generated for display, output, transmittal, or storage, as is known in the art. Typically, computer processor 28 is connected to one or more sensors via one or more wired or wireless connections. Computer processor 28 is typically configured to receive signals (e.g., motion signals) from the one or more sensors, and to process these signals as described herein. In the context of the claims and specification of the present application, the term “motion signal” is used to denote any signal that is generated by a sensor, upon the sensor sensing motion. Such motion may include, for example, respiratory motion, cardiac motion, or other body motion, e.g., large body-movement. Similarly, the term “motion sensor” is used to denote any sensor that senses motion, including the types of motion delineated above.
  • Applications of the invention described herein can take the form of a computer program product accessible from a computer-usable or computer-readable medium (e.g., a non-transitory computer-readable medium) providing program code for use by or in connection with a computer or any instruction execution system, such as computer processor 28. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium. Typically, the computer-usable or computer readable medium is a non-transitory computer-usable or computer readable medium.
  • Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W) and DVD.
  • A data processing system suitable for storing and/or executing program code will include at least one processor (e.g., computer processor 28) coupled directly or indirectly to memory elements (e.g., memory 29) through a system bus. The memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution. The system can read the inventive instructions on the program storage devices and follow these instructions to execute the methodology of the embodiments of the invention.
  • Network adapters may be coupled to the processor to enable the processor to become coupled to other processors or remote printers or storage devices through intervening private or public networks. Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.
  • Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the C programming language or similar programming languages.
  • It will be understood that each block of the flowcharts shown in FIGS. 3, 4, and 8, and combinations of blocks in the flowcharts, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer (e.g., computer processor 28) or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowcharts and/or algorithms described in the present application. These computer program instructions may also be stored in a computer-readable medium (e.g., a non-transitory computer-readable medium) that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the flowchart blocks and algorithms. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowcharts and/or algorithms described in the present application.
  • Computer processor 28 is typically a hardware device programmed with computer program instructions to produce a special purpose computer. For example, when programmed to perform the algorithms described with reference to FIG. 3, computer processor 28 typically acts as a special purpose temperature control computer processor, when programmed to perform the algorithms described with reference to FIG. 4, computer processor 28 typically acts as a special purpose apnea monitoring processor, and when programmed to perform the algorithms described with reference to FIG. 8, computer processor 28 typically acts as a special purpose pregnancy monitoring processor. Typically, the operations described herein that are performed by computer processor 28 transform the physical state of memory 29, which is a real physical article, to have a different magnetic polarity, electrical charge, or the like depending on the technology of the memory that is used.
  • Techniques described herein may be practiced in combination with techniques described in one or more of the following patents and patent applications, which are incorporated herein by reference. In some applications, techniques and apparatus described in one or more of the following patents and patent applications, which are incorporated herein by reference, are combined with techniques and apparatus described herein:
      • U.S. patent application Ser. No. 11/048,100, filed Jan. 31, 2005, which issued as U.S. Pat. No. 7,077,810;
      • U.S. patent application Ser. No. 11/197,786, filed Aug. 3, 2005, which issued as U.S. Pat. No. 7,314,451;
      • U.S. patent application Ser. No. 11/446,281, filed Jun. 2, 2006, which issued as U.S. Pat. No. 8,376,954;
      • U.S. patent application Ser. No. 11/552,872, filed Oct. 25, 2006, now abandoned, which published as US 2007/0118054;
      • U.S. patent application Ser. No. 11/755,066, filed May 30, 2007, now abandoned, which published as US 2008/0114260;
      • U.S. patent application Ser. No. 11/782,750, filed Jul. 25, 2007, which issued as U.S. Pat. No. 8,403,865;
      • U.S. patent application Ser. No. 12/113,680, filed May 1, 2008, now abandoned, which published as US 2008/0275349;
      • U.S. patent application Ser. No. 12/842,634, filed Jul. 23, 2010, which issued as U.S. Pat. No. 8,517,953;
      • U.S. patent application Ser. No. 12/938,421, filed Nov. 3, 2010, which issued as U.S. Pat. No. 8,585,607;
      • U.S. patent application Ser. No. 12/991,749, filed Nov. 9, 2010, which issued as U.S. Pat. No. 8,821,418;
      • U.S. patent application Ser. No. 13/107,772, filed May 13, 2011, which issued as U.S. Pat. No. 8,491,492;
      • U.S. patent application Ser. No. 13/305,618, filed Nov. 28, 2011, now abandoned, which published as US 2012/0132211;
      • U.S. patent application Ser. No. 13/389,200, filed Jun. 13, 2012, now abandoned, which published as US 2012/0253142;
      • U.S. patent application Ser. No. 13/750,957, filed Jan. 25, 2013, which issued as U.S. Pat. No. 8,603,010;
      • U.S. patent application Ser. No. 13/750,962, filed Jan. 25, 2013, which issued as U.S. Pat. No. 8,679,034;
      • U.S. patent application Ser. No. 13/863,293, filed Mar. 15, 2013, now abandoned, which published as US 2013/0245502;
      • U.S. patent application Ser. No. 13/906,325, filed May 30, 2013, which issued as U.S. Pat. No. 8,882,684;
      • U.S. patent application Ser. No. 13/921,915, filed Jun. 19, 2013, which issued as U.S. Pat. No. 8,679,030;
      • U.S. patent application Ser. No. 14/019,371, filed Sep. 5, 2013, which published as US 2014/0005502;
      • U.S. patent application Ser. No. 14/020,574, filed Sep. 6, 2013, which issued as U.S. Pat. No. 8,731,646;
      • U.S. patent application Ser. No. 14/054,280, filed Oct. 15, 2013, which issued as U.S. Pat. No. 8,734,360;
      • U.S. patent application Ser. No. 14/150,115, filed Jan. 8, 2014, which issued as U.S. Pat. No. 8,840,564;
      • U.S. patent application Ser. No. 14/231,855, filed Apr. 1, 2014, which issued as U.S. Pat. No. 8,992,434;
      • U.S. patent application Ser. No. 14/454,300, filed Aug. 7, 2014, which issued as U.S. Pat. No. 8,942,779;
      • U.S. patent application Ser. No. 14/458,399, filed Aug. 13, 2014, which issued as U.S. Pat. No. 8,998,830;
      • U.S. patent application Ser. No. 14/474,357, filed Sep. 2, 2014, which published as US 2014/0371635;
      • U.S. patent application Ser. No. 14/557,654, filed Dec. 2, 2014, issued as U.S. Pat. No. 9,026,199;
      • U.S. patent application Ser. No. 14/631,978, filed Feb. 26, 2015, published as US 2015/0164438;
      • U.S. patent application Ser. No. 14/624,904, filed Feb. 18 2015, published as US 2015/0164433;
      • U.S. patent application Ser. No. 14/663,835, filed Mar. 20, 2015, published as US 2015/0190087;
      • U.S. patent application Ser. No. 14/810,814, filed Jul. 28, 2015, published as US 2015/0327792;
      • International Patent Application PCT/IL2005/000113, which published as WO 2005/074361;
      • International Patent Application PCT/IL2006/000727, which published as WO 2006/137067;
      • International Patent Application PCT/IB2006/002998, which published as WO 2007/052108;
      • International Patent Application PCT/IL2008/000601, which published as WO 2008/135985;
      • International Patent Application PCT/IL2009/000473, which published as WO 2009/138976;
      • International Patent Application PCT/IL2011/050045, which published as WO 2012/077113;
      • International Patent Application PCT/IL2013/050283, which published as WO 2013/150523;
      • International Patent Application PCT/IL2014/050644, which published as WO 2015/008285; and
      • International Patent Application No. PCT/IL2015/050880 to Shinar, which published as WO 16/035073.
  • It will be appreciated by persons skilled in the art that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes both combinations and subcombinations of the various features described hereinabove, as well as variations and modifications thereof that are not in the prior art, which would occur to persons skilled in the art upon reading the foregoing description.

Claims (22)

1. Apparatus for use with a seat of a vehicle comprising:
a sensor unit configured to be placed underneath the seat and configured to detect motion of a subject who is sitting on the seat during motion of the vehicle, the sensor unit comprising:
a housing;
at least one first motion sensor disposed within the housing, such that the first motion sensor generates a first sensor signal that is indicative of the motion of the vehicle;
at least one second motion sensor disposed within the housing, such that the second motion sensor generates a second sensor signal that is indicative of the motion of the subject and the motion of the vehicle; and
a computer processor configured to at least partially distinguish between the motion of the subject and the motion of the vehicle by analyzing the first and second sensor signals.
2. The apparatus according to claim 1, wherein the first motion sensor is disposed within the housing such that the first motion sensor is isolated from the motion of the subject, such that the first motion sensor only detects motion that is due to motion of the vehicle.
3. The apparatus according to claim 1, wherein the computer processor is configured to:
derive the motion of the vehicle from the first sensor signal, and
based upon the derived motion of the vehicle, to subtract, from the second sensor signal, a portion of the second sensor signal that is generated by the motion of the vehicle.
4. The apparatus according to claim 1, wherein:
at least a portion of the housing is flexible,
the apparatus further comprises a fluid compartment disposed on an inner surface of the housing,
the at least one first motion sensor is disposed on a surface of the fluid compartment, and
at least one second motion sensor is disposed on at least one inner surface of the flexible portion of the housing.
5. The apparatus according to claim 4, wherein the first motion sensor comprises a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
6. The apparatus according to claim 4, wherein the second motion sensor comprises a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
7. The apparatus according to claim 4, wherein the at least one second motion sensor comprises two or more second motion sensors disposed on respective inner surfaces of the flexible portion of the housing.
8. The apparatus according to claim 1, wherein:
the housing comprising flexible and rigid portions;
the at least one first motion sensor is disposed on at least one inner surface of the rigid portion of the housing;
the at least one second motion sensor is disposed on at least one inner surface of the flexible portion of the housing, and configured to generate a second sensor signal.
9. The apparatus according to claim 8, wherein the first motion sensor comprises a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
10. The apparatus according to claim 8, wherein the second motion sensor comprises a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
11. The apparatus according to claim 8, wherein the at least one first motion sensor comprises two or more first motion sensors disposed on respective inner surfaces of the rigid portion of the housing.
12. The apparatus according to claim 8, wherein the at least one second motion sensor comprises two or more second motion sensors disposed on respective inner surfaces of the flexible portion of the housing.
13. Apparatus for use with a seat of a vehicle comprising:
a sensor unit configured to be placed underneath the seat and configured to detect motion of a subject who is sitting on the seat during motion of the vehicle, the sensor unit comprising:
a housing at least a portion of which is flexible;
a fluid compartment disposed on an inner surface of the housing;
at least one first motion sensor disposed on a surface of the fluid compartment, and configured to generate a first sensor signal;
at least one second motion sensor disposed on at least one inner surface of the flexible portion of the housing, the second motion sensor being configured to generate a second sensor signal; and
a computer processor configured to at least partially distinguish between the motion of the subject and the motion of the vehicle by analyzing the first and second sensor signals.
14. The apparatus according to claim 13, wherein the first motion sensor comprises a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
15. The apparatus according to claim 13, wherein the second motion sensor comprises a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
16. The apparatus according to claim 13, wherein the at least one second motion sensor comprises two or more second motion sensors disposed on respective inner surfaces of the flexible portion of the housing.
17. Apparatus for use with a seat of a vehicle comprising:
a sensor unit configured to be placed underneath the seat and configured to detect motion of a subject who is sitting on the seat during motion of the vehicle, the sensor unit comprising:
a housing comprising flexible and rigid portions;
at least one first motion sensor disposed on at least one inner surface of the rigid portion of the housing, and configured to generate a first sensor signal;
at least one second motion sensor disposed on at least one inner surface of the flexible portion of the housing, and configured to generate a second sensor signal; and
a computer processor configured to at least partially distinguish between the motion of the subject and the motion of the vehicle by analyzing the first and second sensor signals.
18. The apparatus according to claim 17, wherein the first motion sensor comprises a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
19. The apparatus according to claim 17, wherein the second motion sensor comprises a sensor selected from the group consisting of: a deformation sensor, a piezoelectric sensor, and an accelerometer.
20. The apparatus according to claim 17, wherein the at least one first motion sensor comprises two or more first motion sensors disposed on respective inner surfaces of the rigid portion of the housing.
21. The apparatus according to claim 17, wherein the at least one second motion sensor comprises two or more second motion sensors disposed on respective inner surfaces of the flexible portion of the housing.
22-42. (canceled)
US15/431,842 2016-02-14 2017-02-14 Apparatus and methods for monitoring a subject Abandoned US20170231545A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US15/431,842 US20170231545A1 (en) 2016-02-14 2017-02-14 Apparatus and methods for monitoring a subject
US16/877,543 US20200275876A1 (en) 2016-02-14 2020-05-19 Apparatus and methods for monitoring a subject
US17/103,826 US11547336B2 (en) 2016-02-14 2020-11-24 Apparatus and methods for monitoring a subject
US18/152,457 US20230157602A1 (en) 2016-02-14 2023-01-10 Apparatus and methods for monitoring a subject

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662295077P 2016-02-14 2016-02-14
US15/431,842 US20170231545A1 (en) 2016-02-14 2017-02-14 Apparatus and methods for monitoring a subject

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/877,543 Continuation US20200275876A1 (en) 2016-02-14 2020-05-19 Apparatus and methods for monitoring a subject

Publications (1)

Publication Number Publication Date
US20170231545A1 true US20170231545A1 (en) 2017-08-17

Family

ID=58266699

Family Applications (4)

Application Number Title Priority Date Filing Date
US15/431,842 Abandoned US20170231545A1 (en) 2016-02-14 2017-02-14 Apparatus and methods for monitoring a subject
US16/877,543 Abandoned US20200275876A1 (en) 2016-02-14 2020-05-19 Apparatus and methods for monitoring a subject
US17/103,826 Active US11547336B2 (en) 2016-02-14 2020-11-24 Apparatus and methods for monitoring a subject
US18/152,457 Pending US20230157602A1 (en) 2016-02-14 2023-01-10 Apparatus and methods for monitoring a subject

Family Applications After (3)

Application Number Title Priority Date Filing Date
US16/877,543 Abandoned US20200275876A1 (en) 2016-02-14 2020-05-19 Apparatus and methods for monitoring a subject
US17/103,826 Active US11547336B2 (en) 2016-02-14 2020-11-24 Apparatus and methods for monitoring a subject
US18/152,457 Pending US20230157602A1 (en) 2016-02-14 2023-01-10 Apparatus and methods for monitoring a subject

Country Status (2)

Country Link
US (4) US20170231545A1 (en)
WO (1) WO2017138005A2 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019053719A1 (en) 2017-09-17 2019-03-21 Earlysense Ltd. Apparatus and methods for monitoring a subject
US10661682B2 (en) 2018-07-17 2020-05-26 Honda Motor Co., Ltd. Vehicle seat haptic system and method
US10786211B2 (en) 2008-05-12 2020-09-29 Earlysense Ltd. Monitoring, predicting and treating clinical episodes
US10939829B2 (en) 2004-02-05 2021-03-09 Earlysense Ltd. Monitoring a condition of a subject
US20210307683A1 (en) * 2020-04-01 2021-10-07 UDP Labs, Inc. Systems and Methods for Remote Patient Screening and Triage
US11147476B2 (en) 2010-12-07 2021-10-19 Hill-Rom Services, Inc. Monitoring a sleeping subject
US11696691B2 (en) 2008-05-01 2023-07-11 Hill-Rom Services, Inc. Monitoring, predicting, and treating clinical episodes
US11813075B2 (en) * 2020-01-24 2023-11-14 Hb Innovations, Inc. Combinational output sleep system
US11812936B2 (en) 2014-09-03 2023-11-14 Hill-Rom Services, Inc. Apparatus and methods for monitoring a sleeping subject
WO2024095033A1 (en) 2022-10-31 2024-05-10 Bosch Car Multimedia Portugal, S.A. Occupant action recognition

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220015681A1 (en) 2018-11-11 2022-01-20 Biobeat Technologies Ltd. Wearable apparatus and method for monitoring medical properties
DE102022004333A1 (en) 2022-11-21 2023-10-19 Mercedes-Benz Group AG Method for adjusting a seat and the adjustable seat

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6438477B1 (en) * 2002-02-27 2002-08-20 Delphi Technologies, Inc. Vehicle seat occupant characterization method including empty seat detection
US6575902B1 (en) * 1999-01-27 2003-06-10 Compumedics Limited Vigilance monitoring system
US20040061615A1 (en) * 2000-12-21 2004-04-01 Mitsuru Takashima Doze alarm for driver using enclosed air sound sensor
US20060253238A1 (en) * 2005-05-06 2006-11-09 Murphy Morgan D Method of distinguishing between adult and cinched car seat occupants of a vehicle seat
US20080156602A1 (en) * 2006-05-31 2008-07-03 Techno-Sciences, Inc. Adaptive energy absorption system for a vehicle seat
US20100056937A1 (en) * 2006-09-28 2010-03-04 Aisin Seiki Kabushiki Kaisha Heartbeat detecting apparatus
US20150229341A1 (en) * 2013-04-06 2015-08-13 Honda Motor Co., Ltd. System and method for capturing and decontaminating photoplethysmopgraphy (ppg) signals in a vehicle
US20150317834A1 (en) * 2014-05-01 2015-11-05 Adam G. Poulos Determining coordinate frames in a dynamic environment

Family Cites Families (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4226972A1 (en) 1992-08-14 1994-02-17 Vladimir Dr Blazek Non-invasive measurement of venous and arterial blood pressure in finger or toe - correlating slow increase from zero air pressure in cuff, caused by electronically-controlled pump, with change in venous signal from reflection or transmission photoplethysmograph sensor
US6217525B1 (en) 1998-04-30 2001-04-17 Medtronic Physio-Control Manufacturing Corp. Reduced lead set device and method for detecting acute cardiac ischemic conditions
EP1416849A1 (en) 2001-08-08 2004-05-12 Product Systems Limited Activity monitoring device
US7190994B2 (en) * 2002-03-18 2007-03-13 Sonomedica, Inc. Method and system for generating a likelihood of cardiovascular disease, analyzing cardiovascular sound signals remotely from the location of cardiovascular sound signal acquisition, and determining time and phase information from cardiovascular sound signals
JP4277073B2 (en) * 2003-05-21 2009-06-10 株式会社デルタツーリング Seat load body determination device
US8942779B2 (en) 2004-02-05 2015-01-27 Early Sense Ltd. Monitoring a condition of a subject
US7077810B2 (en) 2004-02-05 2006-07-18 Earlysense Ltd. Techniques for prediction and monitoring of respiration-manifested clinical episodes
US20070118054A1 (en) 2005-11-01 2007-05-24 Earlysense Ltd. Methods and systems for monitoring patients for clinical episodes
US8403865B2 (en) 2004-02-05 2013-03-26 Earlysense Ltd. Prediction and monitoring of clinical episodes
US7314451B2 (en) 2005-04-25 2008-01-01 Earlysense Ltd. Techniques for prediction and monitoring of clinical episodes
US10194810B2 (en) 2004-02-05 2019-02-05 Earlysense Ltd. Monitoring a condition of a subject
US9131891B2 (en) 2005-11-01 2015-09-15 Earlysense Ltd. Monitoring a condition of a subject
US8491492B2 (en) 2004-02-05 2013-07-23 Earlysense Ltd. Monitoring a condition of a subject
DE602005016556D1 (en) * 2004-07-20 2009-10-22 Vincent Spruytte SYSTEM AND METHOD FOR DETECTING MOVEMENTS
US7542804B2 (en) 2004-12-03 2009-06-02 Alfred E. Mann Foundation For Scientific Research Neuromuscular stimulation to avoid pulmonary embolisms
DE102004058722A1 (en) 2004-12-06 2006-06-14 Airbus Deutschland Gmbh Electromagnetic resonance stimulation device for seat has receiving region for object, and massage device inserted within seat
JP2007130182A (en) 2005-11-09 2007-05-31 Toshiba Corp Illumination controller, illumination control system, illumination control method, and illumination control program
US20080275349A1 (en) 2007-05-02 2008-11-06 Earlysense Ltd. Monitoring, predicting and treating clinical episodes
US8585607B2 (en) 2007-05-02 2013-11-19 Earlysense Ltd. Monitoring, predicting and treating clinical episodes
WO2009138976A2 (en) 2008-05-12 2009-11-19 Earlysense Ltd Monitoring, predicting and treating clinical episodes
KR101435680B1 (en) 2007-09-11 2014-09-02 삼성전자주식회사 Method for analyzing stress based on biometric signal measured multiple
WO2013150523A1 (en) 2012-04-01 2013-10-10 Earlysense Ltd. Monitoring, predicting and treating clinical episodes
US9883809B2 (en) 2008-05-01 2018-02-06 Earlysense Ltd. Monitoring, predicting and treating clinical episodes
US8882684B2 (en) 2008-05-12 2014-11-11 Earlysense Ltd. Monitoring, predicting and treating clinical episodes
US11696691B2 (en) 2008-05-01 2023-07-11 Hill-Rom Services, Inc. Monitoring, predicting, and treating clinical episodes
US10238351B2 (en) 2008-05-12 2019-03-26 Earlysense Ltd. Monitoring, predicting and treating clinical episodes
US9215987B2 (en) 2009-12-16 2015-12-22 The Johns Hopkins University Methodology for arrhythmia risk stratification by assessing QT interval instability
US8979765B2 (en) * 2010-04-19 2015-03-17 Sotera Wireless, Inc. Body-worn monitor for measuring respiratory rate
US8788028B2 (en) 2010-07-28 2014-07-22 Medtronic, Inc. Parasympathetic stimulation to enhance tachyarrhythmia detection
US10292625B2 (en) 2010-12-07 2019-05-21 Earlysense Ltd. Monitoring a sleeping subject
EP2648616A4 (en) 2010-12-07 2014-05-07 Earlysense Ltd Monitoring, predicting and treating clinical episodes
WO2015008285A1 (en) 2013-07-18 2015-01-22 Earlysense Ltd. Monitoring a sleeping subject
US9011346B2 (en) 2011-01-27 2015-04-21 The Board Of Trustees Of The Leland Stanford Junior University Systems and methods for monitoring the circulatory system
JPWO2013062049A1 (en) 2011-10-25 2015-04-02 株式会社東芝 X-ray computed tomography system
US9636029B1 (en) * 2013-03-14 2017-05-02 Vital Connect, Inc. Adaptive selection of digital ECG filter
GB201314483D0 (en) * 2013-08-13 2013-09-25 Dolphitech As Ultrasound testing
WO2015187365A1 (en) 2014-06-02 2015-12-10 Cardiac Pacemakers, Inc. Method and apparatus for detecting atrial tachyarrhythmia using heart sounds
US10172593B2 (en) 2014-09-03 2019-01-08 Earlysense Ltd. Pregnancy state monitoring
US10575829B2 (en) 2014-09-03 2020-03-03 Earlysense Ltd. Menstrual state monitoring
AU2015321376B2 (en) * 2014-09-23 2019-10-03 Rr Sequences Inc. Contactless electric cardiogram system
JP2016146246A (en) 2015-02-06 2016-08-12 パナソニックIpマネジメント株式会社 Control device, control method and control system
KR20160110807A (en) * 2015-03-12 2016-09-22 주식회사 소소 Headset apparatus for detecting multi bio-signal
US10537263B2 (en) 2015-12-07 2020-01-21 Smart Solutions Technologies, S.L. Atrial fibrillation detection system and methods of use
WO2019053719A1 (en) 2017-09-17 2019-03-21 Earlysense Ltd. Apparatus and methods for monitoring a subject
US11011100B2 (en) 2018-09-10 2021-05-18 Lumileds Llc Dynamic pixel diagnostics for a high refresh rate LED array

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6575902B1 (en) * 1999-01-27 2003-06-10 Compumedics Limited Vigilance monitoring system
US20040061615A1 (en) * 2000-12-21 2004-04-01 Mitsuru Takashima Doze alarm for driver using enclosed air sound sensor
US6438477B1 (en) * 2002-02-27 2002-08-20 Delphi Technologies, Inc. Vehicle seat occupant characterization method including empty seat detection
US20060253238A1 (en) * 2005-05-06 2006-11-09 Murphy Morgan D Method of distinguishing between adult and cinched car seat occupants of a vehicle seat
US20080156602A1 (en) * 2006-05-31 2008-07-03 Techno-Sciences, Inc. Adaptive energy absorption system for a vehicle seat
US20100056937A1 (en) * 2006-09-28 2010-03-04 Aisin Seiki Kabushiki Kaisha Heartbeat detecting apparatus
US20150229341A1 (en) * 2013-04-06 2015-08-13 Honda Motor Co., Ltd. System and method for capturing and decontaminating photoplethysmopgraphy (ppg) signals in a vehicle
US20150317834A1 (en) * 2014-05-01 2015-11-05 Adam G. Poulos Determining coordinate frames in a dynamic environment

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10939829B2 (en) 2004-02-05 2021-03-09 Earlysense Ltd. Monitoring a condition of a subject
US12082913B2 (en) 2004-02-05 2024-09-10 Hill-Rom Services, Inc. Monitoring a condition of a subject
US11696691B2 (en) 2008-05-01 2023-07-11 Hill-Rom Services, Inc. Monitoring, predicting, and treating clinical episodes
US10786211B2 (en) 2008-05-12 2020-09-29 Earlysense Ltd. Monitoring, predicting and treating clinical episodes
US11147476B2 (en) 2010-12-07 2021-10-19 Hill-Rom Services, Inc. Monitoring a sleeping subject
US11812936B2 (en) 2014-09-03 2023-11-14 Hill-Rom Services, Inc. Apparatus and methods for monitoring a sleeping subject
WO2019053719A1 (en) 2017-09-17 2019-03-21 Earlysense Ltd. Apparatus and methods for monitoring a subject
US10661682B2 (en) 2018-07-17 2020-05-26 Honda Motor Co., Ltd. Vehicle seat haptic system and method
US11813075B2 (en) * 2020-01-24 2023-11-14 Hb Innovations, Inc. Combinational output sleep system
US20210307683A1 (en) * 2020-04-01 2021-10-07 UDP Labs, Inc. Systems and Methods for Remote Patient Screening and Triage
US11931168B2 (en) 2020-04-01 2024-03-19 Sleep Number Corporation Speech-controlled health monitoring systems and methods
WO2024095033A1 (en) 2022-10-31 2024-05-10 Bosch Car Multimedia Portugal, S.A. Occupant action recognition

Also Published As

Publication number Publication date
US11547336B2 (en) 2023-01-10
WO2017138005A3 (en) 2017-11-02
US20200275876A1 (en) 2020-09-03
US20230157602A1 (en) 2023-05-25
WO2017138005A2 (en) 2017-08-17
US20210100489A1 (en) 2021-04-08

Similar Documents

Publication Publication Date Title
US11547336B2 (en) Apparatus and methods for monitoring a subject
US20240074741A1 (en) Activation of home appliance in response to homeowner sleep state
US11510613B2 (en) Biological condition determining apparatus and biological condition determining method
Roebuck et al. A review of signals used in sleep analysis
EP2265173B1 (en) Method and system for sleep/wake condition estimation
CA2516093C (en) Automated insomnia treatment system
US10278638B2 (en) System and method to monitor and assist individual's sleep
KR101516016B1 (en) sleep control and/or monitoring apparatus based on portable eye-and-ear mask and method for the same
US9820680B2 (en) System and method for determining sleep and sleep stages of a person
US8512221B2 (en) Automated treatment system for sleep
CN102065753B (en) Non-invasive method and apparatus for determining light- sleep and deep-sleep stages
US20100100004A1 (en) Skin Temperature Measurement in Monitoring and Control of Sleep and Alertness
US20140275829A1 (en) Sleep stage annotation device
US20080275349A1 (en) Monitoring, predicting and treating clinical episodes
WO2016035073A1 (en) Monitoring a sleeping subject
KR20180075832A (en) Method and Apparatus for Monitoring Sleep State
JP2022520934A (en) Sleep monitoring system and method
CN109758281B (en) Safety system based on body position adjustment
CN116897008A (en) Bed with features for sensing sleeper pressure and generating an estimate of brain activity
Maritsa et al. Audio-based wearable multi-context recognition system for apnea detection
Cuppens Detection of epileptic seizures based on video and accelerometer recordings
Tahoun Nonintrusive nocturnal remote monitoring of vital signs in ambient assisted living environments
Adil et al. Importance of Sleep for Medical and General Wellness
Zuccala Methods for acquisition and integration of personal wellness parameters
Van Deun et al. Ambient intelligence in the bedroom

Legal Events

Date Code Title Description
AS Assignment

Owner name: EARLYSENSE LTD., ISRAEL

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SHINAR, ZVIKA;TSOREF, LIAT;MEGER, GUY;SIGNING DATES FROM 20170313 TO 20170321;REEL/FRAME:041751/0445

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: KREOS CAPITAL V (EXPERT FUND) L.P., JERSEY

Free format text: SECURITY INTEREST;ASSIGNOR:EARLYSENSE LTD.;REEL/FRAME:044345/0219

Effective date: 20171129

AS Assignment

Owner name: EARLYSENSE LTD., ISRAEL

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:KREOS CAPITAL V (EXPERT FUND) L.P.;REEL/FRAME:049425/0725

Effective date: 20190502

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

AS Assignment

Owner name: KREOS CAPITAL VI (EXPERT FUND) L.P., JERSEY

Free format text: SECURITY INTEREST;ASSIGNOR:EARLYSENSE LTD.;REEL/FRAME:049827/0878

Effective date: 20190711

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

AS Assignment

Owner name: EARLYSENSE LTD., ISRAEL

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SHINAR, ZVIKA;KARASIK, ROMAN;KATZ, YANIV;AND OTHERS;SIGNING DATES FROM 20200614 TO 20200726;REEL/FRAME:053314/0490

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION