Nothing Special   »   [go: up one dir, main page]

US20200281528A1 - Method and system for diagnosing a disease using eye optical data - Google Patents

Method and system for diagnosing a disease using eye optical data Download PDF

Info

Publication number
US20200281528A1
US20200281528A1 US16/882,616 US202016882616A US2020281528A1 US 20200281528 A1 US20200281528 A1 US 20200281528A1 US 202016882616 A US202016882616 A US 202016882616A US 2020281528 A1 US2020281528 A1 US 2020281528A1
Authority
US
United States
Prior art keywords
user
eye
radiation
disease
optical data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/882,616
Inventor
Michael A. Brewer
Shannon Rose Hinkley
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Renegade Optophysics LLC
Original Assignee
Renegade Optophysics LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Renegade Optophysics LLC filed Critical Renegade Optophysics LLC
Priority to US16/882,616 priority Critical patent/US20200281528A1/en
Assigned to Renegade OptoPhysics LLC reassignment Renegade OptoPhysics LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Brewer, Michael A, HINKLEY, SHANNON ROSE
Publication of US20200281528A1 publication Critical patent/US20200281528A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/1015Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for wavefront analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0075Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by spectroscopy, i.e. measuring spectra, e.g. Raman spectroscopy, infrared absorption spectroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02438Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • A61B5/14551Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
    • A61B5/14555Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases specially adapted for the eye fundus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4842Monitoring progression or stage of a disease
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/02Goggles
    • A61F9/029Additional functions or features, e.g. protection for other parts of the face such as ears, nose or mouth; Screen wipers or cleaning devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • A61B3/1225Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes using coherent radiation
    • A61B3/1233Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes using coherent radiation for measuring blood flow, e.g. at the retina
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02416Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Definitions

  • Non-invasive diagnostic techniques primarily rely on electromagnetic radiation. Based on how the radiation interacts with bodily tissues or analytes, an indication of the presence or absence of a disease state (e.g., cancer, liver disease) can be determined.
  • a disease state e.g., cancer, liver disease
  • non-invasive diagnostic techniques are known in the prior art; however many techniques provide very limited information about the overall health of the patient.
  • the current non-invasive diagnostic techniques often utilize clunky benchtop devices that are primarily focused on the detection of a single blood analyte, the monitoring of volumetric changes of tissue structures (e.g., plethysmography), or the oxygenation levels of the blood (e.g., pulse oximetry), which are usually directed to the diagnosis or monitoring of a specific disease state.
  • the current techniques do not provide information about the presence or absence of non-tested diseases, whether the patient experienced a disease triggering event, or the severity of a disease (i.e., disease stage).
  • a diagnostic eye goggle system capable of collecting and analyzing multiple types of optical data from a user's eye and cross correlate that data with historical data to identify one or more disease states of the user.
  • a diagnostic eye goggle system capable of tracking the biological and physical changes in the eye of a user with or without a disease, and use the tracked changes to identify one or more disease states of a future user.
  • the present invention relates to a diagnostic eye goggle system, and more particularly, to a diagnostic eye goggle system utilizing optical measurements of a user's eye and a master database having historical user data to identify a disease state of the user or provide lens-correcting suggestions.
  • the general purpose of the diagnostic eye goggle system is to provide a diagnostic eye goggle system which has many novel features that result in a diagnostic eye goggle system which is not anticipated, rendered obvious, suggested, or even implied by prior art, either alone or in combination thereof.
  • a method for diagnosing a disease, a disease state, or a disease stage of a user based on optical data is provided herein.
  • Goggles are provided having a radiation source, a radiation sensor, and a microcontroller.
  • the goggles are assembled about a user's head such that the radiation source and radiation sensor are situated in front of a user's eye. Radiation is emitted radiation into the user's eye with the radiation source.
  • a user's optical data is detected with the radiation sensor.
  • the optical data includes at least two of the following: a) a wavefront of reflected radiation from the user's eye; b) a spectrum of reflected radiation from the user' eye; and c) one or more wavelengths of reflected radiation from the user's eye.
  • a statistical match between the user's optical data and optical data from one or more historical users is determined, where the statistical match is determined with a diagnostic software module executed by a processor.
  • a disease, disease state, or disease stage of the user is diagnosed based on a diagnosed disease, disease state, or disease stage of the one or more historical users.
  • a diagnostic eye goggle system for diagnosing a disease, disease state, or disease stage of a user is also provided herein.
  • the system included goggles, a master database, and a diagnostic software module.
  • the goggles are configured to detect a user's optical data.
  • the optical data includes at least two of the following: a) a wavefront of reflected radiation from the user's eye; b) a spectrum of reflected radiation from the user' eye; and c) one or more wavelengths of reflected radiation from the user's eye.
  • the master database stores optical data from a plurality of historical users.
  • the diagnostic software module is stored on non-transient memory and executed by a processor.
  • the diagnostic software module when executed by the processor determines a statistical match between the user's optical data and optical data from one or more historical users.
  • a diagnoses of a disease, disease state, or disease stage of the user is determined based on a diagnosed disease, disease state, or disease stage of the one or more historical users.
  • FIG. 1 depicts a diagnostic eye goggle system having a user wearing goggles that interface with an external master database.
  • FIG. 2 is a perspective view of the goggles.
  • FIG. 3 depicts the components of the goggles and how the goggles interact with a user's eye.
  • FIGS. 4A-4D depict different types of optical data acquired by the goggles, where FIG. 4A depicts the detection of a wavefront, FIG. 4B depicts the detection of analytes with spectral analysis, FIG. 4C depicts the detection of frequency-shifted radiation, and FIG. 4D depicts the detection of reflected radiation patterns on specific regions of the eye.
  • FIG. 5 depicts a front panel of the goggles having a camera and eye directing lights.
  • FIG. 6 depicts a method of using the diagnostic eye goggle system.
  • the present invention has utility as a diagnostic eye goggle system to acquire optical data from a user's eye and cross-correlate the optical data with historical optical data to identify at least one of a disease state or a disease stage of the user.
  • the diagnostic eye goggle system has additional utility in providing lens-correcting instructions or suggestions to the user or health care provider.
  • the invention can be adapted to diagnose several diseases, disease states, and disease stages illustratively including: cancer; organ disease (e.g., liver, heart, brain, skin); nerve and vessel disease; bacterial, parasite and viral infections; and eye diseases (e.g., glaucoma, macular degeneration).
  • organ disease e.g., liver, heart, brain, skin
  • nerve and vessel disease e.g., bacterial, parasite and viral infections
  • eye diseases e.g., glaucoma, macular degeneration
  • range is intended to encompass not only the end point values of the range but also intermediate values of the range as explicitly being included within the range and varying by the last significant figure of that range.
  • a recited range of 1 to 4 is intended to include 1-2, 1-3, 2-4, 3-4, and 1-4.
  • FIGS. 1 through 6 examples of the instant diagnostic eye goggle system employing the principles and concepts of the present diagnostic eye goggle system and generally designated by the reference number 10 will be described.
  • the diagnostic eye goggle system 10 generally includes goggles 12 and an external master database 14 .
  • the external master database 14 includes data from historical users of the goggles 12 , referred to herein as historical user data.
  • the external master database 14 is stored on or more servers and accessible by an Internet connection, however, it should be appreciated that the external master database 14 may be stored on a private server or intranet and may be accessible by other wired or wireless connections.
  • the goggles 12 generally include a front panel 16 and a head securement feature 18 .
  • the front panel 16 is situated in front of the user's eyes when the goggles are worn about the user's head (U).
  • the front panel 16 includes a light shield 17 around the front panel 16 that conforms about the user's eyes to eliminate exposure of natural light to the user's eyes.
  • the light shield 17 may project around an outer edge of the front panel 16 to make contact with the user's face.
  • the light shield 17 may further be made of a flexible, and light absorbent material.
  • the head securement feature 18 is configured to secure the goggles 12 to the user's head (U).
  • the securement feature 18 may include an elastic strap, an adjustable strap, temples that fit on the user's ears, a nose clip that assembles to the user's nose, and equivalents thereof.
  • the front panel 16 of the goggles 12 is shown in the context of emitting and receiving radiation (denoted as the dashed arrows 17 and 19 , respectively) into and out of a user's eye (E).
  • the front panel 16 includes a plurality of components to acquire optical data from the user's eye (E).
  • the front panel 16 may include one or more electromagnetic radiation sources 19 disposed to emit radiation into one or more eyes (E) of the user.
  • the radiation source(s) 19 may include one or more light emitting diodes (LEDs), solid-state lasers, incandescent light, fluorescent light, or a combination thereof.
  • the radiation source(s) are configured to emit radiation having minimal harmful effects on the structures of the eye (E).
  • the emitted radiation wavelength may range from 380 nanometers in wavelength to 2500 nanometers in wavelength corresponding to the visible and infrared spectrum of radiation. In some embodiments, the emitted radiation has a shorter wavelength below 380 nanometers but greater than 50 nanometers.
  • the front panel 16 further includes one or more radiation sensors 20 to detect at least one of refraction, reflection, interference, intensity, frequency-shift, wavefront, or a spectrum of reflected radiation reflected from one or more structures in the user's eye (E).
  • the radiation sensors 20 may include a charged-coupled device (CCD) sensor, a Hartmann-Shack wavefront sensor, or an array of photodiodes.
  • CCD charged-coupled device
  • Hartmann-Shack wavefront sensor or an array of photodiodes.
  • the front panel 16 may further include one or more optical elements 22 disposed between the radiation source and the radiation sensor for manipulating at least one the emitted radiation and the reflected radiation.
  • the one or more optical sensors may include at least one of a slit, a pinhole, a collimator, a mirror, a beam-splitter, a lens, an x-y scanner, an x-y-z scanner, a prism, a reference arm, or a combination thereof.
  • the radiation emitted from the radiation source(s) 19 is directly detected by the radiation sensor(s) 20 without the use of the optical elements 22 .
  • a simple slit or pinhole disposed in front of the radiation source 19 may be regarded as an optical element 22 .
  • the front panel 16 further includes a microcontroller 24 disposed in communication with at least one of the radiation source 19 , the radiation sensor 20 , and optical elements 22 .
  • the microcontroller 24 generally coordinates the emission of radiation into the user's eye(s) (E) and analyzes the data received from the radiation sensor(s) 20 .
  • the microcontroller 24 further includes a processor and memory.
  • a transceiver 26 is further disposed in communication with the microcontroller 24 .
  • the transceiver 26 provides a datalink between the microcontroller 24 and the external master database 14 .
  • the interface may be accomplished with a wired or wireless connection including Ethernet cables, BUS cables, a power line, Bluetooth, Wi-Fi, radiofrequency, and equivalents thereof.
  • the datalink may be accomplished through a wired or wireless network, illustratively including, a local area network, or the Internet.
  • the term “in communication” refers to a wired or wireless connection between two or more stated elements (e.g., microcontroller 24 and transceiver 26 ) and does not necessarily require a direct one-to-one connection where other elements (e.g., circuitry, a network) may facilitate or be part of the connection between the two or more stated elements.
  • the diagnostic eye goggle system 10 further includes a diagnostic software module that cross-correlates analyzed data from the microcontroller with the historical data in the master database 14 to identify a disease state of the user.
  • the diagnostic software module is stored in memory associated with the microcontroller 24 and executed by a processor of the microcontroller 24 .
  • the diagnostic software module is stored in memory associated with the master database 14 and executed by a processor associated with the master database 14 .
  • the diagnostic software module may use several algorithms for identifying a statistical match, illustratively including: a) running the analyzed data through a decision tree to classify the analyzed data into a cohort and subsequently comparing the analyzed data to historical data within said cohort; b) comparing one or more finite outputs from the analyzed data (e.g., Zernike Polynomials) with one or more outputs associated with the historical user data; c) Na ⁇ ve Bayes classifiers to recognize specific patterns in the analyzed data and match the specific patterns with patterns associated with the historical user data; d) regression analysis to correlate how the analyzed set of data statistically compares to historical user's data; and e) clustering algorithms to cluster the analyzed set of data with historical user data to aid in finding a statistical match.
  • a) running the analyzed data through a decision tree to classify the analyzed data into a cohort and subsequently comparing the analyzed data to historical data within said cohort b) comparing one or more finite outputs from the analyzed data
  • the diagnostic software module iteratively compares the analyzed data from the microcontroller 24 with historical analyzed data from each historical user of the diagnostic eye goggle system 10 . For example, if the master database 14 includes historical analyzed data from 5,000 users, then the diagnostic software module compares the present analyzed user's data with each of the 5,000 previous user's analyzed data to identify a match.
  • the optical data from the 5,000 historical users are classified into one or more groups, which may or may not correspond to a particular disease, disease state, or disease stage. The present user's analyzed data is then first grouped or classified into one or more groups and subsequently compared with each of the historical user's data in said group. Specific types of optical data to be acquired, analyzed, and matched are further described below.
  • the diagnostic eye goggle system 10 further includes read-write memory 27 for performing offline tasks when the eye goggles 12 are disconnected from the master database 14 .
  • the read-write memory 27 is housed in the front panel 16 and disposed in communication with the microcontroller 24 .
  • the read-write memory 27 is external to the goggles 12 but in communication with the microcontroller 24 and in the same locational vicinity as the goggles 10 , such as an external hard drive, universal serial bus (USB) drive, and equivalents thereof. While in other embodiments, the read-write memory 27 is the same as the aforementioned memory associated with the microcontroller 24 .
  • the read-write memory 27 is particularly advantageous as the memory 27 permits the goggles 12 to function without connectivity to the master database 14 .
  • the goggles 12 may be sent to a remote African village to acquire optical eye data from remote users in the local population.
  • the read-write memory 27 may then store optical eye data from a plurality of remote user's in that local population.
  • the goggles 10 are capable of re-connecting to the master database 14 (e.g., through an internet connection)
  • the optical eye data from the plurality of user's are transferred and stored in the master database 14 and an identification of a disease state or disease stage for each individual may be provided.
  • the read-write memory 27 may further store historical user data to identify a disease state and/or stage without having to connect to the master database 14 .
  • the diagnostic software module may be stored in the read-write write memory 27 and executed by the processor of the microcontroller 24 to identify a disease state and/or stage of the remote users.
  • the file size of the totality of the historical user data may be too large to store in the read-write memory 27 .
  • a selected portion of the historical user data is stored in read-write memory 27 .
  • the selected portion of the historical user data stored in the read-write memory 27 is selected based on a type of a disease and/or a prevalence of a disease.
  • the eye goggles 12 may be sent to an African village having an outbreak of malaria. Optical eye data from historical users having malaria is then selected as the portion of historical user data that is stored in the read-write memory 27 . The eye goggles 12 are then equipped to quickly identify if any user's in the African village population has malaria without having to connect with the master database 14 .
  • the read-write memory 27 only stores common diseases, while keeping uncommon diseases stored in the master database 14 . Therefore, the read-write memory 27 is not overloaded with historical user data and the computational time to cross-correlate and identify a disease is reduced. Then, once the goggles 12 re-connect with the master database 14 , any uncommon diseases from the remote population may be identified.
  • the memory associated with the microcontroller 12 stores three or more optical data acquisition modules.
  • the three or more optical data acquisition modules include software executable instructions to acquire three or more different types of optical data from the eye (E).
  • a first optical data acquisition module is configured to identify eye aberrations by detecting the refraction of reflected radiation from the eye (E).
  • the first optical data acquisition model includes instructions when executed by the processor causes the processor to: command at least one of the radiation source 19 and optical elements 22 to emit one or more pulses of radiation 28 onto the retina (R) of the user's eye (E), wherein a wavefront 30 of reflected radiation 32 is detected by the sensor 20 and transferred to the microcontroller 24 for eye aberration analysis.
  • the radiation sensor 20 for detecting the wavefront 30 may be a Hartmann-Shack wavefront sensor having a lenslet array and a CCD sensor.
  • the lenslet array is part of the optical elements 22 and the CCD sensor is the radiation sensor 20 .
  • the eye aberration analysis may include the determination of the Zernike Polynomials from the detected refractions of radiation over the area of the eye (E).
  • the wavefront is acquired using Tscherning aberroscopy or ray tracing.
  • a second optical data acquisition module is configured to identify the presence or absence of one or more analytes (A) in the blood vessels (BV) or other tissue structures of the user's eye (E).
  • the second optical data acquisition module includes instructions when executed by the processor causes the processor to: command at least one of the radiation source 19 and optical elements 22 to emit one or more pulses of a continuous spectrum of radiation 34 on one or more blood vessels (BV) or tissue structures in the user's eye (E).
  • a spectrum of the reflected radiation 36 is detected by the sensor 20 and transferred to the microcontroller 24 to analyze the presence, absence or concentration of an analyte (A) in the blood vessels (BV), tissues, or tissue structures in the user's eye (E).
  • the continuous spectrum of emitted radiation 34 may be white light comprised of the visible light spectrum of radiation.
  • the continuous spectrum may further include a spectrum of infrared light that may absorb, reflect, or interact with an analyte (A) in the blood vessel (BV), tissue, or tissue structure in the eye (E).
  • the reflected light 36 is detected and analyzed to determine one or more spectral line fingerprints by examining at least one of: a) the presence or absence of a particular wavelength of light that reflected from the eye (E); and/or b) the intensity of a particular wavelength of light reflected from the eye (E).
  • the optical elements 22 may include a prism to spread the reflected light 36 into their corresponding wavelengths for analysis.
  • the spectral line fingerprints provide an indication of the presence, absence, or a concentration of a particular analyte (A) in the user's blood or other tissue structures in the user's eye (E).
  • the radiation source 19 , optical elements 22 , and sensors 20 may include components to employ Raman spectroscopy for obtaining a spectral analysis of one or more analytes in the eye (E).
  • a third optical data acquisition module is configured to detect a frequency-shift in emitted radiation 38 compared to the reflected radiation 40 .
  • the emitted radiation 38 has a shorter wavelength than the reflected radiation 40 .
  • the third optical data acquisition module when executed by the processor causes the processor to: command at least one of the radiation source 19 and optical elements 22 to emit one or more specific wavelengths of radiation 38 on one or more blood vessels (BV), tissues, or tissue structures in the eye (E), wherein a frequency-shifted wavelength of reflected radiation 40 is detected by the sensor and transferred to the microcontroller for at least one of analyte (A), tissue, or tissue structure analysis of the eye (E).
  • the microcontroller 24 may command the radiation source 19 and/or optical elements 22 to emit radiation 38 having a wavelength of 520 nm at a particular tissue structure or blood vessel (BV) in the eye (E), and detect a reflected wavelength 40 of 600 nm.
  • the frequency-shift in the reflected radiation 40 indicates how the light interacted with the particular analyte (A), tissue, or tissue structure to ascertain the quality of a tissue or tissue structure and identify at least one of the presence, absence, or concentration of an analyte (A) in the eye (E).
  • a fourth optical data acquisition module is configured to detect an angular degree of reflected radiation reflected from one or more specific target locations on the retina (R) or other tissue structures in the eye (E).
  • the fourth optical data acquisition module when executed by the processor causes the processor to: command at least one of the radiation source 19 and optical elements 22 to emit one or more pulses of radiation 42 at one or more target locations on the retina (R) or other structures in the eye (E), wherein an angular degree of reflected radiation 44 is detected by the sensor 20 and transferred to the microcontroller 24 to analyze a topography of the targeted location(s).
  • the radiation may reflect in different directions due to an irregularly shaped surface.
  • An irregular topographical surface of a target location may be indicative of a particular disease, disease state, or disease stage.
  • a fifth optical data acquisition module is configured to emit one or more specific wavelengths of radiation and detect the intensity of reflected radiation.
  • the fifth optical data acquisition module when executed by the processor causes the processor to: command at least one of the radiation source 19 and optical elements 22 to emit one or more pulses of one or more specific wavelengths of radiation, wherein an intensity, or amount of reflected radiation, is detected by the sensor 20 and transferred to the microcontroller 24 to analyze the presence, absence, or concentration of one or more analytes (A) in a blood vessel (A) or other tissue in the eye (E).
  • some analytes (A) may absorb radiation at a first wavelength (providing a low intensity reading), and reflect radiation at a second wavelength (providing a high intensity reading).
  • the difference between the detected intensities of reflected radiation between the two different emitted wavelengths may be indicative of a concentration of a particular analyte (A).
  • the optical elements 22 may include a prism that is adjusted in response to commands by the microcontroller 24 to emit a specific wavelength.
  • the radiation source 19 includes a plurality of LEDs that may each emit a specific wavelength when commanded to do so.
  • a sixth optical data acquisition module is configured to detect one or more volumetric changes of a blood vessel (BV) or tissue structure in the eye (E).
  • the sixth optical data acquisition module when executed by the processor causes the processor to: command at least one of the microcontroller 24 or optical elements 22 to emit a plurality of pulses of radiation on and around one or more blood vessels in the user's eye (E), wherein the sensor detects a change in the reflected radiation between pulses that corresponds to a volumetric change in one or more of the blood vessels.
  • the sixth optical data acquisition module acts as a plethysmograph to monitor blood pressure, blood flow, and heart rate.
  • a seventh optical data acquisition module is configured to obtain images of surfaces and sub-surfaces of tissue structures in the eye (E).
  • the seventh optical data acquisition module when executed by the processor cause the processor to: command at least one of the microcontroller 24 or optical elements 22 to emit a plurality of pulses of infrared radiation on one or more targeted tissue structures, wherein the sensor detects a reflectivity profile of the targeted tissue containing information about the spatial dimensions and location of tissue structures.
  • the seventh optical data acquisition module is generally referred to as optical coherence tomography.
  • the aforementioned tissues and tissue structures in the eye illustratively include specific regions of the retina (R), the corneal tear film, the macula, the fovea, the vitreous body, the aqueous humor (fluid), the optical nerve, the lens, the pupil, the cornea, and ganglion cells.
  • analytes (A) to be detected in the blood vessels (BV) or tissues illustratively include, but not limited to: compounds such as glucose and bilirubin; enzymes such as amylase, lipase, aspartate transaminase, and alanine transaminase; metals such as mercury; cells such as white blood cells; and other proteins or metabolites such as growth factors and signaling proteins.
  • the front panel 16 may further include components for directing the emitted radiation.
  • the microcontroller 24 is disposed in communication with one or more optical elements 22 to actively manipulate at least one of the emitted radiation or the reflected radiation.
  • the optical elements 22 may include one or more actuating components, illustratively including, servo-motors, step-motors, pivots, ball screws, nuts, linear rails, and equivalents thereof to actively adjust one or more of the optical elements 22 based on commands from the microcontroller 24 (e.g., an x-y scanner for directing the radiation at a plurality of pre-programmed locations).
  • the three or more optical data acquisition modules when executed by the processor cause the processor to: actively direct the emitted radiation to a plurality of specific locations on the retina by actively adjusting one or more of the optical elements 22 (e.g., a mirror, a pinhole) with the actuating components.
  • the front panel 16 may further include a camera 46 disposed in communication with the microcontroller 24 .
  • the camera 46 includes an eye tracking software module for locating and tracking the pupil of the eye (E). Therefore, the emitted radiation may be actively and accurately directed to specific locations in the eye (E) based, in part, on a current position of the user's pupil.
  • the radiation may be directed to specific regions on the eye using a plurality of eye directing lights 48 .
  • the front panel 16 may include a plurality of eye directing lights 48 situated about a radiation emission aperture 50 in the front panel 16 .
  • the eye directing lights 48 are shown in a radial pattern about the radiation emission aperture 50 .
  • the eye directing lights 48 are configured to direct the user's line-of-sight in a particular direction to collect optical data on a specific region in the eye (E).
  • the three or more optical data acquisition modules may include additional instructions when executed by the processor cause the processor to: illuminate a sequence of the eye directing lights to sequentially direct the user's eye (E) to an illuminated light; command the radiation source to emit one or more pulses of radiation into the user's eye (E) for each position the eye (E) is directed to an illuminated light; and collect the reflected radiation reflected from the retina when the eye (E) is directed to each illuminated light.
  • an eye directing light 48 located below the radiation emission aperture 50 is illuminated directing the user's line-of-sight down.
  • the eye directing lights 48 may be used in lieu of optical elements 22 that actively direct emitted radiation at specific locations in the eye (E), or the eye directing lights 48 may be used in conjunction with optical elements 22 that actively direct emitted radiation.
  • the microcontroller 24 generates a mathematical map of the eye (E) having map data corresponding to the analyzed data collected from the optical data acquisition modules.
  • the map data may includes one or more analyzed wavefronts, one or more analyzed spectra of reflected radiation, one or more analyzed frequency-shifts of reflected radiation, one or more analyzed angular degrees of reflection, one or more analyzed intensities of reflected radiation from one or more emitted wavelengths of radiation, and one or more analyzed volumetric changes of a blood vessel (BV) or tissue structure.
  • BV blood vessel
  • the diagnostic software module compares the mathematical map of the eye (E) with historical user's mathematical maps to identify a disease state of the user using one or more of the aforementioned matching algorithms. For example, early detection of pancreatic cancer is determined by the combination of: i. blood composition as determined by the second optical acquisition module and the fifth optical acquisition module; ii. a given light wave reflection pattern as determined by the fourth optical acquisition module; and iii. a given wavefront aberration map as determined by the first acquisition module.
  • the microcontroller 24 generates a mapping identifier based on all of the map data.
  • a mapping identifier may be generated by combining, relating, and/or transforming i, ii, and iii above into a single value, range of values, or mathematical function.
  • the diagnostic software module then cross-correlates the mapping identifier with historical user's mapping identifiers located in the master database 14 to identify a particular disease, disease state, or disease stage.
  • the diagnostic software module may cross-correlate remote user's analyzed optical data, mathematical maps, and/or mapping identifiers with historical user's analyzed optical data, mathematical maps, and/or mapping identifiers stored locally in the read-write memory 27 to identify a particular disease, disease state, or disease stage of the remote user in a remote location (e.g., African village) if no connectivity to the master database 14 is possible as described above.
  • a remote location e.g., African village
  • the analyzed data, mathematical map, and/or mapping identifier of the user are transferred and stored in the master database 14 to become a component of the user's longitudinal health record and made available for diagnosing a disease state for future user's of the diagnostic eye goggle system 10 .
  • the user repeats the data acquisition modules with the diagnostic eye goggle system 10 to track how the acquired optical data may change as a function of disease onset, disease progression, or disease regression. The tracked changes in the optical data provide incredibly valuable markers for diagnosing a disease, disease state, or disease stage of a future user of the diagnostic eye goggle system 10 .
  • the tracked changes in the optical data further provides the potential to identify disease triggering events, to aid in the diagnoses of a future diseases, or the proneness a user may be to a particular disease.
  • identify disease triggering events to aid in the diagnoses of a future diseases, or the proneness a user may be to a particular disease.
  • the master database 14 further receives and stores medical history data of the user linked to the user's analyzed optical data, mathematical map, and/or mapping identifier.
  • the medical history data may include, but not limited to, a current disease state, a current disease stage, a past disease, height, weight, gender, race, smoking status, alcohol use, family medical history, blood work, and a gene map or DNA sequence of the user.
  • the medical history data of the user and past users is stored in the master database, where the diagnostic software module cross-correlates a present user's analyzed data, mathematical map, and/or mapping identifier with historical user's analyzed data, mathematical map, and/or mapping identifier to identify a statistical match therebetween.
  • a diagnosis of one or more disease states or disease stages of the present user may be made based on the medical history of a matched historical user. For example, past user A has a medical history of Alzheimer's disease. Past user A has a specific mathematical map Y generated by the diagnostic eye goggle system 10 . A new user B then utilizes the eye goggle system 10 that generates a mathematical map Z. The diagnostic software module identifies that mathematical map Y and mathematical map Z are a statistical match. The new user B may then be diagnosed with Alzheimer's disease. It should be appreciated that a user may be matched with several past user's having no diseases and thus an identification of no disease for the present user is possible. In another inventive embodiment, the analyzed optical data, mathematical maps, or mapping identifiers may be combined with genetic and other population health data in the master database, where disease analysis and triggering markers for disease initiation can be studied.
  • the diagnostic eye goggle system 10 further provides the user or health care provider with lens-correcting instructions or suggestions based on the analyzed wavefront data. Therefore a user receives a disease diagnosis, as well as a diagnosis of the user's visual acuity, which may be used to improve the user's visual acuity.
  • a particular inventive embodiment of a method for diagnosing a disease state or disease stage of a user with the eye goggle system of claim 1 is depicted.
  • the method includes assembling the goggles about the user's head wherein the front panel is situated in front of the user's eyes (E) [Block 100 ].
  • a disease state is provided to the user [Block 114 ].
  • the method may further include locating one or more eye features prior to emitting at least one of the first set of radiation, the second set of radiation, or the third set of radiation [Block 116 ].
  • Medical history data of the user may also be transmitted and stored into the master database to provide historical medical data for identifying at least one of a disease state or disease stage of a future user [Block 118 ].
  • the method may include repeating the steps above at several time points for a user having a particular disease to track and store the changes of the mathematical map as a function of disease progression or regression. The tracked changes provide valuable markers to identify disease states or disease stages of a future user of the diagnostic eye goggles 10 .
  • a method for identifying a disease state or stage of a remote user.
  • the goggles 12 are sent to a remote location having no connectivity to the master database 14 .
  • a first set, second set, and third set of optical eye data are acquired from a plurality of remote users at the remote location [Blocks 102 , 104 , 106 ].
  • the first set, second set, and third set of optical data are interpreted by a processor [Block 108 ] and locally stored in the read-write memory 27 locally associated with the goggles 12 [Block 120 ].
  • the interpreted data is then transmitted and stored to a master database 14 upon establishing an Internet connection between the goggles 12 and the master database 14 [Block 110 ].
  • the transmitted data is then cross-correlated with historical user data stored in the master database 14 to identify at least one of a disease state and/or stage of one or more of the plurality of remote users [Block 112 ].
  • the interpreted data is cross-correlated with historical user data stored in the local read-write memory 27 memory associated with the goggles 12 to identify at least one of a disease state and/or disease stage of one or more of the plurality of remote users without having to connect with the master database 14 .

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Surgery (AREA)
  • Ophthalmology & Optometry (AREA)
  • Pathology (AREA)
  • Cardiology (AREA)
  • Otolaryngology (AREA)
  • Vascular Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Physiology (AREA)
  • Optics & Photonics (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

A method for diagnosing a disease, disease state, or disease stage of a user is described herein. Goggles are providing having a radiation source, a radiation sensor, and a microcontroller. The goggle are assembled about a user's head. Optical data is collected from the user's eye, where the optical data includes at least two of the following: a) a wavefront of reflected radiation from the user's eye; b) a spectrum of reflected radiation from the user' eye; and c) one or more wavelengths of reflected to radiation from the user's eye. A statistical match between the user's optical data and a historical user's optical data is determined. A diagnoses of a disease, disease state, or disease stage of the user is determined based on a diagnosed disease, disease state, or disease stage of the historical user.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims priority benefit to U.S. patent application Ser. No. 15/832,233 filed on 5 Dec. 2017, the contents of which are hereby incorporated by reference in their entity.
  • FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
  • Not Applicable
  • INCORPORATION BY REFERENCE OF MATERIAL SUBMITTED ON A COMPACT DISK
  • Not Applicable
  • TO ALL WHOM IT MAY CONCERN
  • Be it known that, Michael A. Brewer and Shannon Rose Hinkley, have invented new and useful improvements in a system and method for diagnosing a disease with eye optical data as described in this specification.
  • BACKGROUND OF THE INVENTION
  • Human body diseases are triggered by a multitude of potential triggering events including environmental pressures, physiological changes, or genetically induced causes, to name a few. The detection of a disease, or disease onset, is paramount to the health of the population and has been an evolving field in modern medicine. One of the more effective methods for detecting a disease is through blood tests. However, recent advances in optics and signal processing have given rise to several non-invasive diagnostic techniques to detect diseases. The non-invasive diagnostic techniques primarily rely on electromagnetic radiation. Based on how the radiation interacts with bodily tissues or analytes, an indication of the presence or absence of a disease state (e.g., cancer, liver disease) can be determined. Various non-invasive diagnostic techniques are known in the prior art; however many techniques provide very limited information about the overall health of the patient. The current non-invasive diagnostic techniques often utilize clunky benchtop devices that are primarily focused on the detection of a single blood analyte, the monitoring of volumetric changes of tissue structures (e.g., plethysmography), or the oxygenation levels of the blood (e.g., pulse oximetry), which are usually directed to the diagnosis or monitoring of a specific disease state. In addition, the current techniques do not provide information about the presence or absence of non-tested diseases, whether the patient experienced a disease triggering event, or the severity of a disease (i.e., disease stage).
  • Thus, there is a need in the art for a diagnostic eye goggle system capable of collecting and analyzing multiple types of optical data from a user's eye and cross correlate that data with historical data to identify one or more disease states of the user. There is a further need for a diagnostic eye goggle system capable of tracking the biological and physical changes in the eye of a user with or without a disease, and use the tracked changes to identify one or more disease states of a future user.
  • FIELD OF THE INVENTION
  • The present invention relates to a diagnostic eye goggle system, and more particularly, to a diagnostic eye goggle system utilizing optical measurements of a user's eye and a master database having historical user data to identify a disease state of the user or provide lens-correcting suggestions.
  • SUMMARY OF THE INVENTION
  • The general purpose of the diagnostic eye goggle system, described subsequently in greater detail, is to provide a diagnostic eye goggle system which has many novel features that result in a diagnostic eye goggle system which is not anticipated, rendered obvious, suggested, or even implied by prior art, either alone or in combination thereof.
  • A method for diagnosing a disease, a disease state, or a disease stage of a user based on optical data is provided herein. Goggles are provided having a radiation source, a radiation sensor, and a microcontroller. The goggles are assembled about a user's head such that the radiation source and radiation sensor are situated in front of a user's eye. Radiation is emitted radiation into the user's eye with the radiation source. A user's optical data is detected with the radiation sensor. The optical data includes at least two of the following: a) a wavefront of reflected radiation from the user's eye; b) a spectrum of reflected radiation from the user' eye; and c) one or more wavelengths of reflected radiation from the user's eye. A statistical match between the user's optical data and optical data from one or more historical users is determined, where the statistical match is determined with a diagnostic software module executed by a processor. A disease, disease state, or disease stage of the user is diagnosed based on a diagnosed disease, disease state, or disease stage of the one or more historical users.
  • A diagnostic eye goggle system for diagnosing a disease, disease state, or disease stage of a user is also provided herein. The system included goggles, a master database, and a diagnostic software module. The goggles are configured to detect a user's optical data. The optical data includes at least two of the following: a) a wavefront of reflected radiation from the user's eye; b) a spectrum of reflected radiation from the user' eye; and c) one or more wavelengths of reflected radiation from the user's eye. The master database stores optical data from a plurality of historical users. The diagnostic software module is stored on non-transient memory and executed by a processor. The diagnostic software module when executed by the processor determines a statistical match between the user's optical data and optical data from one or more historical users. A diagnoses of a disease, disease state, or disease stage of the user is determined based on a diagnosed disease, disease state, or disease stage of the one or more historical users.
  • Thus has been broadly outlined the more important features of the present disease detecting eye goggle system so that the detailed description thereof that follows may be better understood and in order that the present contribution to the art may be better appreciated.
  • Objects of the present disease detecting eye goggle system, along with various novel features that characterize the invention are particularly pointed out in the claims forming a part of this disclosure. For better understanding of the disease detecting eye goggle system, its operating advantages and specific objects attained by its uses, refer to the accompanying drawings and description.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 depicts a diagnostic eye goggle system having a user wearing goggles that interface with an external master database.
  • FIG. 2 is a perspective view of the goggles.
  • FIG. 3 depicts the components of the goggles and how the goggles interact with a user's eye.
  • FIGS. 4A-4D depict different types of optical data acquired by the goggles, where FIG. 4A depicts the detection of a wavefront, FIG. 4B depicts the detection of analytes with spectral analysis, FIG. 4C depicts the detection of frequency-shifted radiation, and FIG. 4D depicts the detection of reflected radiation patterns on specific regions of the eye.
  • FIG. 5 depicts a front panel of the goggles having a camera and eye directing lights.
  • FIG. 6 depicts a method of using the diagnostic eye goggle system.
  • DETAILED DESCRIPTION OF THE DRAWINGS
  • The present invention has utility as a diagnostic eye goggle system to acquire optical data from a user's eye and cross-correlate the optical data with historical optical data to identify at least one of a disease state or a disease stage of the user. The diagnostic eye goggle system has additional utility in providing lens-correcting instructions or suggestions to the user or health care provider. The following description of various embodiments of the invention is not intended to limit the invention to those specific embodiments, but rather to enable any person skilled in the art to make and use this invention through exemplary aspects thereof. It will be clear and apparent to one skilled in the art that the invention can be adapted to diagnose several diseases, disease states, and disease stages illustratively including: cancer; organ disease (e.g., liver, heart, brain, skin); nerve and vessel disease; bacterial, parasite and viral infections; and eye diseases (e.g., glaucoma, macular degeneration).
  • It is to be understood that in instances where a range of values are provided that the range is intended to encompass not only the end point values of the range but also intermediate values of the range as explicitly being included within the range and varying by the last significant figure of that range. By way of example, a recited range of 1 to 4 is intended to include 1-2, 1-3, 2-4, 3-4, and 1-4.
  • With reference now to the drawings, and in particular FIGS. 1 through 6 thereof, examples of the instant diagnostic eye goggle system employing the principles and concepts of the present diagnostic eye goggle system and generally designated by the reference number 10 will be described.
  • With reference to FIGS. 1 and 2, particular embodiments of the general components of the present diagnostic eye goggle system 10 is illustrated. The diagnostic eye goggle system 10 generally includes goggles 12 and an external master database 14. The external master database 14 includes data from historical users of the goggles 12, referred to herein as historical user data. In specific embodiments, the external master database 14 is stored on or more servers and accessible by an Internet connection, however, it should be appreciated that the external master database 14 may be stored on a private server or intranet and may be accessible by other wired or wireless connections. The goggles 12 generally include a front panel 16 and a head securement feature 18. The front panel 16 is situated in front of the user's eyes when the goggles are worn about the user's head (U). In a particular embodiment, the front panel 16 includes a light shield 17 around the front panel 16 that conforms about the user's eyes to eliminate exposure of natural light to the user's eyes. The light shield 17 may project around an outer edge of the front panel 16 to make contact with the user's face. The light shield 17 may further be made of a flexible, and light absorbent material. The head securement feature 18 is configured to secure the goggles 12 to the user's head (U). The securement feature 18 may include an elastic strap, an adjustable strap, temples that fit on the user's ears, a nose clip that assembles to the user's nose, and equivalents thereof.
  • With reference to FIG. 3, a particular embodiment of the front panel 16 of the goggles 12 is shown in the context of emitting and receiving radiation (denoted as the dashed arrows 17 and 19, respectively) into and out of a user's eye (E). The front panel 16 includes a plurality of components to acquire optical data from the user's eye (E). The front panel 16 may include one or more electromagnetic radiation sources 19 disposed to emit radiation into one or more eyes (E) of the user. The radiation source(s) 19 may include one or more light emitting diodes (LEDs), solid-state lasers, incandescent light, fluorescent light, or a combination thereof. The radiation source(s) are configured to emit radiation having minimal harmful effects on the structures of the eye (E). In a specific embodiment, the emitted radiation wavelength may range from 380 nanometers in wavelength to 2500 nanometers in wavelength corresponding to the visible and infrared spectrum of radiation. In some embodiments, the emitted radiation has a shorter wavelength below 380 nanometers but greater than 50 nanometers.
  • The front panel 16 further includes one or more radiation sensors 20 to detect at least one of refraction, reflection, interference, intensity, frequency-shift, wavefront, or a spectrum of reflected radiation reflected from one or more structures in the user's eye (E). The radiation sensors 20 may include a charged-coupled device (CCD) sensor, a Hartmann-Shack wavefront sensor, or an array of photodiodes.
  • The front panel 16 may further include one or more optical elements 22 disposed between the radiation source and the radiation sensor for manipulating at least one the emitted radiation and the reflected radiation. The one or more optical sensors may include at least one of a slit, a pinhole, a collimator, a mirror, a beam-splitter, a lens, an x-y scanner, an x-y-z scanner, a prism, a reference arm, or a combination thereof. In another embodiment, the radiation emitted from the radiation source(s) 19 is directly detected by the radiation sensor(s) 20 without the use of the optical elements 22. However, it should be appreciated that a simple slit or pinhole disposed in front of the radiation source 19 may be regarded as an optical element 22.
  • The front panel 16 further includes a microcontroller 24 disposed in communication with at least one of the radiation source 19, the radiation sensor 20, and optical elements 22. The microcontroller 24 generally coordinates the emission of radiation into the user's eye(s) (E) and analyzes the data received from the radiation sensor(s) 20. The microcontroller 24 further includes a processor and memory. A transceiver 26 is further disposed in communication with the microcontroller 24. The transceiver 26 provides a datalink between the microcontroller 24 and the external master database 14. The interface may be accomplished with a wired or wireless connection including Ethernet cables, BUS cables, a power line, Bluetooth, Wi-Fi, radiofrequency, and equivalents thereof. In addition, the datalink may be accomplished through a wired or wireless network, illustratively including, a local area network, or the Internet. Further, the term “in communication” refers to a wired or wireless connection between two or more stated elements (e.g., microcontroller 24 and transceiver 26) and does not necessarily require a direct one-to-one connection where other elements (e.g., circuitry, a network) may facilitate or be part of the connection between the two or more stated elements.
  • The diagnostic eye goggle system 10 further includes a diagnostic software module that cross-correlates analyzed data from the microcontroller with the historical data in the master database 14 to identify a disease state of the user. In one embodiment, the diagnostic software module is stored in memory associated with the microcontroller 24 and executed by a processor of the microcontroller 24. In another embodiment, the diagnostic software module is stored in memory associated with the master database 14 and executed by a processor associated with the master database 14. The diagnostic software module may use several algorithms for identifying a statistical match, illustratively including: a) running the analyzed data through a decision tree to classify the analyzed data into a cohort and subsequently comparing the analyzed data to historical data within said cohort; b) comparing one or more finite outputs from the analyzed data (e.g., Zernike Polynomials) with one or more outputs associated with the historical user data; c) Naïve Bayes classifiers to recognize specific patterns in the analyzed data and match the specific patterns with patterns associated with the historical user data; d) regression analysis to correlate how the analyzed set of data statistically compares to historical user's data; and e) clustering algorithms to cluster the analyzed set of data with historical user data to aid in finding a statistical match. In some embodiments, the diagnostic software module iteratively compares the analyzed data from the microcontroller 24 with historical analyzed data from each historical user of the diagnostic eye goggle system 10. For example, if the master database 14 includes historical analyzed data from 5,000 users, then the diagnostic software module compares the present analyzed user's data with each of the 5,000 previous user's analyzed data to identify a match. In other embodiments, the optical data from the 5,000 historical users are classified into one or more groups, which may or may not correspond to a particular disease, disease state, or disease stage. The present user's analyzed data is then first grouped or classified into one or more groups and subsequently compared with each of the historical user's data in said group. Specific types of optical data to be acquired, analyzed, and matched are further described below.
  • In specific inventive embodiments, the diagnostic eye goggle system 10 further includes read-write memory 27 for performing offline tasks when the eye goggles 12 are disconnected from the master database 14. In one embodiment, the read-write memory 27 is housed in the front panel 16 and disposed in communication with the microcontroller 24. In another embodiment, the read-write memory 27 is external to the goggles 12 but in communication with the microcontroller 24 and in the same locational vicinity as the goggles 10, such as an external hard drive, universal serial bus (USB) drive, and equivalents thereof. While in other embodiments, the read-write memory 27 is the same as the aforementioned memory associated with the microcontroller 24. The read-write memory 27 is particularly advantageous as the memory 27 permits the goggles 12 to function without connectivity to the master database 14. For example, the goggles 12 may be sent to a remote African village to acquire optical eye data from remote users in the local population. The read-write memory 27 may then store optical eye data from a plurality of remote user's in that local population. Once the goggles 10 are capable of re-connecting to the master database 14 (e.g., through an internet connection), the optical eye data from the plurality of user's are transferred and stored in the master database 14 and an identification of a disease state or disease stage for each individual may be provided.
  • In a particular inventive embodiment, the read-write memory 27 may further store historical user data to identify a disease state and/or stage without having to connect to the master database 14. The diagnostic software module may be stored in the read-write write memory 27 and executed by the processor of the microcontroller 24 to identify a disease state and/or stage of the remote users. In some instances, the file size of the totality of the historical user data may be too large to store in the read-write memory 27. In such a case, a selected portion of the historical user data is stored in read-write memory 27. In a particular embodiment, the selected portion of the historical user data stored in the read-write memory 27 is selected based on a type of a disease and/or a prevalence of a disease. For example, the eye goggles 12 may be sent to an African village having an outbreak of malaria. Optical eye data from historical users having malaria is then selected as the portion of historical user data that is stored in the read-write memory 27. The eye goggles 12 are then equipped to quickly identify if any user's in the African village population has malaria without having to connect with the master database 14. In another example, the read-write memory 27 only stores common diseases, while keeping uncommon diseases stored in the master database 14. Therefore, the read-write memory 27 is not overloaded with historical user data and the computational time to cross-correlate and identify a disease is reduced. Then, once the goggles 12 re-connect with the master database 14, any uncommon diseases from the remote population may be identified.
  • With reference to FIGS. 4A through 4D, several types of optical data to be acquired and analyzed from the eye (E) are illustrated. In a particular embodiment, the memory associated with the microcontroller 12 stores three or more optical data acquisition modules. The three or more optical data acquisition modules include software executable instructions to acquire three or more different types of optical data from the eye (E). In a particular embodiment, with reference to FIG. 4A, a first optical data acquisition module is configured to identify eye aberrations by detecting the refraction of reflected radiation from the eye (E). The first optical data acquisition model includes instructions when executed by the processor causes the processor to: command at least one of the radiation source 19 and optical elements 22 to emit one or more pulses of radiation 28 onto the retina (R) of the user's eye (E), wherein a wavefront 30 of reflected radiation 32 is detected by the sensor 20 and transferred to the microcontroller 24 for eye aberration analysis. The radiation sensor 20 for detecting the wavefront 30 may be a Hartmann-Shack wavefront sensor having a lenslet array and a CCD sensor. In one embodiment, the lenslet array is part of the optical elements 22 and the CCD sensor is the radiation sensor 20. The eye aberration analysis may include the determination of the Zernike Polynomials from the detected refractions of radiation over the area of the eye (E). In other embodiments, the wavefront is acquired using Tscherning aberroscopy or ray tracing.
  • With reference to FIG. 4B, a second optical data acquisition module is configured to identify the presence or absence of one or more analytes (A) in the blood vessels (BV) or other tissue structures of the user's eye (E). The second optical data acquisition module includes instructions when executed by the processor causes the processor to: command at least one of the radiation source 19 and optical elements 22 to emit one or more pulses of a continuous spectrum of radiation 34 on one or more blood vessels (BV) or tissue structures in the user's eye (E). A spectrum of the reflected radiation 36 is detected by the sensor 20 and transferred to the microcontroller 24 to analyze the presence, absence or concentration of an analyte (A) in the blood vessels (BV), tissues, or tissue structures in the user's eye (E). The continuous spectrum of emitted radiation 34 may be white light comprised of the visible light spectrum of radiation. The continuous spectrum may further include a spectrum of infrared light that may absorb, reflect, or interact with an analyte (A) in the blood vessel (BV), tissue, or tissue structure in the eye (E). The reflected light 36 is detected and analyzed to determine one or more spectral line fingerprints by examining at least one of: a) the presence or absence of a particular wavelength of light that reflected from the eye (E); and/or b) the intensity of a particular wavelength of light reflected from the eye (E). The optical elements 22 may include a prism to spread the reflected light 36 into their corresponding wavelengths for analysis. The spectral line fingerprints provide an indication of the presence, absence, or a concentration of a particular analyte (A) in the user's blood or other tissue structures in the user's eye (E). In a specific embodiment, the radiation source 19, optical elements 22, and sensors 20 may include components to employ Raman spectroscopy for obtaining a spectral analysis of one or more analytes in the eye (E).
  • In a particular embodiment, with reference to FIG. 4C, a third optical data acquisition module is configured to detect a frequency-shift in emitted radiation 38 compared to the reflected radiation 40. As illustrated in FIG. 4C, the emitted radiation 38 has a shorter wavelength than the reflected radiation 40. The third optical data acquisition module when executed by the processor causes the processor to: command at least one of the radiation source 19 and optical elements 22 to emit one or more specific wavelengths of radiation 38 on one or more blood vessels (BV), tissues, or tissue structures in the eye (E), wherein a frequency-shifted wavelength of reflected radiation 40 is detected by the sensor and transferred to the microcontroller for at least one of analyte (A), tissue, or tissue structure analysis of the eye (E). For example, the microcontroller 24 may command the radiation source 19 and/or optical elements 22 to emit radiation 38 having a wavelength of 520 nm at a particular tissue structure or blood vessel (BV) in the eye (E), and detect a reflected wavelength 40 of 600 nm. The frequency-shift in the reflected radiation 40 indicates how the light interacted with the particular analyte (A), tissue, or tissue structure to ascertain the quality of a tissue or tissue structure and identify at least one of the presence, absence, or concentration of an analyte (A) in the eye (E).
  • In a specific embodiment, with reference to FIG. 4D, a fourth optical data acquisition module is configured to detect an angular degree of reflected radiation reflected from one or more specific target locations on the retina (R) or other tissue structures in the eye (E). The fourth optical data acquisition module when executed by the processor causes the processor to: command at least one of the radiation source 19 and optical elements 22 to emit one or more pulses of radiation 42 at one or more target locations on the retina (R) or other structures in the eye (E), wherein an angular degree of reflected radiation 44 is detected by the sensor 20 and transferred to the microcontroller 24 to analyze a topography of the targeted location(s). Depending on the topography of the target location, the radiation may reflect in different directions due to an irregularly shaped surface. An irregular topographical surface of a target location may be indicative of a particular disease, disease state, or disease stage.
  • In a particular embodiment, a fifth optical data acquisition module is configured to emit one or more specific wavelengths of radiation and detect the intensity of reflected radiation. The fifth optical data acquisition module when executed by the processor causes the processor to: command at least one of the radiation source 19 and optical elements 22 to emit one or more pulses of one or more specific wavelengths of radiation, wherein an intensity, or amount of reflected radiation, is detected by the sensor 20 and transferred to the microcontroller 24 to analyze the presence, absence, or concentration of one or more analytes (A) in a blood vessel (A) or other tissue in the eye (E). For example, some analytes (A) may absorb radiation at a first wavelength (providing a low intensity reading), and reflect radiation at a second wavelength (providing a high intensity reading). The difference between the detected intensities of reflected radiation between the two different emitted wavelengths may be indicative of a concentration of a particular analyte (A). In a particular embodiment, the optical elements 22 may include a prism that is adjusted in response to commands by the microcontroller 24 to emit a specific wavelength. In other embodiments, the radiation source 19 includes a plurality of LEDs that may each emit a specific wavelength when commanded to do so.
  • In a specific inventive embodiment, a sixth optical data acquisition module is configured to detect one or more volumetric changes of a blood vessel (BV) or tissue structure in the eye (E). The sixth optical data acquisition module when executed by the processor causes the processor to: command at least one of the microcontroller 24 or optical elements 22 to emit a plurality of pulses of radiation on and around one or more blood vessels in the user's eye (E), wherein the sensor detects a change in the reflected radiation between pulses that corresponds to a volumetric change in one or more of the blood vessels. The sixth optical data acquisition module acts as a plethysmograph to monitor blood pressure, blood flow, and heart rate.
  • In a particular inventive embodiment, a seventh optical data acquisition module is configured to obtain images of surfaces and sub-surfaces of tissue structures in the eye (E). The seventh optical data acquisition module when executed by the processor cause the processor to: command at least one of the microcontroller 24 or optical elements 22 to emit a plurality of pulses of infrared radiation on one or more targeted tissue structures, wherein the sensor detects a reflectivity profile of the targeted tissue containing information about the spatial dimensions and location of tissue structures. The seventh optical data acquisition module is generally referred to as optical coherence tomography.
  • It should be appreciated, that the aforementioned tissues and tissue structures in the eye (E) illustratively include specific regions of the retina (R), the corneal tear film, the macula, the fovea, the vitreous body, the aqueous humor (fluid), the optical nerve, the lens, the pupil, the cornea, and ganglion cells. It should further be appreciated that the aforementioned analytes (A) to be detected in the blood vessels (BV) or tissues illustratively include, but not limited to: compounds such as glucose and bilirubin; enzymes such as amylase, lipase, aspartate transaminase, and alanine transaminase; metals such as mercury; cells such as white blood cells; and other proteins or metabolites such as growth factors and signaling proteins.
  • To direct the emitted radiation to detect analytes and/or abnormalities on specific regions of the eye (E), the front panel 16 may further include components for directing the emitted radiation. In one embodiment, the microcontroller 24 is disposed in communication with one or more optical elements 22 to actively manipulate at least one of the emitted radiation or the reflected radiation. The optical elements 22 may include one or more actuating components, illustratively including, servo-motors, step-motors, pivots, ball screws, nuts, linear rails, and equivalents thereof to actively adjust one or more of the optical elements 22 based on commands from the microcontroller 24 (e.g., an x-y scanner for directing the radiation at a plurality of pre-programmed locations). The three or more optical data acquisition modules when executed by the processor cause the processor to: actively direct the emitted radiation to a plurality of specific locations on the retina by actively adjusting one or more of the optical elements 22 (e.g., a mirror, a pinhole) with the actuating components. With reference to FIG. 5, the front panel 16 may further include a camera 46 disposed in communication with the microcontroller 24. The camera 46 includes an eye tracking software module for locating and tracking the pupil of the eye (E). Therefore, the emitted radiation may be actively and accurately directed to specific locations in the eye (E) based, in part, on a current position of the user's pupil.
  • With reference to FIG. 5, the radiation may be directed to specific regions on the eye using a plurality of eye directing lights 48. The front panel 16 may include a plurality of eye directing lights 48 situated about a radiation emission aperture 50 in the front panel 16. The eye directing lights 48 are shown in a radial pattern about the radiation emission aperture 50. The eye directing lights 48 are configured to direct the user's line-of-sight in a particular direction to collect optical data on a specific region in the eye (E). The three or more optical data acquisition modules may include additional instructions when executed by the processor cause the processor to: illuminate a sequence of the eye directing lights to sequentially direct the user's eye (E) to an illuminated light; command the radiation source to emit one or more pulses of radiation into the user's eye (E) for each position the eye (E) is directed to an illuminated light; and collect the reflected radiation reflected from the retina when the eye (E) is directed to each illuminated light. For example, to target a region of the eye (E) below the macula, an eye directing light 48 located below the radiation emission aperture 50 is illuminated directing the user's line-of-sight down. Thus, radiation emitted through the pupil will make contact with retinal structures located below the macula. The eye directing lights 48 may be used in lieu of optical elements 22 that actively direct emitted radiation at specific locations in the eye (E), or the eye directing lights 48 may be used in conjunction with optical elements 22 that actively direct emitted radiation.
  • During and/or after the optical data acquisition process, in specific embodiments, the microcontroller 24 generates a mathematical map of the eye (E) having map data corresponding to the analyzed data collected from the optical data acquisition modules. The map data may includes one or more analyzed wavefronts, one or more analyzed spectra of reflected radiation, one or more analyzed frequency-shifts of reflected radiation, one or more analyzed angular degrees of reflection, one or more analyzed intensities of reflected radiation from one or more emitted wavelengths of radiation, and one or more analyzed volumetric changes of a blood vessel (BV) or tissue structure. The diagnostic software module then compares the mathematical map of the eye (E) with historical user's mathematical maps to identify a disease state of the user using one or more of the aforementioned matching algorithms. For example, early detection of pancreatic cancer is determined by the combination of: i. blood composition as determined by the second optical acquisition module and the fifth optical acquisition module; ii. a given light wave reflection pattern as determined by the fourth optical acquisition module; and iii. a given wavefront aberration map as determined by the first acquisition module. In a specific inventive embodiment, the microcontroller 24 generates a mapping identifier based on all of the map data. For example, a mapping identifier may be generated by combining, relating, and/or transforming i, ii, and iii above into a single value, range of values, or mathematical function. The diagnostic software module then cross-correlates the mapping identifier with historical user's mapping identifiers located in the master database 14 to identify a particular disease, disease state, or disease stage. It should be appreciated, that the diagnostic software module may cross-correlate remote user's analyzed optical data, mathematical maps, and/or mapping identifiers with historical user's analyzed optical data, mathematical maps, and/or mapping identifiers stored locally in the read-write memory 27 to identify a particular disease, disease state, or disease stage of the remote user in a remote location (e.g., African village) if no connectivity to the master database 14 is possible as described above.
  • In particular embodiments, the analyzed data, mathematical map, and/or mapping identifier of the user are transferred and stored in the master database 14 to become a component of the user's longitudinal health record and made available for diagnosing a disease state for future user's of the diagnostic eye goggle system 10. In a particular embodiment, the user repeats the data acquisition modules with the diagnostic eye goggle system 10 to track how the acquired optical data may change as a function of disease onset, disease progression, or disease regression. The tracked changes in the optical data provide incredibly valuable markers for diagnosing a disease, disease state, or disease stage of a future user of the diagnostic eye goggle system 10. The tracked changes in the optical data further provides the potential to identify disease triggering events, to aid in the diagnoses of a future diseases, or the proneness a user may be to a particular disease. By knowing how triggering events are seen from an Ophthalmological standpoint during an eye examination, several mathematical maps can be generated per disease, disease state, and disease stage for diagnosing future users with a particular disease, disease state, or disease stage.
  • In specific inventive embodiments, the master database 14 further receives and stores medical history data of the user linked to the user's analyzed optical data, mathematical map, and/or mapping identifier. The medical history data may include, but not limited to, a current disease state, a current disease stage, a past disease, height, weight, gender, race, smoking status, alcohol use, family medical history, blood work, and a gene map or DNA sequence of the user. The medical history data of the user and past users is stored in the master database, where the diagnostic software module cross-correlates a present user's analyzed data, mathematical map, and/or mapping identifier with historical user's analyzed data, mathematical map, and/or mapping identifier to identify a statistical match therebetween. If the diagnostic software module identifies a match, a diagnosis of one or more disease states or disease stages of the present user may be made based on the medical history of a matched historical user. For example, past user A has a medical history of Alzheimer's disease. Past user A has a specific mathematical map Y generated by the diagnostic eye goggle system 10. A new user B then utilizes the eye goggle system 10 that generates a mathematical map Z. The diagnostic software module identifies that mathematical map Y and mathematical map Z are a statistical match. The new user B may then be diagnosed with Alzheimer's disease. It should be appreciated that a user may be matched with several past user's having no diseases and thus an identification of no disease for the present user is possible. In another inventive embodiment, the analyzed optical data, mathematical maps, or mapping identifiers may be combined with genetic and other population health data in the master database, where disease analysis and triggering markers for disease initiation can be studied.
  • In specific embodiments, the diagnostic eye goggle system 10 further provides the user or health care provider with lens-correcting instructions or suggestions based on the analyzed wavefront data. Therefore a user receives a disease diagnosis, as well as a diagnosis of the user's visual acuity, which may be used to improve the user's visual acuity.
  • With reference to FIG. 6, a particular inventive embodiment of a method for diagnosing a disease state or disease stage of a user with the eye goggle system of claim 1 is depicted. The method includes assembling the goggles about the user's head wherein the front panel is situated in front of the user's eyes (E) [Block 100]. Emitting a first set of radiation on the user's retina with the radiation source and detecting and collecting a wavefront of reflected radiation reflected from the user's retina with the sensor [Block 102]. Emitting a second set of radiation on the user's retina with the radiation source and detecting and collecting a spectrum of reflected radiation reflected from the user's retina with the sensor [Block 104]. Emitting a third set of radiation on the user's retina with the radiation source and detecting and collecting a wavelength of reflected radiation reflected from the user's retina with the sensor [Block 106]. Generating a mathematical map of the eye (E) based on the wavefront, spectrum, and wavelength of the reflected radiation with the microcontroller 24 [Block 108]. Transmitting and storing the mathematical map of the eye (E) to a master database for diagnosing future users of the goggles [Block 110]. And, cross-correlating the mathematical map of the eye (E) with historical user's mathematical maps to provide at least one of a disease state, a disease stage, or lens-correcting suggestions to the user [Block 112]. Based on the cross-correlation as described above, a disease state is provided to the user [Block 114]. The method may further include locating one or more eye features prior to emitting at least one of the first set of radiation, the second set of radiation, or the third set of radiation [Block 116]. Medical history data of the user may also be transmitted and stored into the master database to provide historical medical data for identifying at least one of a disease state or disease stage of a future user [Block 118]. Finally, the method may include repeating the steps above at several time points for a user having a particular disease to track and store the changes of the mathematical map as a function of disease progression or regression. The tracked changes provide valuable markers to identify disease states or disease stages of a future user of the diagnostic eye goggles 10.
  • In another inventive embodiment, a method is provided for identifying a disease state or stage of a remote user. The goggles 12 are sent to a remote location having no connectivity to the master database 14. A first set, second set, and third set of optical eye data are acquired from a plurality of remote users at the remote location [ Blocks 102, 104, 106]. The first set, second set, and third set of optical data are interpreted by a processor [Block 108] and locally stored in the read-write memory 27 locally associated with the goggles 12 [Block 120]. The interpreted data is then transmitted and stored to a master database 14 upon establishing an Internet connection between the goggles 12 and the master database 14 [Block 110]. In one embodiment, the transmitted data is then cross-correlated with historical user data stored in the master database 14 to identify at least one of a disease state and/or stage of one or more of the plurality of remote users [Block 112]. In another embodiment, the interpreted data is cross-correlated with historical user data stored in the local read-write memory 27 memory associated with the goggles 12 to identify at least one of a disease state and/or disease stage of one or more of the plurality of remote users without having to connect with the master database 14.
  • Other Embodiments
  • While at least one exemplary embodiment has been presented in the foregoing detail description, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the described embodiments in any way. It should be understood that various changes may be made in the function and arrangement of elements without departing from the scope as set forth in the appended claims and the legal equivalents thereof.

Claims (20)

What is claimed is:
1. A method for diagnosing a disease, a disease state, or a disease stage of a user, said method comprising:
providing goggles, said goggles self-containing:
a radiation source to emit radiation into a user's eye;
a radiation sensor to detect radiation reflected from the user's eye; and
a microcontroller in communication with the radiation source and the radiation sensor;
assembling the goggles about a user's head such that the radiation source and radiation sensor are situated in front of a user's eye;
emitting radiation into the user's eye with the radiation source;
detecting a user's optical data with the radiation sensor, said optical data comprising at least two of the following: a) a wavefront of reflected radiation from the user's eye; b) a spectrum of reflected radiation from the user' eye; and c) one or more wavelengths of reflected radiation from the user's eye;
determining a statistical match between the user's optical data and optical data from one or more historical users, wherein the statistical match is determined with a diagnostic software module executed by a processor; and
diagnosing a disease, disease state, or disease stage of the user based on a diagnosed disease, disease state, or disease stage of the one or more historical users.
2. The method of claim 1 wherein the optical data comprises all three of the following: a) a wavefront of reflected radiation from the user's eye; b) a spectrum of reflected radiation from the user' eye; and c) one or more wavelengths of reflected radiation from the user's eye.
3. The method of claim 2 wherein the optical data further comprises an angular degree of reflected radiation.
4. The method of claim 1 wherein the optical data from the one or more historical users is stored in a master database.
5. The method of claim 4 wherein the microcontroller comprises the processor and non-transient memory, wherein the master database and diagnostic software module are stored in the non-transient memory, and the determination of the statistical match is performed by the processor of the microcontroller.
6. The method of claim 4 wherein the master database is stored in non-transient memory external to the goggles.
7. The method of claim 6 wherein the master database stores optical data from a plurality of historical users, wherein at least two historical users have been diagnosed with different diseases.
8. The method of claim 7 wherein the microcontroller comprises the processor and non-transient memory, wherein the non-transient memory of the microcontroller stores optical data from a selected portion of historical users having being diagnosed with one specific type of disease.
9. The method of claim 6 wherein the diagnostic software module is executed by a processor external to the goggles.
10. The method of claim 4 wherein the goggles further comprise a transmitter, to wherein the method further comprises transmitting and storing the user's optical data in the master database to assist in diagnosing a disease, disease state, or disease stage of a future user.
11. The method of claim 1 wherein the optical data further comprises the presence or concentration of one or more blood analytes present in the user's eye blood vessels, eye tissues, or eye tissue structures.
12. The method of claim 1 wherein the one or more wavelengths of reflected radiation detected from the user's eye is frequency shifted from a wavelength of radiation emitted from the radiation source.
13. The method of claim 1 further comprising:
locating at least one tissue, tissue structure, or blood vessel in the user's eye; and
targeting the emitted radiation from the radiation source to the targeted tissue, tissue structure, or blood vessel in the user's eye.
14. The method of claim 1 wherein the radiation source includes at least one of a light emitting diode (LED), a solid-state laser, an incandescent light, or a fluorescent light, and
15. The method of claim 14 wherein the radiation sensor includes at least one of a charged-coupled device (CCD) sensor, a Hartmann-Shack wavefront sensor, or an array of photodiodes.
16. A diagnostic eye goggle system for diagnosing a disease, disease state, or disease stage of a user, the system comprising:
the goggles of claim 1, wherein the goggles are configured to detect a user's optical data, said optical data comprising at least two of the following: a) a wavefront of reflected radiation from the user's eye; b) a spectrum of reflected radiation from the user' eye; and c) one or more wavelengths of reflected radiation from the user's eye;
a master database stored on non-transient memory for storing optical data from a plurality of historical users; and
a diagnostic software module stored on non-transient memory and executed by a processor, wherein the software module when executed by the processor determines a statistical match between the user's optical data and optical data from one or more historical users, wherein the diagnostic software module further diagnoses a disease, disease state, or disease stage of the user based on a diagnosed disease, disease state, or disease stage of the one or more historical users.
17. The system of claim 16 wherein the goggles further comprise a transmitter for establishing a data-link between the microcontroller and the master database.
18. The system of claim 16 wherein the microcontroller comprises the processor and the non-transient memory, wherein the processor executes the diagnostic software module, and the non-transient memory stores the master database and the diagnostic software module.
19. The system of claim 17 wherein the user's optical data is transmitted to the master database, and the diagnostic software module is executed by a processor external to the goggles.
20. The system of claim 17 wherein the master-database is stored in non-transient memory external to the goggles, and the diagnostic software module is executed by a processor associated with the microcontroller.
US16/882,616 2017-12-05 2020-05-25 Method and system for diagnosing a disease using eye optical data Abandoned US20200281528A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/882,616 US20200281528A1 (en) 2017-12-05 2020-05-25 Method and system for diagnosing a disease using eye optical data

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15/832,233 US10694995B2 (en) 2017-12-05 2017-12-05 Diagnostic eye goggle system
US16/882,616 US20200281528A1 (en) 2017-12-05 2020-05-25 Method and system for diagnosing a disease using eye optical data

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/832,233 Continuation US10694995B2 (en) 2017-12-05 2017-12-05 Diagnostic eye goggle system

Publications (1)

Publication Number Publication Date
US20200281528A1 true US20200281528A1 (en) 2020-09-10

Family

ID=66657745

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/832,233 Active 2038-04-26 US10694995B2 (en) 2017-12-05 2017-12-05 Diagnostic eye goggle system
US16/882,616 Abandoned US20200281528A1 (en) 2017-12-05 2020-05-25 Method and system for diagnosing a disease using eye optical data

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/832,233 Active 2038-04-26 US10694995B2 (en) 2017-12-05 2017-12-05 Diagnostic eye goggle system

Country Status (1)

Country Link
US (2) US10694995B2 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7384246B2 (en) * 2001-10-15 2008-06-10 Robert Bosch Gmbh Pump element and piston pump for generating high fuel pressure
US20190142270A1 (en) * 2016-04-22 2019-05-16 Carl Zeiss Meditec, Inc. System and method for visual field testing

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5140990A (en) 1990-09-06 1992-08-25 Spacelabs, Inc. Method of measuring blood pressure with a photoplethysmograph
US6040578A (en) 1996-02-02 2000-03-21 Instrumentation Metrics, Inc. Method and apparatus for multi-spectral analysis of organic blood analytes in noninvasive infrared spectroscopy
US6305804B1 (en) 1999-03-25 2001-10-23 Fovioptics, Inc. Non-invasive measurement of blood component using retinal imaging
US20050010091A1 (en) 2003-06-10 2005-01-13 Woods Joe W. Non-invasive measurement of blood glucose using retinal imaging
US6648473B2 (en) 2001-11-13 2003-11-18 Philadelphia Retina Endowment Fund High-resolution retina imaging and eye aberration diagnostics using stochastic parallel perturbation gradient descent optimization adaptive optics
KR100675555B1 (en) 2003-07-07 2007-01-29 유선국 Pulse oximeter and thereof method
WO2006052479A2 (en) 2004-11-08 2006-05-18 Optovue, Inc. Optical apparatus and method for comprehensive eye diagnosis
ES2544585T3 (en) 2007-06-15 2015-09-01 University Of Southern California Analysis of retinal map patterns for the diagnosis of optic nerve diseases by optical coherence tomography
US8740381B2 (en) 2007-06-27 2014-06-03 Bausch & Lomb Incorporated Method and apparatus for extrapolating diagnostic data
US7641343B1 (en) 2007-07-26 2010-01-05 Motamedi Manouchehr E Method and apparatus for early diagnosis of Alzheimer's using non-invasive eye tomography by terahertz
US20110082355A1 (en) 2009-07-30 2011-04-07 Oxitone Medical Ltd. Photoplethysmography device and method
US8494606B2 (en) 2009-08-19 2013-07-23 Covidien Lp Photoplethysmography with controlled application of sensor pressure
US8649008B2 (en) 2010-02-04 2014-02-11 University Of Southern California Combined spectral and polarimetry imaging and diagnostics
US8632262B2 (en) 2011-08-30 2014-01-21 Kestrel Labs, Inc. Laser to fiber optical coupling in photoplethysmography
WO2015021036A1 (en) 2013-08-05 2015-02-12 Xhale, Inc. Sensors for photoplethysmography in the ophthalmic artery region
US20150313462A1 (en) 2014-05-04 2015-11-05 Alexander Reis Method and System of using Photorefractive effects to examine eyes using a portable device
NZ773841A (en) * 2015-03-16 2022-07-01 Magic Leap Inc Methods and systems for diagnosing and treating health ailments
EP3448233A4 (en) * 2016-04-30 2019-05-08 Envision Diagnostics, Inc. Medical devices, systems, and methods for performing eye exams using displays comprising mems scanning mirrors
US11701046B2 (en) * 2016-11-02 2023-07-18 Northeastern University Portable brain and vision diagnostic and therapeutic system
TR201704117A2 (en) * 2017-03-20 2017-07-21 Oduncu Abdulkadi̇r VISUAL WARNING GLOCOMAL TREATMENT GLASSES
US10617566B2 (en) * 2018-06-14 2020-04-14 Vestibular First, Llc Modular headset for diagnosis and treatment of vestibular disorders

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7384246B2 (en) * 2001-10-15 2008-06-10 Robert Bosch Gmbh Pump element and piston pump for generating high fuel pressure
US20190142270A1 (en) * 2016-04-22 2019-05-16 Carl Zeiss Meditec, Inc. System and method for visual field testing

Also Published As

Publication number Publication date
US20190167191A1 (en) 2019-06-06
US10694995B2 (en) 2020-06-30

Similar Documents

Publication Publication Date Title
JP2021130010A (en) Methods and systems for diagnosing and treating health-hazard diseases
US8801183B2 (en) Assessment of microvascular circulation
WO2016189966A1 (en) Binocular measurement device, binocular measurement method, and binocular measurement program
US8011785B2 (en) Optical alignment apparatus and method therefor
JP2004283609A (en) Pupillometer with pupil irregularity detection, pupil tracking, and pupil response detection capability, glaucoma screening capability, corneal topography measurement capability, intracranial pressure detection capability, and ocular aberration measurement capability
US11839427B2 (en) Systems, methods, and apparatuses for ocular measurements
US20180249941A1 (en) Oculometric Neurological Examination (ONE) Appliance
Parnandi et al. Contactless measurement of heart rate variability from pupillary fluctuations
Roig et al. Pupil detection and tracking for analysis of fixational eye micromovements
US11642068B2 (en) Device and method to determine objectively visual memory of images
US20210353141A1 (en) Systems, methods, and apparatuses for eye imaging, screening, monitoring, and diagnosis
WO2020005053A1 (en) Portable system for identifying potential cases of diabetic macular oedema using image processing and artificial intelligence
US20220151484A1 (en) Joint determination of accommodation and vergence
US10694995B2 (en) Diagnostic eye goggle system
US20230064792A1 (en) Illumination of an eye fundus using non-scanning coherent light
Sethi et al. Functional testing in glaucoma diagnosis
Quan et al. Automatic glaucoma screening hybrid cloud system with pattern classification algorithms
CN114176510B (en) Head-mounted intraocular pressure measuring instrument and cloud platform
Suba et al. Survey on Machine Learning Techniques Used for Central Serous Retinopathy Detection
KR20240141832A (en) Systems and methods for collecting retinal stimulation and/or retinal signal data
Koprowski et al. Image Processing in Ophthalmology
NZ753160B2 (en) Methods and systems for diagnosing and treating health ailments

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

AS Assignment

Owner name: RENEGADE OPTOPHYSICS LLC, COLORADO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BREWER, MICHAEL A;HINKLEY, SHANNON ROSE;REEL/FRAME:053303/0245

Effective date: 20200715

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION