US20210219843A1 - Motion-Adaptive Interactive Imaging - Google Patents
Motion-Adaptive Interactive Imaging Download PDFInfo
- Publication number
- US20210219843A1 US20210219843A1 US17/213,235 US202117213235A US2021219843A1 US 20210219843 A1 US20210219843 A1 US 20210219843A1 US 202117213235 A US202117213235 A US 202117213235A US 2021219843 A1 US2021219843 A1 US 2021219843A1
- Authority
- US
- United States
- Prior art keywords
- subject
- view
- camera
- frame frequency
- trigger
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000003384 imaging method Methods 0.000 title abstract description 126
- 230000002452 interceptive effect Effects 0.000 title abstract description 12
- 230000033001 locomotion Effects 0.000 claims abstract description 73
- 238000000034 method Methods 0.000 claims abstract description 63
- 230000004044 response Effects 0.000 claims abstract description 18
- 239000012472 biological sample Substances 0.000 claims description 51
- 230000005284 excitation Effects 0.000 claims description 31
- 239000000523 sample Substances 0.000 claims description 31
- 238000004422 calculation algorithm Methods 0.000 claims description 17
- 238000012544 monitoring process Methods 0.000 claims description 10
- 230000002123 temporal effect Effects 0.000 claims description 8
- 238000001914 filtration Methods 0.000 claims description 7
- 238000012935 Averaging Methods 0.000 claims description 4
- 238000001727 in vivo Methods 0.000 claims description 3
- 206010028980 Neoplasm Diseases 0.000 abstract description 27
- 230000035945 sensitivity Effects 0.000 abstract description 21
- 238000001574 biopsy Methods 0.000 abstract description 12
- 230000001965 increasing effect Effects 0.000 abstract description 12
- 238000002073 fluorescence micrograph Methods 0.000 abstract description 5
- 238000012800 visualization Methods 0.000 abstract description 4
- 238000005286 illumination Methods 0.000 description 28
- 230000006870 function Effects 0.000 description 25
- 210000001519 tissue Anatomy 0.000 description 24
- 230000003044 adaptive effect Effects 0.000 description 12
- 230000008569 process Effects 0.000 description 9
- 101710160107 Outer membrane protein A Proteins 0.000 description 8
- 238000001514 detection method Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 7
- 238000000799 fluorescence microscopy Methods 0.000 description 7
- 239000000975 dye Substances 0.000 description 6
- 238000012014 optical coherence tomography Methods 0.000 description 6
- 230000008447 perception Effects 0.000 description 6
- FVAUCKIRQBBSSJ-UHFFFAOYSA-M sodium iodide Chemical compound [Na+].[I-] FVAUCKIRQBBSSJ-UHFFFAOYSA-M 0.000 description 6
- 238000001356 surgical procedure Methods 0.000 description 6
- 239000012099 Alexa Fluor family Substances 0.000 description 5
- 241001465754 Metazoa Species 0.000 description 5
- 230000008859 change Effects 0.000 description 5
- 230000005670 electromagnetic radiation Effects 0.000 description 5
- 239000007850 fluorescent dye Substances 0.000 description 5
- 239000000463 material Substances 0.000 description 5
- 210000000056 organ Anatomy 0.000 description 5
- 238000004091 panning Methods 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 238000002604 ultrasonography Methods 0.000 description 5
- 241000196324 Embryophyta Species 0.000 description 4
- 230000008901 benefit Effects 0.000 description 4
- XQPRBTXUXXVTKB-UHFFFAOYSA-M caesium iodide Chemical compound [I-].[Cs+] XQPRBTXUXXVTKB-UHFFFAOYSA-M 0.000 description 4
- 201000010099 disease Diseases 0.000 description 4
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 4
- 230000010354 integration Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000002829 reductive effect Effects 0.000 description 4
- 238000009877 rendering Methods 0.000 description 4
- 230000008685 targeting Effects 0.000 description 4
- 241000124008 Mammalia Species 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 3
- 201000011510 cancer Diseases 0.000 description 3
- 239000012216 imaging agent Substances 0.000 description 3
- 230000004807 localization Effects 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 230000001629 suppression Effects 0.000 description 3
- 206010006187 Breast cancer Diseases 0.000 description 2
- 208000026310 Breast neoplasm Diseases 0.000 description 2
- 229910052684 Cerium Inorganic materials 0.000 description 2
- MCVAAHQLXUXWLC-UHFFFAOYSA-N [O-2].[O-2].[S-2].[Gd+3].[Gd+3] Chemical compound [O-2].[O-2].[S-2].[Gd+3].[Gd+3] MCVAAHQLXUXWLC-UHFFFAOYSA-N 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 239000000427 antigen Substances 0.000 description 2
- 102000036639 antigens Human genes 0.000 description 2
- 108091007433 antigens Proteins 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 210000004027 cell Anatomy 0.000 description 2
- ZMIGMASIKSOYAM-UHFFFAOYSA-N cerium Chemical compound [Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce][Ce] ZMIGMASIKSOYAM-UHFFFAOYSA-N 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 238000002591 computed tomography Methods 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 230000004069 differentiation Effects 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 238000002675 image-guided surgery Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 244000005700 microbiome Species 0.000 description 2
- 238000000386 microscopy Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 239000003147 molecular marker Substances 0.000 description 2
- 238000003333 near-infrared imaging Methods 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 238000012552 review Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 235000009518 sodium iodide Nutrition 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 229910052716 thallium Inorganic materials 0.000 description 2
- BKVIYDNLLOSFOA-UHFFFAOYSA-N thallium Chemical compound [Tl] BKVIYDNLLOSFOA-UHFFFAOYSA-N 0.000 description 2
- PBYZMCDFOULPGH-UHFFFAOYSA-N tungstate Chemical compound [O-][W]([O-])(=O)=O PBYZMCDFOULPGH-UHFFFAOYSA-N 0.000 description 2
- 208000003174 Brain Neoplasms Diseases 0.000 description 1
- 241000234674 Bromelia plumieri Species 0.000 description 1
- 241000282465 Canis Species 0.000 description 1
- 201000009030 Carcinoma Diseases 0.000 description 1
- 241000283073 Equus caballus Species 0.000 description 1
- 208000000461 Esophageal Neoplasms Diseases 0.000 description 1
- 229910052693 Europium Inorganic materials 0.000 description 1
- 241000282324 Felis Species 0.000 description 1
- 208000032612 Glial tumor Diseases 0.000 description 1
- 206010018338 Glioma Diseases 0.000 description 1
- DGAQECJNVWCQMB-PUAWFVPOSA-M Ilexoside XXIX Chemical compound C[C@@H]1CC[C@@]2(CC[C@@]3(C(=CC[C@H]4[C@]3(CC[C@@H]5[C@@]4(CC[C@@H](C5(C)C)OS(=O)(=O)[O-])C)C)[C@@H]2[C@]1(C)O)C)C(=O)O[C@H]6[C@@H]([C@H]([C@@H]([C@H](O6)CO)O)O)O.[Na+] DGAQECJNVWCQMB-PUAWFVPOSA-M 0.000 description 1
- 235000004874 Karatas plumieri Nutrition 0.000 description 1
- 206010058467 Lung neoplasm malignant Diseases 0.000 description 1
- 229910052765 Lutetium Inorganic materials 0.000 description 1
- 206010027476 Metastases Diseases 0.000 description 1
- 206010030155 Oesophageal carcinoma Diseases 0.000 description 1
- OAICVXFJPJFONN-UHFFFAOYSA-N Phosphorus Chemical compound [P] OAICVXFJPJFONN-UHFFFAOYSA-N 0.000 description 1
- 241000288906 Primates Species 0.000 description 1
- 241000283984 Rodentia Species 0.000 description 1
- 206010039491 Sarcoma Diseases 0.000 description 1
- HCHKCACWOHOZIP-UHFFFAOYSA-N Zinc Chemical compound [Zn] HCHKCACWOHOZIP-UHFFFAOYSA-N 0.000 description 1
- 239000005083 Zinc sulfide Substances 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 210000000577 adipose tissue Anatomy 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- JNDMLEXHDPKVFC-UHFFFAOYSA-N aluminum;oxygen(2-);yttrium(3+) Chemical compound [O-2].[O-2].[O-2].[Al+3].[Y+3] JNDMLEXHDPKVFC-UHFFFAOYSA-N 0.000 description 1
- OYLGJCQECKOTOL-UHFFFAOYSA-L barium fluoride Chemical compound [F-].[F-].[Ba+2] OYLGJCQECKOTOL-UHFFFAOYSA-L 0.000 description 1
- 229910001632 barium fluoride Inorganic materials 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- SILMSBFCJHBWJS-UHFFFAOYSA-K bis(germine-1-carbonyloxy)bismuthanyl germine-1-carboxylate Chemical compound [Bi+3].[O-]C(=O)[Ge]1=CC=CC=C1.[O-]C(=O)[Ge]1=CC=CC=C1.[O-]C(=O)[Ge]1=CC=CC=C1 SILMSBFCJHBWJS-UHFFFAOYSA-K 0.000 description 1
- 210000000481 breast Anatomy 0.000 description 1
- 229910052793 cadmium Inorganic materials 0.000 description 1
- BDOSMKKIYDKNTQ-UHFFFAOYSA-N cadmium atom Chemical compound [Cd] BDOSMKKIYDKNTQ-UHFFFAOYSA-N 0.000 description 1
- WUKWITHWXAAZEY-UHFFFAOYSA-L calcium difluoride Chemical compound [F-].[F-].[Ca+2] WUKWITHWXAAZEY-UHFFFAOYSA-L 0.000 description 1
- 229910001634 calcium fluoride Inorganic materials 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000000701 chemical imaging Methods 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000004624 confocal microscopy Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000008094 contradictory effect Effects 0.000 description 1
- 238000001816 cooling Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 238000009792 diffusion process Methods 0.000 description 1
- NKTZYSOLHFIEMF-UHFFFAOYSA-N dioxido(dioxo)tungsten;lead(2+) Chemical compound [Pb+2].[O-][W]([O-])(=O)=O NKTZYSOLHFIEMF-UHFFFAOYSA-N 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 201000004101 esophageal cancer Diseases 0.000 description 1
- OGPBJKLSAFTDLK-UHFFFAOYSA-N europium atom Chemical compound [Eu] OGPBJKLSAFTDLK-UHFFFAOYSA-N 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 238000001917 fluorescence detection Methods 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 238000001093 holography Methods 0.000 description 1
- 230000001976 improved effect Effects 0.000 description 1
- 238000011065 in-situ storage Methods 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 150000002484 inorganic compounds Chemical class 0.000 description 1
- 229910010272 inorganic material Inorganic materials 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- XKUYOJZZLGFZTC-UHFFFAOYSA-K lanthanum(iii) bromide Chemical compound Br[La](Br)Br XKUYOJZZLGFZTC-UHFFFAOYSA-K 0.000 description 1
- ICAKDTKJOYSXGC-UHFFFAOYSA-K lanthanum(iii) chloride Chemical compound Cl[La](Cl)Cl ICAKDTKJOYSXGC-UHFFFAOYSA-K 0.000 description 1
- 239000003446 ligand Substances 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 238000004020 luminiscence type Methods 0.000 description 1
- 201000005202 lung cancer Diseases 0.000 description 1
- 208000020816 lung neoplasm Diseases 0.000 description 1
- OHSVLFRHMCKCQY-UHFFFAOYSA-N lutetium atom Chemical compound [Lu] OHSVLFRHMCKCQY-UHFFFAOYSA-N 0.000 description 1
- NZOCXFRGADJTKP-UHFFFAOYSA-K lutetium(3+);triiodide Chemical compound I[Lu](I)I NZOCXFRGADJTKP-UHFFFAOYSA-K 0.000 description 1
- 238000002595 magnetic resonance imaging Methods 0.000 description 1
- 230000009401 metastasis Effects 0.000 description 1
- 230000000771 oncological effect Effects 0.000 description 1
- 238000012634 optical imaging Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 150000002894 organic compounds Chemical class 0.000 description 1
- 201000008968 osteosarcoma Diseases 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 230000007170 pathology Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000002601 radiography Methods 0.000 description 1
- 230000008929 regeneration Effects 0.000 description 1
- 238000011069 regeneration method Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000004621 scanning probe microscopy Methods 0.000 description 1
- 238000011896 sensitive detection Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 229910052708 sodium Inorganic materials 0.000 description 1
- 239000011734 sodium Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 238000003892 spreading Methods 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 230000008093 supporting effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000003325 tomography Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 210000004881 tumor cell Anatomy 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
- 229910019901 yttrium aluminum garnet Inorganic materials 0.000 description 1
- 229910052725 zinc Inorganic materials 0.000 description 1
- 239000011701 zinc Substances 0.000 description 1
- 229910052984 zinc sulfide Inorganic materials 0.000 description 1
- DRDVZXDWVBGGMH-UHFFFAOYSA-N zinc;sulfide Chemical compound [S-2].[Zn+2] DRDVZXDWVBGGMH-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0007—Image acquisition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/48—Diagnostic techniques
- A61B6/481—Diagnostic techniques involving the use of contrast agents
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/50—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6811—Motion detection based on the image signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/683—Vibration or motion blur correction performed by a processor, e.g. controlling the readout of an image memory
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
-
- H04N5/23216—
-
- H04N5/23245—
-
- H04N5/23254—
-
- H04N5/23267—
-
- H04N5/2354—
Definitions
- Tumor margins are the healthy tissue surrounding the tumor, and more specifically, the distance between the tumor tissue and the edge of the surrounding tissue removed along with the tumor. Ideally, the margins are selected so that the risk of leaving tumor tissue within the patient is low.
- the current clinical approach to tumor margin assessment involves circumferential analysis of a resected specimen using frozen sections in a lab during surgical time (Chambers et al. (2015) Laryngoscope 125:636; Byers et al. (1978) Amer. J. Surgery 136:525). Tactile and visual measures are traditionally the only methods available in this sampling process. However, complex three-dimensional (3D) primary specimens are often difficult to manipulate and orient for full inspection and processing.
- margin assessments based on frozen sectioning is difficult to implement on breast cancer specimens due to fatty tissues.
- X-ray imaging and gamma detection are thus used intraoperatively to estimate the border of a tumor by looking at contrasts from inserted biopsy markers such as clips, wires, or seeds which help locate the tumor mass (O'Kelly et al. (2016) J. Surgical Oncology 113:256).
- X-ray cabinet imagers used in surgical suites can give a two-dimensional (2D) projection view of a sample in the form of a snap shot.
- 2D two-dimensional
- Fluorescence image-guided surgery and fluorescence image-guided margin assessment are emerging technologies taking advantage of recent developments of fluorescence dyes and tumor targeting imaging agents from translational and clinical research areas (Garland et al. (2016) Cell Chem. Bio. 23:122; Warram (2014) Cancer Metastasis Rev. 33:809).
- fluorescence imaging systems have been developed and commercialized for image-guided surgery (Zhu and Sevick-Muraca (2015) British J. Radiology 88:20140547; Chi et al. (2014) Theranostics 4:1072; D'Souza et al. (2016) J. Biomed. Opt. 21:80901).
- Some of these handheld and overhead devices have been tested for use in the margin assessment of tumors, rather than for their well-established purpose of imaging primary tumor tissues.
- ambient interferences such as those associated with broadband ambient light, wound bed fluids, and electro-magnetism.
- These interferences along with a fluorescence imaging requirement of a high frame rate, together produce reduced detection limits or sensitivities of the devices.
- One approach to avoiding ambient light interferences without dimming operating room lights is the use of modulation of excitation light away from ambient frequencies.
- a cabinet biopsy imager U.S. Patent Application No. 2016/0245753 can be used to isolate a sample from ambient interferences to provide a higher sensitivity.
- the sensitivity of an imaging system is particularly important for margin assessment when a micro-extension of tumor tissue can reverse a negative result.
- microdose administration of imaging agents, and low concentration of cellular receptors of dye conjugates would also demand high sensitivity of fluorescence detection of the imaging system.
- Several aspects of imaging have been identified as targets for improving signal sensitivity in fluorescence systems, including optimization of fluorescence excitation sources and emission filters, improvement of light collection efficiency, elimination of ambient light interferences, modification of the image sensor, and suppression of system noises. While each of these can positively influence the sensitivity of fluorescence imaging, they can also negatively influence the video-rate performance of image-guided systems.
- Video rate or high-frame rate imaging offers real-time viewing capability of a subject such as a surgical patient or tumor biopsy sample. Multi-channel co-registration of this reflected white light image with a fluorescent image can produce an overlaid image useful in tumor assessment.
- the typical performance characteristics of these overlays of reflective light and fluorescence images sacrifices real-time motion sensitivity as a compromise to boost fluorescence sensitivity. This is the case for systems with charge-coupled device (CCD), complementary metal-oxide-semiconductor (CMOS), electron-multiplied-CCD, or intensified-CCD sensors.
- CCD charge-coupled device
- CMOS complementary metal-oxide-semiconductor
- intensified-CCD sensors intensified-CCD sensors.
- improvements to high frame rate imaging negatively impact the sensitivity of the imaging system by reducing signal collection time in each frame as well as by reducing data processing time and increasing read noises of the imaging sensor.
- imaging technology methods, computer programs, and systems for viewing a subject from multiple angles or orientations while adapting in real-time the imaging quality and imaging function in response to motion.
- the provided methods and systems maintain the capability of providing a real-time preview during image-guided navigation, while also including high-sensitivity image properties when an adaptive function is turned on.
- Motion-sensitive feedback from, for example, the movement of an imager is used to asses user-controlled motion commands.
- the imager is switched to an adaptive or enhanced configuration that can include additional imaging modalities, enhanced image quality, longer integration time for a better sensitivity, overlapping of channels, noise filtering, color changes, computational results, or other alterations.
- the adaptive imaging configuration can allow for better detection of faint signals at the location of interest that would otherwise be difficult to appreciate with the original imaging properties in use during the real-time preview and prior to the motion-adaptive switch.
- the capability to interactively review a sample in a 3D real-time image, with an optimal fluorescence imaging sensitivity adaptively triggered in response to a change in motion has great potential for providing important information for the vetting of surgical samples.
- adaptive functional imaging with modalities of X-ray and high-sensitivity fluorescence imaging can significantly increase the efficiency, reduce the sampling error, and increase the diagnosis accuracy of intraoperative margin assessment in cancer surgery.
- the utility of a biopsy imager can be improved by providing an interactive preview of a specimen with useful additional information enabled by adaptive imaging modalities.
- One provided method for detecting and presenting faint signals from a real-time, interactive imager includes recording using a camera a first real-time view of a subject.
- the camera has a camera location and a camera orientation, and the subject has a subject location and a subject orientation.
- the camera also has a frame frequency, and the frame frequency when recording the first real-time view is a first frame frequency.
- the method further includes rendering for display the first real-time view.
- the method further includes changing the camera location or the camera orientation or the subject location or the subject orientation in response to an operator command.
- the method further includes monitoring a rate of movement in response to the operator command.
- the method further includes sending a trigger to the camera based upon the rate of movement descending below a target rate.
- the method further includes switching the frame frequency from the first frame frequency to a second frame frequency based on the trigger, wherein the second frame frequency is less than the first frame frequency.
- the method further includes capturing using the camera a second real-time view of the subject, wherein the frame frequency of the camera is the second frame frequency, thereby increasing an amount of light collected in each frame.
- the method further includes outputting for display the second real-time view.
- the camera captures fluorescence images of the subject.
- the camera is a first camera, and a reflected white light real-time view is photographed with a second camera.
- the camera records reflected white light images with the first frame frequency, and the camera captures fluorescence images with the second frame frequency.
- the outputting includes overlaying a reflected white light image of the subject over the second real-time view of the subject, wherein the reflected white light image was recorded or photographed prior to the sending of the trigger.
- the recording of the first real-time view includes a first illumination of the subject
- the capturing of the second real-time view includes a second illumination of the subject, wherein the second illumination is different from the first illumination.
- the first illumination has a first wavelength
- the second illumination has a second wavelength, wherein the second wavelength is different from the first wavelength.
- the first illumination has a first intensity
- the second illumination has a second intensity, wherein the second intensity is different from the first intensity.
- the first illumination has a first temporal modulation
- the second illumination has a second temporal modulation, wherein the second temporal modulation is different from the first temporal modulation.
- the capturing of the second real-time view includes application of a noise filtering algorithm or a noise averaging algorithm.
- the camera has a resolution.
- the recording of the first real-time view includes recording using the camera, wherein the resolution is a first resolution.
- the capturing of the second real-time view includes capturing using the camera, wherein the resolution is a second resolution.
- the method further includes altering the resolution from the first resolution to the second resolution based on the trigger, wherein the second resolution is higher than the first resolution.
- the camera has a field of view.
- the recording of the first real-time view includes recording using the camera, wherein the field of view is a first field of view.
- the capturing of the second real-time view includes capturing using the camera, wherein the field of view is a second field of view.
- the method further includes modifying the field of view from the first field of view to the second field of view based on the trigger, wherein the second field of view is smaller than the first field of view.
- the method further includes turning off a light directed at the subject based on the trigger. In some embodiments, the method further includes taking an X-ray image, a near-infrared image, or a radioisotope image based on the trigger.
- the monitoring includes registering a presence or an absence of an operator command. In some embodiments, the monitoring includes registering an acceleration of the sample or an acceleration of the camera. In some embodiments, the monitoring includes registering a difference between sequential frames of the first real-time view.
- the subject is a biological sample.
- the biological sample is an in vivo sample.
- the biological sample is an ex vivo sample.
- the biological sample is from a mammalian subject.
- the biological sample comprises a tumor.
- the operations include recording using a camera a first real-time view of a subject.
- the camera has a camera location and a camera orientation, and the subject has a subject location and a subject orientation.
- the camera has a frame frequency, and the frame frequency is a first frame frequency.
- the operations further include rendering for display the first real-time view.
- the operations further include changing the camera location or the camera orientation or the subject location or the subject orientation in response to an operator command.
- the operations further include monitoring a rate of movement in response to the operator command.
- the operations further include sending a trigger to the camera based upon the rate of movement descending below a target rate.
- the operations further include switching the frame frequency from the first frame frequency to a second frame frequency based on the trigger, wherein the second frame frequency is less than the first frame frequency.
- the operations further include capturing using the camera a second real-time view of the subject, wherein the frame frequency of the camera is the second frame frequency, thereby increasing an amount of light collected in each frame.
- the operations further include outputting for display the second real-time view.
- the system includes at least one processor, and a memory operatively coupled with the at least one processor.
- the at least one processor executes instructions from the memory including program code for recording using a camera a first real-time view of a subject.
- the camera has a camera location and a camera orientation, and the subject has a subject location and a subject orientation.
- the camera has a frame frequency, and the frame frequency is a first frame frequency.
- the instructions further include program code for rendering for display the first real-time view.
- the instructions further include program code for changing the camera location or the camera orientation or the subject location or the subject orientation in response to an operator command.
- the instructions further include program code for monitoring a rate of movement in response to the operator command.
- the instructions further include program code for sending a trigger to the camera based upon the rate of movement descending below a target rate.
- the instructions further include program code for switching the frame frequency from the first frame frequency to a second frame frequency based on the trigger, wherein the second frame frequency is less than the first frame frequency.
- the instructions further include program code for capturing using the camera a second real-time view of the subject, wherein the frame frequency of the camera is the second frame frequency, thereby increasing an amount of light collected in each frame.
- the instructions further include program code for outputting for display the second real-time view.
- FIG. 1 is an illustration of an imaging system in accordance with an embodiment.
- FIG. 2 is an illustration of methods for entering interactive motion commands in accordance with an embodiment.
- FIG. 3 is an illustration of motion-adaptive images rendered for display in accordance with an embodiment.
- FIG. 4 is a flowchart of a process for detecting and presenting faint signals from a real-time interactive imager in accordance with an embodiment.
- FIG. 5 is a time course for the use of a single camera to switch from a reflected light preview to a high sensitivity fluorescence mode in accordance with an embodiment.
- FIG. 6 is a flowchart of an adaptive imaging process in accordance with an embodiment.
- FIG. 7 is a time course for a multi-modal system with multiple motion-adaptive functions in accordance with an embodiment.
- FIG. 8 is a flowchart of a motion-adaptive multi-angle X-ray imaging process in accordance with an embodiment.
- Embodiments of the present invention relate in part to motion-adaptive switching of imaging functions, such as those, for example, used for surgical or biopsy imaging.
- the provided methods and systems can monitor for changes in motion, and responsively enhance or modify real-time images being captured or rendered for display. This can be accomplished by analyzing user control behaviors to determine when and how to optimize imaging results, while simultaneously responding to inputs from the user who is interactively directing a viewing motion around areas of interest.
- the user can use motion control to direct real-time 3-D imaging of a sample.
- the motion-adaptive imaging can then maintain a real-time, high-speed preview as the image is in motion, while automatically switching to an enhanced functional image when the image is relatively stationary. Because the imaging needs during movement and when stationary are often different, or even contradictory, the ability to automatically detect and respond to changes in motion is a valuable feature of the provided imaging methods and systems.
- functional imaging parameters or functional imaging channels can be switched on as enhancements to provide additional information such as enhanced imaging quality, longer integration for better sensitivity, additional imaging modalities, X-ray shots, overlapping of channels, filtering applications, color changes, adding computational results, and augmented information.
- the methods and systems can therefore be particularly helpful in identifying and characterizing areas of interest in real-time to assist in the localization of disease tissue in either a surgical suite or a pathology lab.
- FIG. 1 illustrates one embodiment as a descriptive example. Shown is a sample positioning module 101 that includes an imaging stage 102 . The imaging stage has a first rotational axis 103 and a second rotational axis 104 . A biological sample 105 is shown being supported by the imaging stage. The biological sample is within an imaging volume 106 proximate to the imaging stage. Also shown is an optical imaging module 107 that includes a camera 108 configured to have a depth of focus within the imaging volume. The camera records a first real-time view 109 of the biological sample using a first frame rate frequency, and the first real-time view is rendered for display as shown. While the camera is recording with the first frame rate frequency, the sensitivity of the camera to faint signals, such as those of the fluorescent region 110 of the biological sample, is low.
- the rotatable imaging stage 102 supporting the biological sample 105 is equipped with rotational motors to control the view angle and position of the sample within the imaging volume 106 .
- the stage can allow the imager 107 to efficiently provide a substantially full-rotation 3D image.
- a first rotational axis can, for example, provide nearly 360-degree movement along the z-axis (roll) relative to the sample.
- a second rotational axis can, for example, tilt along the y-axis (pitch) for imaging at different perspectives. Tilting of the sample stage also can allow projection views from the top and the bottom of the sample via a transparent window.
- the rotational imaging stage can also be moved in an X-Y plane to allow for the registration of the sample to the center of the imaging volume.
- the camera 108 While the rotational stage 102 is in motion above a selected target motion rate, the camera 108 records the first real-time view 109 of the biological sample 105 using a first frame rate frequency.
- a trigger is sent to the camera, and the frame frequency of the camera is switched from the first frame frequency to a second, lower frame frequency.
- the camera then captures a second real-time view 111 of the biological sample.
- the lower frame frequency used to capture and output this second real-time view the amount of light collected in each frame is increased, and the sensitivity to the faint signals of the fluorescent region 110 is increased. Accordingly, the intensity of the perceived fluorescent signal is greater in the second real-time view than in the first real-time view.
- the camera or imager can be configured to have any functional lens assembly, focal length, and exposure control sufficient to record the first real-time view of the subject.
- the camera or imager can record the first real-time view using an electronic image sensor.
- the camera or imager can include a charged coupled device (CCD) and/or a complementary metal-oxide-semiconductor (CMOS) sensor.
- CCD charged coupled device
- CMOS complementary metal-oxide-semiconductor
- the first real-time view is monochromatic.
- the first real-time view includes two or more colors.
- the camera can have an actively or passively cooled heat exchanger to maintain imaging sensors at low temperatures. The cooling can prevent optical background noise such as darkness or blooming. Exemplary camera and optical components are described in U.S. Pat. Nos. 7,286,232, 8,220,415, and 8,851,017.
- the first and second real-time views represent depictions of the subject as seen from the viewpoint, such that there is minimal lag between the time of recording, capturing, or photographing the subject, and the time of rendering or outputting the views for display.
- the lag can be less than 5 seconds, less than 4 seconds, less than 3 seconds, less than 2 seconds, less than 1 second, less than 900 milliseconds, less than 800 milliseconds, less than 700 milliseconds, less than 600 milliseconds, less than 500 milliseconds, less than 400 milliseconds, less than 300 milliseconds, less than 200 milliseconds, or less than 100 milliseconds.
- the camera used to record, capture, or photograph the real-time views of the subject has a frame frequency.
- the frame frequency can be at least 1 frame per second, at least 4 frames per second, at least 8 frames per second, at least 16 frames per second, at least 24 frames per second, at least 25 frames per second, at least 30 frames per second, at least 48 frames per second, at least 50 frames per second, at least 60 frames per second, at least 72 frames per second, at least 90 frames per second, at least 100 frames per second, at least 120 frames per second, at least 144 frames per second, or at least 240 frames per second.
- the frame frequency can be less than 1 frame per second.
- the frame frequency can be a relatively high frame frequency, sometimes referred to as a video rate. Typically, video rates are greater than or equal to 16 frames per second, with higher rates providing a smoother perception of motion.
- the rendered or output real-time views for display can be identical to images recorded, captured, or photographed using the camera.
- the rendered or output real-time views can be constructions based on information in the recorded, captured, or photographed images.
- the rendered or output real-time views can contain images or information collected with one channel or modality.
- the rendered or output real-time views can overlay images or information collected with two or more channels or modalities.
- a rendered image can overlay reflected light information showing a visible light view of the subject and X-ray information showing locations of radiodense regions within the biological sample.
- the models are co-registered in three-dimensional space so that the image presents information for each modality as seen from a single viewpoint.
- the subject can be any person, animal, plant, or object to be imaged.
- the subject can be a biological sample.
- the biological sample is one or more organisms.
- the biological subject can be one or more microscopic organisms.
- the biological sample can be a plant.
- the biological sample can be an animal.
- the biological sample can be a mammal.
- the biological sample can be a human.
- the biological sample is a tissue, organ, or other subsection of an organism.
- the biological sample can be an in vivo sample that is part or all of one or more living organisms.
- the biological sample can be an individual living microorganism.
- the biological sample can be a community of two or more living microorganisms.
- the biological sample can a living plant or animal.
- the biological sample can be a living mammal.
- the biological sample can be a living human.
- the biological sample can be a tissue, organ, or other subsection of a living human. In some embodiments, the biological sample is a region of an animal or human undergoing a surgical or other medical operation or analysis.
- the biological sample can be an ex vivo sample of a tissue, organ, or other subsection removed from a plant or animal.
- the biological sample can be a tissue, organ, or other subsection removed from a mammal.
- the biological sample can be a tissue, organ, or other subsection removed from a human.
- the biological sample is a resected human tissue sample.
- the biological sample is a biopsy sample extracted from a human.
- the biological sample can be of a mammalian subject.
- the mammalian subject can be, for example, rodent, canine, feline, equine, ovine, porcine, or a primate.
- the mammalian subject can be human.
- the subject can be a patient suffering from a disease.
- the subject is a cancer patient.
- the biological sample comprises a tumor, such as tumor tissue or cells.
- the biological sample comprises a peripheral biopsy of a tissue sample previously removed.
- the biological sample is tumor tissue such as a breast core biopsy.
- the biological sample size can be as small as a tissue slice.
- the subject and the camera each have a location and an orientation.
- the camera is oriented such that it is pointing towards the subject, and the camera is located such that the subject is within the focal length of the camera.
- the orientation of the subject can be changed.
- the location of the subject can be changed.
- the orientation and/or location of the camera can change.
- the provided methods and systems can be applied, for example, to a cabinet biopsy imager or an image-guided surgical system.
- the changing of the location or orientation of the subject can occur when the subject is, for example, held on a sample stage that is interactively moved for 3D biopsy imaging.
- the subject is a patient or a region of a patient within an operating room with an external or endoscopic imager, it is more likely that the subject will remain relatively stationary while the camera location or orientation is changed.
- a camera on a moving arm can providing different view angles at a wound bed.
- a camera can be used with a robotic image-guided surgical system. In each of these cases, the changing positions and orientations are used to enable 3D imaging of the subject.
- 3D imaging is a broadly-used common term representing the addition of depth perception to 2D images, including those from clinical areas such as computed tomography (CT), magnetic resonance imaging (MRI), and ultrasonography (Kopans (2014) Amer. J. Roentgenology 202:299; Ciatto et al. (2013) Lancet Oncology 14:583; Karatas and Toy (2014) Eur. J. Dentistry 8:132). It has also been applied in biomedical spaces such as those of microscopy, photoacoustic imaging, optical coherence tomography, spectroscopic imaging, and near-infrared imaging. In imaging and filming industries particular for entertainment purposes, 3D imaging has been well-known as a technique for displaying depth perception in projected images in movie theaters or other viewing environments.
- 3D images can be produced through holography, which records a light field through an interference pattern and re-produces a spatial distribution of the light field in three dimensions.
- 3D imaging refers to the regeneration of the geometrical information of a subject via computing algorithms and calculations based on recorded data.
- 3D images can also simply be a stacking of optically sectioned X-Y 2D views, such as Z-stacking of images from confocal microscopy.
- 3D images are produced using a stereoscopic effect by re-creating binocular vision to both left and right eyes at the same time to create the perception of 3D depth of an image.
- 3D imaging is often applied to movies, gaming, and entertainment applications.
- 3D images can be the computed representation of a subject formed in a computer-generated model based on either multiple real images or created virtual content.
- Such technologies have been used, for example, in the fields of virtual reality, mixed reality, and augmented reality.
- 3D pictures and 3D videos can be panoramic images obtained by recording views surrounding an observer by up to 360 degrees.
- 3D images can also offer only a sectional view of a part of an object, rather than a visualization of the full object. This is particularly the case in optical coherence tomography (OCT), scanning microscopy, and photoacoustic microscopy, where imaging depth and scanning coverage is very limited in its ability to capture the full view of a large subject.
- OCT optical coherence tomography
- scanning microscopy scanning microscopy
- photoacoustic microscopy where imaging depth and scanning coverage is very limited in its ability to capture the full view of a large subject.
- 3D imaging refers to the multi-angle visualization of a subject.
- the dynamic viewing from different angles or a collection of different view perspectives of an object gives the perception of depth, geometry, dimensions, features, and location of the subject.
- the imaging and dynamic viewing in multi-angles and orientations enables a 3D perception, not appreciated by traditional 2D representations via a static picture or a static view angle.
- FIG. 2 illustrates examples of interactive motion controls for changing the location and/or orientation of the subject and/or camera. Shown is a display 201 showing a real-time view of a subject 202 .
- operator commands to change a location or orientation are entered by touching the screen with a finger 203 or a device such as a touch pen 204 .
- the touch commands can include pressing areas of the display showing virtual buttons 205 , 206 for rotating the view in opposite directions about a first axis, and virtual buttons 207 , 208 for rotating the view in opposite directions about a second axis that is orthogonal to the first axis.
- the touching commands can include swiping, pinching, or spreading gestures to change the view by rotating, zooming, or panning.
- operator commands to change a location or orientation include moving a cursor 209 on the display by using physical buttons, a computer mouse, a scroll ball, a control stick, switches, or a handheld controller. Operator commands for movement can also be entered using waving gestures, voice activation, or accelerometers.
- the rate of movement of one or both of the subject and the camera can be monitored by registering the occurrence of operator commands.
- the rate of movement can be monitored by registering the presence or the absence of operator commands.
- the movement slows or halts when an operator stops inputting a command for rotating, panning, or zooming the real-time view.
- the movement slows or halts when an operator inputs a command for slowing or halting a rotating, panning, or zooming of the real-time view.
- the rate of movement of one or both of the subject and the camera can be monitored by registering the acceleration of the subject and camera, respectively.
- the camera includes a positioning system or one or more accelerometers that measure the location and movement rate of the camera.
- the camera is connected to a support system such as, for example, an arm, that includes a positioning system or one or more accelerometers that measure the location and movement rate of the camera.
- the subject is connected to an imaging platform, such as, for example, a rotational stage, that includes a positioning system or one or more accelerometers that measure the location and movement rate of the subject.
- the rate of movement of one or both of the subject and the camera can be monitored by registering changes between two or more frames of the real-time view.
- the rate of movement of one or both of the subject and the camera is monitored by registering a difference between sequential frames of the real-time view.
- the registering of changes between frames of the real-time view can be accomplished with any algorithm suitable for video frame comparison or matching.
- the rate of movement is monitored by applying a block-matching algorithm, a phase correlation algorithm, a pixel recursive algorithm, an optical flow algorithm, or a corner detection algorithm.
- a trigger can be sent to the camera when the monitored rate of movement of one or both of the subject and the camera descends below a target rate.
- the target rate can be expressed as a target degree of movement occurring in a target period of time.
- the target degree of movement can be, for example, 25%, 20%, 15%, 10%, 5%, 4%, 3%, 2%, or 1%.
- the target period of time can be, for example, 5 seconds, 4.5 seconds, 4 seconds, 3.5 seconds, 3 seconds, 2.5 seconds, 2 seconds, 1.5 seconds, 1 second, or 0.5 seconds. In some embodiments, the target rate of movement is 15% in 1.5 seconds.
- the camera switches to a motion-adaptive imaging function. In some embodiments, the camera switches its frame frequency from a first frame frequency to a lower second frame frequency based on the trigger.
- FIG. 3 illustrates an embodiment in which touch gestures are used to enter operator commands for moving the subject or camera, and an adaptive imaging mode increases the amount of light collected in each frame in response to the rate of movement dropping below a target rate.
- the camera or subject are moved such that the real-time view changes from the first image 301 to the second 302 , third 303 , and then fourth image 304 , presenting different viewpoints of the subject in the 3D imaging visualization.
- the rate of movement in response to the operator commands is monitored by any of the techniques discussed above.
- a trigger is sent to the camera, and the frame frequency of the camera is changed to a higher frame frequency. Becuase of this higher frame frequency associated with the motion-adaptive imaging function, the camera collects more light in each frame, and the resulting fifth image 307 shows a stronger response to faint fluorescence signals emitting from the subject.
- a pause of motion input (or a gesture, or voice, or release of pressure on the touch screen, or increase of pressure on the screen, or a prolonged holding on the screen) at an area of interest indicates the expectation of higher sensitivity imaging to interrogate the disease tissue distribution in details.
- the computer then responsively triggers the adaptive imaging functions.
- the changes to the camera in response to the trigger signal are not limited to those affecting frame frequency rate, and can also include changes to imaging modes, channels, speeds, resolutions, or sensitivities.
- the changes can also include adjustments to the illumination of the subject, or to the processing of recorded, captured, or photographed information.
- the motion-adaptive imaging function of the camera can include one or more of different integration times of imaging pixels on the image sensor, filtering algorithms, noise suppression algorithms, or boosts to the signal to noise ratio.
- Other changes in the motion-adaptive imaging function can include manipulating the image to show, for example, different colors, contrasts, or overlaid information.
- the motion-adaptive imaging function can add another imaging modality displayed in an overlaid fashion.
- the switch to a motion-adaptive imaging function represents a shift from a high speed preview mode to a high-performance, high-sensitivity mode, with imaging parameters selected to improve or optimize each mode independently.
- FIG. 4 presents a flowchart of a process 400 for detecting and presenting faint signals from a real-time interactive imager.
- a first real-time view of a subject is recorded using a camera, wherein the camera has a camera location and a camera orientation, wherein the subject has a subject location and a subject orientation, wherein the camera has a frame frequency, and wherein the frame frequency is a first frame frequency.
- a first real-time view is rendered for display.
- the camera location or the camera orientation or the subject location or the subject orientation is changed in response to an operator command.
- 404 a rate of movement in response to the operator command is monitored.
- a trigger is sent to the camera based upon the rate of movement descending below a target rate.
- the frame frequency is switched from the first frame frequency to a second frame frequency based on the trigger, wherein the second frame frequency is less than the first frame frequency.
- a second real-time view of the subject is captured using the camera, wherein the frame frequency of the camera is the second frame frequency, thereby increasing an amount of light collected in each frame.
- the second real-time view is output for display.
- the camera can be a fluorescence camera configured to capture fluorescence images of the subject upon excitation of the subject by a fluorescence excitation light source.
- the fluorescence excitation light source can be any device configured to emit electromagnetic radiation at an excitation wavelength capable of exciting a fluorescent material within the imaging volume.
- the fluorescent material can comprise a fluorophore or fluorescent dye.
- the fluorescence excitation light source can be configured to illuminate the imaging volume, and any sample within, with radiation comprising this excitation wavelength.
- the fluorescence excitation light source emits near-infrared light.
- the illumination of the subject with near-infrared light is performed at one or more wavelengths of from about 650 nm to about 1400 nm.
- wavelengths include, for example, about 700, 725, 750, 775, 800, 825, 850, 875, 900, 910, 920, 930, 940, 950, 960, 970, 980, 990, 1000, 1100, 1200, 1300, and 1400 nm. Sometimes these wavelengths are referred to as being in the NIR-I (between 750 and 1060 nm) and NIR-II (between 1000 nm and 1700 nm) wavelength regions.
- Fluorophore methods utilize molecules that absorb light of one wavelength and emit light of a different wavelength.
- a visible image in combination with a fluorophore (e.g., an infrared or near-infrared fluorophore)
- Filter sets can be used in the optical system to isolate excitation wavelengths with optimized emission collection for corresponding imaging agents.
- the subject comprises a fluorescent dye.
- the fluorescent group is a near-infrared (NIR) fluorophore that emits in the range of between about 650 to about 1400 nm.
- NIR near-infrared
- Use of near-infrared fluorescence technology can be advantageous as it substantially eliminates or reduces background from auto fluorescence of tissue.
- Another benefit to near-IR fluorescent technology is that scattered light from the excitation source is greatly reduced since the scattering intensity is proportional to the inverse fourth power of the wavelength. Low background fluorescence and low scattering result in a high signal to noise ratio, which is essential for highly sensitive detection.
- the optically transparent window in the near-IR region (650 nm to 990 nm) or NIR-II region (between about 1000 nm and 1400) in biological tissue makes NIR fluorescence a valuable technology for imaging and subcellular detection applications that require the transmission of light through biological components.
- the fluorescent group is preferably selected form the group consisting of IRDYE® 800RS, IRDYE® 800CW, IRDYE® 800, ALEXA FLUOR® 660, ALEXA FLUOR® 680, ALEXA FLUOR® 700, ALEXA FLUOR® 750, ALEXA FLUOR® 790, Cy5, Cy5.5, Cy7, DY 676, DY680, DY682, and DY780 molecular marker.
- the near infrared group is IRDYE® 800CW, IRDYE® 800, IRDYE® 700DX, IRDYE® 700, or Dynomic DY676 molecular marker.
- a fluorescent dye is contacted with a biological sample prior to excising the biological sample from the subject.
- the dye can be injected or administered to the subject prior to surgery or after surgery.
- the dye is conjugated to an antibody, ligand, or targeting moiety or molecule having an affinity to a tumor or recognizes a tumor antigen.
- the fluorescent dye comprises a targeting moiety.
- the surgeon “paints” the tumor with the dye.
- the fluorescent dye is contacted with the biological sample after excising the biological sample from the subject. In this manner, dye can be contacted to the tissue at the margins.
- the targeting molecule or moiety is an antibody that binds an antigen such as a lung cancer cell surface antigen, a brain tumor cell surface antigen, a glioma cell surface antigen, a breast cancer cell surface antigen, an esophageal cancer cell surface antigen, a common epithelial cancer cell surface antigen, a common sarcoma cell surface antigen, or an osteosarcoma cell surface antigen.
- an antigen such as a lung cancer cell surface antigen, a brain tumor cell surface antigen, a glioma cell surface antigen, a breast cancer cell surface antigen, an esophageal cancer cell surface antigen, a common epithelial cancer cell surface antigen, a common sarcoma cell surface antigen, or an osteosarcoma cell surface antigen.
- the camera can be a reflected light camera configured to capture white light reflected by the subject.
- the camera can be a gray-scale or true-color imager.
- the camera can be a multichannel imager with a reflective light mode with at least one fluorescence channel.
- the imager can also be combined with X-ray imaging.
- the motion-adaptive method can also be used on an X-ray only multi-angle, multi-dimensional imager for radiography of a sample
- the first real-time view and the second real-time view can be recorded, captured, or photographed with the same camera.
- the first real-time view and the second real-time view can be recorded, captured, or photographed with different cameras.
- Each of the first and second real-time views can be recorded with multiple cameras.
- a reflected white light camera with a first frame frequency is used to record the first real-time view
- a fluorescence camera with a second frame frequency is used to capture the second real-time view.
- FIG. 5 presents an exemplary time course for a process 500 using a single camera to adaptively switch from a reflected light preview mode to a fluorescence mode characterized by a long exposure time of the camera image sensor.
- the camera records a first real-time view of a subject using reflected white light and a default frame rate of 30 frames per second.
- the user pauses motion control, and the method enters a 1.5-second waiting time period to determine if the monitored rate of movement descends below a target rate.
- the camera is switched to the motion-adaptive imaging function, with a fluorescence channel of the camera turned on, and the exposure time increased.
- the camera begins capturing the second real-time view of the subject using fluorescence and the second frame rate characteristic of the increased exposure time.
- FIG. 6 presents one workflow of an adaptive imaging process.
- a user inputs commands for motion interactively to the imaging system.
- the system receives these interactive controls and provides 3D movement for real-time imaging.
- the interactive motion control commands are entered via touching the screen or other computer input methods such as with the use of a computer mouse, a control stick, a scroll ball, switches, or a handheld controller.
- real-time images typically at a video rate, are continuously rendered for display.
- the system recognizes a discontinuation of command inputs within a default period of time, or the release of the touch on the screen, the system responsively triggers the adaptive imaging functions.
- These adaptive functions can include increasing the integration time of pixels of the image sensor and applying noise suppression algorithms.
- the result of these adaptive functions is the enhancement of imaging sensitivity in the fluorescence channel which can be overlaid with images from a reflective channel or other imaging modalities. The user will then observe an increase in signal level due to the higher image sensitivity at the fluorescence channel so that localization of disease tissue can be performed.
- X-ray, near-infrared, radioisotope, or ultrasound images can be captured based on the trigger signal.
- X-ray information can be used to show locations of radiodense regions within the subject or biological sample.
- the motion-adaptive imaging methods can then be used with an X-ray biopsy imager to provide responsive shots of, for example, a lumpectomy samples at multiple user-selected angles.
- X-ray images can be produced by exposing the subject to X-ray irradiation from an X-ray source, and recording the X-rays with an X-ray imager.
- the X-ray source can be any artificial X-ray source configured to irradiate the imaging volume with X-rays.
- the X-ray source is an X-ray tube.
- the X-ray tube can comprise a rotating anode tube.
- the X-ray source is a solid-anode microfocus X-ray tube or a metal-jet-anode microfocus X-ray tube.
- the X-ray imager can be any device configured to measure the properties of X-rays exiting the image volume.
- the X-ray imager can comprise, for example, one or more of a sensitized photographic plate, sensitized photographic film, a photostimulable phosphor plate, a semiconductor or solid state detector, or a scintillator.
- the X-ray imager comprises a scintillator.
- the scintillator can comprise any material that converts an X-ray photon to a visible light photon.
- the scintillator can comprise one or more organic or inorganic compounds.
- the scintillator compounds can comprise, for example, barium fluoride, calcium fluoride doped with europium, bismuth germinate, cadmium tungstate, cesium iodide doped with thallium, cesium iodide doped with sodium, undoped cesium iodide, gadolinium oxysulfide, lanthanum bromide doped with cerium, lanthanum chloride doped with cerium, lead tungstate, lutetium iodide, lutetium oxyorthosilicate, sodium iodide doped with thallium, yttrium aluminum garnet, zinc sulfide, or zinc tungstate.
- the scintillator comprises sodium iodide, gadolinium oxysulfide, or cesium iodide.
- the X-ray imager is a an X-ray flat panel detector.
- the flat panel detector can comprise a scintillator material and a photodiode transistor array.
- the flat panel detector can further comprise one or more readout circuits.
- the flat panel detector can comprise a detection face and a display face on opposite sides of the detector from one another.
- the detection face can be directed towards the biological sample and the X-ray source so as to be contacted with X-rays generated by the X-ray source and passing through the imaging volume.
- the display face can be directed towards a camera so that an X-ray image displayed on the display face can be recorded using the camera.
- the X-ray image is displayed on the display face by generating visible light that is recorded by a visible light camera configured to have a depth of focus that corresponds to the distance between the display face and the camera.
- Ultrasonic images can be produced with an ultrasonic transducer array configured to detect ultrasound waves exiting the imaging volume and convert the waves into electrical signals.
- Energy pulses transmitted into the imaging volume can cause a biological sample within to absorb this time-varying energy, inducing the generation of acoustic waves that can be detected by the ultrasonic transducer array.
- the ultrasonic transducer array is in contact with the biological sample via a coupling medium.
- the coupling medium can comprise water or gel to relay ultrasound waves.
- the energy pulses are non-ionizing light pulses and the ultrasonic transducer array can be used to record a photoacoustic image.
- the energy pulses are radio frequency pulses and the ultrasonic transducer array can be used to record a thermoacoustic image. In some embodiments, the energy pulses are ultrasonic pulses, and the ultrasonic transducer array can be used to record an ultrasound image.
- Optical coherence tomography images can be produced with an interferometer configured for tomography of the biological sample within the imaging volume.
- the interferometer is a Michelson interferometer.
- a camera can be configured to detect electromagnetic radiation emitted from the imaging volume for optical coherence tomography of the biological sample.
- the motion-adaptive imaging functions can include one or more noise filtering or noise averaging processes. Any signal processing algorithm suitable for image noise reduction can be used with the provided methods and systems.
- the noise reduction algorithms can include, for example, chroma and luminescence noise separation, linear smoothing, anistropic diffusion, non-local averaging, or wavelet transformations.
- FIG. 7 presents an exemplary time course 700 for a multi-modal system that starts with both a white light and a fluorescence channel, and then adaptively switches on multiple functions.
- the camera records a first real-time view of a subject using both reflected white light and fluorescence, each with a frame rate of 15 frames per second.
- the subject is illuminated with a high-frequency modulated illumination that alternates between white light and fluorescence excitation wavelengths.
- the collection of reflected white light and fluorescence emissions is through one or two cameras that can be synchronized with the illumination modulation.
- the user pauses motion control, and the method enters a 1.5-second waiting time period to determine if the monitored rate of movement descends below a target rate.
- the imaging system is switched to multiple motion-adaptive imaging functions. These include the turning off of the white light illumination channel, the switching of the fluorescence camera frame frequency to a lower frequency characteristic of a long exposure mode, the stopping of binning in the image processing of the camera, the starting of the application of a noise filtering algorithm, and the readying of an additional imaging modality for X-ray image capture.
- the fluorescence camera begins capturing the second real-time view of the subject using fluorescence and the second frame rate, and the X-ray imager records an X-ray image for display.
- the imaging system begins to wait for user entry of new motion commands.
- FIG. 8 presents a workflow of a motion-adaptive multi-angle X-ray imaging process.
- a user inputs motion commands interactively to the imaging system.
- the system receives these commands and moves a sample stage to provide multi-dimensional movement for real-time reflective-light imaging in an imaging cabinet.
- Real-time images are rendered for display continuously to show the sample orientation inside the imaging cabinet.
- the system recognizes the movement rate has dropped such that a period of commands is reduced, the system triggers an X-ray imaging instruction that a user can interrupt. In the absence of interruption, an X-ray image is captured.
- the X-ray image can be output for presentation along with the real-time reflective view in the same overlaid image or separate images. The user can then continue the motion to re-orient the sample or take another X-ray shot.
- the first and second real-time views of the subject recorded using standard and motion-adaptive imaging functions can each contain one or more imaging modalities.
- the separate modalities can be rendered or output for display in overlaid or separate images.
- the imaging modalities can include reflected white light imaging, fluorescence imaging, X-ray imaging, near infra-red imaging, ultrasonic imaging, optical coherence tomography imaging, or others.
- the real-time views captured using the motion-adaptive functions can be overlaid with an image recorded using the standard imaging functions prior to the sending of the trigger to begin motion-adaptive imaging.
- a reflected white light image of the subject recorded or photographed prior to the sending of the trigger is overlaid with the real-time view of the subject captured subsequent to the sending of the trigger.
- the standard and motion-adaptive imaging functions can each include an illumination of the subject.
- the second illumination for the motion-adaptive imaging subsequent to the sending of the trigger can be different from the first illumination for the standard imaging prior to the sending of the trigger.
- the first and second illuminations, and the light received by cameras and imagers associated with the provided methods and systems can include any wavelength of electromagnetic radiation.
- the illuminations can be provided by one or more illumination sources. Illumination sources can be mounted proximate to the imaging volume in order to illuminate the sample with white light, monochrome light, near-infrared light, fluorescence light, or other electromagnetic radiation. One or more white lights can be used to illuminate the imaging volume.
- the illumination of the subject with visible light is performed at one or more wavelengths of about 380 nm to about 700 nm. These wavelengths include, for example, about 380, 390, 400, 425, 450, 475, 500, 525, 550, 575, 600, 625, 650, 675, or about 700 nm. These wavelengths can occur in combination, such as in broadband white light.
- illumination at one or more wavelengths is stopped by the switch to motion-adaptive imaging.
- a light is turned off by the switch to motion-adaptive imaging.
- the first and second illuminations can have similar or different intensities.
- the first and second illuminations can have similar or different temporal modulations.
- an illumination can include both white light and fluorescence excitation light used to alternately illuminate the subject in a modulated fashion.
- the modulated electromagnetic radiations can differ in amplitude, phase, frequency, or polarization.
- the first and second real-time views of the subject recorded using standard and motion-adaptive imaging functions, respectively, can have similar or different resolutions.
- the real-time views can have similar or different pixel resolutions, spatial resolutions, spectral resolutions, temporal resolutions, or radiometric resolutions.
- the first and second real-time views of the subject recorded using standard and motion-adaptive imaging functions, respectively, can have similar or different fields of view.
- the devices and methods can utilize a computing apparatus that is programmed or otherwise configured to automate and/or regulate one or more steps of the methods or features of the devices provided herein.
- Some embodiments provide machine executable code in a non-transitory storage medium that, when executed by a computing apparatus, implements any of the methods or operates any of the devices described herein.
- the computing apparatus operates one or more power sources, motors, and/or displays.
- a display can be used for review of the real-time views.
- the display can be a touch screen to receive interactive command inputs.
- the display can be a wireless device relaying the information wirelessly.
- first”, “second”, “third”, “fourth”, and “fifth”, and “sixth” when used herein with reference to images, views, frequencies, cameras, illuminations, wavelengths, intensities, modulations, resolutions, fields of view, axes, or other elements or properties are simply to more clearly distinguish the two or more elements or properties and unless stated otherwise are not intended to indicate order.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Surgery (AREA)
- Pathology (AREA)
- Molecular Biology (AREA)
- Biomedical Technology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Optics & Photonics (AREA)
- High Energy & Nuclear Physics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
Abstract
Methods and systems are provided for the detecting and presenting of faint signals from a real-time interactive imager. A real-time view of a subject is controlled by a user by entering commands that move the subject or a camera used to record the real-time view. The rate of this movement is monitored, and when the rate descends below a preset value, a trigger is sent to the camera. In response to the trigger, the camera switches to a lower frame frequency for capturing the real-time view of the subject, thereby increasing the amount of light collected in each frame, and increasing the sensitivity of the camera to faint signals. The methods and systems are particularly relevant for surgical or biopsy imaging, in which both a higher frame rate preview image of a subject for 3D visualization, and a higher sensitivity fluorescence image of the subject for tumor margin assessment, are desired.
Description
- This application is a continuation of U.S. application Ser. No. 15/819,603, filed Nov. 21, 2017, which claims the benefit of U.S. Provisional Patent Application No. 62/425,967, filed Nov. 23, 2016, and U.S. Provisional Patent Application No. 62/489,921, filed Apr. 25, 2017, the full disclosures of which are incorporated herein by reference in their entirety for all purposes.
- NOT APPLICABLE
- The success of oncologic surgery can depend on obtaining negative tumor margins so that no tumor tissues are left behind. Tumor margins are the healthy tissue surrounding the tumor, and more specifically, the distance between the tumor tissue and the edge of the surrounding tissue removed along with the tumor. Ideally, the margins are selected so that the risk of leaving tumor tissue within the patient is low. The current clinical approach to tumor margin assessment involves circumferential analysis of a resected specimen using frozen sections in a lab during surgical time (Chambers et al. (2015) Laryngoscope 125:636; Byers et al. (1978) Amer. J. Surgery 136:525). Tactile and visual measures are traditionally the only methods available in this sampling process. However, complex three-dimensional (3D) primary specimens are often difficult to manipulate and orient for full inspection and processing.
- As a specific example, margin assessments based on frozen sectioning is difficult to implement on breast cancer specimens due to fatty tissues. X-ray imaging and gamma detection are thus used intraoperatively to estimate the border of a tumor by looking at contrasts from inserted biopsy markers such as clips, wires, or seeds which help locate the tumor mass (O'Kelly et al. (2016) J. Surgical Oncology 113:256). X-ray cabinet imagers used in surgical suites can give a two-dimensional (2D) projection view of a sample in the form of a snap shot. However the 3D localization of inserts within the sample can significantly benefit the assessment (Kopens (2014) Amer. J. Roentgenology 202:299; Ciatto et al. (2013) Lancet Oncology 14:583; Chagpar et al. (2015) Amer. J. Surgery 210:886). The alternative use of a real-time motion-controlled orientation and imaging of a sample, coupled with the use of periodic X-ray shots, can therefore be helpful in viewing samples from different directions efficiently in the surgical suite.
- Fluorescence image-guided surgery and fluorescence image-guided margin assessment are emerging technologies taking advantage of recent developments of fluorescence dyes and tumor targeting imaging agents from translational and clinical research areas (Garland et al. (2016) Cell Chem. Bio. 23:122; Warram (2014) Cancer Metastasis Rev. 33:809). In recent years, several fluorescence imaging systems have been developed and commercialized for image-guided surgery (Zhu and Sevick-Muraca (2015) British J. Radiology 88:20140547; Chi et al. (2014) Theranostics 4:1072; D'Souza et al. (2016) J. Biomed. Opt. 21:80901). Some of these handheld and overhead devices have been tested for use in the margin assessment of tumors, rather than for their well-established purpose of imaging primary tumor tissues. When used in situ with an image-guided systems above the wound bed, there exist significant ambient interferences such as those associated with broadband ambient light, wound bed fluids, and electro-magnetism. These interferences, along with a fluorescence imaging requirement of a high frame rate, together produce reduced detection limits or sensitivities of the devices. One approach to avoiding ambient light interferences without dimming operating room lights is the use of modulation of excitation light away from ambient frequencies. Alternatively, a cabinet biopsy imager (U.S. Patent Application No. 2016/0245753) can be used to isolate a sample from ambient interferences to provide a higher sensitivity.
- The sensitivity of an imaging system is particularly important for margin assessment when a micro-extension of tumor tissue can reverse a negative result. In addition, microdose administration of imaging agents, and low concentration of cellular receptors of dye conjugates, would also demand high sensitivity of fluorescence detection of the imaging system. Several aspects of imaging have been identified as targets for improving signal sensitivity in fluorescence systems, including optimization of fluorescence excitation sources and emission filters, improvement of light collection efficiency, elimination of ambient light interferences, modification of the image sensor, and suppression of system noises. While each of these can positively influence the sensitivity of fluorescence imaging, they can also negatively influence the video-rate performance of image-guided systems.
- Video rate or high-frame rate imaging, often with reflected white light, offers real-time viewing capability of a subject such as a surgical patient or tumor biopsy sample. Multi-channel co-registration of this reflected white light image with a fluorescent image can produce an overlaid image useful in tumor assessment. However, the typical performance characteristics of these overlays of reflective light and fluorescence images sacrifices real-time motion sensitivity as a compromise to boost fluorescence sensitivity. This is the case for systems with charge-coupled device (CCD), complementary metal-oxide-semiconductor (CMOS), electron-multiplied-CCD, or intensified-CCD sensors. In general, improvements to high frame rate imaging negatively impact the sensitivity of the imaging system by reducing signal collection time in each frame as well as by reducing data processing time and increasing read noises of the imaging sensor.
- In general, provided herein are imaging technology methods, computer programs, and systems for viewing a subject from multiple angles or orientations while adapting in real-time the imaging quality and imaging function in response to motion. The provided methods and systems maintain the capability of providing a real-time preview during image-guided navigation, while also including high-sensitivity image properties when an adaptive function is turned on. Motion-sensitive feedback from, for example, the movement of an imager, is used to asses user-controlled motion commands. When motion is substantially reduced or stopped, the imager is switched to an adaptive or enhanced configuration that can include additional imaging modalities, enhanced image quality, longer integration time for a better sensitivity, overlapping of channels, noise filtering, color changes, computational results, or other alterations. In this way, the adaptive imaging configuration can allow for better detection of faint signals at the location of interest that would otherwise be difficult to appreciate with the original imaging properties in use during the real-time preview and prior to the motion-adaptive switch.
- For example, the capability to interactively review a sample in a 3D real-time image, with an optimal fluorescence imaging sensitivity adaptively triggered in response to a change in motion, has great potential for providing important information for the vetting of surgical samples. Furthermore, adaptive functional imaging with modalities of X-ray and high-sensitivity fluorescence imaging can significantly increase the efficiency, reduce the sampling error, and increase the diagnosis accuracy of intraoperative margin assessment in cancer surgery. Similarly, the utility of a biopsy imager can be improved by providing an interactive preview of a specimen with useful additional information enabled by adaptive imaging modalities.
- One provided method for detecting and presenting faint signals from a real-time, interactive imager includes recording using a camera a first real-time view of a subject. The camera has a camera location and a camera orientation, and the subject has a subject location and a subject orientation. The camera also has a frame frequency, and the frame frequency when recording the first real-time view is a first frame frequency. The method further includes rendering for display the first real-time view. The method further includes changing the camera location or the camera orientation or the subject location or the subject orientation in response to an operator command. The method further includes monitoring a rate of movement in response to the operator command. The method further includes sending a trigger to the camera based upon the rate of movement descending below a target rate. The method further includes switching the frame frequency from the first frame frequency to a second frame frequency based on the trigger, wherein the second frame frequency is less than the first frame frequency. The method further includes capturing using the camera a second real-time view of the subject, wherein the frame frequency of the camera is the second frame frequency, thereby increasing an amount of light collected in each frame. The method further includes outputting for display the second real-time view.
- In some embodiments, the camera captures fluorescence images of the subject. In some embodiments, the camera is a first camera, and a reflected white light real-time view is photographed with a second camera. In some embodiments, the camera records reflected white light images with the first frame frequency, and the camera captures fluorescence images with the second frame frequency. In some embodiments, the outputting includes overlaying a reflected white light image of the subject over the second real-time view of the subject, wherein the reflected white light image was recorded or photographed prior to the sending of the trigger.
- In some embodiments, the recording of the first real-time view includes a first illumination of the subject, and the capturing of the second real-time view includes a second illumination of the subject, wherein the second illumination is different from the first illumination. In some embodiments, the first illumination has a first wavelength, and the second illumination has a second wavelength, wherein the second wavelength is different from the first wavelength. In some embodiments, the first illumination has a first intensity, and the second illumination has a second intensity, wherein the second intensity is different from the first intensity. In some embodiments, the first illumination has a first temporal modulation, and the second illumination has a second temporal modulation, wherein the second temporal modulation is different from the first temporal modulation.
- In some embodiments, the capturing of the second real-time view includes application of a noise filtering algorithm or a noise averaging algorithm.
- In some embodiments, the camera has a resolution. The recording of the first real-time view includes recording using the camera, wherein the resolution is a first resolution. The capturing of the second real-time view includes capturing using the camera, wherein the resolution is a second resolution. The method further includes altering the resolution from the first resolution to the second resolution based on the trigger, wherein the second resolution is higher than the first resolution.
- In some embodiments, the camera has a field of view. The recording of the first real-time view includes recording using the camera, wherein the field of view is a first field of view. The capturing of the second real-time view includes capturing using the camera, wherein the field of view is a second field of view. The method further includes modifying the field of view from the first field of view to the second field of view based on the trigger, wherein the second field of view is smaller than the first field of view.
- In some embodiments, the method further includes turning off a light directed at the subject based on the trigger. In some embodiments, the method further includes taking an X-ray image, a near-infrared image, or a radioisotope image based on the trigger.
- In some embodiments, the monitoring includes registering a presence or an absence of an operator command. In some embodiments, the monitoring includes registering an acceleration of the sample or an acceleration of the camera. In some embodiments, the monitoring includes registering a difference between sequential frames of the first real-time view.
- In some embodiments, the subject is a biological sample. In some embodiments, the biological sample is an in vivo sample. In some embodiments, the biological sample is an ex vivo sample. In some embodiments, the biological sample is from a mammalian subject. In some embodiments, the biological sample comprises a tumor.
- Also provided is a machine-readable non-transitory medium embodying information indicative of instructions for causing the computer processor to perform operations for presenting images. The operations include recording using a camera a first real-time view of a subject. The camera has a camera location and a camera orientation, and the subject has a subject location and a subject orientation. The camera has a frame frequency, and the frame frequency is a first frame frequency. The operations further include rendering for display the first real-time view. The operations further include changing the camera location or the camera orientation or the subject location or the subject orientation in response to an operator command. The operations further include monitoring a rate of movement in response to the operator command. The operations further include sending a trigger to the camera based upon the rate of movement descending below a target rate. The operations further include switching the frame frequency from the first frame frequency to a second frame frequency based on the trigger, wherein the second frame frequency is less than the first frame frequency. The operations further include capturing using the camera a second real-time view of the subject, wherein the frame frequency of the camera is the second frame frequency, thereby increasing an amount of light collected in each frame. The operations further include outputting for display the second real-time view.
- Also provided is a computer system for presenting images. The system includes at least one processor, and a memory operatively coupled with the at least one processor. The at least one processor executes instructions from the memory including program code for recording using a camera a first real-time view of a subject. The camera has a camera location and a camera orientation, and the subject has a subject location and a subject orientation. The camera has a frame frequency, and the frame frequency is a first frame frequency. The instructions further include program code for rendering for display the first real-time view. The instructions further include program code for changing the camera location or the camera orientation or the subject location or the subject orientation in response to an operator command. The instructions further include program code for monitoring a rate of movement in response to the operator command. The instructions further include program code for sending a trigger to the camera based upon the rate of movement descending below a target rate. The instructions further include program code for switching the frame frequency from the first frame frequency to a second frame frequency based on the trigger, wherein the second frame frequency is less than the first frame frequency. The instructions further include program code for capturing using the camera a second real-time view of the subject, wherein the frame frequency of the camera is the second frame frequency, thereby increasing an amount of light collected in each frame. The instructions further include program code for outputting for display the second real-time view.
-
FIG. 1 is an illustration of an imaging system in accordance with an embodiment. -
FIG. 2 is an illustration of methods for entering interactive motion commands in accordance with an embodiment. -
FIG. 3 is an illustration of motion-adaptive images rendered for display in accordance with an embodiment. -
FIG. 4 is a flowchart of a process for detecting and presenting faint signals from a real-time interactive imager in accordance with an embodiment. -
FIG. 5 is a time course for the use of a single camera to switch from a reflected light preview to a high sensitivity fluorescence mode in accordance with an embodiment. -
FIG. 6 is a flowchart of an adaptive imaging process in accordance with an embodiment. -
FIG. 7 is a time course for a multi-modal system with multiple motion-adaptive functions in accordance with an embodiment. -
FIG. 8 is a flowchart of a motion-adaptive multi-angle X-ray imaging process in accordance with an embodiment. - Embodiments of the present invention relate in part to motion-adaptive switching of imaging functions, such as those, for example, used for surgical or biopsy imaging. The provided methods and systems can monitor for changes in motion, and responsively enhance or modify real-time images being captured or rendered for display. This can be accomplished by analyzing user control behaviors to determine when and how to optimize imaging results, while simultaneously responding to inputs from the user who is interactively directing a viewing motion around areas of interest. In brief, the user can use motion control to direct real-time 3-D imaging of a sample. The motion-adaptive imaging can then maintain a real-time, high-speed preview as the image is in motion, while automatically switching to an enhanced functional image when the image is relatively stationary. Because the imaging needs during movement and when stationary are often different, or even contradictory, the ability to automatically detect and respond to changes in motion is a valuable feature of the provided imaging methods and systems.
- In many medical imaging applications, it can be beneficial to have an accurate and responsive interactive real-time view of a subject that an operator can use as a navigational guide for examining the subject. It can also be beneficial to have enhanced images of the subject that the operator can use to collect more detailed information related to a view of the subject once navigated to. Often, the qualities of such an enhanced image that make it useful for closer investigations also make the enhanced image poor for use in real-time navigation. For this reason, the switching between different imaging functions can be helpful. In this way, for example, when the movement of a real-time view has substantially stopped, functional imaging parameters or functional imaging channels can be switched on as enhancements to provide additional information such as enhanced imaging quality, longer integration for better sensitivity, additional imaging modalities, X-ray shots, overlapping of channels, filtering applications, color changes, adding computational results, and augmented information. The methods and systems can therefore be particularly helpful in identifying and characterizing areas of interest in real-time to assist in the localization of disease tissue in either a surgical suite or a pathology lab.
-
FIG. 1 illustrates one embodiment as a descriptive example. Shown is asample positioning module 101 that includes animaging stage 102. The imaging stage has a firstrotational axis 103 and a secondrotational axis 104. Abiological sample 105 is shown being supported by the imaging stage. The biological sample is within animaging volume 106 proximate to the imaging stage. Also shown is anoptical imaging module 107 that includes acamera 108 configured to have a depth of focus within the imaging volume. The camera records a first real-time view 109 of the biological sample using a first frame rate frequency, and the first real-time view is rendered for display as shown. While the camera is recording with the first frame rate frequency, the sensitivity of the camera to faint signals, such as those of thefluorescent region 110 of the biological sample, is low. - The
rotatable imaging stage 102 supporting thebiological sample 105 is equipped with rotational motors to control the view angle and position of the sample within theimaging volume 106. By rotating a sample in two degrees offreedom imager 107 to efficiently provide a substantially full-rotation 3D image. A first rotational axis can, for example, provide nearly 360-degree movement along the z-axis (roll) relative to the sample. A second rotational axis can, for example, tilt along the y-axis (pitch) for imaging at different perspectives. Tilting of the sample stage also can allow projection views from the top and the bottom of the sample via a transparent window. In some embodiments, the rotational imaging stage can also be moved in an X-Y plane to allow for the registration of the sample to the center of the imaging volume. - While the
rotational stage 102 is in motion above a selected target motion rate, thecamera 108 records the first real-time view 109 of thebiological sample 105 using a first frame rate frequency. When the motion rate of the rotational stage descends below this target rate, a trigger is sent to the camera, and the frame frequency of the camera is switched from the first frame frequency to a second, lower frame frequency. The camera then captures a second real-time view 111 of the biological sample. As a result of the lower frame frequency used to capture and output this second real-time view, the amount of light collected in each frame is increased, and the sensitivity to the faint signals of thefluorescent region 110 is increased. Accordingly, the intensity of the perceived fluorescent signal is greater in the second real-time view than in the first real-time view. - The camera or imager can be configured to have any functional lens assembly, focal length, and exposure control sufficient to record the first real-time view of the subject. The camera or imager can record the first real-time view using an electronic image sensor. The camera or imager can include a charged coupled device (CCD) and/or a complementary metal-oxide-semiconductor (CMOS) sensor. In some embodiments, the first real-time view is monochromatic. In some embodiments, the first real-time view includes two or more colors. The camera can have an actively or passively cooled heat exchanger to maintain imaging sensors at low temperatures. The cooling can prevent optical background noise such as darkness or blooming. Exemplary camera and optical components are described in U.S. Pat. Nos. 7,286,232, 8,220,415, and 8,851,017.
- The first and second real-time views represent depictions of the subject as seen from the viewpoint, such that there is minimal lag between the time of recording, capturing, or photographing the subject, and the time of rendering or outputting the views for display. The lag can be less than 5 seconds, less than 4 seconds, less than 3 seconds, less than 2 seconds, less than 1 second, less than 900 milliseconds, less than 800 milliseconds, less than 700 milliseconds, less than 600 milliseconds, less than 500 milliseconds, less than 400 milliseconds, less than 300 milliseconds, less than 200 milliseconds, or less than 100 milliseconds.
- The camera used to record, capture, or photograph the real-time views of the subject has a frame frequency. The frame frequency can be at least 1 frame per second, at least 4 frames per second, at least 8 frames per second, at least 16 frames per second, at least 24 frames per second, at least 25 frames per second, at least 30 frames per second, at least 48 frames per second, at least 50 frames per second, at least 60 frames per second, at least 72 frames per second, at least 90 frames per second, at least 100 frames per second, at least 120 frames per second, at least 144 frames per second, or at least 240 frames per second. The frame frequency can be less than 1 frame per second. The frame frequency can be a relatively high frame frequency, sometimes referred to as a video rate. Typically, video rates are greater than or equal to 16 frames per second, with higher rates providing a smoother perception of motion.
- The rendered or output real-time views for display can be identical to images recorded, captured, or photographed using the camera. The rendered or output real-time views can be constructions based on information in the recorded, captured, or photographed images. The rendered or output real-time views can contain images or information collected with one channel or modality. The rendered or output real-time views can overlay images or information collected with two or more channels or modalities. As a non-limiting example, a rendered image can overlay reflected light information showing a visible light view of the subject and X-ray information showing locations of radiodense regions within the biological sample. Typically, when a rendered image overlays images or information from multiple channels, modalities, or models, the models are co-registered in three-dimensional space so that the image presents information for each modality as seen from a single viewpoint.
- The subject can be any person, animal, plant, or object to be imaged. The subject can be a biological sample. In some embodiments, the biological sample is one or more organisms. The biological subject can be one or more microscopic organisms. The biological sample can be a plant. The biological sample can be an animal. The biological sample can be a mammal. The biological sample can be a human. In some embodiments, the biological sample is a tissue, organ, or other subsection of an organism.
- The biological sample can be an in vivo sample that is part or all of one or more living organisms. The biological sample can be an individual living microorganism. The biological sample can be a community of two or more living microorganisms. The biological sample can a living plant or animal. The biological sample can be a living mammal. The biological sample can be a living human. The biological sample can be a tissue, organ, or other subsection of a living human. In some embodiments, the biological sample is a region of an animal or human undergoing a surgical or other medical operation or analysis.
- The biological sample can be an ex vivo sample of a tissue, organ, or other subsection removed from a plant or animal. The biological sample can be a tissue, organ, or other subsection removed from a mammal. The biological sample can be a tissue, organ, or other subsection removed from a human. In some embodiments, the biological sample is a resected human tissue sample. In some embodiments, the biological sample is a biopsy sample extracted from a human.
- The biological sample can be of a mammalian subject. The mammalian subject can be, for example, rodent, canine, feline, equine, ovine, porcine, or a primate. The mammalian subject can be human.
- The subject can be a patient suffering from a disease. In some embodiments, the subject is a cancer patient. In certain aspects, the biological sample comprises a tumor, such as tumor tissue or cells. In certain aspects, the biological sample comprises a peripheral biopsy of a tissue sample previously removed. In another aspect, the biological sample is tumor tissue such as a breast core biopsy. The biological sample size can be as small as a tissue slice.
- The subject and the camera each have a location and an orientation. Typically, the camera is oriented such that it is pointing towards the subject, and the camera is located such that the subject is within the focal length of the camera. In some embodiments, as shown in
FIG. 1 , the orientation of the subject can be changed. In some embodiments, the location of the subject can be changed. In some embodiments, the orientation and/or location of the camera can change. By changing one or more of the locations and/or orientations of the subject and/or camera, real-time views of different areas of the subject can be captured, recorded, or photographed from multiple viewpoints. - The provided methods and systems can be applied, for example, to a cabinet biopsy imager or an image-guided surgical system. The changing of the location or orientation of the subject can occur when the subject is, for example, held on a sample stage that is interactively moved for 3D biopsy imaging. When the subject is a patient or a region of a patient within an operating room with an external or endoscopic imager, it is more likely that the subject will remain relatively stationary while the camera location or orientation is changed. In one example, a camera on a moving arm can providing different view angles at a wound bed. In another example, a camera can be used with a robotic image-guided surgical system. In each of these cases, the changing positions and orientations are used to enable 3D imaging of the subject.
- 3D imaging is a broadly-used common term representing the addition of depth perception to 2D images, including those from clinical areas such as computed tomography (CT), magnetic resonance imaging (MRI), and ultrasonography (Kopans (2014) Amer. J. Roentgenology 202:299; Ciatto et al. (2013) Lancet Oncology 14:583; Karatas and Toy (2014) Eur. J. Dentistry 8:132). It has also been applied in biomedical spaces such as those of microscopy, photoacoustic imaging, optical coherence tomography, spectroscopic imaging, and near-infrared imaging. In imaging and filming industries particular for entertainment purposes, 3D imaging has been well-known as a technique for displaying depth perception in projected images in movie theaters or other viewing environments.
- A variety of imaging techniques have been termed as “3D.” In some cases, 3D images can be produced through holography, which records a light field through an interference pattern and re-produces a spatial distribution of the light field in three dimensions. For a variety of tomographic methods, 3D imaging refers to the regeneration of the geometrical information of a subject via computing algorithms and calculations based on recorded data. 3D images can also simply be a stacking of optically sectioned X-Y 2D views, such as Z-stacking of images from confocal microscopy. Frequently, 3D images are produced using a stereoscopic effect by re-creating binocular vision to both left and right eyes at the same time to create the perception of 3D depth of an image. Such 3D imaging is often applied to movies, gaming, and entertainment applications. Through imaging reconstruction technologies, 3D images can be the computed representation of a subject formed in a computer-generated model based on either multiple real images or created virtual content. Such technologies have been used, for example, in the fields of virtual reality, mixed reality, and augmented reality. In photography, 3D pictures and 3D videos can be panoramic images obtained by recording views surrounding an observer by up to 360 degrees. 3D images can also offer only a sectional view of a part of an object, rather than a visualization of the full object. This is particularly the case in optical coherence tomography (OCT), scanning microscopy, and photoacoustic microscopy, where imaging depth and scanning coverage is very limited in its ability to capture the full view of a large subject.
- As used herein, 3D imaging refers to the multi-angle visualization of a subject. The dynamic viewing from different angles or a collection of different view perspectives of an object gives the perception of depth, geometry, dimensions, features, and location of the subject. The imaging and dynamic viewing in multi-angles and orientations enables a 3D perception, not appreciated by traditional 2D representations via a static picture or a static view angle.
-
FIG. 2 illustrates examples of interactive motion controls for changing the location and/or orientation of the subject and/or camera. Shown is adisplay 201 showing a real-time view of a subject 202. In some embodiments, operator commands to change a location or orientation are entered by touching the screen with afinger 203 or a device such as atouch pen 204. The touch commands can include pressing areas of the display showingvirtual buttons virtual buttons cursor 209 on the display by using physical buttons, a computer mouse, a scroll ball, a control stick, switches, or a handheld controller. Operator commands for movement can also be entered using waving gestures, voice activation, or accelerometers. - The rate of movement of one or both of the subject and the camera can be monitored by registering the occurrence of operator commands. The rate of movement can be monitored by registering the presence or the absence of operator commands. In some embodiments, the movement slows or halts when an operator stops inputting a command for rotating, panning, or zooming the real-time view. In some embodiments the movement slows or halts when an operator inputs a command for slowing or halting a rotating, panning, or zooming of the real-time view. In some embodiments, the movement quickens or begins when an operator starts inputting a command for rotating, panning, or zooming the real-time view. In some embodiments the movement quickens or begins when an operator stops inputting a command for slowing or halting a rotating, panning, or zooming of the real-time view.
- The rate of movement of one or both of the subject and the camera can be monitored by registering the acceleration of the subject and camera, respectively. In some embodiments, the camera includes a positioning system or one or more accelerometers that measure the location and movement rate of the camera. In some embodiments, the camera is connected to a support system such as, for example, an arm, that includes a positioning system or one or more accelerometers that measure the location and movement rate of the camera. In some embodiments, the subject is connected to an imaging platform, such as, for example, a rotational stage, that includes a positioning system or one or more accelerometers that measure the location and movement rate of the subject.
- The rate of movement of one or both of the subject and the camera can be monitored by registering changes between two or more frames of the real-time view. In some embodiments, the rate of movement of one or both of the subject and the camera is monitored by registering a difference between sequential frames of the real-time view. The registering of changes between frames of the real-time view can be accomplished with any algorithm suitable for video frame comparison or matching. In some embodiments, the rate of movement is monitored by applying a block-matching algorithm, a phase correlation algorithm, a pixel recursive algorithm, an optical flow algorithm, or a corner detection algorithm.
- A trigger can be sent to the camera when the monitored rate of movement of one or both of the subject and the camera descends below a target rate. The target rate can be expressed as a target degree of movement occurring in a target period of time. The target degree of movement can be, for example, 25%, 20%, 15%, 10%, 5%, 4%, 3%, 2%, or 1%. The target period of time can be, for example, 5 seconds, 4.5 seconds, 4 seconds, 3.5 seconds, 3 seconds, 2.5 seconds, 2 seconds, 1.5 seconds, 1 second, or 0.5 seconds. In some embodiments, the target rate of movement is 15% in 1.5 seconds. Upon receiving the trigger signal, the camera switches to a motion-adaptive imaging function. In some embodiments, the camera switches its frame frequency from a first frame frequency to a lower second frame frequency based on the trigger.
-
FIG. 3 illustrates an embodiment in which touch gestures are used to enter operator commands for moving the subject or camera, and an adaptive imaging mode increases the amount of light collected in each frame in response to the rate of movement dropping below a target rate. Shown are a series of fourimages touch gestures 306, the camera or subject are moved such that the real-time view changes from thefirst image 301 to the second 302, third 303, and thenfourth image 304, presenting different viewpoints of the subject in the 3D imaging visualization. The rate of movement in response to the operator commands is monitored by any of the techniques discussed above. When the rate of movement descends below a target rate, a trigger is sent to the camera, and the frame frequency of the camera is changed to a higher frame frequency. Becuase of this higher frame frequency associated with the motion-adaptive imaging function, the camera collects more light in each frame, and the resultingfifth image 307 shows a stronger response to faint fluorescence signals emitting from the subject. - In some embodiments, a pause of motion input (or a gesture, or voice, or release of pressure on the touch screen, or increase of pressure on the screen, or a prolonged holding on the screen) at an area of interest indicates the expectation of higher sensitivity imaging to interrogate the disease tissue distribution in details. The computer then responsively triggers the adaptive imaging functions. The changes to the camera in response to the trigger signal are not limited to those affecting frame frequency rate, and can also include changes to imaging modes, channels, speeds, resolutions, or sensitivities. The changes can also include adjustments to the illumination of the subject, or to the processing of recorded, captured, or photographed information. For example, the motion-adaptive imaging function of the camera can include one or more of different integration times of imaging pixels on the image sensor, filtering algorithms, noise suppression algorithms, or boosts to the signal to noise ratio. Other changes in the motion-adaptive imaging function can include manipulating the image to show, for example, different colors, contrasts, or overlaid information. The motion-adaptive imaging function can add another imaging modality displayed in an overlaid fashion. In some embodiments, the switch to a motion-adaptive imaging function represents a shift from a high speed preview mode to a high-performance, high-sensitivity mode, with imaging parameters selected to improve or optimize each mode independently.
-
FIG. 4 presents a flowchart of aprocess 400 for detecting and presenting faint signals from a real-time interactive imager. Inoperation 401, a first real-time view of a subject is recorded using a camera, wherein the camera has a camera location and a camera orientation, wherein the subject has a subject location and a subject orientation, wherein the camera has a frame frequency, and wherein the frame frequency is a first frame frequency. Inoperation 402, a first real-time view is rendered for display. Inoperation 403, the camera location or the camera orientation or the subject location or the subject orientation is changed in response to an operator command. In operation, 404, a rate of movement in response to the operator command is monitored. Inoperation 405, a trigger is sent to the camera based upon the rate of movement descending below a target rate. Inoperation 406, the frame frequency is switched from the first frame frequency to a second frame frequency based on the trigger, wherein the second frame frequency is less than the first frame frequency. Inoperation 407, a second real-time view of the subject is captured using the camera, wherein the frame frequency of the camera is the second frame frequency, thereby increasing an amount of light collected in each frame. Inoperation 408, the second real-time view is output for display. - The camera can be a fluorescence camera configured to capture fluorescence images of the subject upon excitation of the subject by a fluorescence excitation light source. The fluorescence excitation light source can be any device configured to emit electromagnetic radiation at an excitation wavelength capable of exciting a fluorescent material within the imaging volume. The fluorescent material can comprise a fluorophore or fluorescent dye. The fluorescence excitation light source can be configured to illuminate the imaging volume, and any sample within, with radiation comprising this excitation wavelength. In some embodiments, the fluorescence excitation light source emits near-infrared light. In certain aspects, the illumination of the subject with near-infrared light is performed at one or more wavelengths of from about 650 nm to about 1400 nm. These wavelengths include, for example, about 700, 725, 750, 775, 800, 825, 850, 875, 900, 910, 920, 930, 940, 950, 960, 970, 980, 990, 1000, 1100, 1200, 1300, and 1400 nm. Sometimes these wavelengths are referred to as being in the NIR-I (between 750 and 1060 nm) and NIR-II (between 1000 nm and 1700 nm) wavelength regions.
- Fluorophore methods utilize molecules that absorb light of one wavelength and emit light of a different wavelength. To utilize a visible image in combination with a fluorophore (e.g., an infrared or near-infrared fluorophore), care should be taken to ensure that the spectra of light variously absorbed, reflected, and emitted do not significantly overlap so as to confound differentiation of the components from each other and differentiation of the components from endogenous tissue material. Filter sets can be used in the optical system to isolate excitation wavelengths with optimized emission collection for corresponding imaging agents.
- In certain aspects, the subject comprises a fluorescent dye. In one aspect, the fluorescent group is a near-infrared (NIR) fluorophore that emits in the range of between about 650 to about 1400 nm. Use of near-infrared fluorescence technology can be advantageous as it substantially eliminates or reduces background from auto fluorescence of tissue. Another benefit to near-IR fluorescent technology is that scattered light from the excitation source is greatly reduced since the scattering intensity is proportional to the inverse fourth power of the wavelength. Low background fluorescence and low scattering result in a high signal to noise ratio, which is essential for highly sensitive detection. Furthermore, the optically transparent window in the near-IR region (650 nm to 990 nm) or NIR-II region (between about 1000 nm and 1400) in biological tissue makes NIR fluorescence a valuable technology for imaging and subcellular detection applications that require the transmission of light through biological components.
- In certain aspects, the fluorescent group is preferably selected form the group consisting of IRDYE® 800RS, IRDYE® 800CW, IRDYE® 800, ALEXA FLUOR® 660, ALEXA FLUOR® 680,
ALEXA FLUOR® 700, ALEXA FLUOR® 750, ALEXA FLUOR® 790, Cy5, Cy5.5, Cy7, DY 676, DY680, DY682, and DY780 molecular marker. In certain aspects, the near infrared group is IRDYE® 800CW, IRDYE® 800, IRDYE® 700DX,IRDYE® 700, or Dynomic DY676 molecular marker. - In certain aspects, a fluorescent dye is contacted with a biological sample prior to excising the biological sample from the subject. For example, the dye can be injected or administered to the subject prior to surgery or after surgery. In certain aspects, the dye is conjugated to an antibody, ligand, or targeting moiety or molecule having an affinity to a tumor or recognizes a tumor antigen. In certain aspects, the fluorescent dye comprises a targeting moiety. In one aspect, the surgeon “paints” the tumor with the dye. In certain aspects, the fluorescent dye is contacted with the biological sample after excising the biological sample from the subject. In this manner, dye can be contacted to the tissue at the margins.
- In some aspects, the targeting molecule or moiety is an antibody that binds an antigen such as a lung cancer cell surface antigen, a brain tumor cell surface antigen, a glioma cell surface antigen, a breast cancer cell surface antigen, an esophageal cancer cell surface antigen, a common epithelial cancer cell surface antigen, a common sarcoma cell surface antigen, or an osteosarcoma cell surface antigen.
- The camera can be a reflected light camera configured to capture white light reflected by the subject. The camera can be a gray-scale or true-color imager. The camera can be a multichannel imager with a reflective light mode with at least one fluorescence channel. The imager can also be combined with X-ray imaging. The motion-adaptive method can also be used on an X-ray only multi-angle, multi-dimensional imager for radiography of a sample
- The first real-time view and the second real-time view can be recorded, captured, or photographed with the same camera. The first real-time view and the second real-time view can be recorded, captured, or photographed with different cameras. Each of the first and second real-time views can be recorded with multiple cameras. In some embodiments, a reflected white light camera with a first frame frequency is used to record the first real-time view, and a fluorescence camera with a second frame frequency is used to capture the second real-time view.
-
FIG. 5 presents an exemplary time course for aprocess 500 using a single camera to adaptively switch from a reflected light preview mode to a fluorescence mode characterized by a long exposure time of the camera image sensor. During the times of the motion sensing andcontrol mode 501, the camera records a first real-time view of a subject using reflected white light and a default frame rate of 30 frames per second. Attime event 502, the user pauses motion control, and the method enters a 1.5-second waiting time period to determine if the monitored rate of movement descends below a target rate. Attime events time event 505, the camera begins capturing the second real-time view of the subject using fluorescence and the second frame rate characteristic of the increased exposure time. -
FIG. 6 presents one workflow of an adaptive imaging process. A user inputs commands for motion interactively to the imaging system. The system receives these interactive controls and provides 3D movement for real-time imaging. The interactive motion control commands are entered via touching the screen or other computer input methods such as with the use of a computer mouse, a control stick, a scroll ball, switches, or a handheld controller. As these commands continue, real-time images, typically at a video rate, are continuously rendered for display. Once the system recognizes a discontinuation of command inputs within a default period of time, or the release of the touch on the screen, the system responsively triggers the adaptive imaging functions. These adaptive functions can include increasing the integration time of pixels of the image sensor and applying noise suppression algorithms. The result of these adaptive functions is the enhancement of imaging sensitivity in the fluorescence channel which can be overlaid with images from a reflective channel or other imaging modalities. The user will then observe an increase in signal level due to the higher image sensitivity at the fluorescence channel so that localization of disease tissue can be performed. - In addition to reflected white light imaging and fluorescence imaging, other imaging modalities can be used with the provided methods and systems. For example, X-ray, near-infrared, radioisotope, or ultrasound images can be captured based on the trigger signal. X-ray information can be used to show locations of radiodense regions within the subject or biological sample. The motion-adaptive imaging methods can then be used with an X-ray biopsy imager to provide responsive shots of, for example, a lumpectomy samples at multiple user-selected angles.
- X-ray images can be produced by exposing the subject to X-ray irradiation from an X-ray source, and recording the X-rays with an X-ray imager. The X-ray source can be any artificial X-ray source configured to irradiate the imaging volume with X-rays. In some embodiments, the X-ray source is an X-ray tube. The X-ray tube can comprise a rotating anode tube. In some embodiments, the X-ray source is a solid-anode microfocus X-ray tube or a metal-jet-anode microfocus X-ray tube.
- The X-ray imager can be any device configured to measure the properties of X-rays exiting the image volume. The X-ray imager can comprise, for example, one or more of a sensitized photographic plate, sensitized photographic film, a photostimulable phosphor plate, a semiconductor or solid state detector, or a scintillator. In some embodiments, the X-ray imager comprises a scintillator. The scintillator can comprise any material that converts an X-ray photon to a visible light photon. The scintillator can comprise one or more organic or inorganic compounds. The scintillator compounds can comprise, for example, barium fluoride, calcium fluoride doped with europium, bismuth germinate, cadmium tungstate, cesium iodide doped with thallium, cesium iodide doped with sodium, undoped cesium iodide, gadolinium oxysulfide, lanthanum bromide doped with cerium, lanthanum chloride doped with cerium, lead tungstate, lutetium iodide, lutetium oxyorthosilicate, sodium iodide doped with thallium, yttrium aluminum garnet, zinc sulfide, or zinc tungstate. In some embodiments, the scintillator comprises sodium iodide, gadolinium oxysulfide, or cesium iodide.
- In some embodiments, the X-ray imager is a an X-ray flat panel detector. The flat panel detector can comprise a scintillator material and a photodiode transistor array. The flat panel detector can further comprise one or more readout circuits. The flat panel detector can comprise a detection face and a display face on opposite sides of the detector from one another. The detection face can be directed towards the biological sample and the X-ray source so as to be contacted with X-rays generated by the X-ray source and passing through the imaging volume. The display face can be directed towards a camera so that an X-ray image displayed on the display face can be recorded using the camera. In some embodiments, the X-ray image is displayed on the display face by generating visible light that is recorded by a visible light camera configured to have a depth of focus that corresponds to the distance between the display face and the camera.
- Ultrasonic images can be produced with an ultrasonic transducer array configured to detect ultrasound waves exiting the imaging volume and convert the waves into electrical signals. Energy pulses transmitted into the imaging volume can cause a biological sample within to absorb this time-varying energy, inducing the generation of acoustic waves that can be detected by the ultrasonic transducer array. Within the imaging volume, the ultrasonic transducer array is in contact with the biological sample via a coupling medium. The coupling medium can comprise water or gel to relay ultrasound waves. In some embodiments, the energy pulses are non-ionizing light pulses and the ultrasonic transducer array can be used to record a photoacoustic image. In some embodiments, the energy pulses are radio frequency pulses and the ultrasonic transducer array can be used to record a thermoacoustic image. In some embodiments, the energy pulses are ultrasonic pulses, and the ultrasonic transducer array can be used to record an ultrasound image.
- Optical coherence tomography images can be produced with an interferometer configured for tomography of the biological sample within the imaging volume. In some embodiments, the interferometer is a Michelson interferometer. A camera can be configured to detect electromagnetic radiation emitted from the imaging volume for optical coherence tomography of the biological sample.
- The motion-adaptive imaging functions can include one or more noise filtering or noise averaging processes. Any signal processing algorithm suitable for image noise reduction can be used with the provided methods and systems. The noise reduction algorithms can include, for example, chroma and luminescence noise separation, linear smoothing, anistropic diffusion, non-local averaging, or wavelet transformations.
-
FIG. 7 presents anexemplary time course 700 for a multi-modal system that starts with both a white light and a fluorescence channel, and then adaptively switches on multiple functions. During the times of the motion sensing andcontrol mode 701, the camera records a first real-time view of a subject using both reflected white light and fluorescence, each with a frame rate of 15 frames per second. During this time, the subject is illuminated with a high-frequency modulated illumination that alternates between white light and fluorescence excitation wavelengths. The collection of reflected white light and fluorescence emissions is through one or two cameras that can be synchronized with the illumination modulation. Attime event 702, the user pauses motion control, and the method enters a 1.5-second waiting time period to determine if the monitored rate of movement descends below a target rate. Attime event 703, the imaging system is switched to multiple motion-adaptive imaging functions. These include the turning off of the white light illumination channel, the switching of the fluorescence camera frame frequency to a lower frequency characteristic of a long exposure mode, the stopping of binning in the image processing of the camera, the starting of the application of a noise filtering algorithm, and the readying of an additional imaging modality for X-ray image capture. Attime event 704, the fluorescence camera begins capturing the second real-time view of the subject using fluorescence and the second frame rate, and the X-ray imager records an X-ray image for display. Attime event 705, the imaging system begins to wait for user entry of new motion commands. -
FIG. 8 presents a workflow of a motion-adaptive multi-angle X-ray imaging process. A user inputs motion commands interactively to the imaging system. The system receives these commands and moves a sample stage to provide multi-dimensional movement for real-time reflective-light imaging in an imaging cabinet. Real-time images are rendered for display continuously to show the sample orientation inside the imaging cabinet. Once the system recognizes the movement rate has dropped such that a period of commands is reduced, the system triggers an X-ray imaging instruction that a user can interrupt. In the absence of interruption, an X-ray image is captured. The X-ray image can be output for presentation along with the real-time reflective view in the same overlaid image or separate images. The user can then continue the motion to re-orient the sample or take another X-ray shot. - The first and second real-time views of the subject recorded using standard and motion-adaptive imaging functions, respectively, can each contain one or more imaging modalities. In the case of two or more imaging modalities, the separate modalities can be rendered or output for display in overlaid or separate images. The imaging modalities can include reflected white light imaging, fluorescence imaging, X-ray imaging, near infra-red imaging, ultrasonic imaging, optical coherence tomography imaging, or others. The real-time views captured using the motion-adaptive functions can be overlaid with an image recorded using the standard imaging functions prior to the sending of the trigger to begin motion-adaptive imaging. In some embodiments, a reflected white light image of the subject recorded or photographed prior to the sending of the trigger is overlaid with the real-time view of the subject captured subsequent to the sending of the trigger.
- The standard and motion-adaptive imaging functions can each include an illumination of the subject. The second illumination for the motion-adaptive imaging subsequent to the sending of the trigger can be different from the first illumination for the standard imaging prior to the sending of the trigger. The first and second illuminations, and the light received by cameras and imagers associated with the provided methods and systems, can include any wavelength of electromagnetic radiation. The illuminations can be provided by one or more illumination sources. Illumination sources can be mounted proximate to the imaging volume in order to illuminate the sample with white light, monochrome light, near-infrared light, fluorescence light, or other electromagnetic radiation. One or more white lights can be used to illuminate the imaging volume. In some embodiments, the illumination of the subject with visible light is performed at one or more wavelengths of about 380 nm to about 700 nm. These wavelengths include, for example, about 380, 390, 400, 425, 450, 475, 500, 525, 550, 575, 600, 625, 650, 675, or about 700 nm. These wavelengths can occur in combination, such as in broadband white light. In some embodiments illumination at one or more wavelengths is stopped by the switch to motion-adaptive imaging. In some embodiments, a light is turned off by the switch to motion-adaptive imaging.
- The first and second illuminations can have similar or different intensities. The first and second illuminations can have similar or different temporal modulations. In some embodiments, an illumination can include both white light and fluorescence excitation light used to alternately illuminate the subject in a modulated fashion. The modulated electromagnetic radiations can differ in amplitude, phase, frequency, or polarization.
- The first and second real-time views of the subject recorded using standard and motion-adaptive imaging functions, respectively, can have similar or different resolutions. The real-time views can have similar or different pixel resolutions, spatial resolutions, spectral resolutions, temporal resolutions, or radiometric resolutions. The first and second real-time views of the subject recorded using standard and motion-adaptive imaging functions, respectively, can have similar or different fields of view.
- The devices and methods can utilize a computing apparatus that is programmed or otherwise configured to automate and/or regulate one or more steps of the methods or features of the devices provided herein. Some embodiments provide machine executable code in a non-transitory storage medium that, when executed by a computing apparatus, implements any of the methods or operates any of the devices described herein. In some embodiments, the computing apparatus operates one or more power sources, motors, and/or displays. A display can be used for review of the real-time views. The display can be a touch screen to receive interactive command inputs. The display can be a wireless device relaying the information wirelessly.
- The terms “first”, “second”, “third”, “fourth”, and “fifth”, and “sixth” when used herein with reference to images, views, frequencies, cameras, illuminations, wavelengths, intensities, modulations, resolutions, fields of view, axes, or other elements or properties are simply to more clearly distinguish the two or more elements or properties and unless stated otherwise are not intended to indicate order.
- Although the foregoing invention has been described in some detail by way of illustration and example for purposes of clarity of understanding, one of skill in the art will appreciate that certain changes and modifications may be practiced within the scope of the appended claims. In addition, each reference provided herein is incorporated by reference in its entirety to the same extent as if each reference was individually incorporated by reference. Where a conflict exists between the instant application and a reference provided herein, the instant application shall dominate.
Claims (20)
1. A method of detecting and presenting images, the method comprising:
recording, using a fluorescence camera, a first view of a subject at a first frame frequency;
photographing a reflected white light view of the subject with a second camera;
monitoring, using a processor, a rate of movement of the subject in the reflected white light view from the second camera;
sending a trigger, using the processor, to the fluorescence camera based upon the rate of movement crossing a target rate;
switching, using the processor, the first frame frequency to a second frame frequency based on the trigger; and
capturing, using the fluorescence camera, a second view of the subject at the second frame frequency.
2. The method of claim 1 , wherein the second frame frequency is lower than the first frame frequency, and crossing the target rate includes descending below the target rate.
3. The method of claim 1 , further comprising:
overlaying a reflected white light image of the subject over the second view of the subject, wherein the reflected white light image was photographed prior to the sending of the trigger; and
outputting the overlaid reflected white light image of the subject over the second view of the subject.
4. The method of claim 1 , further comprising:
illuminating the subject with a first excitation light while recording the first view;
switching the first excitation light to a second excitation light in response to the trigger; and
illuminating the subject with a second excitation light while capturing the second view.
5. The method of claim 4 , wherein the first excitation light has a first wavelength and the second excitation light has a second wavelength.
6. The method of claim 4 , wherein the first excitation light has a first intensity and the second excitation light has a second intensity.
7. The method of claim 4 , wherein the first excitation light has a first temporal modulation and the second excitation light has a second temporal modulation.
8. The method of claim 1 , wherein the capturing of the second view includes an application of a noise filtering algorithm or a noise averaging algorithm.
9. The method of claim 1 , wherein the fluorescence camera has a resolution, wherein the recording of the first view is at a first resolution, and the capturing of the second view is at a second resolution, and wherein the method further comprises:
altering the resolution from the first resolution to the second resolution based on the trigger.
10. The method of claim 1 , wherein the fluorescence camera has a field of view, wherein the recording of the first view is of a first field of view, and the capturing of the second view is of a second field of view, and wherein the method further comprises:
modifying the field of view from the first field of view to the second field of view based on the trigger.
11. The method of claim 1 , further comprising:
turning on or off a light source directed at the subject based on the trigger.
12. The method of claim 1 , further comprising:
taking an X-ray image, a near-infrared image, or a radioisotope image based on the trigger.
13. The method of claim 1 , wherein the monitoring includes registering a presence or an absence of an operator command.
14. The method of claim 1 , wherein the subject is a biological sample.
15. The method of claim 14 , wherein the biological sample is an in vivo sample.
16. The method of claim 14 , wherein the biological sample is an ex vivo sample.
17. A machine-readable non-transitory medium embodying information indicative of instructions for causing a computer processor to perform operations for presenting images, the operations comprising:
controlling a fluorescence camera to record a first view of a subject at a first frame frequency;
controlling a second camera to photograph a reflected white light view of the subject;
monitoring a rate of movement of the subject in the reflected white light view from the second camera;
sending a trigger to the fluorescence camera based upon the rate of movement crossing a target rate;
switching the first frame frequency to a second frame frequency based on the trigger; and
controlling the fluorescence camera to capture a second view of the subject at the second frame frequency.
18. The medium of claim 17 , wherein the operations further comprise:
controlling a first excitation light to illuminate the subject while recording the first view;
switching the first excitation light to a second excitation light in response to the trigger; and
controlling a second excitation light to illuminate the subject while capturing the second view.
19. A computer system for presenting images, the system comprising:
at least one processor, and
a memory operatively coupled with the at least one processor, the at least one processor executing instructions from the memory, the memory comprising:
a) program code for controlling a fluorescence camera to record a first view of a subject at a first frame frequency;
b) program code for controlling a second camera to photograph a reflected white light view of the subject;
c) program code for monitoring a rate of movement of the subject in the reflected white light view from the second camera;
d) program code for sending a trigger to the fluorescence camera based upon the rate of movement crossing a target rate;
e) program code for switching the first frame frequency to a second frame frequency based on the trigger; and
f) program code for controlling the fluorescence camera to capture a second view of the subject at the second frame frequency.
20. The computer system of claim 19 , wherein the memory further comprises:
program code for controlling a first excitation light to illuminate the subject while recording the first view;
program code for switching the first excitation light to a second excitation light in response to the trigger; and
program code for controlling a second excitation light to illuminate the subject while capturing the second view.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/213,235 US20210219843A1 (en) | 2016-11-23 | 2021-03-26 | Motion-Adaptive Interactive Imaging |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662425967P | 2016-11-23 | 2016-11-23 | |
US201762489921P | 2017-04-25 | 2017-04-25 | |
US15/819,603 US10993622B2 (en) | 2016-11-23 | 2017-11-21 | Motion-adaptive interactive imaging method |
US17/213,235 US20210219843A1 (en) | 2016-11-23 | 2021-03-26 | Motion-Adaptive Interactive Imaging |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/819,603 Continuation US10993622B2 (en) | 2016-11-23 | 2017-11-21 | Motion-adaptive interactive imaging method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210219843A1 true US20210219843A1 (en) | 2021-07-22 |
Family
ID=60813964
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/819,603 Active 2039-03-10 US10993622B2 (en) | 2016-11-23 | 2017-11-21 | Motion-adaptive interactive imaging method |
US17/213,235 Abandoned US20210219843A1 (en) | 2016-11-23 | 2021-03-26 | Motion-Adaptive Interactive Imaging |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/819,603 Active 2039-03-10 US10993622B2 (en) | 2016-11-23 | 2017-11-21 | Motion-adaptive interactive imaging method |
Country Status (3)
Country | Link |
---|---|
US (2) | US10993622B2 (en) |
EP (1) | EP3545488A1 (en) |
WO (1) | WO2018098162A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11670053B2 (en) * | 2017-12-05 | 2023-06-06 | Radalytica A.S. | Method of non-destructive imaging of the internal structure and device for carrying out the method |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016137899A1 (en) | 2015-02-23 | 2016-09-01 | Li-Cor, Inc. | Fluorescence biopsy specimen imager and methods |
EP3869184A1 (en) | 2015-06-26 | 2021-08-25 | Li-Cor, Inc. | Fluorescence biopsy specimen imager and methods |
WO2017184940A1 (en) | 2016-04-21 | 2017-10-26 | Li-Cor, Inc. | Multimodality multi-axis 3-d imaging |
US10278586B2 (en) | 2016-06-23 | 2019-05-07 | Li-Cor, Inc. | Complementary color flashing for multichannel image presentation |
US10993622B2 (en) | 2016-11-23 | 2021-05-04 | Li-Cor, Inc. | Motion-adaptive interactive imaging method |
WO2018200261A1 (en) | 2017-04-25 | 2018-11-01 | Li-Cor, Inc. | Top-down and rotational side view biopsy specimen imager and methods |
DE102019004233B4 (en) | 2018-06-15 | 2022-09-22 | Mako Surgical Corp. | SYSTEMS AND METHODS FOR TRACKING OBJECTS |
US11070762B2 (en) * | 2019-10-10 | 2021-07-20 | Titan Medical Inc. | Imaging apparatus for use in a robotic surgery system |
EP4024034A1 (en) * | 2021-01-05 | 2022-07-06 | The Boeing Company | Methods and apparatus for measuring fastener concentricity |
US20230254589A1 (en) * | 2022-02-04 | 2023-08-10 | Applied Materials, Inc. | Pulsed illumination for fluid inspection |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150373293A1 (en) * | 2014-06-24 | 2015-12-24 | Sony Corporation | Video acquisition with adaptive frame rate |
US20160155472A1 (en) * | 2014-12-02 | 2016-06-02 | Sony Corporation | Sensor configuration switching for adaptation of video capturing frame rate |
US20180228375A1 (en) * | 2015-11-18 | 2018-08-16 | The Board Of Trustees Of The Leland Stanford Junior University | Method and Systems for Measuring Neural Activity |
Family Cites Families (136)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2792502A (en) | 1953-02-06 | 1957-05-14 | Donald T O'connor | High sensitivity fluoroscope |
JPH02304366A (en) | 1989-05-18 | 1990-12-18 | Tosoh Corp | Manipulator for analyzing device |
US5103338A (en) | 1990-10-04 | 1992-04-07 | Crowley Kevin D | Apparatus for positioning objects for microscopic examination |
US5224141A (en) * | 1991-02-06 | 1993-06-29 | General Electric Company | Fluoroscopic method with reduced x-ray dosage |
JP2830492B2 (en) | 1991-03-06 | 1998-12-02 | 株式会社ニコン | Projection exposure apparatus and projection exposure method |
US5408294A (en) | 1993-05-28 | 1995-04-18 | Image Technology International, Inc. | 3D photographic printer with direct key-subject alignment |
EP0685709B1 (en) | 1994-05-31 | 2001-12-05 | Japan Em Co., Ltd. | Apparatus for measuring dimension of article and scale to be used in the same |
US5782762A (en) | 1994-10-27 | 1998-07-21 | Wake Forest University | Method and system for producing interactive, three-dimensional renderings of selected body organs having hollow lumens to enable simulated movement through the lumen |
JP3602207B2 (en) | 1995-07-12 | 2004-12-15 | 富士写真フイルム株式会社 | Surgical fluorescent imaging device |
JPH09236755A (en) | 1996-02-29 | 1997-09-09 | Jeol Ltd | Method for correcting position of sample stage of microscope and sample stage |
DE19628765B4 (en) | 1996-07-17 | 2007-04-05 | Braun, Paul-Wilhelm, Dipl.-Ing. | Method and device for determining the position of non-rectilinearly moving, in particular rotating machine parts |
KR19980019031A (en) | 1996-08-27 | 1998-06-05 | 고노 시게오 | A stage device (A STAGE APPARATUS) |
CN1480903A (en) | 1996-08-29 | 2004-03-10 | ������������ʽ���� | Specificity information assignment, object extraction and 3-D model generation method and appts thereof |
JPH10123054A (en) | 1996-10-23 | 1998-05-15 | Kagaku Gijutsu Shinko Jigyodan | Method and apparatus for observing sample |
US6165170A (en) | 1998-01-29 | 2000-12-26 | International Business Machines Corporation | Laser dermablator and dermablation |
AU5920899A (en) | 1998-09-14 | 2000-04-03 | Lucid, Inc. | Imaging of surgical biopsies |
JP2000304688A (en) | 1999-04-16 | 2000-11-02 | Canon Inc | Substrate measuring method and device |
US6711433B1 (en) | 1999-09-30 | 2004-03-23 | Siemens Corporate Research, Inc. | Method for providing a virtual contrast agent for augmented angioscopy |
US6473489B2 (en) | 1999-09-30 | 2002-10-29 | Siemens Corporate Research, Inc | Apparatus for superimposition of X-ray and video images |
US7065242B2 (en) | 2000-03-28 | 2006-06-20 | Viewpoint Corporation | System and method of three-dimensional image capture and modeling |
US7118710B2 (en) | 2000-10-30 | 2006-10-10 | Sru Biosystems, Inc. | Label-free high-throughput optical technique for detecting biomolecular interactions |
GB0112392D0 (en) | 2001-05-22 | 2001-07-11 | Medical Res Council | Optical imaging appartus and associated specimen support means |
US7113217B2 (en) | 2001-07-13 | 2006-09-26 | Xenogen Corporation | Multi-view imaging apparatus |
KR100411631B1 (en) | 2001-10-18 | 2003-12-18 | 주식회사 메디미르 | Fluorescence endoscope apparatus and a method for imaging tissue within a body using the same |
DE10339784B4 (en) | 2002-08-28 | 2021-09-16 | Carl Zeiss Meditec Ag | Microscopy system and microscopy method |
US20110229023A1 (en) | 2002-11-01 | 2011-09-22 | Tenebraex Corporation | Technique for enabling color blind persons to distinguish between various colors |
US20040101088A1 (en) | 2002-11-27 | 2004-05-27 | Sabol John Michael | Methods and apparatus for discriminating multiple contrast agents |
GB0301775D0 (en) | 2003-01-25 | 2003-02-26 | Wilson John E | Device and method for 3Dimaging |
JP4043417B2 (en) | 2003-08-28 | 2008-02-06 | シスメックス株式会社 | Particle size measuring device |
JP4634304B2 (en) | 2003-10-10 | 2011-02-16 | 浜松ホトニクス株式会社 | Method and system for quantifying fluorescent dye concentration |
EP1607041B1 (en) | 2004-06-17 | 2008-01-16 | Cadent Ltd. | Method for providing data associated with the intraoral cavity |
US20080312540A1 (en) | 2004-12-08 | 2008-12-18 | Vasilis Ntziachristos | System and Method for Normalized Flourescence or Bioluminescence Imaging |
US7355700B2 (en) | 2004-12-10 | 2008-04-08 | International Business Machines Corporation | Automated inspection system and method |
US7822174B2 (en) | 2005-04-20 | 2010-10-26 | The Regents Of The University Of California | Cryotomography x-ray microscopy state |
US8351026B2 (en) | 2005-04-22 | 2013-01-08 | Affymetrix, Inc. | Methods and devices for reading microarrays |
US20090087046A1 (en) | 2005-09-08 | 2009-04-02 | Matthew Joseph Kuhn | Digital blink comparator apparatus and software and methods for operation |
US8504140B2 (en) | 2008-04-08 | 2013-08-06 | Bruker Biospin Corporation | Apparatus and method for fluorescence imaging and tomography using spatially structured illumination |
US8953909B2 (en) | 2006-01-21 | 2015-02-10 | Elizabeth T. Guckenberger | System, method, and computer software code for mimic training |
JP5006215B2 (en) | 2006-02-07 | 2012-08-22 | 古河電気工業株式会社 | Photodetector and measuring object reader |
CA2640441C (en) | 2006-02-15 | 2015-11-24 | Ahmed Bouzid | Fluorescence filtering system and method for molecular imaging |
CN101460953B (en) | 2006-03-31 | 2012-05-30 | 索雷克萨公司 | Systems and devices for sequence by synthesis analysis |
JP4979271B2 (en) | 2006-05-29 | 2012-07-18 | オリンパス株式会社 | ENDOSCOPE SYSTEM AND ENDOSCOPE OPERATING METHOD |
US20080025475A1 (en) | 2006-07-25 | 2008-01-31 | Synarc, Inc. | Apparatus for determining the position and orientation in medical imaging |
US7551711B2 (en) | 2006-08-07 | 2009-06-23 | Xoran Technologies, Inc. | CT scanner including a camera to obtain external images of a patient |
US8792968B2 (en) | 2006-09-25 | 2014-07-29 | Song Xiao | System and method for health evaluation |
US7715523B2 (en) | 2006-09-28 | 2010-05-11 | Lafferty Peter R | System and apparatus for rapid stereotactic breast biopsy analysis |
US8503602B2 (en) | 2006-09-28 | 2013-08-06 | Peter R. Lafferty | System and apparatus for rapid stereotactic breast biopsy analysis |
CN101301192B (en) | 2007-05-10 | 2010-06-23 | 中国科学院自动化研究所 | Multimode autofluorescence tomography molecule image instrument and rebuilding method |
JP2008298861A (en) | 2007-05-29 | 2008-12-11 | Olympus Corp | Observation device |
EP1998205B1 (en) | 2007-05-29 | 2011-07-06 | Olympus Corporation | Observation apparatus |
JP2009008739A (en) | 2007-06-26 | 2009-01-15 | Olympus Corp | Living body observation apparatus |
DE102007030768A1 (en) | 2007-07-02 | 2009-01-08 | Sirona Dental Systems Gmbh | Measuring device and method for 3D measurement of tooth models |
CN100593389C (en) | 2007-07-10 | 2010-03-10 | 清华大学 | Continuous dynamic gathering type beastie inducing fluorescence molecule imaging system |
US8220415B2 (en) | 2007-09-05 | 2012-07-17 | Li-Cor, Inc. | Modular animal imaging apparatus |
DE102007047461A1 (en) | 2007-09-28 | 2009-04-02 | Carl Zeiss Microimaging Gmbh | Method and optical arrangement for examining a sample |
US7929743B2 (en) | 2007-10-02 | 2011-04-19 | Hologic, Inc. | Displaying breast tomosynthesis computer-aided detection results |
WO2009089543A2 (en) | 2008-01-10 | 2009-07-16 | The Ohio State University Research Foundation | Fluorescence detection system |
US9332942B2 (en) | 2008-01-28 | 2016-05-10 | The General Hospital Corporation | Systems, processes and computer-accessible medium for providing hybrid flourescence and optical coherence tomography imaging |
US8143600B2 (en) | 2008-02-18 | 2012-03-27 | Visiongate, Inc. | 3D imaging of live cells with ultraviolet radiation |
CZ301826B6 (en) | 2008-03-21 | 2010-06-30 | Biologické centrum AV CR, v.v.i., Ústav molekulární biologie rostlin | Method of investigating dynamics of volume changes of physiologically, particularly photo synthetically active samples and apparatus for making the same |
US8169468B2 (en) * | 2008-04-26 | 2012-05-01 | Intuitive Surgical Operations, Inc. | Augmented stereoscopic visualization for a surgical robot |
WO2013109966A1 (en) | 2012-01-20 | 2013-07-25 | The Trustees Of Dartmouth College | Method and apparatus for quantitative hyperspectral fluorescence and reflectance imaging for surgical guidance |
WO2009155151A2 (en) | 2008-06-20 | 2009-12-23 | Visiongate, Inc. | Functional imaging of cells optical projection tomography |
CA2731956A1 (en) | 2008-07-25 | 2010-01-28 | Daniel S. Gareau | Rapid confocal microscopy to support surgical procedures |
CN101401722B (en) | 2008-11-07 | 2012-07-25 | 上海奥通激光技术有限公司 | Multi-mode co-focusing imaging method and apparatus |
US20120049088A1 (en) | 2009-03-06 | 2012-03-01 | The Trustees Of Columbia University In The City Of New York | Systems, methods and computer-accessible media for hyperspectral excitation-resolved fluorescence tomography |
US9155471B2 (en) | 2009-05-27 | 2015-10-13 | Lumicell, Inc'. | Methods and systems for spatially identifying abnormal cells |
US8310531B2 (en) | 2009-08-03 | 2012-11-13 | Genetix Corporation | Methods and apparatuses for processing fluorescence images |
CA2842721C (en) | 2009-10-19 | 2016-03-29 | Ventana Medical Systems, Inc. | Imaging system and techniques |
DE112010004507B4 (en) * | 2009-11-20 | 2023-05-25 | Given Imaging Ltd. | System and method for controlling power consumption of an in vivo device |
US8235530B2 (en) | 2009-12-07 | 2012-08-07 | C-Rad Positioning Ab | Object positioning with visual feedback |
ES2364916B1 (en) | 2010-03-05 | 2012-09-04 | Consejo Superior De Investigaciones Científicas (Csic) | INSTRUMENT FOR THE IMPLEMENTATION OF WIDE FIELD IMAGES TO DIFFERENT DEPTHS OF A SPECIMEN |
WO2012027542A2 (en) | 2010-08-25 | 2012-03-01 | California Institute Of Technology | Simultaneous orthogonal light sheet microscopy and computed optical tomography |
WO2012037414A1 (en) | 2010-09-15 | 2012-03-22 | The Charles Stark Draper Laboratory, Inc. | Systems and methods for multilayer imaging and retinal injury analysis |
CN101984928B (en) | 2010-09-29 | 2012-06-13 | 北京大学 | Multi-mode molecular tomography system |
WO2012065163A2 (en) | 2010-11-12 | 2012-05-18 | Emory University | Additional systems and methods for providing real-time anatomical guidance in a diagnostic or therapeutic procedure |
EP2455891A1 (en) | 2010-11-23 | 2012-05-23 | Synoptics Limited | Methods and systems for automatic capture of an image of a faint pattern of light emitted by a specimen |
WO2012071682A1 (en) | 2010-11-30 | 2012-06-07 | 中国科学院自动化研究所 | System and method for multimode three dimensional optical tomography based on specificity |
JP5677864B2 (en) | 2011-01-17 | 2015-02-25 | オリンパス株式会社 | Microscope imaging apparatus and microscope observation method |
CN102048525B (en) | 2011-01-26 | 2012-05-30 | 浙江大学 | Organism fluorescent three-dimensional imaging system and application thereof |
EP2673738A4 (en) | 2011-02-11 | 2017-08-23 | E-4 Endeavors, Inc. | System and method for modeling a biopsy specimen |
US9077910B2 (en) * | 2011-04-06 | 2015-07-07 | Dolby Laboratories Licensing Corporation | Multi-field CCD capture for HDR imaging |
WO2012171029A1 (en) | 2011-06-09 | 2012-12-13 | The Regents Of The University Of California | Excised specimen imaging using a combined pet and micro ct scanner |
DE102011104216A1 (en) | 2011-06-15 | 2012-12-20 | Scanbull Software Gmbh | Method for three-dimensional acquisition of object to be utilized in entertainment field, involves creating and storing image data set comprising coding for images of objects, captured by optical acquisition unit at different polar angles |
WO2013016651A1 (en) | 2011-07-28 | 2013-01-31 | Massachusetts Institute Of Technology | Camera configuration for three-dimensional imaging of interior spaces |
JP5796423B2 (en) * | 2011-09-06 | 2015-10-21 | リコーイメージング株式会社 | Imaging device |
US9218697B2 (en) | 2011-11-30 | 2015-12-22 | Waba Fun Llc | Systems and methods for authenticating objects using IR |
EP2788958B1 (en) | 2011-12-05 | 2019-09-18 | Commonwealth Scientific and Industrial Research Organisation | Method and system for characterising plant phenotype |
US10006922B2 (en) | 2011-12-22 | 2018-06-26 | Massachusetts Institute Of Technology | Raman spectroscopy for detection of glycated analytes |
WO2015023990A1 (en) | 2013-08-15 | 2015-02-19 | The Trustees Of Dartmouth College | Method and apparatus for quantitative and depth resolved hyperspectral fluorescence and reflectance imaging for surgical guidance |
CN109567724A (en) | 2012-02-23 | 2019-04-05 | 史密夫和内修有限公司 | Video-endoscope system |
KR101320712B1 (en) | 2012-02-23 | 2013-10-21 | 인하대학교 산학협력단 | Method for aligning rotation axis of two-axis rotation stage using alignment mark and apparatus thereof |
CN104204778A (en) | 2012-03-12 | 2014-12-10 | 三菱丽阳株式会社 | Fluorescence detection device and fluorescence detection method |
WO2013166497A1 (en) | 2012-05-04 | 2013-11-07 | University Of Rochester | Method and device for preserving and imaging specimens while retaining information on the spatial orientation of specimens with respect to reference objects |
US20150105283A1 (en) | 2012-05-30 | 2015-04-16 | Clarient Diagnostics Services, Inc. | Multiplexed diagnosis method for classical hodgkin lymphoma |
US8741232B2 (en) | 2012-09-05 | 2014-06-03 | Faxitron Bioptics, Llc | Specimen imaging device and methods for use thereof |
KR102251749B1 (en) | 2012-11-07 | 2021-05-13 | 모듈레이티드 이미징, 아이엔씨. | Efficient modulated imaging |
US20140125790A1 (en) | 2012-11-08 | 2014-05-08 | Wisconsin Alumni Research Foundation | Device And Method For Three Dimensional Imaging Of Biological Sample |
US9824440B2 (en) | 2012-11-20 | 2017-11-21 | Vanderbilt University | Methods and systems for three-dimensional real-time intraoperative surgical margin evaluation of tumor tissues |
JP2014115151A (en) | 2012-12-07 | 2014-06-26 | Shimadzu Corp | Optical imaging device |
US8988574B2 (en) * | 2012-12-27 | 2015-03-24 | Panasonic Intellectual Property Corporation Of America | Information communication method for obtaining information using bright line image |
US9442069B2 (en) | 2012-12-21 | 2016-09-13 | Quantum Dental Technologies Inc. | Apparatus for in-vitro imaging and analysis of dental samples |
CN103082997B (en) | 2013-01-28 | 2015-10-28 | 中国科学院自动化研究所 | Drum-type multimodality fusion three-dimension disclocation imaging system and method |
US9936858B2 (en) | 2013-02-04 | 2018-04-10 | Orpheus Medical Ltd | Color reduction in images of an interior of a human body |
EP2765591B1 (en) | 2013-02-08 | 2016-07-13 | FEI Company | Sample preparation stage |
US10231626B2 (en) | 2013-03-15 | 2019-03-19 | The Regents Of The University Of California | Imaging system and method for fluorescence guided surgery |
US9654704B2 (en) | 2013-03-15 | 2017-05-16 | Infrared Integrated Systems, Ltd. | Apparatus and method for multispectral imaging with three dimensional overlaying |
US9407838B2 (en) * | 2013-04-23 | 2016-08-02 | Cedars-Sinai Medical Center | Systems and methods for recording simultaneously visible light image and infrared light image from fluorophores |
GB2514125B (en) | 2013-05-13 | 2016-10-05 | Nikon Metrology Nv | X-ray imaging system with climate control |
US9632187B2 (en) | 2013-06-12 | 2017-04-25 | The Regents Of The University Of California | Modular positron emission tomography kit |
US9651525B2 (en) | 2013-06-27 | 2017-05-16 | TecScan Systems Inc. | Method and apparatus for scanning an object |
JP5726956B2 (en) | 2013-07-09 | 2015-06-03 | オリンパス株式会社 | Method and apparatus for analyzing faint light sample |
KR101514204B1 (en) * | 2013-07-12 | 2015-04-23 | 한국전기연구원 | Apparatus and method for detecting NIR fluorescence at Sentinel Lymph Node |
US10539772B2 (en) | 2013-10-09 | 2020-01-21 | Howard Hughes Medical Institute | Multiview light-sheet microscopy |
CN203677064U (en) | 2014-01-23 | 2014-07-02 | 清华大学 | Reflection-type fluorescence tomography system with linear optical fibers adopted |
US20150257653A1 (en) * | 2014-03-14 | 2015-09-17 | Elwha Llc | Device, system, and method for determining blood pressure in a mammalian subject |
WO2016014252A1 (en) | 2014-07-24 | 2016-01-28 | Apple Inc. | Invisible optical label for transmitting information between computing devices |
US10113910B2 (en) | 2014-08-26 | 2018-10-30 | Digimarc Corporation | Sensor-synchronized spectrally-structured-light imaging |
CN104299599B (en) | 2014-11-04 | 2017-05-24 | 深圳市华星光电技术有限公司 | Conversion system and conversion method from RGB data to WRGB data |
WO2016073569A2 (en) | 2014-11-05 | 2016-05-12 | Carestream Health, Inc. | Video detection of tooth condition using green and red fluorescence |
KR102476063B1 (en) | 2014-12-16 | 2022-12-12 | 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 | Ureter detection using waveband-selective imaging |
US10013796B2 (en) | 2015-01-22 | 2018-07-03 | Ditto Technologies, Inc. | Rendering glasses shadows |
WO2016137899A1 (en) | 2015-02-23 | 2016-09-01 | Li-Cor, Inc. | Fluorescence biopsy specimen imager and methods |
WO2016154589A1 (en) * | 2015-03-25 | 2016-09-29 | Camplex, Inc. | Surgical visualization systems and displays |
EP3869184A1 (en) | 2015-06-26 | 2021-08-25 | Li-Cor, Inc. | Fluorescence biopsy specimen imager and methods |
JPWO2017017745A1 (en) | 2015-07-27 | 2018-03-22 | 株式会社日立ハイテクノロジーズ | Defect determination method and X-ray inspection apparatus |
KR101849705B1 (en) | 2015-11-13 | 2018-05-30 | 한국전기연구원 | Method and system for generating 3D image using spectral x-ray and optical image |
EP3394579B1 (en) | 2015-12-21 | 2023-09-20 | Verily Life Sciences LLC | Systems and methods for determining an identity of a probe in a target based on colors and locations of two or more fluorophores in the probe and in the target |
WO2017160643A1 (en) | 2016-03-14 | 2017-09-21 | Massachusetts Institute Of Technology | Device and method for imaging shortwave infrared fluorescence |
WO2017184940A1 (en) | 2016-04-21 | 2017-10-26 | Li-Cor, Inc. | Multimodality multi-axis 3-d imaging |
CA2956230C (en) | 2016-04-29 | 2020-01-14 | Synaptive Medical (Barbados) Inc. | Multi-modal optical imaging system for tissue analysis |
US20170336706A1 (en) | 2016-05-20 | 2017-11-23 | Li-Cor, Inc. | X-ray biopsy specimen imager and methods |
CN109328036B (en) * | 2016-06-17 | 2024-03-08 | 皇家飞利浦有限公司 | System and method for determining hemodynamic parameters of a patient |
US10278586B2 (en) | 2016-06-23 | 2019-05-07 | Li-Cor, Inc. | Complementary color flashing for multichannel image presentation |
US10709333B2 (en) | 2016-07-25 | 2020-07-14 | PhotoSound Technologies, Inc. | Instrument for acquiring co-registered orthogonal fluorescence and photoacoustic volumetric projections of tissue and methods of its use |
US10993622B2 (en) | 2016-11-23 | 2021-05-04 | Li-Cor, Inc. | Motion-adaptive interactive imaging method |
-
2017
- 2017-11-21 US US15/819,603 patent/US10993622B2/en active Active
- 2017-11-21 WO PCT/US2017/062812 patent/WO2018098162A1/en unknown
- 2017-11-21 EP EP17822083.6A patent/EP3545488A1/en not_active Withdrawn
-
2021
- 2021-03-26 US US17/213,235 patent/US20210219843A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150373293A1 (en) * | 2014-06-24 | 2015-12-24 | Sony Corporation | Video acquisition with adaptive frame rate |
US20160155472A1 (en) * | 2014-12-02 | 2016-06-02 | Sony Corporation | Sensor configuration switching for adaptation of video capturing frame rate |
US20180228375A1 (en) * | 2015-11-18 | 2018-08-16 | The Board Of Trustees Of The Leland Stanford Junior University | Method and Systems for Measuring Neural Activity |
Non-Patent Citations (1)
Title |
---|
Choi et al., "A spatial-temporal multiresolution CMOS image sensor with adaptive frame rates for tracking the moving objects in region-of-interest and suppressing motion blur"., IEEE Journal of Solid-State Circuits, Vol. 42, No. 12, 2007. (Year: 2007) * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11670053B2 (en) * | 2017-12-05 | 2023-06-06 | Radalytica A.S. | Method of non-destructive imaging of the internal structure and device for carrying out the method |
Also Published As
Publication number | Publication date |
---|---|
US10993622B2 (en) | 2021-05-04 |
US20180140197A1 (en) | 2018-05-24 |
WO2018098162A1 (en) | 2018-05-31 |
EP3545488A1 (en) | 2019-10-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210219843A1 (en) | Motion-Adaptive Interactive Imaging | |
US10775309B2 (en) | Top-down and rotational side view biopsy specimen imager and methods | |
JP7319331B2 (en) | Open field handheld fluorescence imaging system and method | |
US10948415B2 (en) | Method of determining surgical margins using fluorescence biopsy specimen imager | |
US10314490B2 (en) | Method and device for multi-spectral photonic imaging | |
US10489964B2 (en) | Multimodality multi-axis 3-D imaging with X-ray | |
US10517483B2 (en) | System for detecting fluorescence and projecting a representative image | |
Shao et al. | Designing a wearable navigation system for image-guided cancer resection surgery | |
EP3338617B1 (en) | Goggle imaging systems and devices | |
US8041409B2 (en) | Method and apparatus for multi-modal imaging | |
US10830712B2 (en) | System and method for cabinet x-ray systems with camera | |
EP3641622A1 (en) | System for endoscopic imaging and method for processing images | |
WO2017043539A1 (en) | Image processing system, image processing device, projecting device, and projecting method | |
JP6485275B2 (en) | Imaging device | |
WO2021222015A1 (en) | Simultaneous top-down and rotational side-view fluorescence imager for excised tissue | |
BRPI0708897A2 (en) | device and method for imaging a cloudy medium, and, computer program product | |
KR101021989B1 (en) | Image acquisition apparatus of small animals for clinical trials | |
US10921265B2 (en) | System and method for cabinet x-ray systems with near-infrared optical system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: MIDCAP FINANCIAL TRUST, MARYLAND Free format text: SECURITY INTEREST;ASSIGNOR:LI-COR, INC.;REEL/FRAME:058293/0889 Effective date: 20211201 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |