Nothing Special   »   [go: up one dir, main page]

US20220392070A1 - Combined assessment of morphological and perivascular disease markers - Google Patents

Combined assessment of morphological and perivascular disease markers Download PDF

Info

Publication number
US20220392070A1
US20220392070A1 US17/890,822 US202217890822A US2022392070A1 US 20220392070 A1 US20220392070 A1 US 20220392070A1 US 202217890822 A US202217890822 A US 202217890822A US 2022392070 A1 US2022392070 A1 US 2022392070A1
Authority
US
United States
Prior art keywords
analyzer module
regions
segmenting
medical imaging
imaging data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/890,822
Inventor
Andrew J. Buckler
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Elucid Bioimaging Inc.
Original Assignee
Elucid Bioimaging Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Elucid Bioimaging Inc. filed Critical Elucid Bioimaging Inc.
Priority to US17/890,822 priority Critical patent/US20220392070A1/en
Publication of US20220392070A1 publication Critical patent/US20220392070A1/en
Assigned to Elucid Bioimaging Inc. reassignment Elucid Bioimaging Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BUCKLER, ANDREW J.
Priority to US18/319,003 priority patent/US20230386026A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/02007Evaluating blood vessel condition, e.g. elasticity, compliance
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4848Monitoring or testing the effects of treatment, e.g. of medication
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/504Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of blood vessels, e.g. by angiography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5217Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data extracting a diagnostic or physiological parameter from medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0891Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of blood vessels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/155Segmentation; Edge detection involving morphological operators
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular

Definitions

  • the invention relates to computer-aided phenotyping (CAP) of disease which can include applying computerized image analysis and/or data fusion algorithms to patient data.
  • CAP computer-aided phenotyping
  • the invention relates to quantitative imaging and analytics for elucidating the disease process of atherosclerosis, including delineating perivascular adipose tissue and/or determining the thickness of tissues between the lipid-rich necrotic core (LRNC) and the lumen (“cap thickness”) of a blood vessel.
  • LRNC lipid-rich necrotic core
  • Atherosclerosis can be life threatening, particularly in aging populations, but even among the relatively young.
  • Current methods for diagnosing atherosclerosis for example, the use of blood markers e.g., cholesterol levels) and/or determining the degree to which the lumen is narrowed (stenosis) are limited, and thus can result in suboptimal treatment decisions (e.g., to perform or not to perform surgeries, or prescribe intensive medical therapy). For example, many vascular surgeries do not benefit the patient, some that need surgeries don't get them, and many could be effectively treated with drugs but may not be prescribed them.
  • CT computed tomography
  • MR magnetic resonance imaging
  • MRA magnetic resonance imaging
  • DCE-MRI multi-contrast MRI
  • ultrasound b-mode or intravascular US
  • targeted contrast agent approaches with various imaging modalities.
  • Imaging can be valuable because it can provide spatially and temporally localized anatomic and/or functional information, using non- or minimally invasive methods.
  • techniques to deal with increasing resolution can be desired, both to exploit patterns and/or signatures in the data typically not readily assessed with the human eye, as well as to, for example, manage a large magnitude of data to efficiently integrate it into the clinical workflow.
  • the radiologist can “drown” in data. Therefore, in order to, for example, integrate quantitative imaging for individual patient management it can be desirable to provide a class of decision support informatics tools to enable further exploiting the capabilities of imaging within the realities of existing tool work flows and/or reimbursement constraints.
  • One difficulty with current imaging of atherosclerosis can include lack of robustness in the method used. For example, current methods typically only provide a low level of contrast between blood vessel outer wall and perivascular tissues, thus making it difficult to distinguish between the two. Some current methods simply employ annular rings around a lumen without specific determination of outer wall boundary. Vessel tapering, branching vessels, nearby tissues, etc. can also be problematic.
  • Another difficulty with current imaging of atherosclerosis can be due to a particular imaging device interrogating tissue using a limited excitation, and that despite the utility of multi-contrast MR on the one hand, or multi-energy CT on the other, the result can be a degree of non-specific response in the produced signal.
  • the invention includes a system comprising a processor and a non-transient storage medium including processor executable instructions implementing an analyzer module including a hierarchical analytics framework.
  • the hierarchal analytics framework can be configured to utilize a first set of machine learned algorithms to identify and quantify a set of biological properties utilizing medical imaging data segment the medical imaging data based on the quantified biological properties to delineate existence of perivascular adipose tissue.
  • segmenting the medical imaging data further comprises segmenting the medical imaging data into at least a lumen boundary and an outer wall boundary.
  • the analyzer module is configured to partition a lumen and an outer wall based on the segmented lumen boundary and outer wall boundary into one or more vessel boundaries.
  • the biological properties include calcified regions, LRNC regions, intra-plaque regions, matrix regions, or any combination thereof.
  • delineating the perivascular adipose tissue further includes creating an evaluation region by extending the outer wall boundary by a predetermined distance and utilizing a second set of machine learned algorithms to identify whether the evaluation region includes the perivascular adipose tissue.
  • the analyzer module is configured to determine maximum, minimum, mean or any combination thereof a cross-sectional area of the perivascular adipose tissue. In some embodiments, the analyzer module is configured to, for each partition, determine a maximum, minimum, mean or any combination thereof of a cross-sectional area of each of the one more vessels boundaries.
  • the analyzer module is configured to, for each partition, determine volume of each of the one more vessels boundaries. In some embodiments, the analyzer module is configured to determine maximum, minimum, mean or any combination thereof of a cross-sectional area for a target.
  • segmenting the medical image data further comprises segmenting the medical image data into three-dimensional (3D) objects.
  • the invention involves a system including a processor and a non-transient storage medium including processor executable instructions implementing an analyzer module including a hierarchical analytics framework.
  • the hierarchical analytics framework can be configured to utilize a first set of machine learned algorithms to identify and quantify a set of biological properties utilizing medical imaging data, wherein the biological properties include LRNC regions of a blood vessel and segment the medical imaging data based on the quantified biological properties to determine a lumen boundary and determine a cap thickness based on a minimum distance between the lumen boundary and LRNC regions.
  • segmenting the medical imaging data further comprises segmenting the medical imaging data into an outer wall boundary.
  • the analyzer module is configured to partition a lumen and an outer wall based on the segmented lumen boundary and outer wall boundary into one or more vessel boundaries.
  • the biological properties include calcified regions, LRNC regions, intra-plaque regions, matrix regions, or any combination thereof.
  • FIG. 1 is a diagram of a system for determining and characterizing a medical condition by implementing a hierarchical analytics framework, according to some embodiments of the invention.
  • FIG. 2 is a flow chart for a re-sampling based model building method, according to some embodiments of the invention.
  • FIG. 3 is a flow chart of a method for delineating existence of perivascular adipose tissue and determining cap thickness based on medical imaging data of a patient, according to some embodiments of the invention.
  • FIG. 3 A is a diagram showing effects of restoring as can affect accurate quantitation of tissues, being a cross section through a coronary artery with intensity profiles for the original source CT and after the model-based restoring, according to some embodiments of the invention.
  • FIG. 4 is a flow chart for a method for restoring and segmenting, according to some embodiments of the invention.
  • FIG. 5 is a flow chart for a method of an iterative optimization algorithm, according to some embodiments of the invention.
  • FIG. 6 is an example of an acquired image which includes blurring, according to the prior art.
  • FIGS. 7 A and 7 B shows examples of an acquired image which before and after restoring, respectively, according to some embodiments of the invention.
  • FIGS. 8 A and 8 B shows examples of an acquired image which before and after restoring, respectively, according to some embodiments of the invention.
  • FIG. 9 shows an example of an augmented restored/deblurred image, according to some embodiments of the invention.
  • FIG. 1 is a diagram of a system for determining and characterizing a medical condition by implementing a hierarchical analytics framework, according to some embodiments of the invention.
  • the system 100 can include a trainer module 110 , an analyzer module 120 and a cohort tool module 120 .
  • the analyzer module 120 can include a hierarchical analytics framework which can identify and/or quantify biological properties/analytes 130 based on medical imaging data (e.g., medical imaging data of patient).
  • the medical imaging data can include (i) imaging features 122 from one or more acquired images 121 A of a patient 50 and/or (ii) non-imaging input data 121 B for a patient 50 .
  • the analyzer module 120 can identify and/or characterize one or more pathologies (e.g., prognostic phenotypes) 124 based on the quantified biological properties/analytes 123 .
  • the analyzer module 120 can operate independent of ground truth and/or validation references by implementing one or more pre-trained algorithms, e.g., machine learned algorithms, for drawing inferences.
  • the analyzer module 120 includes algorithms for calculating imaging features 122 from the acquired images 121 A of the patient 50 .
  • the image features 122 are computed on a per-voxel basis, on a region-of-interest basis, or any combination thereof.
  • non-imaging inputs 121 B that can be utilized in calculating imaging features 122 include data from laboratory systems, patient-reported symptoms, patient history, or any combination thereof.
  • the image features 122 and/or non-imaging inputs 121 B can be utilized by the analyzer module 120 to calculate the biological properties/analytes 123 .
  • the biological properties/analytes are typically quantitative, objective properties (e.g., objectively verifiable rather than being stated as impression or appearances) that may represent e.g., a presence and degree of a marker (such as a chemical substance) or other measurements such as structure, size, or anatomic characteristics of region of interest.
  • the quantified biological properties/analytes 123 are displayed and/or exported for direct consumption by a user, e.g., by a clinician, in addition to or independent of further processing by the analyzer module 120 .
  • the cohort tool module 130 can define a cohort of patients for group analyses thereof, e.g., based on a selected set of criteria related to the cohort study in question.
  • An example cohort analysis may be for a group of patients enrolled in a clinical trial, e.g., with the patients further being grouped based on one or more arms of the trial for example a treatment vs. control arm.
  • Another type of cohort analysis may be for a set of subjects for which ground truth or references exist, and this type of cohort may be further decomposed into a training set or “development” set and a test or “holdout” set. Development sets may be supported so as to train 112 the algorithms and models within analyzer module 120 , and holdout sets may be supported so as to evaluate/validate 113 the performance of the algorithms or models within analyzer module 120 .
  • the trainer module 110 can be utilized to train 112 the algorithms and models within analyzer module 120 .
  • the trainer module 110 can rely on ground truth 111 and/or reference annotations 114 so as to derive weights and/or models, e.g., according to established machine learning paradigms or by informing algorithm developers.
  • the trainer module 110 employs classification and/or regression models.
  • the classification and/or regression modules can be highly adaptable, e.g., capable of uncovering complex relationships among the predictors and the response. However, their ability to adapt to the underlying structure within the existing data can enable the models to find patterns that are not reproducible for another sample of subjects. Adapting to irreproducible structures within the existing data is commonly known as model over-fitting. To avoid building an over-fit model, a systematic approach may be applied that prevents a model from finding spurious structure and enable the end-user to have confidence that the final model will predict new samples with a similar degree of accuracy on the set of data for which the model was evaluated.
  • Successive training sets may be utilized to determine optimal tuning parameter(s), and a test set may be utilized to estimate an algorithm's or model's predictive performance.
  • Training sets may be used for training each of the classifiers via randomized cross-validation.
  • Datasets may be repeatedly split into training and testing sets and may be used to determine classification performance and model parameters. The splitting of the datasets into training and test sets can occur using a stratified and/or maximum dissimilarity approach.
  • a re-sampling approach e.g. bootstrapping
  • the classification models may be trained in whole or in part by application of multi-scale modeling techniques, such as for example partial differential equations, e.g., to represent likely cell signaling pathways or plausible biologically-motivated presentations.
  • the classification models may be trained as described in U.S. Pat. No. 16/203,418, filed on Nov. 28, 2018, incorporated herein by reference in its entirety.
  • a patient report is generated, for example, as shown FIG. 3 of U.S. patent application Ser. No. 16/203,418, along with the corresponding description.
  • vessel coordinates and/or topologies are determined, as shown in FIGS. 7, 9, 23, 26, of U.S. patent application Ser. No. 16/203,418, along with the corresponding description.
  • various analysis, models, and/or analytes are determined, as shown in FIGS. 10, 12, 18, 31, 34 and 39 of U.S. Pat. No. 16/203,418, along with the corresponding description.
  • system architectures are as shown in FIGS. 32, 33 of U.S. Pat. No. 16/203,418, along with the corresponding description.
  • FIG. 2 is a flow chart for a re-sampling based model building method 200 which may be utilized by a system (e.g., system 100 as described above with respect to FIG. 1 ), according to some embodiments of the invention.
  • a tuning parameter set can be defined.
  • data e.g., medical data as described above in FIG. 1
  • a model is fitted and hold-out samples can be predicted.
  • resampling estimates can be combined into a performance profile.
  • final tuning parameters can be determined.
  • the entire training set can be re-fitted with the final tuning parameters.
  • each model can be evaluated for predictive performance on the test set.
  • Test set evaluation can occur once for each model to, for example, ensure that the model building process does not over-fit the test set.
  • the optimal tuning parameter estimates, the re-sampled training set performance and/or the test set performance can be reported (e.g., transmitted to a display and/or written to a file to include in a report). Actual values of the model parameters over randomized splits can be compared to evaluate model stability and/or robustness to training data.
  • one or more models are tuned for each of the biological properties/analytes (e.g., tissue types) represented in ground truth maps (e.g., ground truth 11 as described above).
  • Model responses e.g., responses from the models as described above
  • endpoints may have continuous and categorical responses; some of the techniques in the above categories are used for both categorical and continuous responses, while others are specific to either categorical or continuous responses.
  • Optimal tuning parameter estimates, the re-sampled training set performance, as well as the test set performance may be reported for each model.
  • model complexity grows (e.g., amount of computation, hidden layers, stages of optimization and/or dimensionality of hyperplanes)
  • predictive performance can improve.
  • the performance improvement can be achieved at the expense of model interpretability.
  • the parameter coefficients from a multiple linear regression model intuitively link each predictor to the response.
  • the same kind of interpretation typically cannot be uncovered in a neural network, support vector machine, or many other models as are known in the art.
  • these models may provide much better predictive ability, especially if the underlying relationship between the predictors and the response is non-linear.
  • variable importance calculations are performed to extract at least a portion of interpretive information.
  • Variable importance projection methods can provide a weight to the individual features based on an extent that the respective individual feature contributes to a low dimensional data representation. For example, for problems where the number of features is equal to or larger than the number of training instances, classifier models can be subject to the “curse of dimensionality” problem.
  • the analyzer module 120 provides functionalities as described below in Table 1:
  • the analyzer module 120 can delineate fields, for example, to register multiple data streams across a field; to segment organs, vessels, lesions and other application-specific objects; and/or to reformat/reconfigure anatomy for specific analyses.
  • the segmenting the vessels includes segmenting the medical imaging data to one or more parts of a blood vessel, including, a lumen boundary, an outer wall boundary, and/or one or more vessel boundaries based on the identified and quantified biological properties.
  • the biological properties can include calcified regions, LRNC regions, intra-plaque regions, matrix regions, or any combination thereof.
  • FIG. 4 of U.S. Pat. No. 16/203,418, along with the corresponding description, shows an example segmentation levels for a multi-scale vessel wall analyte map, according to the present disclosure.
  • the analyzer module 120 can delineating a target, for example, a lesion, in a delineated field.
  • Delineating a target may, for example, include registering multiple data streams at a locale; conducting fine-grained segmentation; measuring size and/or other characteristics of relevant anatomic structures; and/or extracting whole-target features (e.g., biological properties/analytes characteristic of the entire target region).
  • one or more sub-target regions are delineated.
  • a target region may be split into sub-target regions according to a particular application with sub-target specific calculations (e.g., biological properties/analytes characteristic of a sub-target region).
  • the analyzer module 120 can delineate components and/or relevant features (such as composition), for example, in a particular field, target or sub-target region.
  • Pathologies may be determined, based on the biological quantified properties/analytes, and characterized, e.g., by determining phenotype and/or predictive outcomes for the pathologies.
  • the analyzer module 120 compares data across multiple timepoints, e.g., one or more of the biological components/analytes may involve a time based quantification.
  • a wide scan field may be utilized to asses multi-focal pathologies, e.g., based on aggregate quantifications of biological properties/analytes across a plurality of targets in the delineated field.
  • the analyzer module 120 may be configured to generate a patient report.
  • FIG. 3 is a flow chart of a method for delineating existence of perivascular adipose tissue and determining cap thickness based on medical imaging data of a patient, according to some embodiments of the invention.
  • the medical imaging data can include radiological data and non-radiological data.
  • the medical imaging data can include MRI image information, patient demographic information and/or MRI device information.
  • the method can involve segmenting a lumen boundary based on the medical imaging data (Step 305 ).
  • the method can also involve segmenting an outer wall boundary based on the medical imaging data (Step 310 ).
  • Segmenting the medical image data can involve receiving the medical image data (e.g., from a file, from a user, from another computer and/or from the cloud).
  • the medical imaging data can include data obtained via an MRI, CT, and/or ultrasound device.
  • the medical imaging data can be analyzed (e.g., via analyzing module 120 as described above in FIG. 1 ), to identify and/or quantify one or more biological properties.
  • the medical image data can be restored and/or restored/deblurred, as discussed in further detail below.
  • the medical imaging data can include input from a user that indicates a region of interest containing a physiological target that is to be phenotyped.
  • Identifying and/or quantifying the one or more biological properties can involve utilizing one or more machine learned algorithms.
  • the machine learned algorithms can be retrieved from a file, input by a user, retrieved from the cloud, or any combination thereof.
  • the machine learned algorithms can be algorithms trained using AlexNet, which is a convolutional neural network (CNN).
  • AlexNet which is a convolutional neural network (CNN).
  • the machine learned algorithms are further based on non-image medial data. For example, genomics, proteomics, and/or transcriptomics data.
  • the one or more biological properties and the medical imaging data itself can be used for segmenting into two-dimensional (2D) or three-dimensional (3D) objects. For example, upon segmenting the lumen boundary, a lumen of the blood vessel can be visualized in 3D. Similarly, upon segmenting the outer wall boundary, the outer wall can be visualized in 3D.
  • the segmenting of the lumen prior to segmenting a user viewing a volume rendering of the blood vessel defines an initial vessel centerline.
  • the segmenting of the lumen can be performed by a thresholding level set evolution using the optimal local Otsu threshold.
  • the segmenting of the outer wall can be performed by using a geodesic active contour level set evolution, initialized with the lumen segmentation and initial centerline.
  • the lumen and/or outer wall can be manually edited by a user, for example, via a user input device.
  • centerline paths are determined by defining a speed function for a fast marching algorithm.
  • the speed function can be a linear function of distance from the lumen boundary and outside of the lumen.
  • a small nonzero value can be used to, for example, allow for pathfinding across complete stenoses.
  • Gradient descent can be used to define an initial centerline, which can be further centralized by a ball-and-spring model that optimizes monotonic equal spacing and distance from the lumen boundary.
  • the method can involve partitioning (e.g., bifurcating) vessel boundaries (Step 315 ).
  • Partitioning vessel boundaries can be based on the segmented lumen boundary and outer wall boundary into one or more vessel boundaries.
  • the one or more vessel boundaries can be partitioned into 2D or 3D objects.
  • partitioning the vessel boundaries involves applying image registrations utilizing Mattes mutual information (MR), mean square error (CT) metric, rigid versor transform and/or LBFGSB optimizer, as examples.
  • An initial lumen segmentation can utilize a confidence connected filter (e.g., coronary, carotid, vertebral and/or femoral) to distinguish the lumen.
  • Lumen segmentation can utilize MR imaging (such as a combination of normalized, e.g., inverted for dark contrast, images) or CT imaging (such as use of registered pre-contrast, post-contrast CT and 2D Gaussian distributions) to define a vessel-ness function.
  • MR imaging such as a combination of normalized, e.g., inverted for dark contrast, images
  • CT imaging such as use of registered pre-contrast, post-contrast CT and 2D Gaussian distributions
  • Various components that are nearby but not necessarily connected can be dilated to connect them, e.g., by analyzing and
  • partitioning the vessels involves outer wall segmentation (e.g., utilizing a minimum curvature (k2) flow to account for lumen irregularities).
  • an edge potential map is calculated as outward-downward gradients in both contrast and non-contrast.
  • outer wall segmentation utilizes cumulative distribution functions (incorporating prior distributions of wall thickness, e.g., from 1-2 adjoining levels) in a speed function to allow for median thickness in the absence of any other edge information.
  • ferret diameters are employed for vessel characterization.
  • wall thickness is calculated as the sum of the distance to lumen plus the distance to the outer wall.
  • lumen and/or wall segmentations are performed using semantic segmentation using, for example, CNNs.
  • the lumen and wall segmentations can be partitioned according to path segments by using a greedy fast marching competition from all the path points of the three segments, resulting in three mutually exclusive partitions of each segmentation.
  • the method can involve determining one or more of calcified regions, LRNC regions, intra-plaque hemorrhage regions, matrix regions based on the segmentations (e.g., as the segmentations are based on quantified biological properties) (Step 320 ).
  • the method can involve delineating the perivascular adipose tissue, wherein delineating the perivascular adipose tissue can involve creating an evaluation region by extending the outer wall boundary by a predetermined distance and utilizing a second set of machine learned algorithms to identify whether the evaluation region includes the perivascular adipose tissue (Step 325 ).
  • the predetermine distance can be input by a user, retrieved from a file, or based on quantified biological properties.
  • the outer wall boundary is extended as a 3D object.
  • the outer wall boundary is extended by a predetermined volume.
  • the predetermined volume can be input by a user, retrieved from a file or based on quantified biological properties.
  • identifying whether the evaluation region include the perivascular adipose tissue involves employing one or more algorithms for evaluating vascular and perivascular structure.
  • the system e.g., system 100 as described above in FIG. 1
  • the system can employ a target/vessel segment/cross-section model for segmenting the underlying structure of an imaged vessel.
  • luminal irregularities and/or plaque structure e.g., remodeling
  • a maximum, minimum, mean or any combination thereof of a cross-sectional area of the perivascular adipose tissue is determined.
  • a maximum, minimum, mean or any combination thereof of a cross-sectional area of each of the one more vessel boundaries is determined. In some embodiments, a volume of each of the one more vessel boundaries is determined.
  • evaluating the vascular and perivascular composition can involve the model accounting for an observed image intensity at a pixel or voxel being influenced by a local neighborhood of hidden analyte category nodes thereby accounting for partial volume and scanner point spread function (PSF).
  • LRNC lipid-rich necrotic core
  • PVAT perivascular adipose tissue
  • FIG. 4 of U.S. patent application Ser. No. 16/203,418, depicts a multi-scale vessel wall analyte map that includes a wall-level segmentation 410 (e.g., a cross-sectional slice of the vessel), blob-level segmentation and pixel-level segmentation 430 (e.g., based on individual image pixels.
  • a wall-level segmentation 410 e.g., a cross-sectional slice of the vessel
  • blob-level segmentation e.g., a cross-sectional slice of the vessel
  • pixel-level segmentation 430 e.g., based on individual image pixels.
  • the method can also involve determining cap thickness (e.g., a layer of tissue that can be described as a cap) based on a minimum distance between the lumen boundary and LRNC regions (Step 330 ).
  • the minimum distance between the lumen boundary and LRNC regions can be determined by creating first vector between a voxel in the lumen and voxel in the LNRC, determining the distance of the vector, creating a second vector between a different voxel point in the lumen and a different voxel in the LNRC, determining the distance and comparing against the first determined distance, and keeping the shortest of the two. Performing these steps for additional lumen and LNRC voxels to find the shortest voxel and assigning that as the cap thickness.
  • parameters related to cross-section of tissues are determined and/or output. For example, cross section within each positioned and oriented slab, maximum, mean, minimum, and/or area of tissue characteristics in-wall (e.g., within the outer wall boundary) and/or perivascular.
  • parameters related to one or more vessels are determined and/or output. For example, within each partition as described above, a maximum, mean, and/or minimum cross-section measurements across all the cross-sections included in each respective vessel can be determined and/or output. Within each partition, a volume and/or volume proportion can be determined and/or output (e.g., for 3D objects).
  • parameters related to a target e.g., a group of vessels. In some embodiments, determining the parameters related to a target involves perform similar functions as the vessel level, but for the target as a whole.
  • the readings can be marshalled for ML (e.g., out to training sets and/or in for per-patient inference). In some embodiments, the readings can be marshalled for ML either alone or with non-imaging data, e.g., bloodwork and/or transcriptomics in curated tissue collections.
  • non-imaging data e.g., bloodwork and/or transcriptomics in curated tissue collections.
  • images are stored (e.g., optionally enhanced) for DL (e.g., out to training sets and/or in for per-patient inference)
  • the images are stored for DL either alone or with non-imaging data, e.g., bloodwork and/or transcriptomics in curated tissue collections.
  • FIG. 3 A The restored/deblurred image shows a cross section through a coronary artery with intensity profiles for an original source CT scan after restoring according to methods described above (e.g., FIGS. 4 and 5 ) are applied.
  • the change in comparison to the original image can be on the order of +/ ⁇ 30 HU.
  • FIG. 3 A is a diagram showing effects of restoring as can affect accurate quantitation of tissues, being a cross section through a coronary artery with intensity profiles for the original source CT and after the model-based restoring.
  • adipose tissue For PVAT, one assumption can be that its fully adipose tissue is ⁇ 100 HU while water due to inflammation is 0 HU such that adipose tissue with some inflammation would be somewhere in between ⁇ 100 and 0 HU depending on how much water has entered the tissue. If, for example, tissue were 50% adipose tissue and 50% water from inflammation, a change of 50 HU can be achieved, which may be negated by uncorrected for CT blur, thus making a potentially good biomarker ineffective.
  • the medical image data can be restored and/or restored/deblurred.
  • FIG. 21 of U.S. patent application Ser. No. 16/203,418 shows an example of a pre-processing step of restoring using a patient-specific point spread determination algorithm to mitigate artifacts and/or image limitations that can result from the image formation process that can, for example, decrease the ability to determine characteristics predictive of the phenotype.
  • the figure can demonstrates a portion of the radiology analysis application analysis of a plaque from a CT.
  • the restored/deblurred or restored images that are a result of iteratively fitting a physical model of the scanner point spread function with regularizing assumptions about the true latent density of different regions of the image.
  • restoring of the image can be performed as follows: An imaging device (such as an MRI or CT device) can be used to acquire a measured image.
  • a tissue characteristics image model can be initialized for the measured image representing a true underlying image.
  • a tissue characteristics model can apply a level-set method (LSM) as a conceptual framework for numerical analysis of surfaces and shapes in the image representing biological analytes.
  • LSM level-set method
  • the tissue characteristics model can map level sets to the image data via a set of characteristic equations, and thereby can represent specific biological analytes.
  • the characteristic equations can be utilized to solve an optimization problem to determine optimal transformation parameters for the tissue characteristics model, and can thereby optimize restoring for segmentation of the specific biological analytes being analyzed.
  • the tissue characteristics model and/or the optimization parameters can advantageously account/make use of a knowledge base of the underlying biology of the system, e.g., based on biological models for the analytes.
  • the optimization problem can be solved using an iterative process which iteratively adjusts the tissue characteristics image model in order to minimize an energy function which models imaging physics relating to the appearance of different analytes in a Bayesian framework (e.g., energy may be the negative log of probabilities for the Bayesian framework integrated over the image).
  • a restored/deblurred image may be outputted based on the transform parameters determined from the optimization problem.
  • the restored/deblurred image can include restoring which can be optimized for segmentation and/or for quantitative analysis of the biological analytes. This can represent a significant improvement over generalized restoring techniques that have not accounted for the underlying biology of the system being analyzed.
  • Various advantages and improvements can be provided by restoring, for example, removing of blur that derives from very bright as well as very dark signals. Unlike conventional techniques, this may advantageously, account for both the technical image formation process in the scanner, as well as the specific biology being imaged. Additional advantages can include deriving scanner blur based on the image and incorporating detailed statistical models of prior estimates of tissue characteristics drawn from a truth source, e.g., such as histopathology.
  • prior estimates used inform the classification process so as to provide the most plausible explanation for the observed image data. Additional advantages can include increased accuracy in readings of biological analytes, e.g., that include cross-sectional areas, volumes, and spatial locations for different types of tissues.
  • FIG. 4 is a flow chart for a method for restoring and segmenting, according to some embodiments of the invention.
  • the method can involve acquiring a measured image (Step 401 ) via an imaging device.
  • the method can also involve initializing a tissue characteristics image model for the measured image that represents a true underlying image (Step 402 ).
  • initializing a tissue characteristics image model can involve Initializing ⁇ i level set functions and ⁇ i characteristic functions and Initializing g with the background region masked to a constant intensity.
  • the method can also involve solving an optimization algorithm using an iterative process which can iteratively adjust the tissue characteristics image model in order to minimize an energy function which models imaging physics relating to the appearance of different analytes (Step 403 ).
  • the optimization algorithm involves:
  • Volume fractions are computed from characteristic functions.
  • a stopping criterion for the iterations can be based upon one or more user-defined number of iterations.
  • the optimization algorithm can use the iterative process as shown below in FIG. 5 .
  • the method can also involve outputting a restored/deblurred image based on the transform parameters determined from the optimization algorithm (Step 404 ).
  • FIG. 5 is a flow chart for a method of an iterative optimization algorithm step (e.g. step 403 of FIG. 4 ) as applied within the context of multi multi-phase level sets where analyte image regions are defined by characteristic functions as a function of level set functions for each tissue characteristic type.
  • the method can involve calculating characteristic functions from the level set functions for the current iteration, for example, based on the current level sets (Step 501 ).
  • the method can also involve calculating blurred character characteristic functions calculated from the characteristic functions, e.g., based on a IIR Gaussian blurring given a point spread function (PSF) for the imaging device (Step 502 ).
  • PSF point spread function
  • the method can also involve, calculating image intensity constants for the blurred characteristic functions (Step 503 ).
  • the method can also involve calculating level set updates, e.g., based on a gradient decent approach to minimizing an energy function (Step 504 ).
  • the iterative process reinitializes the level sets and characteristic equations with every iteration (e.g., prior to repeating steps 501 - 504 ).
  • a signed distance property of the level set functions is relaxed during each iteration until reinitialization after the iteration.
  • FIGS. 6 , 7 A- 7 B and 8 A- 8 B can illustrate effectiveness of the restoring/segmentation as described above.
  • FIG. 6 is an example of an acquired image which includes blurring.
  • FIG. 7 A shows an example of an acquired image which includes blurring and
  • FIG. 7 B shows the same acquired image of FIG. 7 A after restoring and segmenting according to embodiments of the invention applied herein are applied.
  • FIG. 8 A shows another example of another acquired image which includes blurring and FIG. 8 B shows the same acquired image of FIG. 7 A after restoring and segmenting according to embodiments of the invention applied herein are applied.
  • FIGS. 6 is an example of an acquired image which includes blurring.
  • FIG. 7 A shows an example of an acquired image which includes blurring
  • FIG. 7 B shows the same acquired image of FIG. 7 A after restoring and segmenting according to embodiments of the invention applied herein are applied.
  • FIGS. 6 is an example of an acquired image which includes blurring.
  • FIG. 7 A shows an example of an
  • the restoring and/or segmentation method of the invention can advantageously provide for improved recognition of biological analytes.
  • a restored/deblurred image is augmented by replacing segmented regions representing biological analytes with an overlay map (e.g., a color-coded overlay map) for the segmented regions.
  • an overlay map e.g., a color-coded overlay map
  • FIG. 9 shows an example of an augmented restored/deblurred image, according to some embodiments of the invention.
  • the augmented restored/deblurred image can depict quantitative measurements associated with identified analyte regions as well as one or more graphical characterizations of structure and/or composition.
  • the augmented restored/deblurred image may advantageously provide improved tools for a clinician to evaluate a pathology of the patient.
  • the radial distance may be defined based on the shortest distance to the inner luminal surface and the shortest distance to the outer adventitial surface.
  • the expert-annotation of the histology images includes regions that define the lumen and the vessel (defined as the union of the lumen and vessel wall).
  • a signed distance function can be created for each of these, L(x) and V(x), respectively.
  • the convention is that the interior of these regions is negative so that in the wall L is positive and V is negative.
  • one level set may be used for the entire vessel lumen initialized with the segmented lumen L.
  • Each distinct contiguous bright region can be initialized as its own level set and calculated as follows: Candidate bright regions are computed using a morphological watershed applied to the inverted image (to turn bright peaks into catchment basins).
  • energy functionals can represents an approach that integrates modeling between imaging physics and biology.
  • the imaging physics portion can account for image intensities and the PSF of the scanner while the biological portion of the model incorporates histology-driven knowledge of the structure and growth patterns of atherosclerotic plaques.
  • the model prior weights the model toward the most likely configurations and away from physically and biologically unrealistic solutions.
  • the model can be provided in probabilistic terms and the energy is the negative log of probabilities integrated over the image. In addition to providing analytic tractability, the logarithm super-linearly weights against decreasing probability solutions.
  • a Na ⁇ ve Bayes [10] domain independence assumption is made between imaging physics and biology, e.g., that the likelihood of the residual between blurred model and blurred acquired image does not depend on the biological likelihood of a configuration of tissue characteristic regions next to each other.
  • the various model parameters that can be evolved throughout the algorithm include the level set functions mapped over the image, the true (e.g., restored/deblurred) image intensity of different biological tissue characteristics, and the width of the scanner PSF.
  • the pre-learned model parameters can include the model of the dependencies of the spatial distribution of tissue characteristics with a plaque.
  • the model is iteratively adjusted in order to minimize the energy function through a gradient descent trajectory.
  • the gradient descent approach can allows for the direct adjustment of model parameters, such as each level set 0, in order to minimize energy.
  • An imaging physics term in the energy functional can represent the L2 norm of the difference between the blurred idealized piecewise constant image and the acquired image.
  • the coefficients can allow for a balance between the effect of curvature evolution smoothing and minimizing the mode-to-image residual.
  • the evidence variables can be the acquired image pixel intensities represented by the blurred image g. Within each iteration, the ordering of sub-steps can follow the flow of information through the variables.
  • the characteristic functions can serve as an intermediary and the Euler-Lagrange equation can be determined in terms of the level set functions.
  • the energy functional can be minimized using a gradient descent approach that moves each ⁇ toward the local minimum of E at every point in space simultaneously and independently. Within each iteration, the signed distance property of the level set functions can be relaxed until reinitialization after the iteration and thus the integral disappears.
  • One advantage of the invention can include corrections to the image, (h*f ⁇ f), can be low frequency in that they can be simply step edges blurred by a Gaussian thereby preventing erroneous amplification of high frequency noise, which may often occur with conventional deconvolution techniques that may never fully separate amplifying true image structure from amplifying image noise.
  • the error of this improved deconvolution process may be subject only (or substantially only) to the accuracy of the region image intensity constants, the location of the edges, and/or the imaging system blur, all of which can be highly intuitive and can easily be visually confirmed by the end user.
  • the terms “plurality” and “a plurality” as used herein can include, for example, “multiple” or “two or more”.
  • the terms “plurality” or “a plurality” can be used throughout the specification to describe two or more components, devices, elements, units, parameters, or the like.
  • the term set when used herein can include one or more items.
  • the method embodiments described herein are not constrained to a particular order or sequence. Additionally, some of the described method embodiments or elements thereof can occur or be performed simultaneously, at the same point in time, or concurrently.
  • a computer program can be written in any form of programming language, including compiled and/or interpreted languages, and the computer program can be deployed in any form, including as a stand-alone program or as a subroutine, element, and/or other unit suitable for use in a computing environment.
  • a computer program can be deployed to be executed on one computer or on multiple computers at one site.
  • Method steps can be performed by one or more programmable processors executing a computer program to perform functions of the invention by operating on input data and generating output. Method steps can also be performed by an apparatus and can be implemented as special purpose logic circuitry.
  • the circuitry can, for example, be a FPGA (field programmable gate array) and/or an ASIC (application-specific integrated circuit). Modules, subroutines, and software agents can refer to portions of the computer program, the processor, the special circuitry, software, and/or hardware that implement that functionality.
  • processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
  • a processor receives instructions and data from a read-only memory or a random access memory or both.
  • the essential elements of a computer are a processor for executing instructions and one or more memory devices for storing instructions and data.
  • a computer can be operatively coupled to receive data from and/or transfer data to one or more mass storage devices for storing data (e.g., magnetic, magneto-optical disks, or optical disks).
  • Data transmission and instructions can also occur over a communications network.
  • Information carriers suitable for embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices.
  • the information carriers can, for example, be EPROM, EEPROM, flash memory devices, magnetic disks, internal hard disks, removable disks, magneto-optical disks, CD-ROM, and/or DVD-ROM disks.
  • the processor and the memory can be supplemented by, and/or incorporated in special purpose logic circuitry.
  • the above described techniques can be implemented on a computer having a display device, a transmitting device, and/or a computing device.
  • the display device can be, for example, a cathode ray tube (CRT) and/or a liquid crystal display (LCD) monitor.
  • CTR cathode ray tube
  • LCD liquid crystal display
  • the interaction with a user can be, for example, a display of information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer (e.g., interact with a user interface element).
  • Other kinds of devices can be used to provide for interaction with a user.
  • Other devices can be, for example, feedback provided to the user in any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback).
  • Input from the user can be, for example, received in any form, including acoustic, speech, and/or tactile input.
  • the computing device can include, for example, a computer, a computer with a browser device, a telephone, an IP phone, a mobile device (e.g., cellular phone, personal digital assistant (PDA) device, laptop computer, electronic mail device), and/or other communication devices.
  • the computing device can be, for example, one or more computer servers.
  • the computer servers can be, for example, part of a server farm.
  • the browser device includes, for example, a computer (e.g., desktop computer, laptop computer, and tablet) with a World Wide Web browser (e.g., Microsoft® Internet Explorer® available from Microsoft Corporation, Chrome available from Google, Mozilla® Firefox available from Mozilla Corporation, Safari available from Apple).
  • the mobile computing device includes, for example, a personal digital assistant (PDA).
  • PDA personal digital assistant
  • Website and/or web pages can be provided, for example, through a network (e.g., Internet) using a web server.
  • the web server can be, for example, a computer with a server module (e.g., Microsoft® Internet Information Services available from Microsoft Corporation, Apache Web Server available from Apache Software Foundation, Apache Tomcat Web Server available from Apache Software Foundation).
  • server module e.g., Microsoft® Internet Information Services available from Microsoft Corporation, Apache Web Server available from Apache Software Foundation, Apache Tomcat Web Server available from Apache Software Foundation.
  • the storage module can be, for example, a random access memory (RAM) module, a read only memory (ROM) module, a computer hard drive, a memory card (e.g., universal serial bus (USB) flash drive, a secure digital (SD) flash card), a floppy disk, and/or any other data storage device.
  • RAM random access memory
  • ROM read only memory
  • computer hard drive e.g., a hard drive
  • memory card e.g., universal serial bus (USB) flash drive, a secure digital (SD) flash card
  • SD secure digital
  • Information stored on a storage module can be maintained, for example, in a database (e.g., relational database system, flat database system) and/or any other logical information storage mechanism.
  • the above-described techniques can be implemented in a distributed computing system that includes a back-end component.
  • the back-end component can, for example, be a data server, a middleware component, and/or an application server.
  • the above described techniques can be implemented in a distributing computing system that includes a front-end component.
  • the front-end component can, for example, be a client computer having a graphical user interface, a Web browser through which a user can interact with an example implementation, and/or other graphical user interfaces for a transmitting device.
  • the components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (LAN), a wide area network (WAN), the Internet, wired networks, and/or wireless networks.
  • LAN local area network
  • WAN wide area network
  • the Internet wired networks, and/or wireless networks.
  • the system can include clients and servers.
  • a client and a server are generally remote from each other and typically interact through a communication network.
  • the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • Packet-based networks can include, for example, the Internet, a carrier internet protocol (IP) network (e.g., local area network (LAN), wide area network (WAN), campus area network (CAN), metropolitan area network (MAN), home area network (HAN), a private IP network, an IP private branch exchange (IPBX), a wireless network (e.g., radio access network (RAN), 802.11 network, 802.16 network, general packet radio service (GPRS) network, HiperLAN), and/or other packet-based networks.
  • IP carrier internet protocol
  • LAN local area network
  • WAN wide area network
  • CAN campus area network
  • MAN metropolitan area network
  • HAN home area network
  • IP network IP private branch exchange
  • wireless network e.g., radio access network (RAN), 802.11 network, 802.16 network, general packet radio service (GPRS) network, HiperLAN
  • Circuit-based networks can include, for example, the public switched telephone network (PSTN), a private branch exchange (PBX), a wireless network (e.g., RAN, Bluetooth®, code-division multiple access (CDMA) network, time division multiple access (TDMA) network, global system for mobile communications (GSM) network), and/or other circuit-based networks.
  • PSTN public switched telephone network
  • PBX private branch exchange
  • CDMA code-division multiple access
  • TDMA time division multiple access
  • GSM global system for mobile communications
  • Some embodiments of the present invention may be embodied in the form of a system, a method or a computer program product. Similarly, some embodiments may be embodied as hardware, software or a combination of both. Some embodiments may be embodied as a computer program product saved on one or more non-transitory computer readable medium (or media) in the form of computer readable program code embodied thereon. Such non-transitory computer readable medium may include instructions that when executed cause a processor to execute method steps in accordance with embodiments. In some embodiments the instructions stores on the computer readable medium may be in the form of an installed application and in the form of an installation package.
  • Such instructions may be, for example, loaded by one or more processors and get executed.
  • the computer readable medium may be a non-transitory computer readable storage medium.
  • a non-transitory computer readable storage medium may be, for example, an electronic, optical, magnetic, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination thereof.
  • Computer program code may be written in any suitable programming language.
  • the program code may execute on a single computer system, or on a plurality of computer systems.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Vascular Medicine (AREA)
  • Theoretical Computer Science (AREA)
  • Physiology (AREA)
  • General Physics & Mathematics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Artificial Intelligence (AREA)
  • Quality & Reliability (AREA)
  • Cardiology (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Pulmonology (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

A system including a hierarchical analytics framework that can utilize a first set of machine learned algorithms to identify and quantify a set of biological properties utilizing medical imaging data is provided. System can segment the medical imaging data based on the quantified biological properties to delineate existence of perivascular adipose tissue. The system can also segment the medical imaging data based on the quantified biological properties to determine a lumen boundary and/or determine a cap thickness based on a minimum distance between the lumen boundary and LRNC regions.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 16/984,640, filed Aug. 4, 2020, claiming the benefit of U.S. provisional patent application No. 62/882,881, filed on Aug. 5, 2019, all of which are hereby incorporated by reference in their entireties.
  • GOVERNMENT RIGHTS
  • This work was supported in part by NIH award ID HL 126224. The government may have certain rights in the invention.
  • FIELD OF THE INVENTION
  • The invention relates to computer-aided phenotyping (CAP) of disease which can include applying computerized image analysis and/or data fusion algorithms to patient data. In particular, the invention relates to quantitative imaging and analytics for elucidating the disease process of atherosclerosis, including delineating perivascular adipose tissue and/or determining the thickness of tissues between the lipid-rich necrotic core (LRNC) and the lumen (“cap thickness”) of a blood vessel.
  • BACKGROUND OF THE INVENTION
  • Atherosclerosis can be life threatening, particularly in aging populations, but even among the relatively young. Current methods for diagnosing atherosclerosis, for example, the use of blood markers e.g., cholesterol levels) and/or determining the degree to which the lumen is narrowed (stenosis) are limited, and thus can result in suboptimal treatment decisions (e.g., to perform or not to perform surgeries, or prescribe intensive medical therapy). For example, many vascular surgeries do not benefit the patient, some that need surgeries don't get them, and many could be effectively treated with drugs but may not be prescribed them.
  • Current tools can analyze a blood vessel lumen, but this can be insufficient for truly diagnosing atherosclerosis, as atherosclerosis is a disease of the vessel wall, rather than the blood or the channel through which it flows. High rates of misclassified risk level, inability to assess likely response to drug therapy, and/or inability to measure response to drugs can occur.
  • Currently, radiological imagining can be used as a non-invasive and safe method for locating disease origin. Current medical imagining tools can include computed tomography (CT, including single energy, multi-energy, or spectral CT), magnetic resonance imaging (MR, MRA, DCE-MRI, or multi-contrast MRI), ultrasound (b-mode or intravascular US), and targeted contrast agent approaches with various imaging modalities.
  • Enhanced imaging techniques have made medical imaging an essential component of patient care. Imaging can be valuable because it can provide spatially and temporally localized anatomic and/or functional information, using non- or minimally invasive methods. However, techniques to deal with increasing resolution can be desired, both to exploit patterns and/or signatures in the data typically not readily assessed with the human eye, as well as to, for example, manage a large magnitude of data to efficiently integrate it into the clinical workflow. With newer high-resolution imaging techniques, unaided, the radiologist can “drown” in data. Therefore, in order to, for example, integrate quantitative imaging for individual patient management it can be desirable to provide a class of decision support informatics tools to enable further exploiting the capabilities of imaging within the realities of existing tool work flows and/or reimbursement constraints.
  • Currently, imaging of atherosclerosis is routinely performed both invasively through catheterization as well as non-invasively by ultrasound, CT, MR, and using nuclear medicine techniques. The most typical assessment is luminal stenosis. Recent progress that has been made has been in the determination of fractional flow reserve.
  • One difficulty with current imaging of atherosclerosis can include lack of robustness in the method used. For example, current methods typically only provide a low level of contrast between blood vessel outer wall and perivascular tissues, thus making it difficult to distinguish between the two. Some current methods simply employ annular rings around a lumen without specific determination of outer wall boundary. Vessel tapering, branching vessels, nearby tissues, etc. can also be problematic.
  • Another difficulty with current imaging of atherosclerosis can be due to a particular imaging device interrogating tissue using a limited excitation, and that despite the utility of multi-contrast MR on the one hand, or multi-energy CT on the other, the result can be a degree of non-specific response in the produced signal.
  • SUMMARY OF THE INVENTION
  • Currently difficulties with the recent progress can include difficulty with interpreting raw pixel reconstructed intensity values using simplistic thresholding operators. One aspect of this is that the physical imaging modality intrinsically limits the degree to which the pixel values are correct manifestations of the object being imaged, for example due to the fact that a given point is actually spread or blurred according to the finite physical characteristics of the imaging. For example, at the submillimeter scale of this analysis, scanner blur (e.g., manifestations such as “calcium blooming”) plays a dominant role in quantitative accuracy and, thus, we compensated for the imaging system point spread function. Additionally, heterogeneity of tissues both within and outside the vessel wall present classification and measurement challenges unless processed effectively.
  • In one aspect, the invention includes a system comprising a processor and a non-transient storage medium including processor executable instructions implementing an analyzer module including a hierarchical analytics framework. The hierarchal analytics framework can be configured to utilize a first set of machine learned algorithms to identify and quantify a set of biological properties utilizing medical imaging data segment the medical imaging data based on the quantified biological properties to delineate existence of perivascular adipose tissue.
  • In some embodiments, segmenting the medical imaging data further comprises segmenting the medical imaging data into at least a lumen boundary and an outer wall boundary. In some embodiments, the analyzer module is configured to partition a lumen and an outer wall based on the segmented lumen boundary and outer wall boundary into one or more vessel boundaries.
  • In some embodiments, the biological properties include calcified regions, LRNC regions, intra-plaque regions, matrix regions, or any combination thereof. In some embodiments, delineating the perivascular adipose tissue further includes creating an evaluation region by extending the outer wall boundary by a predetermined distance and utilizing a second set of machine learned algorithms to identify whether the evaluation region includes the perivascular adipose tissue.
  • In some embodiments, the analyzer module is configured to determine maximum, minimum, mean or any combination thereof a cross-sectional area of the perivascular adipose tissue. In some embodiments, the analyzer module is configured to, for each partition, determine a maximum, minimum, mean or any combination thereof of a cross-sectional area of each of the one more vessels boundaries.
  • In some embodiments, the analyzer module is configured to, for each partition, determine volume of each of the one more vessels boundaries. In some embodiments, the analyzer module is configured to determine maximum, minimum, mean or any combination thereof of a cross-sectional area for a target.
  • In some embodiments, segmenting the medical image data further comprises segmenting the medical image data into three-dimensional (3D) objects.
  • In another aspect, the invention involves a system including a processor and a non-transient storage medium including processor executable instructions implementing an analyzer module including a hierarchical analytics framework. The hierarchical analytics framework can be configured to utilize a first set of machine learned algorithms to identify and quantify a set of biological properties utilizing medical imaging data, wherein the biological properties include LRNC regions of a blood vessel and segment the medical imaging data based on the quantified biological properties to determine a lumen boundary and determine a cap thickness based on a minimum distance between the lumen boundary and LRNC regions.
  • In some embodiments, segmenting the medical imaging data further comprises segmenting the medical imaging data into an outer wall boundary. In some embodiments, the analyzer module is configured to partition a lumen and an outer wall based on the segmented lumen boundary and outer wall boundary into one or more vessel boundaries. In some embodiments, the biological properties include calcified regions, LRNC regions, intra-plaque regions, matrix regions, or any combination thereof.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Non-limiting examples of embodiments of the disclosure are described below with reference to figures attached hereto that are listed following this paragraph. Dimensions of features shown in the figures are chosen for convenience and clarity of presentation and are not necessarily shown to scale.
  • The subject matter regarded as the invention is particularly pointed out and distinctly claimed in the concluding portion of the specification. The invention, however, both as to organization and method of operation, together with objects, features and advantages thereof, can be understood by reference to the following detailed description when read with the accompanied drawings. Embodiments of the invention are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like reference numerals indicate corresponding, analogous or similar elements, and in which:
  • FIG. 1 is a diagram of a system for determining and characterizing a medical condition by implementing a hierarchical analytics framework, according to some embodiments of the invention.
  • FIG. 2 is a flow chart for a re-sampling based model building method, according to some embodiments of the invention.
  • FIG. 3 is a flow chart of a method for delineating existence of perivascular adipose tissue and determining cap thickness based on medical imaging data of a patient, according to some embodiments of the invention.
  • FIG. 3A is a diagram showing effects of restoring as can affect accurate quantitation of tissues, being a cross section through a coronary artery with intensity profiles for the original source CT and after the model-based restoring, according to some embodiments of the invention.
  • FIG. 4 is a flow chart for a method for restoring and segmenting, according to some embodiments of the invention.
  • FIG. 5 is a flow chart for a method of an iterative optimization algorithm, according to some embodiments of the invention.
  • FIG. 6 is an example of an acquired image which includes blurring, according to the prior art.
  • FIGS. 7A and 7B shows examples of an acquired image which before and after restoring, respectively, according to some embodiments of the invention.
  • FIGS. 8A and 8B shows examples of an acquired image which before and after restoring, respectively, according to some embodiments of the invention.
  • FIG. 9 shows an example of an augmented restored/deblurred image, according to some embodiments of the invention.
  • It will be appreciated that for simplicity and clarity of illustration, elements shown in the figures have not necessarily been drawn accurately or to scale. For example, the dimensions of some of the elements can be exaggerated relative to other elements for clarity, or several physical components can be included in one functional block or element.
  • DETAILED DESCRIPTION
  • FIG. 1 is a diagram of a system for determining and characterizing a medical condition by implementing a hierarchical analytics framework, according to some embodiments of the invention. The system 100 can include a trainer module 110, an analyzer module 120 and a cohort tool module 120.
  • The analyzer module 120 can include a hierarchical analytics framework which can identify and/or quantify biological properties/analytes 130 based on medical imaging data (e.g., medical imaging data of patient). The medical imaging data can include (i) imaging features 122 from one or more acquired images 121A of a patient 50 and/or (ii) non-imaging input data 121B for a patient 50. The analyzer module 120 can identify and/or characterize one or more pathologies (e.g., prognostic phenotypes) 124 based on the quantified biological properties/analytes 123. The analyzer module 120 can operate independent of ground truth and/or validation references by implementing one or more pre-trained algorithms, e.g., machine learned algorithms, for drawing inferences.
  • In some embodiments, the analyzer module 120 includes algorithms for calculating imaging features 122 from the acquired images 121A of the patient 50. In various embodiments, the image features 122 are computed on a per-voxel basis, on a region-of-interest basis, or any combination thereof. In some embodiments, non-imaging inputs 121B that can be utilized in calculating imaging features 122 include data from laboratory systems, patient-reported symptoms, patient history, or any combination thereof.
  • The image features 122 and/or non-imaging inputs 121B can be utilized by the analyzer module 120 to calculate the biological properties/analytes 123. The biological properties/analytes are typically quantitative, objective properties (e.g., objectively verifiable rather than being stated as impression or appearances) that may represent e.g., a presence and degree of a marker (such as a chemical substance) or other measurements such as structure, size, or anatomic characteristics of region of interest. In various embodiments, the quantified biological properties/analytes 123 are displayed and/or exported for direct consumption by a user, e.g., by a clinician, in addition to or independent of further processing by the analyzer module 120.
  • The cohort tool module 130 can define a cohort of patients for group analyses thereof, e.g., based on a selected set of criteria related to the cohort study in question. An example cohort analysis may be for a group of patients enrolled in a clinical trial, e.g., with the patients further being grouped based on one or more arms of the trial for example a treatment vs. control arm. Another type of cohort analysis may be for a set of subjects for which ground truth or references exist, and this type of cohort may be further decomposed into a training set or “development” set and a test or “holdout” set. Development sets may be supported so as to train 112 the algorithms and models within analyzer module 120, and holdout sets may be supported so as to evaluate/validate 113 the performance of the algorithms or models within analyzer module 120.
  • The trainer module 110 can be utilized to train 112 the algorithms and models within analyzer module 120. The trainer module 110 can rely on ground truth 111 and/or reference annotations 114 so as to derive weights and/or models, e.g., according to established machine learning paradigms or by informing algorithm developers. In some embodiments, the trainer module 110 employs classification and/or regression models. The classification and/or regression modules can be highly adaptable, e.g., capable of uncovering complex relationships among the predictors and the response. However, their ability to adapt to the underlying structure within the existing data can enable the models to find patterns that are not reproducible for another sample of subjects. Adapting to irreproducible structures within the existing data is commonly known as model over-fitting. To avoid building an over-fit model, a systematic approach may be applied that prevents a model from finding spurious structure and enable the end-user to have confidence that the final model will predict new samples with a similar degree of accuracy on the set of data for which the model was evaluated.
  • Successive training sets may be utilized to determine optimal tuning parameter(s), and a test set may be utilized to estimate an algorithm's or model's predictive performance. Training sets may be used for training each of the classifiers via randomized cross-validation. Datasets may be repeatedly split into training and testing sets and may be used to determine classification performance and model parameters. The splitting of the datasets into training and test sets can occur using a stratified and/or maximum dissimilarity approach. In some embodiments, a re-sampling approach (e.g. bootstrapping) is utilized within the training set in order to obtain confidence intervals for (i) the optimal parameter estimate values, and/or (ii) the predictive performance of the models.
  • In some embodiments, the classification models may be trained in whole or in part by application of multi-scale modeling techniques, such as for example partial differential equations, e.g., to represent likely cell signaling pathways or plausible biologically-motivated presentations.
  • In some embodiments, the classification models may be trained as described in U.S. Pat. No. 16/203,418, filed on Nov. 28, 2018, incorporated herein by reference in its entirety. In some embodiments, a patient report is generated, for example, as shown FIG. 3 of U.S. patent application Ser. No. 16/203,418, along with the corresponding description. In various embodiments, vessel coordinates and/or topologies are determined, as shown in FIGS. 7, 9, 23, 26, of U.S. patent application Ser. No. 16/203,418, along with the corresponding description. In various embodiments, various analysis, models, and/or analytes are determined, as shown in FIGS. 10, 12, 18, 31, 34 and 39 of U.S. Pat. No. 16/203,418, along with the corresponding description. In various embodiments, system architectures are as shown in FIGS. 32, 33 of U.S. Pat. No. 16/203,418, along with the corresponding description.
  • FIG. 2 is a flow chart for a re-sampling based model building method 200 which may be utilized by a system (e.g., system 100 as described above with respect to FIG. 1 ), according to some embodiments of the invention. At step 210, a tuning parameter set can be defined. At step 220, for each tuning parameter set, data (e.g., medical data as described above in FIG. 1 ) is resampled, a model is fitted and hold-out samples can be predicted. At step 230, resampling estimates can be combined into a performance profile. At step 240, final tuning parameters can be determined. At step 250, the entire training set can be re-fitted with the final tuning parameters. After each model has been tuned from the training set, each model can be evaluated for predictive performance on the test set. Test set evaluation can occur once for each model to, for example, ensure that the model building process does not over-fit the test set. For each model that is constructed, the optimal tuning parameter estimates, the re-sampled training set performance and/or the test set performance can be reported (e.g., transmitted to a display and/or written to a file to include in a report). Actual values of the model parameters over randomized splits can be compared to evaluate model stability and/or robustness to training data.
  • In some embodiments, one or more models are tuned for each of the biological properties/analytes (e.g., tissue types) represented in ground truth maps (e.g., ground truth 11 as described above). Model responses (e.g., responses from the models as described above) may include, for example, covariance based techniques, non-covariance based techniques, and tree based models. Depending on their construction, endpoints may have continuous and categorical responses; some of the techniques in the above categories are used for both categorical and continuous responses, while others are specific to either categorical or continuous responses. Optimal tuning parameter estimates, the re-sampled training set performance, as well as the test set performance may be reported for each model.
  • As model complexity grows (e.g., amount of computation, hidden layers, stages of optimization and/or dimensionality of hyperplanes), predictive performance can improve. The performance improvement can be achieved at the expense of model interpretability. For example, the parameter coefficients from a multiple linear regression model intuitively link each predictor to the response. The same kind of interpretation typically cannot be uncovered in a neural network, support vector machine, or many other models as are known in the art. However, these models may provide much better predictive ability, especially if the underlying relationship between the predictors and the response is non-linear. In some embodiments, to extract at least a portion of interpretive information, variable importance calculations are performed. Variable importance projection methods can provide a weight to the individual features based on an extent that the respective individual feature contributes to a low dimensional data representation. For example, for problems where the number of features is equal to or larger than the number of training instances, classifier models can be subject to the “curse of dimensionality” problem.
  • In some embodiments, the analyzer module 120 provides functionalities as described below in Table 1:
  • TABLE 1
    Delineate Field Register multiple data streams across a field
    Segment organs, vessels, lesions, and other
    application-specific objects
    Reformat anatomy for specific analyses
    Delineate Target Register multiple data streams at a locale
    Fine-grained segmentation
    Measure size and/or other relevant anatomic
    structure
    Extract whole-target features
    Delineate Sub-target Split target into sub-targets according to
    regions application
    Sub-target specific calculations
    Delineate Components (Re-) Segment Component
    Calculate Readings
    Visualize Probability Map
    Determine Disease Determine Phenotype
    Severity Predict Outcome
    Compare Multiple (Optional) Compare Multiple Timepoints
    Timepoints
    Assess multi-focal Aggregate across target lesions over a wide
    disease scan field.
    Generate Patient Report Generate Patient Report
  • As shown in Table 1, the analyzer module 120 can delineate fields, for example, to register multiple data streams across a field; to segment organs, vessels, lesions and other application-specific objects; and/or to reformat/reconfigure anatomy for specific analyses.
  • In some embodiments, the segmenting the vessels includes segmenting the medical imaging data to one or more parts of a blood vessel, including, a lumen boundary, an outer wall boundary, and/or one or more vessel boundaries based on the identified and quantified biological properties. The biological properties can include calcified regions, LRNC regions, intra-plaque regions, matrix regions, or any combination thereof.
  • FIG. 4 of U.S. Pat. No. 16/203,418, along with the corresponding description, shows an example segmentation levels for a multi-scale vessel wall analyte map, according to the present disclosure.
  • The analyzer module 120 can delineating a target, for example, a lesion, in a delineated field. Delineating a target may, for example, include registering multiple data streams at a locale; conducting fine-grained segmentation; measuring size and/or other characteristics of relevant anatomic structures; and/or extracting whole-target features (e.g., biological properties/analytes characteristic of the entire target region).
  • In some embodiments, one or more sub-target regions are delineated. For example, a target region may be split into sub-target regions according to a particular application with sub-target specific calculations (e.g., biological properties/analytes characteristic of a sub-target region). The analyzer module 120 can delineate components and/or relevant features (such as composition), for example, in a particular field, target or sub-target region.
  • This can include segmenting or re-segmenting the components/features, calculating values for the segmented components/features (e.g., biological properties/analytes characteristic of the component/feature) and assigning a probability map to the readings. Pathologies may be determined, based on the biological quantified properties/analytes, and characterized, e.g., by determining phenotype and/or predictive outcomes for the pathologies.
  • In some embodiments, the analyzer module 120 compares data across multiple timepoints, e.g., one or more of the biological components/analytes may involve a time based quantification. In further embodiments, a wide scan field may be utilized to asses multi-focal pathologies, e.g., based on aggregate quantifications of biological properties/analytes across a plurality of targets in the delineated field. Finally, based on the forgoing analytics, the analyzer module 120 may be configured to generate a patient report.
  • FIG. 3 is a flow chart of a method for delineating existence of perivascular adipose tissue and determining cap thickness based on medical imaging data of a patient, according to some embodiments of the invention. The medical imaging data can include radiological data and non-radiological data. For example, the medical imaging data can include MRI image information, patient demographic information and/or MRI device information.
  • The method can involve segmenting a lumen boundary based on the medical imaging data (Step 305). The method can also involve segmenting an outer wall boundary based on the medical imaging data (Step 310). Segmenting the medical image data can involve receiving the medical image data (e.g., from a file, from a user, from another computer and/or from the cloud). The medical imaging data can include data obtained via an MRI, CT, and/or ultrasound device. The medical imaging data can be analyzed (e.g., via analyzing module 120 as described above in FIG. 1 ), to identify and/or quantify one or more biological properties. In some embodiments, the medical image data can be restored and/or restored/deblurred, as discussed in further detail below.
  • The medical imaging data can include input from a user that indicates a region of interest containing a physiological target that is to be phenotyped.
  • Identifying and/or quantifying the one or more biological properties can involve utilizing one or more machine learned algorithms. The machine learned algorithms can be retrieved from a file, input by a user, retrieved from the cloud, or any combination thereof. The machine learned algorithms can be algorithms trained using AlexNet, which is a convolutional neural network (CNN). In some embodiments, the machine learned algorithms are further based on non-image medial data. For example, genomics, proteomics, and/or transcriptomics data.
  • The one or more biological properties and the medical imaging data itself can be used for segmenting into two-dimensional (2D) or three-dimensional (3D) objects. For example, upon segmenting the lumen boundary, a lumen of the blood vessel can be visualized in 3D. Similarly, upon segmenting the outer wall boundary, the outer wall can be visualized in 3D.
  • In some embodiments, prior to segmenting a user viewing a volume rendering of the blood vessel defines an initial vessel centerline. The segmenting of the lumen can be performed by a thresholding level set evolution using the optimal local Otsu threshold. The segmenting of the outer wall can be performed by using a geodesic active contour level set evolution, initialized with the lumen segmentation and initial centerline. In some embodiments, the lumen and/or outer wall can be manually edited by a user, for example, via a user input device.
  • In some embodiments, centerline paths are determined by defining a speed function for a fast marching algorithm. In the lumen interior, the speed function can be a linear function of distance from the lumen boundary and outside of the lumen. A small nonzero value can be used to, for example, allow for pathfinding across complete stenoses. Gradient descent can be used to define an initial centerline, which can be further centralized by a ball-and-spring model that optimizes monotonic equal spacing and distance from the lumen boundary.
  • The method can involve partitioning (e.g., bifurcating) vessel boundaries (Step 315). Partitioning vessel boundaries can be based on the segmented lumen boundary and outer wall boundary into one or more vessel boundaries. The one or more vessel boundaries can be partitioned into 2D or 3D objects.
  • In some embodiments, partitioning the vessel boundaries involves applying image registrations utilizing Mattes mutual information (MR), mean square error (CT) metric, rigid versor transform and/or LBFGSB optimizer, as examples. An initial lumen segmentation can utilize a confidence connected filter (e.g., coronary, carotid, vertebral and/or femoral) to distinguish the lumen. Lumen segmentation can utilize MR imaging (such as a combination of normalized, e.g., inverted for dark contrast, images) or CT imaging (such as use of registered pre-contrast, post-contrast CT and 2D Gaussian distributions) to define a vessel-ness function. Various components that are nearby but not necessarily connected can be dilated to connect them, e.g., by analyzing and applying thresholding may be applied.
  • In some embodiments, partitioning the vessels involves outer wall segmentation (e.g., utilizing a minimum curvature (k2) flow to account for lumen irregularities). In some embodiments, an edge potential map is calculated as outward-downward gradients in both contrast and non-contrast. In some embodiments, outer wall segmentation utilizes cumulative distribution functions (incorporating prior distributions of wall thickness, e.g., from 1-2 adjoining levels) in a speed function to allow for median thickness in the absence of any other edge information. In some embodiments, ferret diameters are employed for vessel characterization. In some embodiments, wall thickness is calculated as the sum of the distance to lumen plus the distance to the outer wall. In some embodiments, lumen and/or wall segmentations are performed using semantic segmentation using, for example, CNNs. The lumen and wall segmentations can be partitioned according to path segments by using a greedy fast marching competition from all the path points of the three segments, resulting in three mutually exclusive partitions of each segmentation.
  • The method can involve determining one or more of calcified regions, LRNC regions, intra-plaque hemorrhage regions, matrix regions based on the segmentations (e.g., as the segmentations are based on quantified biological properties) (Step 320).
  • The method can involve delineating the perivascular adipose tissue, wherein delineating the perivascular adipose tissue can involve creating an evaluation region by extending the outer wall boundary by a predetermined distance and utilizing a second set of machine learned algorithms to identify whether the evaluation region includes the perivascular adipose tissue (Step 325). The predetermine distance can be input by a user, retrieved from a file, or based on quantified biological properties.
  • In some embodiments, the outer wall boundary is extended as a 3D object. In these embodiments, the outer wall boundary is extended by a predetermined volume. The predetermined volume can be input by a user, retrieved from a file or based on quantified biological properties.
  • In some embodiments, identifying whether the evaluation region include the perivascular adipose tissue involves employing one or more algorithms for evaluating vascular and perivascular structure. The system (e.g., system 100 as described above in FIG. 1 ) can employ a target/vessel segment/cross-section model for segmenting the underlying structure of an imaged vessel. In some embodiments, luminal irregularities and/or plaque structure (e.g., remodeling) are evaluated. In some embodiments, a maximum, minimum, mean or any combination thereof of a cross-sectional area of the perivascular adipose tissue is determined.
  • In some embodiments, a maximum, minimum, mean or any combination thereof of a cross-sectional area of each of the one more vessel boundaries is determined. In some embodiments, a volume of each of the one more vessel boundaries is determined.
  • In some embodiments, evaluating the vascular and perivascular composition (within the wall, e.g., cross-sectional area and volume of lipid-rich necrotic core (LRNC) and how close to the lumen it is (e.g., cap thickness)), and/or perivascular tissue characteristics, e.g., perivascular adipose tissue (PVAT) can involve the model accounting for an observed image intensity at a pixel or voxel being influenced by a local neighborhood of hidden analyte category nodes thereby accounting for partial volume and scanner point spread function (PSF).
  • For example, FIG. 4 of U.S. patent application Ser. No. 16/203,418, depicts a multi-scale vessel wall analyte map that includes a wall-level segmentation 410 (e.g., a cross-sectional slice of the vessel), blob-level segmentation and pixel-level segmentation 430 (e.g., based on individual image pixels.
  • The method can also involve determining cap thickness (e.g., a layer of tissue that can be described as a cap) based on a minimum distance between the lumen boundary and LRNC regions (Step 330). The minimum distance between the lumen boundary and LRNC regions can be determined by creating first vector between a voxel in the lumen and voxel in the LNRC, determining the distance of the vector, creating a second vector between a different voxel point in the lumen and a different voxel in the LNRC, determining the distance and comparing against the first determined distance, and keeping the shortest of the two. Performing these steps for additional lumen and LNRC voxels to find the shortest voxel and assigning that as the cap thickness.
  • In some embodiments, parameters related to cross-section of tissues are determined and/or output. For example, cross section within each positioned and oriented slab, maximum, mean, minimum, and/or area of tissue characteristics in-wall (e.g., within the outer wall boundary) and/or perivascular.
  • In some embodiments, parameters related to one or more vessels are determined and/or output. For example, within each partition as described above, a maximum, mean, and/or minimum cross-section measurements across all the cross-sections included in each respective vessel can be determined and/or output. Within each partition, a volume and/or volume proportion can be determined and/or output (e.g., for 3D objects). In some embodiments, parameters related to a target (e.g., a group of vessels). In some embodiments, determining the parameters related to a target involves perform similar functions as the vessel level, but for the target as a whole.
  • In some embodiments, the readings can be marshalled for ML (e.g., out to training sets and/or in for per-patient inference). In some embodiments, the readings can be marshalled for ML either alone or with non-imaging data, e.g., bloodwork and/or transcriptomics in curated tissue collections.
  • In some embodiments, for each cross-section, images are stored (e.g., optionally enhanced) for DL (e.g., out to training sets and/or in for per-patient inference) In some embodiments, the images are stored for DL either alone or with non-imaging data, e.g., bloodwork and/or transcriptomics in curated tissue collections.
  • FIG. 3A, The restored/deblurred image shows a cross section through a coronary artery with intensity profiles for an original source CT scan after restoring according to methods described above (e.g., FIGS. 4 and 5 ) are applied. The change in comparison to the original image can be on the order of +/−30 HU.
  • In some scenarios, more subtle tissue characteristics are too prone to blurring artifacts for initial estimation from the source CT images, as exemplified in FIG. 3A. FIG. 3A is a diagram showing effects of restoring as can affect accurate quantitation of tissues, being a cross section through a coronary artery with intensity profiles for the original source CT and after the model-based restoring. For PVAT, one assumption can be that its fully adipose tissue is −100 HU while water due to inflammation is 0 HU such that adipose tissue with some inflammation would be somewhere in between −100 and 0 HU depending on how much water has entered the tissue. If, for example, tissue were 50% adipose tissue and 50% water from inflammation, a change of 50 HU can be achieved, which may be negated by uncorrected for CT blur, thus making a potentially good biomarker ineffective.
  • As described above, in some embodiments, the medical image data can be restored and/or restored/deblurred. For example, FIG. 21 of U.S. patent application Ser. No. 16/203,418 shows an example of a pre-processing step of restoring using a patient-specific point spread determination algorithm to mitigate artifacts and/or image limitations that can result from the image formation process that can, for example, decrease the ability to determine characteristics predictive of the phenotype. The figure can demonstrates a portion of the radiology analysis application analysis of a plaque from a CT. As shown in the figure, the restored/deblurred or restored images that are a result of iteratively fitting a physical model of the scanner point spread function with regularizing assumptions about the true latent density of different regions of the image.
  • In some embodiments, restoring of the image can be performed as follows: An imaging device (such as an MRI or CT device) can be used to acquire a measured image. A tissue characteristics image model can be initialized for the measured image representing a true underlying image. A tissue characteristics model can apply a level-set method (LSM) as a conceptual framework for numerical analysis of surfaces and shapes in the image representing biological analytes. The tissue characteristics model can map level sets to the image data via a set of characteristic equations, and thereby can represent specific biological analytes. The characteristic equations can be utilized to solve an optimization problem to determine optimal transformation parameters for the tissue characteristics model, and can thereby optimize restoring for segmentation of the specific biological analytes being analyzed. The tissue characteristics model and/or the optimization parameters can advantageously account/make use of a knowledge base of the underlying biology of the system, e.g., based on biological models for the analytes. The optimization problem can be solved using an iterative process which iteratively adjusts the tissue characteristics image model in order to minimize an energy function which models imaging physics relating to the appearance of different analytes in a Bayesian framework (e.g., energy may be the negative log of probabilities for the Bayesian framework integrated over the image). A restored/deblurred image may be outputted based on the transform parameters determined from the optimization problem. The restored/deblurred image can include restoring which can be optimized for segmentation and/or for quantitative analysis of the biological analytes. This can represent a significant improvement over generalized restoring techniques that have not accounted for the underlying biology of the system being analyzed.
  • Various advantages and improvements can be provided by restoring, for example, removing of blur that derives from very bright as well as very dark signals. Unlike conventional techniques, this may advantageously, account for both the technical image formation process in the scanner, as well as the specific biology being imaged. Additional advantages can include deriving scanner blur based on the image and incorporating detailed statistical models of prior estimates of tissue characteristics drawn from a truth source, e.g., such as histopathology.
  • In some embodiments, prior estimates used inform the classification process so as to provide the most plausible explanation for the observed image data. Additional advantages can include increased accuracy in readings of biological analytes, e.g., that include cross-sectional areas, volumes, and spatial locations for different types of tissues.
  • FIG. 4 is a flow chart for a method for restoring and segmenting, according to some embodiments of the invention. The method can involve acquiring a measured image (Step 401) via an imaging device. The method can also involve initializing a tissue characteristics image model for the measured image that represents a true underlying image (Step 402). In some embodiments, initializing a tissue characteristics image model can involve Initializing ϕi level set functions and χi characteristic functions and Initializing g with the background region masked to a constant intensity.
  • The method can also involve solving an optimization algorithm using an iterative process which can iteratively adjust the tissue characteristics image model in order to minimize an energy function which models imaging physics relating to the appearance of different analytes (Step 403). In some embodiments, the optimization algorithm involves:
  • Until stopping_criterion
      • a. Compute characteristic functions χI,
      • b. Compute blurred characteristic functions h*χI,
      • c. Compute constants ci,
      • d. Compute dϕi/dt level set function updates,
        • i. Create image f with partial volume effect at edges
        • ii. Create image h*f;
  • Volume fractions are computed from characteristic functions.
  • In some embodiments, a stopping criterion for the iterations can be based upon one or more user-defined number of iterations.
  • The optimization algorithm can use the iterative process as shown below in FIG. 5 .
  • The method can also involve outputting a restored/deblurred image based on the transform parameters determined from the optimization algorithm (Step 404). In some embodiments, outputting a restored/deblurred image involves outputting restored/deblurred image is provided as Irestored/deblurred=g−(h*f−f).
  • FIG. 5 is a flow chart for a method of an iterative optimization algorithm step (e.g. step 403 of FIG. 4 ) as applied within the context of multi multi-phase level sets where analyte image regions are defined by characteristic functions as a function of level set functions for each tissue characteristic type. The method can involve calculating characteristic functions from the level set functions for the current iteration, for example, based on the current level sets (Step 501). The method can also involve calculating blurred character characteristic functions calculated from the characteristic functions, e.g., based on a IIR Gaussian blurring given a point spread function (PSF) for the imaging device (Step 502). The method can also involve, calculating image intensity constants for the blurred characteristic functions (Step 503). The method can also involve calculating level set updates, e.g., based on a gradient decent approach to minimizing an energy function (Step 504). The iterative process reinitializes the level sets and characteristic equations with every iteration (e.g., prior to repeating steps 501-504). Thus, a signed distance property of the level set functions is relaxed during each iteration until reinitialization after the iteration.
  • FIGS. 6, 7A-7B and 8A-8B can illustrate effectiveness of the restoring/segmentation as described above. In particular, FIG. 6 is an example of an acquired image which includes blurring. FIG. 7A shows an example of an acquired image which includes blurring and FIG. 7B shows the same acquired image of FIG. 7A after restoring and segmenting according to embodiments of the invention applied herein are applied. FIG. 8A shows another example of another acquired image which includes blurring and FIG. 8B shows the same acquired image of FIG. 7A after restoring and segmenting according to embodiments of the invention applied herein are applied. As shown in FIGS. 7A-7B and 8A-8B, the restoring and/or segmentation method of the invention can advantageously provide for improved recognition of biological analytes. In some embodiments, a restored/deblurred image is augmented by replacing segmented regions representing biological analytes with an overlay map (e.g., a color-coded overlay map) for the segmented regions.
  • FIG. 9 shows an example of an augmented restored/deblurred image, according to some embodiments of the invention.
  • The augmented restored/deblurred image can depict quantitative measurements associated with identified analyte regions as well as one or more graphical characterizations of structure and/or composition. Thus, the augmented restored/deblurred image may advantageously provide improved tools for a clinician to evaluate a pathology of the patient.
  • Because the geometry of the wall may be significantly non-circular, the radial distance may be defined based on the shortest distance to the inner luminal surface and the shortest distance to the outer adventitial surface. The expert-annotation of the histology images includes regions that define the lumen and the vessel (defined as the union of the lumen and vessel wall).
  • A signed distance function can be created for each of these, L(x) and V(x), respectively. The convention is that the interior of these regions is negative so that in the wall L is positive and V is negative. The relative radial distance is computed as r(x)=L(x)/(L(x)−V(x)). It has a value of 0 at the luminal surface and 1 at the adventitial surface. The direction of the r-axis is determined by ∇r(x).
  • In some embodiments, one level set may be used for the entire vessel lumen initialized with the segmented lumen L. Each distinct contiguous bright region can be initialized as its own level set and calculated as follows: Candidate bright regions are computed using a morphological watershed applied to the inverted image (to turn bright peaks into catchment basins).
  • In some embodiments, energy functionals can represents an approach that integrates modeling between imaging physics and biology. The imaging physics portion can account for image intensities and the PSF of the scanner while the biological portion of the model incorporates histology-driven knowledge of the structure and growth patterns of atherosclerotic plaques. The model prior weights the model toward the most likely configurations and away from physically and biologically unrealistic solutions. The model can be provided in probabilistic terms and the energy is the negative log of probabilities integrated over the image. In addition to providing analytic tractability, the logarithm super-linearly weights against decreasing probability solutions.
  • In some embodiments, a Naïve Bayes [10] domain independence assumption is made between imaging physics and biology, e.g., that the likelihood of the residual between blurred model and blurred acquired image does not depend on the biological likelihood of a configuration of tissue characteristic regions next to each other.
  • The various model parameters that can be evolved throughout the algorithm include the level set functions mapped over the image, the true (e.g., restored/deblurred) image intensity of different biological tissue characteristics, and the width of the scanner PSF. The pre-learned model parameters can include the model of the dependencies of the spatial distribution of tissue characteristics with a plaque.
  • In some embodiments, after initialization, the model is iteratively adjusted in order to minimize the energy function through a gradient descent trajectory. The gradient descent approach can allows for the direct adjustment of model parameters, such as each level set 0, in order to minimize energy.
  • An imaging physics term in the energy functional can represent the L2 norm of the difference between the blurred idealized piecewise constant image and the acquired image. The coefficients can allow for a balance between the effect of curvature evolution smoothing and minimizing the mode-to-image residual. The evidence variables can be the acquired image pixel intensities represented by the blurred image g. Within each iteration, the ordering of sub-steps can follow the flow of information through the variables.
  • The characteristic functions can serve as an intermediary and the Euler-Lagrange equation can be determined in terms of the level set functions. The energy functional can be minimized using a gradient descent approach that moves each ϕ toward the local minimum of E at every point in space simultaneously and independently. Within each iteration, the signed distance property of the level set functions can be relaxed until reinitialization after the iteration and thus the integral disappears.
  • One advantage of the invention can include corrections to the image, (h*f−f), can be low frequency in that they can be simply step edges blurred by a Gaussian thereby preventing erroneous amplification of high frequency noise, which may often occur with conventional deconvolution techniques that may never fully separate amplifying true image structure from amplifying image noise. In fact the error of this improved deconvolution process may be subject only (or substantially only) to the accuracy of the region image intensity constants, the location of the edges, and/or the imaging system blur, all of which can be highly intuitive and can easily be visually confirmed by the end user.
  • Although embodiments of the invention are not limited in this regard, discussions utilizing terms such as, for example, “processing,” “computing,” “calculating,” “determining,” “establishing”, “analyzing”, “checking”, or the like, can refer to operation(s) and/or process(es) of a computer, a computing platform, a computing system, or other electronic computing device, that manipulates and/or transforms data represented as physical (e.g., electronic) quantities within the computer's registers and/or memories into other data similarly represented as physical quantities within the computer's registers and/or memories or other information non-transitory storage medium that can store instructions to perform operations and/or processes.
  • Although embodiments of the invention are not limited in this regard, the terms “plurality” and “a plurality” as used herein can include, for example, “multiple” or “two or more”. The terms “plurality” or “a plurality” can be used throughout the specification to describe two or more components, devices, elements, units, parameters, or the like. The term set when used herein can include one or more items. Unless explicitly stated, the method embodiments described herein are not constrained to a particular order or sequence. Additionally, some of the described method embodiments or elements thereof can occur or be performed simultaneously, at the same point in time, or concurrently.
  • A computer program can be written in any form of programming language, including compiled and/or interpreted languages, and the computer program can be deployed in any form, including as a stand-alone program or as a subroutine, element, and/or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or on multiple computers at one site.
  • Method steps can be performed by one or more programmable processors executing a computer program to perform functions of the invention by operating on input data and generating output. Method steps can also be performed by an apparatus and can be implemented as special purpose logic circuitry. The circuitry can, for example, be a FPGA (field programmable gate array) and/or an ASIC (application-specific integrated circuit). Modules, subroutines, and software agents can refer to portions of the computer program, the processor, the special circuitry, software, and/or hardware that implement that functionality.
  • Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor receives instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for executing instructions and one or more memory devices for storing instructions and data. Generally, a computer can be operatively coupled to receive data from and/or transfer data to one or more mass storage devices for storing data (e.g., magnetic, magneto-optical disks, or optical disks).
  • Data transmission and instructions can also occur over a communications network. Information carriers suitable for embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices. The information carriers can, for example, be EPROM, EEPROM, flash memory devices, magnetic disks, internal hard disks, removable disks, magneto-optical disks, CD-ROM, and/or DVD-ROM disks. The processor and the memory can be supplemented by, and/or incorporated in special purpose logic circuitry.
  • To provide for interaction with a user, the above described techniques can be implemented on a computer having a display device, a transmitting device, and/or a computing device. The display device can be, for example, a cathode ray tube (CRT) and/or a liquid crystal display (LCD) monitor. The interaction with a user can be, for example, a display of information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer (e.g., interact with a user interface element). Other kinds of devices can be used to provide for interaction with a user. Other devices can be, for example, feedback provided to the user in any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback). Input from the user can be, for example, received in any form, including acoustic, speech, and/or tactile input.
  • The computing device can include, for example, a computer, a computer with a browser device, a telephone, an IP phone, a mobile device (e.g., cellular phone, personal digital assistant (PDA) device, laptop computer, electronic mail device), and/or other communication devices. The computing device can be, for example, one or more computer servers. The computer servers can be, for example, part of a server farm. The browser device includes, for example, a computer (e.g., desktop computer, laptop computer, and tablet) with a World Wide Web browser (e.g., Microsoft® Internet Explorer® available from Microsoft Corporation, Chrome available from Google, Mozilla® Firefox available from Mozilla Corporation, Safari available from Apple). The mobile computing device includes, for example, a personal digital assistant (PDA).
  • Website and/or web pages can be provided, for example, through a network (e.g., Internet) using a web server. The web server can be, for example, a computer with a server module (e.g., Microsoft® Internet Information Services available from Microsoft Corporation, Apache Web Server available from Apache Software Foundation, Apache Tomcat Web Server available from Apache Software Foundation).
  • The storage module can be, for example, a random access memory (RAM) module, a read only memory (ROM) module, a computer hard drive, a memory card (e.g., universal serial bus (USB) flash drive, a secure digital (SD) flash card), a floppy disk, and/or any other data storage device. Information stored on a storage module can be maintained, for example, in a database (e.g., relational database system, flat database system) and/or any other logical information storage mechanism.
  • The above-described techniques can be implemented in a distributed computing system that includes a back-end component. The back-end component can, for example, be a data server, a middleware component, and/or an application server. The above described techniques can be implemented in a distributing computing system that includes a front-end component. The front-end component can, for example, be a client computer having a graphical user interface, a Web browser through which a user can interact with an example implementation, and/or other graphical user interfaces for a transmitting device. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (LAN), a wide area network (WAN), the Internet, wired networks, and/or wireless networks.
  • The system can include clients and servers. A client and a server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • The above described networks can be implemented in a packet-based network, a circuit-based network, and/or a combination of a packet-based network and a circuit-based network. Packet-based networks can include, for example, the Internet, a carrier internet protocol (IP) network (e.g., local area network (LAN), wide area network (WAN), campus area network (CAN), metropolitan area network (MAN), home area network (HAN), a private IP network, an IP private branch exchange (IPBX), a wireless network (e.g., radio access network (RAN), 802.11 network, 802.16 network, general packet radio service (GPRS) network, HiperLAN), and/or other packet-based networks. Circuit-based networks can include, for example, the public switched telephone network (PSTN), a private branch exchange (PBX), a wireless network (e.g., RAN, Bluetooth®, code-division multiple access (CDMA) network, time division multiple access (TDMA) network, global system for mobile communications (GSM) network), and/or other circuit-based networks.
  • Some embodiments of the present invention may be embodied in the form of a system, a method or a computer program product. Similarly, some embodiments may be embodied as hardware, software or a combination of both. Some embodiments may be embodied as a computer program product saved on one or more non-transitory computer readable medium (or media) in the form of computer readable program code embodied thereon. Such non-transitory computer readable medium may include instructions that when executed cause a processor to execute method steps in accordance with embodiments. In some embodiments the instructions stores on the computer readable medium may be in the form of an installed application and in the form of an installation package.
  • Such instructions may be, for example, loaded by one or more processors and get executed. For example, the computer readable medium may be a non-transitory computer readable storage medium. A non-transitory computer readable storage medium may be, for example, an electronic, optical, magnetic, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination thereof.
  • Computer program code may be written in any suitable programming language. The program code may execute on a single computer system, or on a plurality of computer systems.
  • One skilled in the art will realize the invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The foregoing embodiments are therefore to be considered in all respects illustrative rather than limiting of the invention described herein. Scope of the invention is thus indicated by the appended claims, rather than by the foregoing description, and all changes that come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.
  • In the foregoing detailed description, numerous specific details are set forth in order to provide an understanding of the invention. However, it will be understood by those skilled in the art that the invention can be practiced without these specific details. In other instances, well-known methods, procedures, and components, modules, units and/or circuits have not been described in detail so as not to obscure the invention. Some features or elements described with respect to one embodiment can be combined with features or elements described with respect to other embodiments.

Claims (18)

What is claimed is:
1. A system comprising a processor and a non-transient storage medium including processor executable instructions implementing an analyzer module including a hierarchical analytics framework configured to:
utilize a first set of machine learned algorithms to identify and quantify a set of biological properties utilizing medical imaging data;
segment the medical imaging data based on the quantified biological properties to delineate existence of perivascular adipose tissue;
creating an evaluation region by extending the outer wall boundary by a predetermined distance; and
utilizing a second set of machine learned algorithms to identify whether the evaluation region includes the perivascular adipose tissue as a fully resolved three-dimensional object.
2. The system of claim 1 wherein segmenting the medical imaging data further comprises segmenting the medical imaging data into at least a lumen boundary and an outer wall boundary.
3. The system of claim 2 further wherein the analyzer module is configured to partition a lumen and an outer wall based on the segmented lumen boundary and outer wall boundary into one or more vessel boundaries.
4. The system of claim 2 wherein the biological properties include calcified regions, Lipid-Rich Necrotic Core (LRNC) regions, intra-plaque regions, matrix regions, or any combination thereof.
5. The system of claim 1 wherein the analyzer module is configured to determine maximum, minimum, mean or any combination thereof a cross-sectional area of the perivascular adipose tissue.
6. The system of claim 3 wherein the analyzer module is configured to, for each partition, determine a maximum, minimum, mean or any combination thereof of a cross-sectional area of each of the one more vessels boundaries.
7. The system of claim 3 wherein the analyzer module is configured to, for each partition, determine volume of each of the one more vessels boundaries.
8. The system of claim 3 wherein the analyzer module is configured to determine maximum, minimum, mean or any combination thereof of a cross-sectional area for a target.
9. The system of claim 1 wherein segmenting the medical image data further comprises segmenting the medical image data into three-dimensional (3D) objects.
10. A method for determining existence of a perivascular adipose tissue via an analyzer module including a hierarchical analytics framework, the method comprising:
utilizing a first set of machine learned algorithms to identify and quantify a set of biological properties utilizing medical imaging data;
segmenting the medical imaging data based on the quantified biological properties to delineate existence of perivascular adipose tissue;
creating an evaluation region by extending the outer wall boundary by a predetermined distance; and
utilizing a second set of machine learned algorithms to identify whether the evaluation region includes the perivascular adipose tissue as a fully resolved three-dimensional object.
11. The method of claim 10 wherein segmenting the medical imaging data further comprises segmenting the medical imaging data into at least a lumen boundary and an outer wall boundary.
12. The method of claim 11 further wherein the analyzer module is configured to partition a lumen and an outer wall based on the segmented lumen boundary and outer wall boundary into one or more vessel boundaries.
13. The method of claim 11 wherein the biological properties include calcified regions, Lipid-Rich Necrotic Core (LRNC) regions, intra-plaque regions, matrix regions, or any combination thereof.
14. The method of claim 10 wherein the analyzer module is configured to determine maximum, minimum, mean or any combination thereof a cross-sectional area of the perivascular adipose tissue.
15. The method of claim 12 wherein the analyzer module is configured to, for each partition, determine a maximum, minimum, mean or any combination thereof of a cross-sectional area of each of the one more vessels boundaries.
16. The method of claim 12 wherein the analyzer module is configured to, for each partition, determine volume of each of the one more vessels boundaries.
17. The method of claim 12 wherein the analyzer module is configured to determine maximum, minimum, mean or any combination thereof of a cross-sectional area for a target.
18. The method of claim 10 wherein segmenting the medical image data further comprises segmenting the medical image data into three-dimensional (3D) objects.
US17/890,822 2019-08-05 2022-08-18 Combined assessment of morphological and perivascular disease markers Abandoned US20220392070A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/890,822 US20220392070A1 (en) 2019-08-05 2022-08-18 Combined assessment of morphological and perivascular disease markers
US18/319,003 US20230386026A1 (en) 2019-08-05 2023-05-17 Spatial analysis of cardiovascular imaging

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201962882881P 2019-08-05 2019-08-05
US16/984,640 US11508063B2 (en) 2019-08-05 2020-08-04 Non-invasive measurement of fibrous cap thickness
US17/890,822 US20220392070A1 (en) 2019-08-05 2022-08-18 Combined assessment of morphological and perivascular disease markers

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/984,640 Continuation US11508063B2 (en) 2019-08-05 2020-08-04 Non-invasive measurement of fibrous cap thickness

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/319,003 Continuation US20230386026A1 (en) 2019-08-05 2023-05-17 Spatial analysis of cardiovascular imaging

Publications (1)

Publication Number Publication Date
US20220392070A1 true US20220392070A1 (en) 2022-12-08

Family

ID=74498934

Family Applications (3)

Application Number Title Priority Date Filing Date
US16/984,640 Active 2040-12-12 US11508063B2 (en) 2019-08-05 2020-08-04 Non-invasive measurement of fibrous cap thickness
US17/890,822 Abandoned US20220392070A1 (en) 2019-08-05 2022-08-18 Combined assessment of morphological and perivascular disease markers
US18/319,003 Abandoned US20230386026A1 (en) 2019-08-05 2023-05-17 Spatial analysis of cardiovascular imaging

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/984,640 Active 2040-12-12 US11508063B2 (en) 2019-08-05 2020-08-04 Non-invasive measurement of fibrous cap thickness

Family Applications After (1)

Application Number Title Priority Date Filing Date
US18/319,003 Abandoned US20230386026A1 (en) 2019-08-05 2023-05-17 Spatial analysis of cardiovascular imaging

Country Status (6)

Country Link
US (3) US11508063B2 (en)
EP (1) EP3899864A4 (en)
JP (2) JP2022543330A (en)
KR (1) KR20210121062A (en)
CN (1) CN113439287A (en)
WO (1) WO2021026125A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11642092B1 (en) 2019-01-25 2023-05-09 Cleerly, Inc. Systems and methods for characterizing high risk plaques
US11660058B2 (en) 2020-01-07 2023-05-30 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11690586B2 (en) 2020-01-07 2023-07-04 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11861833B2 (en) 2020-01-07 2024-01-02 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11922627B2 (en) 2022-03-10 2024-03-05 Cleerly, Inc. Systems, devices, and methods for non-invasive image-based plaque analysis and risk determination

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2023127785A1 (en) * 2021-12-28 2023-07-06

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050043614A1 (en) * 2003-08-21 2005-02-24 Huizenga Joel T. Automated methods and systems for vascular plaque detection and analysis
US20160203599A1 (en) * 2013-08-13 2016-07-14 H. Lee Moffitt Cancer Center And Research Institute, Inc. Systems, methods and devices for analyzing quantitative information obtained from radiological images
US9572495B2 (en) * 2009-09-23 2017-02-21 Lightlab Imaging, Inc. Optical coherence tomography lumen morphology and vascular resistance measurements methods for blood vessel evaluations
US20170245829A1 (en) * 2013-03-14 2017-08-31 Volcano Corporation System and Method of Adventitial Tissue Characterization
US20170265832A1 (en) * 2014-08-15 2017-09-21 Oxford University Innovation Limited Method for characterisation of perivascular tissue
US20170337343A1 (en) * 2010-07-16 2017-11-23 The University Of Houston System Methods of computing pericardial and abdominal fat and methods for motion compensation
US20190247050A1 (en) * 2006-11-21 2019-08-15 David S. Goldsmith Integrated system for the infixion and retrieval of implants
US20190287276A1 (en) * 2016-10-31 2019-09-19 Oxford University Innovation Limited Method
US20200126672A1 (en) * 2018-10-17 2020-04-23 Heartflow, Inc. Systems and methods for assessing cardiovascular disease and treatment effectiveness from adipose tissue
US20200237329A1 (en) * 2019-01-25 2020-07-30 Cleerly, Inc. Systems and method of characterizing high risk plaques

Family Cites Families (110)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6108635A (en) 1996-05-22 2000-08-22 Interleukin Genetics, Inc. Integrated disease information system
AU2001271326A1 (en) 2000-06-16 2001-12-24 Lockheed Martin Mission Systems Scaleable object recognition with a belief model
US20030105638A1 (en) 2001-11-27 2003-06-05 Taira Rick K. Method and system for creating computer-understandable structured medical data from natural language reports
US6946715B2 (en) 2003-02-19 2005-09-20 Micron Technology, Inc. CMOS image sensor and method of fabrication
US7493253B1 (en) 2002-07-12 2009-02-17 Language And Computing, Inc. Conceptual world representation natural language understanding system and method
US20050118632A1 (en) 2003-11-06 2005-06-02 Jian Chen Polynucleotides and polypeptides encoding a novel metalloprotease, Protease-40b
US7536370B2 (en) 2004-06-24 2009-05-19 Sun Microsystems, Inc. Inferential diagnosing engines for grid-based computing systems
US7127095B2 (en) 2004-10-15 2006-10-24 The Brigham And Women's Hospital, Inc. Factor analysis in medical imaging
WO2006062958A2 (en) * 2004-12-10 2006-06-15 Worcester Polytechnic Institute Image-based computational mechanical analysis and indexing for cardiovascular diseases
US20070130206A1 (en) 2005-08-05 2007-06-07 Siemens Corporate Research Inc System and Method For Integrating Heterogeneous Biomedical Information
US8734823B2 (en) 2005-12-14 2014-05-27 The Invention Science Fund I, Llc Device including altered microorganisms, and methods and systems of use
WO2007087113A2 (en) 2005-12-28 2007-08-02 The Scripps Research Institute Natural antisense and non-coding rna transcripts as drug targets
GB2434225A (en) 2006-01-13 2007-07-18 Cytokinetics Inc Random forest modelling of cellular phenotypes
CA2644760A1 (en) 2006-03-03 2007-09-13 Entelos, Inc. Apparatus and method for computer modeling respiratory disease
US7627156B2 (en) 2006-03-22 2009-12-01 Volcano Corporation Automated lesion analysis based upon automatic plaque characterization according to a classification criterion
US7899764B2 (en) 2007-02-16 2011-03-01 Siemens Aktiengesellschaft Medical ontologies for machine learning and decision support
US8296247B2 (en) 2007-03-23 2012-10-23 Three Palm Software Combination machine learning algorithms for computer-aided detection, review and diagnosis
US8553832B2 (en) 2007-05-21 2013-10-08 Siemens Aktiengesellschaft Device for obtaining perfusion images
WO2009105530A2 (en) 2008-02-19 2009-08-27 The Trustees Of The University Of Pennsylvania System and method for automated segmentation, characterization, and classification of possibly malignant lesions and stratification of malignant tumors
US8781250B2 (en) 2008-06-26 2014-07-15 Microsoft Corporation Image deconvolution using color priors
US8970578B2 (en) 2008-12-19 2015-03-03 Szilard Voros System and method for lesion-specific coronary artery calcium quantification
DE102009006636B4 (en) * 2008-12-30 2016-02-18 Siemens Aktiengesellschaft Method for determining a 2D contour of a vessel structure depicted in 3D image data
WO2010099016A1 (en) * 2009-02-25 2010-09-02 Worcester Polytechnic Institute Automatic vascular model generation based on fluid-structure interactions (fsi)
US9405886B2 (en) 2009-03-17 2016-08-02 The Board Of Trustees Of The Leland Stanford Junior University Method for determining cardiovascular information
US8108311B2 (en) 2009-04-09 2012-01-31 General Electric Company Systems and methods for constructing a local electronic medical record data store using a remote personal health record server
US20110245650A1 (en) * 2010-04-02 2011-10-06 Kerwin William S Method and System for Plaque Lesion Characterization
WO2011158165A2 (en) 2010-06-13 2011-12-22 Angiometrix Corporation Diagnostic kit and method for measuring balloon dimension in vivo
US8157742B2 (en) 2010-08-12 2012-04-17 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8315812B2 (en) 2010-08-12 2012-11-20 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US9119540B2 (en) 2010-09-16 2015-09-01 Siemens Aktiengesellschaft Method and system for non-invasive assessment of coronary artery disease
EP2646939A2 (en) 2010-11-30 2013-10-09 Orange Phr/emr retrieval system based on body part recognition and method of operation thereof
WO2012094655A2 (en) 2011-01-07 2012-07-12 Indiana University Research And Technology Corporation Deductive multiscale simulation using order parameters
US8798984B2 (en) 2011-04-27 2014-08-05 Xerox Corporation Method and system for confidence-weighted learning of factored discriminative language models
US9974508B2 (en) 2011-09-01 2018-05-22 Ghassan S. Kassab Non-invasive systems and methods for determining fractional flow reserve
US10034614B2 (en) 2012-02-29 2018-07-31 General Electric Company Fractional flow reserve estimation
US8548778B1 (en) 2012-05-14 2013-10-01 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US9247918B2 (en) 2012-07-09 2016-02-02 Siemens Aktiengesellschaft Computation of hemodynamic quantities from angiographic data
KR101939778B1 (en) 2012-07-27 2019-01-18 삼성전자주식회사 Method and apparatus for determining blood flow requirement, method and apparatus for producing blood flow image, method and apparatus for processing myocardial perfusion image
US10433740B2 (en) 2012-09-12 2019-10-08 Heartflow, Inc. Systems and methods for estimating ischemia and blood flow characteristics from vessel geometry and physiology
US9262581B2 (en) 2012-09-24 2016-02-16 Heartflow, Inc. Method and system for facilitating physiological computations
US9320487B2 (en) 2012-09-25 2016-04-26 The Johns Hopkins University Method for estimating flow rates, pressure gradients, coronary flow reserve, and fractional flow reserve from patient specific computed tomography angiogram-based contrast distribution data
US9265473B2 (en) 2012-09-25 2016-02-23 The Johns Hopkins University Method for estimating flow rates and pressure gradients in arterial networks from patient specific computed tomography angiogram-based contrast distribution data
WO2014049574A2 (en) 2012-09-28 2014-04-03 Creed Jerett Methods for the stabilization of arterial plaque for the treatment of ischemic stroke and peripheral artery disease
US9675301B2 (en) 2012-10-19 2017-06-13 Heartflow, Inc. Systems and methods for numerically evaluating vasculature
EP3723041A1 (en) 2012-10-24 2020-10-14 CathWorks Ltd. Automated measurement system and method for coronary artery disease scoring
US9858387B2 (en) 2013-01-15 2018-01-02 CathWorks, LTD. Vascular flow assessment
BR112015010012A2 (en) 2012-11-06 2017-07-11 Koninklijke Philips Nv method; and system
JP6305742B2 (en) 2012-11-30 2018-04-04 キヤノンメディカルシステムズ株式会社 Medical image diagnostic apparatus and display method
JP6091870B2 (en) 2012-12-07 2017-03-08 東芝メディカルシステムズ株式会社 Blood vessel analysis device, medical image diagnostic device, blood vessel analysis method, and blood vessel analysis program
US8983993B2 (en) 2012-12-18 2015-03-17 Sap Se Data warehouse queries using SPARQL
US9042613B2 (en) 2013-03-01 2015-05-26 Heartflow, Inc. Method and system for determining treatments by modifying patient-specific geometrical models
US9424395B2 (en) 2013-03-04 2016-08-23 Heartflow, Inc. Method and system for sensitivity analysis in modeling blood flow characteristics
US10052031B2 (en) 2013-03-04 2018-08-21 Siemens Healthcare Gmbh Determining functional severity of stenosis
US20150324527A1 (en) 2013-03-15 2015-11-12 Northrop Grumman Systems Corporation Learning health systems and methods
US9675257B2 (en) 2013-03-15 2017-06-13 3Dt Holdings, Llc Impedance devices and methods to use the same to obtain luminal organ measurements
US8824752B1 (en) 2013-03-15 2014-09-02 Heartflow, Inc. Methods and systems for assessing image quality in modeling of patient anatomic or blood flow characteristics
WO2014170385A1 (en) 2013-04-18 2014-10-23 Koninklijke Philips N.V. Stenosis therapy planning
US20140365239A1 (en) 2013-06-05 2014-12-11 Nuance Communications, Inc. Methods and apparatus for facilitating guideline compliance
JP6188434B2 (en) 2013-06-07 2017-08-30 東芝メディカルシステムズ株式会社 Medical image processing apparatus and medical image processing program
AU2014296238B2 (en) 2013-07-30 2016-10-06 Heartflow, Inc. Method and system for modeling blood flow with boundary conditions for optimized diagnostic performance
US9043191B2 (en) 2013-08-16 2015-05-26 Heartflow, Inc. Systems and methods for identifying personalized vascular implants from patient-specific anatomic data
EP3036715B1 (en) 2013-08-21 2018-12-19 Koninklijke Philips N.V. Segmentation apparatus for interactively segmenting blood vessels in angiographic image data
US9805463B2 (en) 2013-08-27 2017-10-31 Heartflow, Inc. Systems and methods for predicting location, onset, and/or change of coronary lesions
US9629563B2 (en) 2013-09-04 2017-04-25 Siemens Healthcare Gmbh Method and system for functional assessment of renal artery stenosis from medical images
CN105517492B (en) 2013-09-06 2019-10-18 皇家飞利浦有限公司 For handling the processing equipment of cardiac data
US9700219B2 (en) 2013-10-17 2017-07-11 Siemens Healthcare Gmbh Method and system for machine learning based assessment of fractional flow reserve
CN105792855A (en) 2013-10-18 2016-07-20 分子制药洞察公司 Methods of using SPECT/CT analysis for staging cancer
US10049447B2 (en) 2013-11-06 2018-08-14 H. Lee Moffitt Cancer Center and Research Insititute, Inc. Pathology case review, analysis and prediction
US8977339B1 (en) 2013-12-05 2015-03-10 Intrinsic Medical Imaging Llc Method for assessing stenosis severity through stenosis mapping
US9311570B2 (en) * 2013-12-06 2016-04-12 Kabushiki Kaisha Toshiba Method of, and apparatus for, segmentation of structures in medical images
US9220418B2 (en) 2013-12-18 2015-12-29 Heartflow, Inc. Systems and methods for predicting coronary plaque vulnerability from patient-specific anatomic image data
WO2015091892A1 (en) 2013-12-19 2015-06-25 Comprehensive Biomarker Center Gmbh Mirnas as non-invasive biomarkers for parkinson's disease
US20150234921A1 (en) 2014-02-20 2015-08-20 Howe Li Web-based Search Tool for Searching Literature Adverse Event Case Report
US9501622B2 (en) 2014-03-05 2016-11-22 Heartflow, Inc. Methods and systems for predicting sensitivity of blood flow calculations to changes in anatomical geometry
JP6262027B2 (en) 2014-03-10 2018-01-17 東芝メディカルシステムズ株式会社 Medical image processing device
NL2012459B1 (en) 2014-03-18 2016-01-08 Medis Ass B V Method and device for determining deviation in pressure in a blood vessel.
US9390232B2 (en) 2014-03-24 2016-07-12 Heartflow, Inc. Systems and methods for modeling changes in patient-specific blood vessel geometry and boundary conditions
US9785746B2 (en) 2014-03-31 2017-10-10 Heartflow, Inc. Systems and methods for determining blood flow characteristics using flow ratio
US9773219B2 (en) 2014-04-01 2017-09-26 Heartflow, Inc. Systems and methods for using geometry sensitivity information for guiding workflow
US9058692B1 (en) 2014-04-16 2015-06-16 Heartflow, Inc. Systems and methods for image-based object modeling using multiple image acquisitions or reconstructions
US9449145B2 (en) 2014-04-22 2016-09-20 Heartflow, Inc. Systems and methods for virtual contrast agent simulation and computational fluid dynamics (CFD) to compute functional significance of stenoses
US8958623B1 (en) 2014-04-29 2015-02-17 Heartflow, Inc. Systems and methods for correction of artificial deformation in anatomic modeling
US9595089B2 (en) 2014-05-09 2017-03-14 Siemens Healthcare Gmbh Method and system for non-invasive computation of hemodynamic indices for coronary artery stenosis
US9754082B2 (en) 2014-05-30 2017-09-05 Heartflow, Inc. Systems and methods for reporting blood flow characteristics
DE102014210591B4 (en) 2014-06-04 2022-09-22 Siemens Healthcare Gmbh Fluid dynamic analysis of a vascular tree using angiography
US9747525B2 (en) 2014-06-16 2017-08-29 Siemens Healthcare Gmbh Method and system for improved hemodynamic computation in coronary arteries
US9888968B2 (en) 2014-07-22 2018-02-13 Siemens Healthcare Gmbh Method and system for automated therapy planning for arterial stenosis
US9195801B1 (en) 2014-08-05 2015-11-24 Heartflow, Inc. Systems and methods for treatment planning based on plaque progression and regression curves
US9386933B2 (en) 2014-08-29 2016-07-12 Heartflow, Inc. Systems and methods for determination of blood flow characteristics and pathologies through modeling of myocardial blood supply
US9390224B2 (en) 2014-08-29 2016-07-12 Heartflow, Inc. Systems and methods for automatically determining myocardial bridging and patient impact
US9668700B2 (en) 2014-09-09 2017-06-06 Heartflow, Inc. Method and system for quantifying limitations in coronary artery blood flow during physical activity in patients with coronary artery disease
US9292659B1 (en) 2014-10-29 2016-03-22 Heartflow, Inc. Systems and methods for vessel reactivity to guide diagnosis or treatment of cardiovascular disease
US9594876B2 (en) 2014-11-04 2017-03-14 Heartflow, Inc. Systems and methods for simulation of occluded arteries and optimization of occlusion-based treatments
US9336354B1 (en) 2014-11-04 2016-05-10 Heartflow, Inc. Systems and methods for simulation of hemodialysis access and optimization
US9349178B1 (en) 2014-11-24 2016-05-24 Siemens Aktiengesellschaft Synthetic data-driven hemodynamic determination in medical imaging
US20160203289A1 (en) 2015-01-14 2016-07-14 Heartflow, Inc. Systems and methods for embolism prediction using embolus source and destination probabilities
US10002419B2 (en) 2015-03-05 2018-06-19 Siemens Healthcare Gmbh Direct computation of image-derived biomarkers
EP3282936B1 (en) 2015-04-17 2020-08-19 HeartFlow, Inc. Systems and methods for assessment of tissue function based on vascular disease
US9839483B2 (en) 2015-04-21 2017-12-12 Heartflow, Inc. Systems and methods for risk assessment and treatment planning of arterio-venous malformation
US10275876B2 (en) 2015-06-12 2019-04-30 International Business Machines Corporation Methods and systems for automatically selecting an implant for a patient
US9785748B2 (en) 2015-07-14 2017-10-10 Heartflow, Inc. Systems and methods for estimating hemodynamic forces acting on plaque and monitoring patient risk
US11071501B2 (en) * 2015-08-14 2021-07-27 Elucid Bioiwaging Inc. Quantitative imaging for determining time to adverse event (TTE)
US10176408B2 (en) * 2015-08-14 2019-01-08 Elucid Bioimaging Inc. Systems and methods for analyzing pathologies utilizing quantitative imaging
US11113812B2 (en) * 2015-08-14 2021-09-07 Elucid Bioimaging Inc. Quantitative imaging for detecting vulnerable plaque
US11087459B2 (en) * 2015-08-14 2021-08-10 Elucid Bioimaging Inc. Quantitative imaging for fractional flow reserve (FFR)
US11094058B2 (en) 2015-08-14 2021-08-17 Elucid Bioimaging Inc. Systems and method for computer-aided phenotyping (CAP) using radiologic images
US10740880B2 (en) 2017-01-18 2020-08-11 Elucid Bioimaging Inc. Systems and methods for analyzing pathologies utilizing quantitative imaging
EP4393386A3 (en) 2017-01-23 2024-09-11 Shanghai United Imaging Healthcare Co., Ltd. Method and system for analyzing blood flow condition
US10503959B2 (en) 2017-03-03 2019-12-10 Case Western Reserve University Predicting cancer progression using cell run length features
EP3431005A1 (en) * 2017-07-19 2019-01-23 Koninklijke Philips N.V. Inflammation estimation from x-ray image data

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050043614A1 (en) * 2003-08-21 2005-02-24 Huizenga Joel T. Automated methods and systems for vascular plaque detection and analysis
US20190247050A1 (en) * 2006-11-21 2019-08-15 David S. Goldsmith Integrated system for the infixion and retrieval of implants
US9572495B2 (en) * 2009-09-23 2017-02-21 Lightlab Imaging, Inc. Optical coherence tomography lumen morphology and vascular resistance measurements methods for blood vessel evaluations
US20170337343A1 (en) * 2010-07-16 2017-11-23 The University Of Houston System Methods of computing pericardial and abdominal fat and methods for motion compensation
US20170245829A1 (en) * 2013-03-14 2017-08-31 Volcano Corporation System and Method of Adventitial Tissue Characterization
US20160203599A1 (en) * 2013-08-13 2016-07-14 H. Lee Moffitt Cancer Center And Research Institute, Inc. Systems, methods and devices for analyzing quantitative information obtained from radiological images
US20170265832A1 (en) * 2014-08-15 2017-09-21 Oxford University Innovation Limited Method for characterisation of perivascular tissue
US20190287276A1 (en) * 2016-10-31 2019-09-19 Oxford University Innovation Limited Method
US20200126672A1 (en) * 2018-10-17 2020-04-23 Heartflow, Inc. Systems and methods for assessing cardiovascular disease and treatment effectiveness from adipose tissue
US20200237329A1 (en) * 2019-01-25 2020-07-30 Cleerly, Inc. Systems and method of characterizing high risk plaques

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11751831B2 (en) 2019-01-25 2023-09-12 Cleerly, Inc. Systems and methods for characterizing high risk plaques
US11642092B1 (en) 2019-01-25 2023-05-09 Cleerly, Inc. Systems and methods for characterizing high risk plaques
US11759161B2 (en) 2019-01-25 2023-09-19 Cleerly, Inc. Systems and methods of characterizing high risk plaques
US11766229B2 (en) 2020-01-07 2023-09-26 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11832982B2 (en) 2020-01-07 2023-12-05 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11737718B2 (en) 2020-01-07 2023-08-29 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11751829B2 (en) 2020-01-07 2023-09-12 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11690586B2 (en) 2020-01-07 2023-07-04 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11751826B2 (en) 2020-01-07 2023-09-12 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11751830B2 (en) 2020-01-07 2023-09-12 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11672497B2 (en) 2020-01-07 2023-06-13 Cleerly. Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11766230B2 (en) 2020-01-07 2023-09-26 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11660058B2 (en) 2020-01-07 2023-05-30 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11779292B2 (en) 2020-01-07 2023-10-10 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11730437B2 (en) 2020-01-07 2023-08-22 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11861833B2 (en) 2020-01-07 2024-01-02 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11896415B2 (en) 2020-01-07 2024-02-13 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US12097063B2 (en) 2020-01-07 2024-09-24 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US12076175B2 (en) 2020-01-07 2024-09-03 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11967078B2 (en) 2020-01-07 2024-04-23 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11969280B2 (en) 2020-01-07 2024-04-30 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US12023190B2 (en) 2020-01-07 2024-07-02 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11948301B2 (en) 2022-03-10 2024-04-02 Cleerly, Inc. Systems, devices, and methods for non-invasive image-based plaque analysis and risk determination
US11922627B2 (en) 2022-03-10 2024-03-05 Cleerly, Inc. Systems, devices, and methods for non-invasive image-based plaque analysis and risk determination

Also Published As

Publication number Publication date
CN113439287A (en) 2021-09-24
KR20210121062A (en) 2021-10-07
WO2021026125A1 (en) 2021-02-11
JP2023093605A (en) 2023-07-04
US20210042918A1 (en) 2021-02-11
US20230386026A1 (en) 2023-11-30
JP2022543330A (en) 2022-10-12
US11508063B2 (en) 2022-11-22
EP3899864A1 (en) 2021-10-27
EP3899864A4 (en) 2022-08-31

Similar Documents

Publication Publication Date Title
US11508063B2 (en) Non-invasive measurement of fibrous cap thickness
US12131472B2 (en) Non-invasive imaging to determine health and disease
KR102491988B1 (en) Methods and systems for using quantitative imaging
US20210312622A1 (en) Quantitative imaging for instantaneous wave-free ratio (ifr)
US20210282719A1 (en) Non-invasive risk stratification for atherosclerosis
US12008751B2 (en) Quantitative imaging for detecting histopathologically defined plaque fissure non-invasively
Lafata et al. Radiomics: a primer on high-throughput image phenotyping
US20190180153A1 (en) Methods and systems for utilizing quantitative imaging
US20190180438A1 (en) Methods and systems for utilizing quantitative imaging
US20210390689A1 (en) Non-invasive quantitative imaging biomarkers of atherosclerotic plaque biology
Göçeri Fully automated liver segmentation using Sobolev gradient‐based level set evolution
US11657486B2 (en) Systems and methods for improving soft tissue contrast, multiscale modeling and spectral CT
US20220012865A1 (en) Quantitative imaging for detecting histopathologically defined plaque erosion non-invasively
US20240371000A1 (en) Combined assessment of morphological and perivascular disease markers

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

AS Assignment

Owner name: ELUCID BIOIMAGING INC., MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BUCKLER, ANDREW J.;REEL/FRAME:062958/0818

Effective date: 20200801

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION