WO2012140396A1 - Biomedical visualisation - Google Patents
Biomedical visualisation Download PDFInfo
- Publication number
- WO2012140396A1 WO2012140396A1 PCT/GB2012/000333 GB2012000333W WO2012140396A1 WO 2012140396 A1 WO2012140396 A1 WO 2012140396A1 GB 2012000333 W GB2012000333 W GB 2012000333W WO 2012140396 A1 WO2012140396 A1 WO 2012140396A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data set
- data
- histological
- modality
- information
- Prior art date
Links
- 238000012800 visualization Methods 0.000 title claims description 20
- 238000000034 method Methods 0.000 claims abstract description 54
- 238000002604 ultrasonography Methods 0.000 claims description 34
- 230000001419 dependent effect Effects 0.000 claims description 9
- 206010028980 Neoplasm Diseases 0.000 claims description 5
- 201000011510 cancer Diseases 0.000 claims description 5
- 238000005266 casting Methods 0.000 claims description 5
- 230000036210 malignancy Effects 0.000 claims description 5
- 230000002596 correlated effect Effects 0.000 claims description 3
- 210000001519 tissue Anatomy 0.000 description 71
- 238000005259 measurement Methods 0.000 description 33
- 230000008569 process Effects 0.000 description 9
- 230000004927 fusion Effects 0.000 description 8
- 230000003211 malignant effect Effects 0.000 description 8
- 238000000547 structure data Methods 0.000 description 8
- 238000012545 processing Methods 0.000 description 6
- 206010006187 Breast cancer Diseases 0.000 description 4
- 208000026310 Breast neoplasm Diseases 0.000 description 4
- 238000012512 characterization method Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 230000000875 corresponding effect Effects 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000001575 pathological effect Effects 0.000 description 4
- 238000007794 visualization technique Methods 0.000 description 4
- 230000002380 cytological effect Effects 0.000 description 3
- 238000003745 diagnosis Methods 0.000 description 3
- 201000010099 disease Diseases 0.000 description 3
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 3
- 230000007170 pathology Effects 0.000 description 3
- 238000009877 rendering Methods 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000004040 coloring Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 238000002372 labelling Methods 0.000 description 2
- 238000009607 mammography Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000001766 physiological effect Effects 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 201000009030 Carcinoma Diseases 0.000 description 1
- 208000007536 Thrombosis Diseases 0.000 description 1
- 208000025865 Ulcer Diseases 0.000 description 1
- 230000004308 accommodation Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000001574 biopsy Methods 0.000 description 1
- 210000000481 breast Anatomy 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 238000002405 diagnostic procedure Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000002847 impedance measurement Methods 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 230000002427 irreversible effect Effects 0.000 description 1
- 210000001503 joint Anatomy 0.000 description 1
- 230000003902 lesion Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000002310 reflectometry Methods 0.000 description 1
- 238000011896 sensitive detection Methods 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 238000007920 subcutaneous administration Methods 0.000 description 1
- 210000002435 tendon Anatomy 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 231100000397 ulcer Toxicity 0.000 description 1
- 210000001835 viscera Anatomy 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4416—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to combined acquisition of different diagnostic modalities, e.g. combination of ultrasound and X-ray acquisitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/053—Measuring electrical impedance or conductance of a portion of the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
- A61B5/7425—Displaying combinations of multiple images regardless of image source, e.g. displaying a reference anatomical image with a live image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
- A61B8/5261—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
- A61B6/466—Displaying means of special interest adapted to display 3D data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5229—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
- A61B6/5235—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5229—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
- A61B6/5247—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from an ionising-radiation diagnostic technique and a non-ionising radiation diagnostic technique, e.g. X-ray and ultrasound
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2012—Colour editing, changing, or manipulating; Use of colour codes
Definitions
- the present invention relates to a method and apparatus for generating a three- dimensional biomedical data set.
- the invention further relates to the method and apparatus of displaying the generated biomedical data set.
- US ultrasound
- 3D three- dimensional
- Other visualisation techniques such as electrical impedance (El) mapping enable better biomedical characterisation of tissue type, but may not contain the same high resolution structural information as US.
- El electrical impedance
- a method of generating a three-dimensional biomedical data set comprising: acquiring a first set of three-dimensional biomedical data utilising a first modality; acquiring a second set of three-dimensional biomedical data utilising a second modality; generating a histological data set in dependence on the second set of data; combining the first data set with said histological data set, wherein said combined data set provides structural and histological information.
- the histological data set is generated utilising a look-up table.
- the look-up table may comprise reference histological data correlated to a reference data set acquired using the second modality.
- the histological data set comprises tissue type classification.
- the histological data set may comprise malignancy type classification.
- the malignancy type classification may define the probability of the tissue being malignant or benign.
- the first modality primarily provides structural information, and may utilise ultrasound.
- the second modality primarily provides histological information, and may use electrical impedance.
- the second modality may utilise a plurality of frequencies to acquire the second data set.
- the method further comprises correlating the first and second sets of data, such that the coordinates of each set of data are substantially aligned.
- the method further comprises displaying said combined data set, the combined data set comprising a plurality of voxels, wherein each said voxel is displayed in dependence on said first data set and said histological data set.
- the hue of each voxel may be dependent on the histological data set.
- At least one of: the intensity; opacity; and gray level, of each voxel may be dependent on the first data set.
- each structural feature of the combined data set is provided with a label in dependence on the histological data set.
- the combined data set is displayed utilising an autostereoscopic display.
- the combined data set may be rendered for display utilising ray casting, and the autostereoscopic display may be electronic or printed.
- an apparatus for three-dimensional biomedical visualisation comprising: means for generating a three- dimensional biomedical data set according the above method; and an autostereoscopic display.
- an apparatus for generating a three-dimensional biomedical data set comprising: an input for receiving a first set of three-dimensional biomedical data; a further input for receiving a second set of three-dimensional biomedical data; means (preferably in the form of a processor and associated memory) for generating a histological data set in dependence on the second set of data; and means (preferably in the form of a processor and associated memory) for combining the first data set with said histological data set, wherein said combined data set provides structural and histological information.
- the apparatus further comprises means (preferably in the form of a biomedical data acquisition device) for acquiring the first set of three-dimensional biomedical data utilising a first modality.
- the apparatus preferably further comprises means (preferably in the form of a biomedical data acquisition device) for acquiring the second set of three-dimensional biomedical data utilising a second modality.
- the histological data set is generated utilising a look-up table.
- the look-up table may comprise reference histological data correlated to a reference data set acquired using the second modality.
- the histological data set comprises tissue type classification.
- the histological data set may comprise malignancy type classification.
- the malignancy type classification may define the probability of the tissue being malignant or benign.
- the first modality primarily provides structural information, and may utilise ultrasound.
- the second modality primarily provides histological information, and may use electrical impedance.
- the second modality may utilise a plurality of frequencies to acquire the second data set
- the apparatus further comprises means for correlating the first and second sets of data, such that the coordinates of each set of data are substantially aligned.
- the apparatus further comprises means for displaying said combined data set, the combined data set comprising a plurality of voxels, wherein each said voxel is displayed in dependence on said first data set and said histological data set.
- the hue of each voxel may be dependent on the histological data set. At least one of. the intensity; opacity; and gray level, of each voxel may be dependent on the first data set.
- each structural feature of the combined data set is provided with a label in dependence on the histological data set.
- the combined data set is displayed utilising an autostereoscopic display.
- the combined data set may be rendered for display utilising ray casting, and the autostereoscopic display may be electronic or printed.
- the three-dimensional biomedical information is displayed by autostereoscopic means.
- the autostereoscopic display means is an electronic display with suitable autostereoscopic encoding.
- the three-dimensional biomedical information is displayed in real-time on an autostereoscopic electronic display.
- the autostereoscopic display means is a printed image with suitable autostereoscopic encoding.
- the three-dimensional biomedical information is documented and/or archived in an autostereoscopic printed display.
- the structural information is represented by voxel opacity
- the histological classification information is represented by voxel hue.
- one measurement modality is ultrasound, and the other modality is electrical impedance.
- the electrical impedance is measured at a plurality of frequencies.
- histological classification information is obtained by comparison of electrical impedance measurement data to electrical impedance reference data for known histological classifications.
- the biomedical visualisation is applied to mammography.
- the biomedical visualisation is applied to breast cancer detection.
- combination of data sets preferably refers to the fusion of two data sets into a single new data set This process is distinct from co-registration and co- display of two distinct data sets.
- histological information is referred to, it equally could refer to: cytological information; pathological information; tissue type information; cell type information; or cell characteristic information.
- three-dimensional (biomedical) data preferably connotes data that provides a plurality of data, each associated with a respective position in a three- dimensional volume, and each datum relating to a (biomedical) property at the respective position K is associated with.
- Data from projections such as x-ray images
- the combination of two different datasets preferably comprises associating two different groups of data with positions in a common volume.
- each position in the volume is associated with a value for the first data type and a value for the second data type.
- the combination of more than two datasets is analogous.
- the invention also provides a computer program and a computer program product for carrying out any of the methods described herein and/or for embodying any of the apparatus features described herein, and a computer readable medium having stored thereon a program for carrying out any of the methods described herein and/or for embodying any of the apparatus features described herein.
- the invention also provides a signal embodying a computer program for carrying out any of the methods described herein and/or for embodying any of the apparatus features described herein, a method of transmitting such a signal, and a computer product having an operating system which supports a computer program for carrying out any of the methods described herein and/or for embodying any of the apparatus features described herein.
- Any apparatus feature as described herein may also be provided as a method feature, and vice versa.
- means plus function features may be expressed alternatively in terms of their corresponding structure, such as a suitably programmed processor and associated memory.
- any feature in one aspect of the invention may be applied to other aspects of the invention, in any appropriate combination.
- method aspects may be applied to apparatus aspects, and vice versa.
- any, some and/or all features in one aspect can be applied to any, some and/or all features in any other aspect, in any appropriate combination.
- Figure 1 illustrates the procedure of combining three-dimensional data sets from two different measurement modalities to obtain combined 30 tissue type structure visualisation of biological tissue
- Figure 2 shows combination of three-dimensional data sets from two different measurement modalities, one of which provides structural information with high resolution, and the other of which provides tissue type information with high specificity;
- Figure 3 shows the process of combining 3D US and El data in more detail
- Figure 4 shows individual data sets that occur in different coordinate systems (top view);
- Figure 5 shows individual data sets that occur in different coordinate systems (side view);
- Figure 6 shows a comparison between simply colouring US data with raw El data, and the process of combining US structural information with tissue type information derived from El;
- Figure 7 shows how a 3D display allows easier recognition of features
- Figure 8 shows the two differing processes of co-registration / co-display and combination by fusion
- Figure 9 shows a scale with different levels of abstraction (requiring interpretive skill) with examples of modalities.
- Figure 10 shows an apparatus for three-dimensional biomedical visualisation.
- FIG. 1 illustrates the procedure for the system.
- Biological tissue 100 is interrogated with two different measurement modalities 102, 112.
- a first measurement modality 102 provides 3D measurement data 104 that may receive further processing 106 before using the measurement data for tissue type classification 108, resulting in 3D tissue type data 110.
- a second measurement modality 112 provides 3D measurement data 114 that may receive further processing 116 and results in 3D structure data 118.
- FIG. 1 illustrates how two measurement modalities are combined into a monolithic tissue type/structure visualisation.
- the system extracts the core strength of the first measurement modality 102 to give highly specific tissue type information 110, and the core strength of the second measurement modality to give high resolution spatial structural information 118.
- the system combines the two data sets to provide and visualise 3D tissue type structure data 122.
- the combination of different modalities of imaging to produce a 3D image is distinct from a more simple process of co-registering and overlaying separate data sets.
- Co-registration and overlaying effectively gives two separate displays, shown on the same coordinate frame.
- the process of combination entails fusion of two data sets into a single combined representation, generating a new data set.
- the combination is an irreversible transformation (a many-to-one mapping). This is represented in Figure 8.
- the results still require the same specialist interpretative skills of the clinician as they would for the individual modalities used in order to have diagnostic value.
- a new data set is created that requires less abstract interpretation by providing direct, higher level information about tissue type (cell characteristics).
- Figure 9 illustrates examples of modalities that require different levels of abstract interpretation.
- the clinician needs more training to interpret results.
- the combination by fusion creates a data set that is at a higher level of immediate clinical relevance.
- the fused representation still requires interpretation on the part of the clinician, albeit at a less abstract level.
- Image pixel values could, for example, reflect probabilities of spatial areas having tissue features which map to a disease process, rather than attempting to offer a direct diagnosis of a particular pathology with absolute accuracy. Such accuracy is not realistic and hence interpretation and balance are an essential part of the diagnostic process that must be maintained in any new modality and visualisation method.
- the use of probability images is described in more detail below.
- the tissue type information relates to cell type or cell characteristic information.
- Such information may be cytological information (concerning the function and structure of cells), histological information (concerning the microscopic structure of tissues), or pathological information (concerning the causes of diseases, disease processes, and examination for diagnostic or forensic purposes)
- the tissue type information is referred to as histological information; where histological information is referred to, it equally could refer to: cytological information; pathological information; tissue type information; cell type information; or cell characteristic information.
- US ultrasound
- El electrical impedance
- US is a well established technique that provides detailed information on acoustic reflectivity of substructures within a body. This provides in particular boundary information for substructures, allowing determination of the form of a substructure.
- 3D US permits distinguishing spatial positioning, shape, and relationship to other substructures within a body. In medical imaging this allows quick, non-invasive visualisation of subcutaneous structures including internal organs, structures, and tissues within the human body (e.g. tendons, muscles, joints, vessels) for possible pathology or lesions. Resolution of less than a millimetre can be achieved.
- El is a non-invasive electrical technique that allows characterisation of electrical properties of tissues. The electrical properties of the tissue may be brought into correlation with the physiological properties of the tissue.
- El measurements may be performed at one, two, or a multitude of frequencies.
- the measurement may include a scan frequency sweep.
- the use of multiple El measurement frequencies may assist identification of physiological properties of the tissue.
- the El measurement data can be used to produce high- level tissue characterisation information.
- El is capable of producing 30 data characterising substructures within a body.
- El provides sensitive detection of tissue type, it typically has lower spatial resolution than for instance US.
- the system overcomes this problem by combining the higher resolution structural data from 3D US with the lower resolution tissue type data that can be extracted from 3D El.
- Figure 3 illustrates the process of combining 3D US and El data in more detail.
- the raw US data 300 is subjected to data processing 116.
- the data processing is analogous to 2D US data in well-known operations such as speckle reduction 302 (for instance by 3D truncated median filter) and 3D feature emphasis 304, in particular edge detection (for instance with a 3D Sobel operator), to emphasise structural features.
- the processed US data is composed of intensity/opacity (grey level) data representing structural information.
- Raw El data 306 is used to obtain tissue type information 110.
- the tissue type classification step 108 uses a lookup database and/or model 310 or a lookup table that correlates raw El data (at one, two, or a multitude of measurement frequencies) with tissue type information, and provides a 3D tissue-type data set 110.
- the El measurement data is compared to El measurements corresponding to samples of known tissue types, and for matching El data the tissue types are assumed to be same.
- the tissue type information may comprise classification of tissue types, or distinction of malignant and benign tissue.
- the tissue classification can include further quantitative information, for instance a probability or confidence measure that is associated with a particular classification. For example a probability measure may accompany a malignant/benign classification. A variety of statistical measures are applicable, for instance confidence level or chi square value.
- Standard transformation operations as are well known in the art are used to express the geometric relationship between the US and the El volumes, and to correlate the coordinate systems of the individual data sets. As illustrated in Figure 4 (top view) and Figure 5 (side view), the individual data sets 400, 402 may also occur in different coordinate systems, such as cylindrical 402, 502 and Cartesian 400, 500.
- the structural information from the processed 3D US data set is associated to tissue type information from the processed 3D El data in a 3D data combination step 120. Starting with a voxel of 3D US data (intensity/opacity or gray level data), that voxel then has a colour applied that represents tissue type.
- the colour is determined from the volume coverage of the US voxel, and the value of the tissue-type voxel in its immediate neighbourhood.
- structural features are presented and labelled with the associated tissue type, rendering the measurement results easy to interpret.
- the 3D tissue type structure data is displayed for further investigation and measurement.
- Labelling of the structural features may be achieved with a colour scheme as described above, where one colour represents a specific type of tissue, a second colour represents a second type of tissue, and so forth.
- the colour can represent a probability measure, such as confidence level, that is associated to the tissue type data. For example if tissue is subject to either benign or malignant classification, then one colour represents a high probability of malignant tissue, a second colour represents moderate probability of malignant tissue, and so forth.
- Other means of indicating the tissue type of a structural feature are possible, for instance: text labels placed proximal to the structural feature; text labels that appear when the structural feature is activated, for instance by mouse hover-over, or annotation of the structural feature with a symbol and a legend of symbols.
- the system preferably includes 30 display of the three- dimensional data.
- An example of a 3D display is an auto-stereoscopic display. Unlike for instance tomographic display, true 3D display shows a volume continuum. With an auto- stereoscopic display parallax and other natural visual clues provide highly intuitive and efficient visualisation, as illustrated in Figure 7. If the data volume 700 (top view) contains one object 702 in front of another object 704, then the front view 706 shows the two objects 702, 704 almost coinciding.
- 3D display such as an autostereoscopic display
- a plurality of views 706, 708, 710 can be seen (in dependence on the viewing position), and the viewer intuitively recognises that one object 702 is in front of the other object 704.
- 3D electronic autostereoscopic displays provide dynamic visualisation, and can be used for real-time imaging, and for user-responsive display.
- 3D auto-stereoscopic displays that are printed to appropriate hardcopy 3D decoder material provide static visualisation, and can be used for portable images that can be easily communicated for diagnosis and measurement
- a suitable method of rendering 3D data for autostereoscopic display is disclosed, and is hereby incorporated by reference, in a GB patent application co-filed today titled "Rendering Images for Autostereoscopic Display", and having agent reference P36726GB.
- Other 3D displays such as holographic displays or volumetric displays are equally suitable.
- (c) axes/coordinate freedom during measurement can be achieved.
- Collecting the data sets in three dimensions maintains the integrity of volume spatial information and this in turn allows the axes to be non-coincident during capture.
- the tissue type/structure image data can be interrogated throughout the volume, as the spatial integrity is maintained. Measurement can be made in the native coordinate frames of the modalities' data sets.
- the combined tissue type/structure data set can be addressed at any location in space producing highly specific spatial Information for clinical and surgical use.
- the resulting combined data set may be rendered for interactive autostereoscopic 3D display with natural parallax (look-around). Suitable controls are used to aid full interactive exploration of the data set. Metrological features can be overlaid in real time and used, together with suitable controls, to effect measurement of volumes and distances in full 3D space with natural 3D visual feedback.
- the combined data set may also be stored in a patient database system (a PACS system, for example), ideally together with the source individual modality volumes and visualisation parameters.
- a patient database system a PACS system, for example
- a high resolution 3D print with natural parallax may be produced.
- a high resolution 3D print is further useful for ease of communication, static viewing (e.g. using a light box), and in situations where an electronic display is not available.
- the system can be applied to breast cancer detection by electrical impedance mammography (EIM).
- EIM electrical impedance mammography
- the system can equally be applied to the detection of other histological pathologies such as other carcinomas, ulcers, or thrombosis.
- the tissue type structure data allows determination whether breast tissue is malignant without the need for a biopsy.
- the spatial locations of clinically significant areas can be related dimensionally to easily identifiable structures.
- Breast cancer diagnosis and measurement is improved by the pre-clinical, pre-surgical visualisation environment the system provides.
- Visualising the combined higher resolution structural data from 30 US with the lower resolution tissue type data that can be extracted from 3D El may be implemented with a volume ray-casting procedure.
- volume ray-casting the following steps are performed for each pixel:
- the procedure for determining the pixel colour value is described in more detail as follows.
- Four colour components can be determined: r. g, b and a - red, green, blue and opacity.
- r. g, b As the ray is cast through the data volume, for each stepping position the corresponding data from the US structural information data set is looked up, as is the tissue type information data.
- TissueTypeValue TT_data JextureJookup(RayLookupPosition)
- a combined classification operation is applied to determine the contribution to current pixel.
- the classified contribution from the current stepping position to the current pixel colour has both colour and opacity (alpha) components.
- the colour component is found by lookup in a colour table for the appropriate tissue type value.
- the colour is combined with the intensity value from the US (structural) data volume, and assigned to the target alpha (opacity) component (having possibly been soft thresholded):
- ClasslfiedColour.rgb TT_colour_map_lookup(TissueTypeValue)
- ClassifiedColour.a optional_soft_threshold( USIntensityValue)
- Soft thresholding uses a transfer function with a linear, rather than step transition, it is parameterised by the data values at the lower and upper limits of the linear transition region.
- An exemplary implementation is the GLSL 'smoothstepO' function.
- the contribution of the current stepping position to the current pixel is then evaluated.
- the emissive classified colour r.g.b components determined above are premultiplied by the associated opacity value, and the result used to contribute to the final accumulated render pixel colour
- the structural data (from US, for example) is not required to evaluate or reconstruct the tissue type information (from EI , for example), unlike in some modalities.
- the structural data and the tissue type information are two completely independent datasets, that can be used as two standalone datasets, or can be overlayed as two independent datasets, or, as described here, fused (combined) into a single dataset with higher immediate clinical relevance than the overlayed datasets, or either dataset alone. Because the two datasets are independent (and not convolved in reconstruction of raw data) the fused dataset is of higher information richness and greater confidence in the data is possible. The fusion of the datasets creates a new, distinct dataset.
- FIG. 10 shows an apparatus 1012 for three-dimensional biomedical visualisation.
- the apparatus 1012 includes an input 1004 for receiving a first set of three-dimensional biomedical data; a further input 1006 for receiving a second set of three-dimensional biomedical data; means 1002 for generating a histological data set in dependence on the second set of data; and means 1000 for combining the first data set with said histological data set.
- a biomedical acquisition device 1008 (or other means for acquiring the first set of three- dimensional biomedical data) utilises a first modality.
- a further biomedical acquisition device 1010 (or other means for acquiring the second set of three-dimensional biomedical data) utilises a second modality.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Radiology & Medical Imaging (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
- Radiation-Therapy Devices (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Architecture (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
The present invention relates to a method and apparatus for generating a three-dimensional biomedical data set. Said method comprises acquiring a first set of three- dimensional biomedical data utilising a first modality, acquiring a second set of three-dimensional biomedical data utilising a second modality, generating a histological data set in dependence on the second set of data, and combining the first data set with said histological data set, wherein said combined data set provides structural and histological information. The invention further relates to a method of and apparatus for displaying the generated biomedical data set.
Description
Biomedical visualisation
The present invention relates to a method and apparatus for generating a three- dimensional biomedical data set. The invention further relates to the method and apparatus of displaying the generated biomedical data set. Background
For biomedical visualisation, ultrasound (US) can provide high resolution three- dimensional (3D) structural image information. US provides only very limited information regarding type of biological tissue. Other visualisation techniques such as electrical impedance (El) mapping enable better biomedical characterisation of tissue type, but may not contain the same high resolution structural information as US. To satisfy the need of identifying and visualising types of tissue with high specificity while maintaining high spatial resolution, a more powerful biomedical visualisation technique is required.
According to a first aspect of the present invention, there is provided, a method of generating a three-dimensional biomedical data set, comprising: acquiring a first set of three-dimensional biomedical data utilising a first modality; acquiring a second set of three-dimensional biomedical data utilising a second modality; generating a histological data set in dependence on the second set of data; combining the first data set with said histological data set, wherein said combined data set provides structural and histological information. By such a combination of data sets the biomedical data set produced may be improved.
Preferably, the histological data set is generated utilising a look-up table. The look-up table may comprise reference histological data correlated to a reference data set acquired using the second modality.
Preferably, the histological data set comprises tissue type classification. The histological data set may comprise malignancy type classification. The malignancy type classification may define the probability of the tissue being malignant or benign.
Preferably, the first modality primarily provides structural information, and may utilise ultrasound.
Preferably, the second modality primarily provides histological information, and may use electrical impedance. The second modality may utilise a plurality of frequencies to acquire the second data set.
Preferably, the method further comprises correlating the first and second sets of data, such that the coordinates of each set of data are substantially aligned.
Preferably, the method according further comprises displaying said combined data set, the combined data set comprising a plurality of voxels, wherein each said voxel is displayed in dependence on said first data set and said histological data set. The hue of each voxel may be dependent on the histological data set. At least one of: the intensity; opacity; and gray level, of each voxel may be dependent on the first data set.
Preferably, each structural feature of the combined data set is provided with a label in dependence on the histological data set. Preferably, the combined data set is displayed utilising an autostereoscopic display. The combined data set may be rendered for display utilising ray casting, and the autostereoscopic display may be electronic or printed.
According to a further aspect of the present invention, there is provided an apparatus for three-dimensional biomedical visualisation, comprising: means for generating a three- dimensional biomedical data set according the above method; and an autostereoscopic display.
According to a yet further aspect of the present invention, there is provided an apparatus for generating a three-dimensional biomedical data set, comprising: an input for receiving a first set of three-dimensional biomedical data; a further input for receiving a second set of three-dimensional biomedical data; means (preferably in the form of a processor and associated memory) for generating a histological data set in dependence
on the second set of data; and means (preferably in the form of a processor and associated memory) for combining the first data set with said histological data set, wherein said combined data set provides structural and histological information. Preferably the apparatus further comprises means (preferably in the form of a biomedical data acquisition device) for acquiring the first set of three-dimensional biomedical data utilising a first modality.
The apparatus preferably further comprises means (preferably in the form of a biomedical data acquisition device) for acquiring the second set of three-dimensional biomedical data utilising a second modality.
Preferably, the histological data set is generated utilising a look-up table. The look-up table may comprise reference histological data correlated to a reference data set acquired using the second modality.
Preferably, the histological data set comprises tissue type classification. The histological data set may comprise malignancy type classification. The malignancy type classification may define the probability of the tissue being malignant or benign.
Preferably, the first modality primarily provides structural information, and may utilise ultrasound.
Preferably, the second modality primarily provides histological information, and may use electrical impedance. The second modality may utilise a plurality of frequencies to acquire the second data set
Preferably, the apparatus further comprises means for correlating the first and second sets of data, such that the coordinates of each set of data are substantially aligned.
Preferably, the apparatus further comprises means for displaying said combined data set, the combined data set comprising a plurality of voxels, wherein each said voxel is displayed in dependence on said first data set and said histological data set. The hue of
each voxel may be dependent on the histological data set. At least one of. the intensity; opacity; and gray level, of each voxel may be dependent on the first data set.
Preferably, each structural feature of the combined data set is provided with a label in dependence on the histological data set.
Preferably, the combined data set is displayed utilising an autostereoscopic display. The combined data set may be rendered for display utilising ray casting, and the autostereoscopic display may be electronic or printed.
According to a yet further aspect of the present invention, there is provided a method of obtaining three-dimensional biomedical information from amalgamating data from two different measurement modalities, whereby one modality contributes structural information, and the other histological classification information.
Preferably, the three-dimensional biomedical information is displayed by autostereoscopic means.
Preferably, the autostereoscopic display means is an electronic display with suitable autostereoscopic encoding.
Preferably, the three-dimensional biomedical information is displayed in real-time on an autostereoscopic electronic display. Preferably, the autostereoscopic display means is a printed image with suitable autostereoscopic encoding.
Preferably, the three-dimensional biomedical information is documented and/or archived in an autostereoscopic printed display.
Preferably, the structural information is represented by voxel opacity, and the histological classification information is represented by voxel hue.
Preferably, one measurement modality is ultrasound, and the other modality is electrical impedance.
Preferably, the electrical impedance is measured at a plurality of frequencies.
Preferably, histological classification information is obtained by comparison of electrical impedance measurement data to electrical impedance reference data for known histological classifications. Preferably, the biomedical visualisation is applied to mammography.
Preferably, the biomedical visualisation is applied to breast cancer detection.
Where combination of data sets is referred to, it preferably refers to the fusion of two data sets into a single new data set This process is distinct from co-registration and co- display of two distinct data sets.
Where histological information is referred to, it equally could refer to: cytological information; pathological information; tissue type information; cell type information; or cell characteristic information.
As used herein, three-dimensional (biomedical) data preferably connotes data that provides a plurality of data, each associated with a respective position in a three- dimensional volume, and each datum relating to a (biomedical) property at the respective position K is associated with. Data from projections (such as x-ray images) for example require reconstruction (due to superposition) before they are capable of providing three-dimensional (biomedical) data. The combination of two different datasets preferably comprises associating two different groups of data with positions in a common volume. Preferably, each position in the volume is associated with a value for the first data type and a value for the second data type. The combination of more than two datasets is analogous.
Further features of the invention are characterised by the dependent claims.
The invention extends to methods and/or apparatus substantially as herein described with reference to the accompanying drawings.
The invention also provides a computer program and a computer program product for carrying out any of the methods described herein and/or for embodying any of the apparatus features described herein, and a computer readable medium having stored thereon a program for carrying out any of the methods described herein and/or for embodying any of the apparatus features described herein. The invention also provides a signal embodying a computer program for carrying out any of the methods described herein and/or for embodying any of the apparatus features described herein, a method of transmitting such a signal, and a computer product having an operating system which supports a computer program for carrying out any of the methods described herein and/or for embodying any of the apparatus features described herein.
Any apparatus feature as described herein may also be provided as a method feature, and vice versa. As used herein, means plus function features may be expressed alternatively in terms of their corresponding structure, such as a suitably programmed processor and associated memory.
Any feature in one aspect of the invention may be applied to other aspects of the invention, in any appropriate combination. In particular, method aspects may be applied to apparatus aspects, and vice versa. Furthermore, any, some and/or all features in one aspect can be applied to any, some and/or all features in any other aspect, in any appropriate combination.
It should also be appreciated that particular combinations of the various features described and defined in any aspects of the invention can be implemented and/or supplied and/or used independently.
Furthermore, features implemented in hardware may generally be implemented in software, and vice versa. Any reference to software and hardware features herein
should be construed accordingly.
These and other aspects of the present invention will become apparent from the following exemplary embodiments that are described with reference to the following figures in which:
Figure 1 illustrates the procedure of combining three-dimensional data sets from two different measurement modalities to obtain combined 30 tissue type structure visualisation of biological tissue;
Figure 2 shows combination of three-dimensional data sets from two different measurement modalities, one of which provides structural information with high resolution, and the other of which provides tissue type information with high specificity;
Figure 3 shows the process of combining 3D US and El data in more detail;
Figure 4 shows individual data sets that occur in different coordinate systems (top view); Figure 5 shows individual data sets that occur in different coordinate systems (side view);
Figure 6 shows a comparison between simply colouring US data with raw El data, and the process of combining US structural information with tissue type information derived from El;
Figure 7 shows how a 3D display allows easier recognition of features;
Figure 8 shows the two differing processes of co-registration / co-display and combination by fusion;
Figure 9 shows a scale with different levels of abstraction (requiring interpretive skill) with examples of modalities; and
Figure 10 shows an apparatus for three-dimensional biomedical visualisation.
Detailed Description
To satisfy the need of identifying and visualising types of tissue with high specificity while maintaining high spatial resolution, a powerful biomedical visualisation technique is required. The present system combines three-dimensional data sets from two different measurement modalities, one of which provides structural information with high resolution, and the other of which provides tissue type information with high specificity.
Figure 1 illustrates the procedure for the system. Biological tissue 100 is interrogated with two different measurement modalities 102, 112. A first measurement modality 102 provides 3D measurement data 104 that may receive further processing 106 before using the measurement data for tissue type classification 108, resulting in 3D tissue type data 110. A second measurement modality 112 provides 3D measurement data 114 that may receive further processing 116 and results in 3D structure data 118. The 3D tissue type data 110 and the 3D structure data 118 are then combined 120. The combined 3D tissue type/structure data is finally visualised three dimensionaily 112. Figure 2 illustrates how two measurement modalities are combined into a monolithic tissue type/structure visualisation. The system extracts the core strength of the first measurement modality 102 to give highly specific tissue type information 110, and the core strength of the second measurement modality to give high resolution spatial structural information 118. The system combines the two data sets to provide and visualise 3D tissue type structure data 122.
In particular, the combination of different modalities of imaging to produce a 3D image is distinct from a more simple process of co-registering and overlaying separate data sets. Co-registration and overlaying effectively gives two separate displays, shown on the same coordinate frame. By contrast, the process of combination entails fusion of two data sets into a single combined representation, generating a new data set. The combination is an irreversible transformation (a many-to-one mapping). This is represented in Figure 8. In the case of co-registration and co-display, the results still require the same specialist interpretative skills of the clinician as they would for the individual modalities used in order to have diagnostic value. In the case of combination by fusion a new data set is created that requires less abstract interpretation by providing direct, higher level information about tissue type (cell characteristics). Figure 9 illustrates examples of modalities that require different levels of abstract interpretation. For higher levels of abstraction the clinician needs more training to interpret results. The combination by fusion creates a data set that is at a higher level of immediate clinical relevance. The fused representation still requires interpretation on the part of the clinician, albeit at a
less abstract level. Image pixel values could, for example, reflect probabilities of spatial areas having tissue features which map to a disease process, rather than attempting to offer a direct diagnosis of a particular pathology with absolute accuracy. Such accuracy is not realistic and hence interpretation and balance are an essential part of the diagnostic process that must be maintained in any new modality and visualisation method. The use of probability images is described in more detail below.
The tissue type information relates to cell type or cell characteristic information. Such information may be cytological information (concerning the function and structure of cells), histological information (concerning the microscopic structure of tissues), or pathological information (concerning the causes of diseases, disease processes, and examination for diagnostic or forensic purposes) In the following, the tissue type information is referred to as histological information; where histological information is referred to, it equally could refer to: cytological information; pathological information; tissue type information; cell type information; or cell characteristic information.
The system applies to biomedical processing and combined visualisation of 3D spatial structure data and 3D histological characterisation data in general. An exemplary combination of measurement modalities is ultrasound (US) and electrical impedance (El). US provides high resolution spatial structural information, and El can provide highly specific tissue type information. Processing and combined visualisation of dual-modality EVUS 3D data is described in further detail.
US is a well established technique that provides detailed information on acoustic reflectivity of substructures within a body. This provides in particular boundary information for substructures, allowing determination of the form of a substructure. 3D US permits distinguishing spatial positioning, shape, and relationship to other substructures within a body. In medical imaging this allows quick, non-invasive visualisation of subcutaneous structures including internal organs, structures, and tissues within the human body (e.g. tendons, muscles, joints, vessels) for possible pathology or lesions. Resolution of less than a millimetre can be achieved.
El is a non-invasive electrical technique that allows characterisation of electrical properties of tissues. The electrical properties of the tissue may be brought into correlation with the physiological properties of the tissue. El measurements may be performed at one, two, or a multitude of frequencies. The measurement may include a scan frequency sweep. The use of multiple El measurement frequencies may assist identification of physiological properties of the tissue. By comparing a set of El measurement data with a database and/or model of El measurements corresponding to samples of known tissue types, the El measurement data can be used to produce high- level tissue characterisation information. El is capable of producing 30 data characterising substructures within a body. Although El provides sensitive detection of tissue type, it typically has lower spatial resolution than for instance US. The system overcomes this problem by combining the higher resolution structural data from 3D US with the lower resolution tissue type data that can be extracted from 3D El. Figure 3 illustrates the process of combining 3D US and El data in more detail. First the raw US data 300 is subjected to data processing 116. The data processing is analogous to 2D US data in well-known operations such as speckle reduction 302 (for instance by 3D truncated median filter) and 3D feature emphasis 304, in particular edge detection (for instance with a 3D Sobel operator), to emphasise structural features. The processed US data is composed of intensity/opacity (grey level) data representing structural information.
Raw El data 306 is used to obtain tissue type information 110. The tissue type classification step 108 uses a lookup database and/or model 310 or a lookup table that correlates raw El data (at one, two, or a multitude of measurement frequencies) with tissue type information, and provides a 3D tissue-type data set 110. In the simplest case the El measurement data is compared to El measurements corresponding to samples of known tissue types, and for matching El data the tissue types are assumed to be same. The tissue type information may comprise classification of tissue types, or distinction of malignant and benign tissue. The tissue classification can include further quantitative information, for instance a probability or confidence measure that is associated with a particular classification. For example a probability measure may accompany a
malignant/benign classification. A variety of statistical measures are applicable, for instance confidence level or chi square value.
Standard transformation operations as are well known in the art are used to express the geometric relationship between the US and the El volumes, and to correlate the coordinate systems of the individual data sets. As illustrated in Figure 4 (top view) and Figure 5 (side view), the individual data sets 400, 402 may also occur in different coordinate systems, such as cylindrical 402, 502 and Cartesian 400, 500. The structural information from the processed 3D US data set is associated to tissue type information from the processed 3D El data in a 3D data combination step 120. Starting with a voxel of 3D US data (intensity/opacity or gray level data), that voxel then has a colour applied that represents tissue type. The colour is determined from the volume coverage of the US voxel, and the value of the tissue-type voxel in its immediate neighbourhood. Thus structural features are presented and labelled with the associated tissue type, rendering the measurement results easy to interpret. The 3D tissue type structure data is displayed for further investigation and measurement.
Labelling of the structural features may be achieved with a colour scheme as described above, where one colour represents a specific type of tissue, a second colour represents a second type of tissue, and so forth. Alternatively, the colour can represent a probability measure, such as confidence level, that is associated to the tissue type data. For example if tissue is subject to either benign or malignant classification, then one colour represents a high probability of malignant tissue, a second colour represents moderate probability of malignant tissue, and so forth. Other means of indicating the tissue type of a structural feature are possible, for instance: text labels placed proximal to the structural feature; text labels that appear when the structural feature is activated, for instance by mouse hover-over, or annotation of the structural feature with a symbol and a legend of symbols.
In the case of labelling structural features by colour, it should be appreciated that this is not to be confused with simply colouring (grayscale) US data with raw El data. This is illustrated in Figure 6. If 3D US data 118 is coloured with raw 3D El data 102, the entire
scan volume is flooded with colour that is likely to obscure important features. The presented data 600 is confusing, and the vast amount of data being visualised simultaneously may potentially occlude important diagnostic information. Furthermore, the user is required to interpret the tissue impedance value regarding whether tissue is healthy or pathological, resulting in subjective, difficult analysis and unreliable results. Finally, the accommodation of El measurement at more than one frequency is problematic, as only one spectral value is assigned to each voxel.
To provide maximum benefit, the system preferably includes 30 display of the three- dimensional data. An example of a 3D display is an auto-stereoscopic display. Unlike for instance tomographic display, true 3D display shows a volume continuum. With an auto- stereoscopic display parallax and other natural visual clues provide highly intuitive and efficient visualisation, as illustrated in Figure 7. If the data volume 700 (top view) contains one object 702 in front of another object 704, then the front view 706 shows the two objects 702, 704 almost coinciding. If however a true 3D display such as an autostereoscopic display is used, then a plurality of views 706, 708, 710 can be seen (in dependence on the viewing position), and the viewer intuitively recognises that one object 702 is in front of the other object 704. These features of 3D displays facilitate spatial location of clinically significant regions. Further, significant regions can be related dimensionally and linear and volume measurements can be made. 3D electronic autostereoscopic displays provide dynamic visualisation, and can be used for real-time imaging, and for user-responsive display. 3D auto-stereoscopic displays that are printed to appropriate hardcopy 3D decoder material provide static visualisation, and can be used for portable images that can be easily communicated for diagnosis and measurement A suitable method of rendering 3D data for autostereoscopic display is disclosed, and is hereby incorporated by reference, in a GB patent application co-filed today titled "Rendering Images for Autostereoscopic Display", and having agent reference P36726GB. Other 3D displays such as holographic displays or volumetric displays are equally suitable.
Under circumstances the system operates with data elements that are not arranged in slices (as in tomographic 3D data), but in more complex voxels. The volume continuum nature of this data ('true' 3D) prevents problems that occur for instance when the
tomographic planes of the different data sets do not coincide. The two data sets that are combined might comprise non-congruent volumes that can be related to one another using reference coordinates and transformation operations. The advantages of using true three-dimensional (spatial) information from multiple modalities (dual modality for the combination of US and El) are:
(a) Sub-summation of data with no loss in integrity is possible;
(b) identification and display of spatially discriminatory information is possible; and
(c) axes/coordinate freedom during measurement can be achieved. Collecting the data sets in three dimensions maintains the integrity of volume spatial information and this in turn allows the axes to be non-coincident during capture. The tissue type/structure image data can be interrogated throughout the volume, as the spatial integrity is maintained. Measurement can be made in the native coordinate frames of the modalities' data sets. The combined tissue type/structure data set can be addressed at any location in space producing highly specific spatial Information for clinical and surgical use. For the most natural and effective clinical visualisation environment, the resulting combined data set may be rendered for interactive autostereoscopic 3D display with natural parallax (look-around). Suitable controls are used to aid full interactive exploration of the data set. Metrological features can be overlaid in real time and used, together with suitable controls, to effect measurement of volumes and distances in full 3D space with natural 3D visual feedback.
The combined data set may also be stored in a patient database system (a PACS system, for example), ideally together with the source individual modality volumes and visualisation parameters. For conventional storage and filing a high resolution 3D print with natural parallax may be produced. A high resolution 3D print is further useful for ease of communication, static viewing (e.g. using a light box), and in situations where an electronic display is not available. The system can be applied to breast cancer detection by electrical impedance mammography (EIM). The system can equally be applied to the detection of other histological pathologies such as other carcinomas, ulcers, or thrombosis.
(n a breast cancer detection system, the tissue type structure data allows determination whether breast tissue is malignant without the need for a biopsy. The spatial locations of clinically significant areas can be related dimensionally to easily identifiable structures. Breast cancer diagnosis and measurement is improved by the pre-clinical, pre-surgical visualisation environment the system provides.
Visualising the combined higher resolution structural data from 30 US with the lower resolution tissue type data that can be extracted from 3D El may be implemented with a volume ray-casting procedure. In volume ray-casting, the following steps are performed for each pixel:
1. Calculate the ray direction from a given viewpoint through the current pixel;
2. Move back along the ray vector by one data volume side length to ensure the start point is always at a point in front of the data volume;
3. Cast ray through the volume for
volume side lengths (to ensure it terminates behind the data volume if rotated), in a stepped manner, accumulating the pixel colour value;
4. Adjust image and viewing distance parameters for display and application.
The procedure for determining the pixel colour value is described in more detail as follows. Four colour components can be determined: r. g, b and a - red, green, blue and opacity. As the ray is cast through the data volume, for each stepping position the corresponding data from the US structural information data set is looked up, as is the tissue type information data.
USIntensItyVatue - US_data_texture_lookup(RayLookupPosition)
TissueTypeValue = TT_data JextureJookup(RayLookupPosition)
With these values for the current position, a combined classification operation is applied to determine the contribution to current pixel. The classified contribution from the current stepping position to the current pixel colour has both colour and opacity (alpha) components. The colour component is found by lookup in a colour table for the appropriate tissue type value. The colour is combined with the intensity value from the US (structural) data volume, and assigned to the target alpha (opacity) component (having possibly been soft thresholded):
ClasslfiedColour.rgb = TT_colour_map_lookup(TissueTypeValue)
ClassifiedColour.a = optional_soft_threshold( USIntensityValue)
Soft thresholding uses a transfer function with a linear, rather than step transition, it is parameterised by the data values at the lower and upper limits of the linear transition region. An exemplary implementation is the GLSL 'smoothstepO' function.
The contribution of the current stepping position to the current pixel is then evaluated. The emissive classified colour r.g.b components determined above are premultiplied by the associated opacity value, and the result used to contribute to the final accumulated render pixel colour
FinalColour.rgb = FinalColour.rgb + (1- FinalColour.a) · ClassifiedColour.rgb ·
ClassifiedColour.a
FinalColour.a = FinalColour.a + (1 -FinalColour.a ) · ClassifiedColour.a
The structural data (from US, for example) is not required to evaluate or reconstruct the tissue type information (from EI , for example), unlike in some modalities. On the contrary, the structural data and the tissue type information are two completely independent datasets, that can be used as two standalone datasets, or can be overlayed as two independent datasets, or, as described here, fused (combined) into a single dataset with higher immediate clinical relevance than the overlayed datasets, or either dataset alone. Because the two datasets are independent (and not convolved in reconstruction of raw data) the fused dataset is of higher information richness and greater confidence in the data is possible. The fusion of the datasets creates a new, distinct dataset.
The fusion of structural data with tissue type information can eliminate the need for segmentation. In the case of segmentation, a finite set of points with tissue type information is used to identify and categorise compartments within the structural data. In the case of fusion, there is no need to identify and categorise compartments, as these are evident when displaying and viewing the fused dataset.
Figure 10 shows an apparatus 1012 for three-dimensional biomedical visualisation. The apparatus 1012 includes an input 1004 for receiving a first set of three-dimensional biomedical data; a further input 1006 for receiving a second set of three-dimensional biomedical data; means 1002 for generating a histological data set in dependence on the second set of data; and means 1000 for combining the first data set with said histological data set. The combined data set provides structural and histological information. A biomedical acquisition device 1008 (or other means for acquiring the first set of three- dimensional biomedical data) utilises a first modality. A further biomedical acquisition device 1010 (or other means for acquiring the second set of three-dimensional biomedical data) utilises a second modality.
While the invention has been described in reference to its preferred embodiments, it is to be understood that the words which have been used are words of description rather than limitation and that changes may be made to the invention without departing from its scope as defined by the appended claims.
Each feature disclosed in the description, and (where appropriate) the claims and drawings may be provided independently or in any appropriate combination.
Claims
Claims 1. A method of generating a three-dimensional biomedical data set, comprising:
acquiring a first set of three-dimensional biomedical data utilising a first modality;
acquiring a second set of three-dimensional biomedical data utilising a second modality;
generating a histological data set in dependence on the second set of data; and
combining the first data set with said histological data set, wherein said combined data set provides structural and histological information.
2. A method according to Claim 1 , wherein the histological data set is generated utilising a look-up table.
3. A method according to Claim 2, wherein said look-up table comprises reference histological data correlated to a reference data set acquired using the second modality.
4. A method according to Claim 1 or 2, wherein said histological data set comprises tissue type classification.
5. A method according to Claim 1, 2 or 3, wherein said histological data set comprises malignancy type classification.
6. A method according to any of the preceding claims, wherein the first modality primarily provides structural information.
7. A method according to any of the preceding claims, wherein the first modality utilises ultrasound.
8. A method according to any of the preceding claims, wherein the second modality primarily provides histological information.
9. A method according to any of the preceding claims, wherein the second modality utilises electrical impedance. 10. A method according to Claim 9, wherein the second modality utilises a plurality of frequencies. 11. A method according to any of the preceding claims, further comprising correlating the first and second sets of data, such that the coordinates of each set of data are substantially aligned. 12. A method according to any of the preceding claims, further comprising displaying said combined data set, the combined data set comprising a plurality of voxels, wherein each said voxel is displayed in dependence on said first data set and said histological data set. 13. A method according to Claim 12, wherein the hue of each voxel is dependent on the histological data set. 14. A method according to Claim 12 or 13, wherein at least one of: the intensity; opacity; and gray level, of each voxel is dependent on the first data set. 15. A method according to Claim 12, 13 or 14, wherein each structural feature of said combined data set is provided with a label in dependence on the histological data set. 16. A method according to any of Claims 12 to 15, wherein said combined data set is displayed utilising an autostereoscopic display. 17. A method according to Claim 16, wherein said combined data set is rendered for display utilising ray casting.
8. A method according to Claim 16 or 17, wherein said autostereoscopic display is electronic or printed.
9. Apparatus for three-dimensional biomedical visualisation, comprising:
means for generating a three-dimensional biomedical data set according to the method of any of Claims 1 to 18; and
an autostereoscopic display. 0. Apparatus for generating a three-dimensional biomedical data set, comprising:
an input for receiving a first set of three-dimensional biomedical data;
a further input for receiving a second set of three-dimensional biomedical data;
means for generating a histological data set in dependence on the second set of data; and
means for combining the first data set with said histological data set, wherein said combined data set provides structural and histological information. 1. Apparatus according to Claim 20 further comprising means for acquiring the first set of three-dimensional biomedical data utilising a first modality. 2. Apparatus according to Claim 20 or 21 further comprising means for acquiring the second set of three-dimensional biomedical data utilising a second modality; 3. Apparatus according to any of Claims 20 to 22, wherein said histological data set comprises tissue type classification. 4. Apparatus according to any of Claims 20 to 23, wherein the first modality utilises ultrasound. 5. Apparatus according to any of Claims 20 to 24, wherein the second modality utilises electrical impedance. Apparatus according to any of Claims 20 to 25, further comprising means for displaying said combined data set, the combined data set comprising a plurality of voxels, wherein each said voxel is displayed in dependence on said first data set and said histological data set. Apparatus according to Claim 26, wherein the hue of each voxel is dependent on the histological data set. Apparatus according to Claim 26 or 27, wherein at least one of: the intensity; opacity; and gray level, of each voxel is dependent on the first data set. A method of generating a three-dimensional biomedical data set substantially as herein described with reference to the accompanying figures. Apparatus for generating a three-dimensional biomedical data set substantially as herein described with reference to the accompanying figures.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB201106110A GB201106110D0 (en) | 2011-04-11 | 2011-04-11 | Biomedical visualisation |
GB1106110.8 | 2011-04-11 | ||
GB1203278.5 | 2012-02-24 | ||
GB201203278A GB201203278D0 (en) | 2011-04-11 | 2012-02-24 | Biomedical visualisation |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012140396A1 true WO2012140396A1 (en) | 2012-10-18 |
Family
ID=44122923
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/GB2012/000333 WO2012140396A1 (en) | 2011-04-11 | 2012-04-11 | Biomedical visualisation |
Country Status (2)
Country | Link |
---|---|
GB (2) | GB201106110D0 (en) |
WO (1) | WO2012140396A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016092390A1 (en) * | 2014-12-08 | 2016-06-16 | Koninklijke Philips N.V. | Interactive physiologic data and intravascular imaging data and associated devices, systems, and methods |
US20230181165A1 (en) * | 2021-12-15 | 2023-06-15 | GE Precision Healthcare LLC | System and methods for image fusion |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090238334A1 (en) * | 2008-03-19 | 2009-09-24 | C-Rad Innovation Ab | Phase-contrast x-ray imaging |
US20100268063A1 (en) * | 2009-04-15 | 2010-10-21 | Sebastian Schmidt | Method and device for imaging a volume section by way of pet data |
US20110034806A1 (en) * | 2008-01-09 | 2011-02-10 | The Trustees Of Dartmouth College | Systems And Methods For Combined Ultrasound And Electrical Impedance Imaging |
-
2011
- 2011-04-11 GB GB201106110A patent/GB201106110D0/en not_active Ceased
-
2012
- 2012-02-24 GB GB201203278A patent/GB201203278D0/en not_active Ceased
- 2012-04-11 WO PCT/GB2012/000333 patent/WO2012140396A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110034806A1 (en) * | 2008-01-09 | 2011-02-10 | The Trustees Of Dartmouth College | Systems And Methods For Combined Ultrasound And Electrical Impedance Imaging |
US20090238334A1 (en) * | 2008-03-19 | 2009-09-24 | C-Rad Innovation Ab | Phase-contrast x-ray imaging |
US20100268063A1 (en) * | 2009-04-15 | 2010-10-21 | Sebastian Schmidt | Method and device for imaging a volume section by way of pet data |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016092390A1 (en) * | 2014-12-08 | 2016-06-16 | Koninklijke Philips N.V. | Interactive physiologic data and intravascular imaging data and associated devices, systems, and methods |
US20230181165A1 (en) * | 2021-12-15 | 2023-06-15 | GE Precision Healthcare LLC | System and methods for image fusion |
US12089997B2 (en) * | 2021-12-15 | 2024-09-17 | GE Precision Healthcare LLC | System and methods for image fusion |
Also Published As
Publication number | Publication date |
---|---|
GB201106110D0 (en) | 2011-05-25 |
GB201203278D0 (en) | 2012-04-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11983799B2 (en) | System and method for synthesizing low-dimensional image data from high-dimensional image data using an object grid enhancement | |
US10628930B1 (en) | Systems and methods for generating fused medical images from multi-parametric, magnetic resonance image data | |
US9098935B2 (en) | Image displaying apparatus, image displaying method, and computer readable medium for displaying an image of a mammary gland structure without overlaps thereof | |
EP3267894B1 (en) | Retrieval of corresponding structures in pairs of medical images | |
US7935055B2 (en) | System and method of measuring disease severity of a patient before, during and after treatment | |
US9087400B2 (en) | Reconstructing an object of interest | |
JP2021041268A (en) | System and method for navigating x-ray guided breast biopsy | |
US20140348404A1 (en) | Semantic navigation and lesion mapping from digital breast tomosynthesis | |
US20110125016A1 (en) | Fetal rendering in medical diagnostic ultrasound | |
US9826958B2 (en) | Automated detection of suspected abnormalities in ultrasound breast images | |
WO2013028762A1 (en) | Method and system for integrated radiological and pathological information for diagnosis, therapy selection, and monitoring | |
US9089307B2 (en) | Three-dimensional analysis of lesions represented by image data | |
US20170365051A1 (en) | Medical image data processing system and method | |
EP1923840B1 (en) | Picking on fused 3D volume rendered images and updating corresponding views according to a picking action | |
CN102548480B (en) | Device and method for displaying medical image and program | |
US9449425B2 (en) | Apparatus and method for generating medical image | |
US20120078101A1 (en) | Ultrasound system for displaying slice of object and method thereof | |
JP2000350722A (en) | Arrangement of notable elements of organs and three- dimensional expression method thereof | |
WO2012140396A1 (en) | Biomedical visualisation | |
McDonald | 3-Dimensional breast ultrasonography: What have we been missing? | |
Papavasileiou et al. | Towards a CAD System for Breast Cancer Based on Individual Microcalcifications? | |
Grace Anabela | Three-dimensional (3D) reconstruction of ultrasound foetal images using visualisation toolkit (VTK)/Grace Anabela Henry Dusim | |
Dusim | Three-Dimensional (3D) Reconstruction of Ultrasound Foetal Images Using Visualisation Toolkit (VTK) | |
Pant et al. | Enhancing Liver Biopsy Simulations: 3D Model Development, Endoscopic Imaging, and Needle Tracking | |
CN118576241A (en) | Liver ultrasonic information display method and ultrasonic imaging system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12722175 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 18/12/2013) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12722175 Country of ref document: EP Kind code of ref document: A1 |