WO2016055902A1 - Three dimensional ultrasound imaging by intersecting 2d scanning - Google Patents
Three dimensional ultrasound imaging by intersecting 2d scanning Download PDFInfo
- Publication number
- WO2016055902A1 WO2016055902A1 PCT/IB2015/057435 IB2015057435W WO2016055902A1 WO 2016055902 A1 WO2016055902 A1 WO 2016055902A1 IB 2015057435 W IB2015057435 W IB 2015057435W WO 2016055902 A1 WO2016055902 A1 WO 2016055902A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- images
- image
- sequence
- ultrasound
- probe
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8993—Three dimensional imaging systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4245—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
- A61B8/4254—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8909—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
- G01S15/8915—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
- G01S15/8918—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array the array being linear
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8909—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
- G01S15/8915—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
- G01S15/892—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array the array being curvilinear
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8934—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration
- G01S15/8936—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration using transducers mounted for mechanical movement in three dimensions
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8934—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration
- G01S15/8938—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration using transducers mounted for mechanical movement in two dimensions
- G01S15/894—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration using transducers mounted for mechanical movement in two dimensions by rotation about a single axis
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52085—Details related to the ultrasound signal acquisition, e.g. scan sequences
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8909—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
- G01S15/8915—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
- G01S15/8925—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array the array being a two-dimensional transducer configuration, i.e. matrix or orthogonal linear arrays
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8934—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration
- G01S15/8945—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration using transducers mounted for linear mechanical movement
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52053—Display arrangements
- G01S7/52057—Cathode ray tube displays
- G01S7/5206—Two-dimensional coordinated display of distance and direction; B-scan display
- G01S7/52065—Compound scan display, e.g. panoramic imaging
Definitions
- This invention relates to medical ultrasound imaging and, in particular, to three dimensional ultrasonic imaging by intersecting scanning of a region of interest.
- Ultrasonic imaging can be used to produce two dimensional or three dimensional images of a region of interest.
- Premium ultrasound systems conventionally are capable of operating with probes having two dimensional transducer arrays which can be operated in a phased mode to steer transmit and receive beams over a three dimensional region of the body.
- Such solid state probes can acquire beams steered over a three dimensional region which are used to form three dimensional images in real time.
- Two dimensional array probes can also be used to produce static images of extended volumes referred to as panoramic images as described in U.S. patent 8,539,838 (Yoo et al . ) The probe shown in this patent can be moved along the surface of the body with a row of transducers extending normal to the direction of travel continuously
- an ultrasonic imaging system and method are described in which manual freehand 2D image scanning is performed using a ID array probe, a mechanically swept single piston transducer probe, or a row of elements of a 2D array probe.
- a 2D array whether one row of elements or more than one row of elements is used is not significant as long as a sequence of 2D images is acquired where each image plane in each sequence is formed in a fixed location and orientation relative to the transducer.
- the inventive technique is designed specifically to operate without the need for a priori knowledge of the exact orientation or spacing of the 2D images in any of the sequences, although the technique can also be performed with a priori knowledge.
- a region of interest can be scanned by at least two intersecting sweeps of images.
- the present invention can be used generally if sweeps (2 or more) intersect in such a manner that the lines of intersection of the sweeps, taken together, are not all parallel.
- a region of interest can be scanned at least twice in approximately orthogonal directions to acquire two sets of approximately parallel planes, or in three or more different directions to acquire three or more sets of image planes, which spatially intersect each other.
- the line or lines of intersection of images from the two sets of image planes are then found and the intersecting images fitted together in
- FIGURE 1 provides an example ultrasound system in accordance with an embodiment of the present invention.
- FIGURE 2 illustrates in perspective two
- FIGURE 3 illustrates the two intersecting
- FIGURE 4 illustrates a grid showing how an assumed line of intersection of two images can fail to be an actual line of intersection of the two images by reason of nonlinear motion of the probe during scanning.
- FIGURE 5 illustrates the translation needed for one line of image data of one image to bring it into registration with the same image data in an
- FIGURE 6 illustrates the actual alignment of one set of approximately parallel images with an image which is approximately orthogonal to the approximately parallel images .
- FIGURE 7 illustrates a response surface of function values for unknown variables in an iterative 3D image reconstruction technique of the present invention .
- the present invention includes an ultrasound system which produces a three dimensional image of a region of interest.
- the ultrasound system can include an ultrasound probe having, e.g., a one dimensional array of transducer elements.
- a first frame sequence buffer, coupled to the ultrasound probe can be configured to store a first sequence of ultrasound images, such as spatially offset two dimensional images as the probe is moved in a first direction.
- a second frame sequence buffer, coupled to the ultrasound probe can be configured to store a second sequence of
- ultrasound images such as spatially offset two
- An image line registration processor coupled to receive images from the first and second frame sequence buffers, can be configured to identify a common line of intersection of an image of one buffer with an image of the other buffer.
- a 3D volume buffer coupled to the image line registration processor, can be configured to store the image data of images found to have one or more lines of intersection by the image line registration processor.
- a display coupled to the 3D volume buffer, can be configured to produce a three dimensional image based on the intersecting sweeps of the ultrasound probe.
- An ultrasound probe 11 contains a one dimensional transducer array 12.
- the array is a curved linear array which scans a sector image when the face of the probe is pressed into acoustic contact with the body.
- the linear array may also be a straight linear array or a single row of a two dimensional array transducer.
- the probe is connected by a T/R switch 14 to a transmit beamformer 16 and a receive beamformer 18.
- transmit beamformer actuates predetermined groups of elements at specifically related times to transmit beams in desired directions over an image plane and the receive beamformer steers and focuses receive beams in response to those transmission events.
- the receive beams are filtered and may undergo other signal
- the detected signals may be amplitude detected tissue signals or Doppler detected signals from moving substances such as blood flow.
- a series of ad acent receive beams are stored in a frame buffer 24 from which they will be further processed into a planar image frame as described below.
- the ultrasound system 10 of the present invention can be configured to determine an approximate position of the ultrasound probe 11 in space.
- the ultrasound probe can include position sensors to determine the probe's position in space.
- position sensors can be used, such as accelerometers , gyroscopes, and/or electromagnetic field sensors.
- electromagnetic instrument tracking system or other similar system may be used alone or in combination with the ultrasonic diagnostic imaging system to determine the probe position.
- the electromagnetic tracking system uses electromagnetic field generators and
- sensors to determine the location of an object in space.
- the location may be determined in relation to a fixed plane, an ultrasound probe, and/or other reference point.
- An electromagnetic tracking system that may be used is the PercuNav system by Philips®.
- an electromagnetic tracking and navigation system may be used, such as the PercuNav system, elements of which are shown in Figure 1.
- the electromagnetic tracking and navigation system has a field generator 46 which generates an electromagnetic field permeating a surrounding space of the probe during imaging.
- Sensors 44 are located on the probe 12 which interact with the electromagnetic field and produce signals used to calculate the position and orientation of the image plane of the probe. This calculation is done by a coordinate generator 42 of the system, which is shown receiving signals representing the position and orientation of the image plane, and further receiving signals from the field generator 46 to which the coordinate generator 42 is also coupled for field registration purposes.
- the electromagnetic tracking and navigation system can be controlled via operator control signals from the control panel 30.
- the present invention includes methods for producing a three dimensional ultrasound image by freehand scanning.
- the methods can include, for example, moving an ultrasound probe containing a one dimensional array of transducer elements in a first direction while acquiring a first sequence of spatially offset images from a region of interest.
- the first direction can be generally orthogonal to the plane of an image produced by the array.
- the methods can include storing the first sequence of images.
- Another step of the methods can include moving the ultrasound probe in a second direction while acquiring a second sequence of spatially offset images from a region of interest.
- the probe can be moved in intersecting sweeps in the first and second directions such that at least one image in the first sequence intersects at least one image in the second sequence.
- intersecting sweeps can be generally orthogonal, e.g., the first direction can be generally orthogonal to the second direction.
- the methods can include storing the second sequence of images, and/or processing an image from the first sequence with an image of the second sequence to identify at least one common line of intersection of the two image.
- the methods can include processing additional images from both sequences to identify one or more lines of intersection of each additional image with one or more lines of previously processed images to form a three dimensional grid of intersecting images.
- a step of the methods can include processing the three dimensional grid of intersecting images to produce a three
- first frame sequence buffer 32 The probe is moved so that the images received during its motion will sweep through a region of interest.
- the user touches a control on the user interface 30 to condition the frame sequence buffer to begin storing images.
- the control is touched again to inform the buffer that the first sequence is complete. Any identical images stored at the beginning and the end of the sweep before and after the probe has moved may be identified by image matching and discarded. Since the probe is moved in a given direction the
- the probe is slid or rocked in a second direction (e.g., an approximately orthogonal direction) to acquire a second sequence of images.
- the second sequence of images will also be in an spaced relationship (e.g., approximately parallel) by reason of the rocking or sliding motion but in an
- the second sequence of images is stored in a second frame sequence buffer 34.
- the user interface 30 is used as before to command the second frame sequence buffer when to start and stop storing acquired images. Images stored in the two frame sequence buffers are then spatially registered by detecting the image line of intersection of an image of one sequence to an image of the other sequence by an image line
- This registration is based upon the image content such as the features or speckle pattern of each image.
- the registered grid of planar images is stored in a 3D volume buffer 38. Spaces between the planes of the registered images may then be filled in by scan conversion using scan
- the data of the 3D volume image can thus be converted to uniform density in all three dimensions
- the 3D dataset is then rendered into a 3D image by volume renderer 28 for display on an image display 40.
- accurately spaced image planes of the scanned volume can be produced, enabling quantitative distance, area and volume measurements to be made of a scanned organ (e.g., a kidney) or a mass, so that the dimensions of the organ or mass can be quantitatively measured.
- User control of the ultrasound system is provided by a user interface 30.
- FIGURE 2 illustrates how the image registration process may begin.
- the center frame of each sequence is initially selected for registration on the assumption that, if the frame acquisition were perfect, the two center frames of the two orthogonal sequences would intersect at their center line.
- FSi- c is the center image frame of the first frame sequence
- FS 2 is the center image frame of the second frame sequence. Since the orientation of the images from the respective sequences is approximately orthogonal, one frame will spatially intersect the other at a common line of intersection Li . If the scanning were done in perfect alignment the line of intersection Li would be the aligned center lines of each image. But since the sequences were acquired by manual hand scanning, it is not, as can be seen from the tilt of line Li and its position at the tops of the images.
- the line of intersection Li of the first two images selected for processing is generally the most exhaustive to determine because it can have virtually any
- a pixel matching process such as block matching used in
- panoramic imaging may be employed to compare and try to align a line of pixels of one image with a line of pixels in the other image.
- the degree of match of one line with another may not be perfect because the two images were acquired at different times and the anatomy images may have changed due to bodily function or movement or other effects. Hence a degree of match less than 100% should be used to ascertain matching lines of intersection.
- the location of the common line Li is noted and used to initially estimate the positions of other intersecting images as shown in FIGURE 3.
- FIGURE 3 In this example the two image sequences have been acquired by rocking the probe or sliding it in an arc as indicated by arrow R for the second sequence. This motion will cause the respective images to be approximately parallel but differently angularly tilted by a differential angle ⁇ as shown for images 61, 62, 63, FS 2 64, 65, and 66 of the second sequence.
- the magnitude of the assumed differential angle ⁇ is predicated up the knowledge that a rocking motion was used to scan the region of interest which can be
- the angular increment ⁇ is not exact, of course, but provides a starting point from which the search for the line of intersection of each second sequence image and the first sequence image FSi- c .
- intersection c is shown in relation to a pixel grid 80 for purposes of illustration. Line c is seen to
- FIGURE 5 illustrates how the line of intersection matching process proceeds for two intersecting image planes FS i- n and FS2-n of the two sequences.
- Image FS i- n has a line of image content indicated at 70 which is shown as a row of circles at the current line of intersection of the two images. But this same line of image content 70' is at a different planar location in the second image 70' which is not in alignment at this orientation. When a search is made for this image content in the second image, it is seen that the lines of image content are not
- the spacing to the line of intersection with an adjacent image may be increased or decreased from the nominal pixel spacing in the common image, in which case the pixel spacing in the common image can be stretched or compressed (warped) to maintain the geometric accuracy of the intervening image region.
- the final alignment may not be the initially assumed alignment shown in FIGURE 3, but the pattern of alignment shown in FIGURE 6.
- Other images of the first sequence may now be registered to common lines of intersection with the second sequence images until all of the images of both sequences have been registered with each other,
- the spacing between the image planes of the grid may then be filled in by
- volume rendering rendered into a 3D volume image by volume rendering and the image viewed dynamically by dynamic parallax display.
- the two scanning sequences have a common constraint, which is that each has been applied to the same target anatomy or region of interest in 3D space.
- unknown variable parameters that prevent an accurate reconstruction of the target of interest by use of just one sequence. Assume that values are randomly chosen for uncertain parameter sets of the two sequences, ⁇ 1 for one sequence and ⁇ 2 for the other sequence. Using the chosen values, a
- the reconstruction be a full reconstruction of the entire target volume.
- the reconstructed data can be just the image information that shares the same spatial coordinates, the lines of intersection Li of the two image sequences. Since the original scan plane data are discrete, interpolation may be needed in each image to obtain the exactly intersecting lines. This interpolation can be extended, if desired, to interpolate data for a full reconstruction of the target of interest. For example, a direct reconstruction of the 3D volume data of each image sequence with additional interpolated lines can be performed and then a comparison of the results is made. It can be seen that the reconstructed lines of intersection denote the data information for comparison once some parameter sets for the scanned sequences are chosen . The next step is to perform a comparison to see whether the reconstructed lines of intersection of each scanning sequence are similar.
- comparison is to directly use one reconstructed data set minus the other reconstructed data set, then sum the difference directly.
- Another method is to sum the square of the difference of the two data sets.
- ⁇ denotes the spatial uncertainty parameter of the current data
- Distis a distance metric used to evaluate the data constraints .
- FIGURE 7 illustrates the result of this kind of search known as a "grid search”. From FIGURE 7 it can be seen that the function values of different parameter combinations build up a surface called a "response surface". In the response surface the location with the minimum value is the optimal value of the parameters, which are [4,8] in this example.
- Another way to obtain the optimal solution is to first obtain several values of sampled points on the response surface. Then, based on a spatial shape assumption, an effort is made to reconstruct the response surface and find the location of the minimum value on the
- This iterative method is a global optimization, which can be performed by a number of methods, such as hardy multi-quadratics and kriging- function based methods, among others.
- the computer program instructions may also cause at least some of the operational steps to be performed in parallel.
- steps may also be performed across more than one processor, such as might arise in a multi-processor computer system.
- one or more processes may also be performed concurrently with other processes, or even in a different sequence than illustrated without departing from the scope or spirit of the invention.
- the computer program instructions can be stored on any suitable computer-readable hardware medium
- RAM random access memory
- ROM read-only memory
- EEPROM electrically erasable programmable read-only memory
- flash memory electrically erasable programmable read-only memory
- CD-ROM compact disc-read only memory
- DVD digital versatile disks
- the present invention can include a system having instructions thereon, which when executed, cause the system to perform the following steps: moving an ultrasound probe containing a one dimensional array of transducer elements in a first direction while acquiring a first sequence of spatially offset images from a region of interest; storing the first sequence of images; moving the ultrasound probe in a second direction while acquiring a second sequence of
- spatially offset images from a region of interest wherein at least one image in the first sequence intersects at least one image in the second sequence; storing the second sequence of images; processing an image from the first sequence with an image of the second sequence to identify at least one common line of intersection of the two images; processing additional images from both sequences to identify one or more lines of intersection of each additional image with one or more lines of previously processed images to form a three dimensional grid of intersecting images; and processing the three dimensional grid of intersecting images to produce a three dimensional ultrasound image of a region of interest.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Acoustics & Sound (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
An ultrasound system (10) has a probe (11) with a one dimensional array transducer (12) which is rocked or slid across the body in two orthogonal directions to acquire two sequences of elevationally offset planar images swept through the same region of interest. An image of one sequence is processed with one or more images of the other sequence to find a common line of intersection with one or more images of the other sequence. After all of the intersecting images have been processed to find their lines of intersection with other, substantially orthogonal images, and egg crate-like grid of images has been identified. The elevational spaces between the images can be filled in by scan conversion and the image data processed to form a three dimensional image of the region of interest.
Description
THREE DIMENSIONAL ULTRASOUND IMAGING
BY INTERSECTING 2D SCANNING
This application claims priority to International Application No. PCT/CN2014 / 088188 , filed on October 9, 2014 which is incorporated herein by reference in its entirety .
This invention relates to medical ultrasound imaging and, in particular, to three dimensional ultrasonic imaging by intersecting scanning of a region of interest.
Ultrasonic imaging can be used to produce two dimensional or three dimensional images of a region of interest. Premium ultrasound systems conventionally are capable of operating with probes having two dimensional transducer arrays which can be operated in a phased mode to steer transmit and receive beams over a three dimensional region of the body. Such solid state probes can acquire beams steered over a three dimensional region which are used to form three dimensional images in real time. Two dimensional array probes can also be used to produce static images of extended volumes referred to as panoramic images as described in U.S. patent 8,539,838 (Yoo et al . ) The probe shown in this patent can be moved along the surface of the body with a row of transducers extending normal to the direction of travel continuously
acquiring 2D images in planes (the "T" direction) which are approximately parallel to each other. At the same time images are continuously acquired in overlapping planes in the direction of travel (the "S" direction) . By comparing the image content of these overlapping S
planes, the relative positions of the parallel T planes can be deduced. This information is used produce a static 3D volume image from the content of the
approximately parallel planes.
Lower priced, portable systems are now becoming available in which the processor for signal and image processing is simply the processor of a laptop or tablet computer, with the images displayed on the screen of the portable device. Such highly portable systems are inexpensive by reason of the lack of any further sophisticated processors and generally are operable with only one dimensional array transducer probes which are only capable of 2D imaging. However, such one dimensional array transducer probe can also be used to produce static 3D images as described in U.S. patent 5,474,073 (Schwartz et al . ) The ID array probe shown in this patent is moved along the surface of the body by manual hand scanning, repetitively acquiring a sequence of power Doppler images in approximately parallel planes . The use of the power Doppler mode avoids the acquisition of signals from tissue
surrounding the flow of blood in blood vessels which would clutter and obscure the blood flow. This technique produces 3D images of the blood flow in the region scanned by the approximately parallel planes. However, these blood flow images can be distorted if the motion of the probe is not perfectly linear and at a constant speed. Movement of the probe at a changing speed and nonlinear rocking of the probe will cause the acquired images to be in planes which are not exactly parallel to each other and are variably spaced from each other, which causes distortion of the resultant 3D images. It is desirable to produce 3D images by manual
hand scanning which are not distorted by such variation in movement, and of both tissue and blood flow.
In accordance with the principles of the present invention, an ultrasonic imaging system and method are described in which manual freehand 2D image scanning is performed using a ID array probe, a mechanically swept single piston transducer probe, or a row of elements of a 2D array probe. With the 2D array, whether one row of elements or more than one row of elements is used is not significant as long as a sequence of 2D images is acquired where each image plane in each sequence is formed in a fixed location and orientation relative to the transducer. The inventive technique is designed specifically to operate without the need for a priori knowledge of the exact orientation or spacing of the 2D images in any of the sequences, although the technique can also be performed with a priori knowledge.
In some embodiments, a region of interest can be scanned by at least two intersecting sweeps of images. In certain embodiments, the present invention can be used generally if sweeps (2 or more) intersect in such a manner that the lines of intersection of the sweeps, taken together, are not all parallel. In one
embodiment, a region of interest can be scanned at least twice in approximately orthogonal directions to acquire two sets of approximately parallel planes, or in three or more different directions to acquire three or more sets of image planes, which spatially intersect each other. The line or lines of intersection of images from the two sets of image planes are then found and the intersecting images fitted together in
accordance with their lines of intersection. This produces an egg crate-like grid of intersecting images
covering the scanned 3D region. The fitting of the lines of intersection effectively eliminates much of the distortion due to nonlinear scanning motion of the probe. Scan conversion is then performed of this 3D spatial data set to fill in the interstices between planes with image data, and a 3D volume image is thus produced .
In the drawings :
FIGURE 1 provides an example ultrasound system in accordance with an embodiment of the present invention.
FIGURE 2 illustrates in perspective two
approximately orthogonally oriented ultrasound images and their line of intersection.
FIGURE 3 illustrates the two intersecting
ultrasound images of FIGURE 2 and the assumed positions of other images which are approximately parallel to one of the images .
FIGURE 4 illustrates a grid showing how an assumed line of intersection of two images can fail to be an actual line of intersection of the two images by reason of nonlinear motion of the probe during scanning.
FIGURE 5 illustrates the translation needed for one line of image data of one image to bring it into registration with the same image data in an
approximately orthogonal image.
FIGURE 6 illustrates the actual alignment of one set of approximately parallel images with an image which is approximately orthogonal to the approximately parallel images .
FIGURE 7 illustrates a response surface of function values for unknown variables in an iterative 3D image reconstruction technique of the present invention .
In some aspects, the present invention includes an ultrasound system which produces a three dimensional image of a region of interest. The ultrasound system can include an ultrasound probe having, e.g., a one dimensional array of transducer elements. A first frame sequence buffer, coupled to the ultrasound probe, can be configured to store a first sequence of ultrasound images, such as spatially offset two dimensional images as the probe is moved in a first direction. A second frame sequence buffer, coupled to the ultrasound probe, can be configured to store a second sequence of
ultrasound images, such as spatially offset two
dimensional images as the probe is moved in a second direction. An image line registration processor, coupled to receive images from the first and second frame sequence buffers, can be configured to identify a common line of intersection of an image of one buffer with an image of the other buffer. A 3D volume buffer, coupled to the image line registration processor, can be configured to store the image data of images found to have one or more lines of intersection by the image line registration processor. A display, coupled to the 3D volume buffer, can be configured to produce a three dimensional image based on the intersecting sweeps of the ultrasound probe.
Referring first to FIGURE 1, an ultrasonic
diagnostic imaging system 10 of the present invention is illustrated in block diagram form. An ultrasound probe 11 contains a one dimensional transducer array 12. In this example the array is a curved linear array which scans a sector image when the face of the probe is pressed into acoustic contact with the body. The linear array may also be a straight linear array or a
single row of a two dimensional array transducer. The probe is connected by a T/R switch 14 to a transmit beamformer 16 and a receive beamformer 18. The
transmit beamformer actuates predetermined groups of elements at specifically related times to transmit beams in desired directions over an image plane and the receive beamformer steers and focuses receive beams in response to those transmission events. The receive beams are filtered and may undergo other signal
processing by a filter 20 and desired signals are detected by a detector 22. The detected signals may be amplitude detected tissue signals or Doppler detected signals from moving substances such as blood flow.
They may additionally be signals at fundamental or harmonic frequencies. A series of ad acent receive beams are stored in a frame buffer 24 from which they will be further processed into a planar image frame as described below.
In some embodiments, the ultrasound system 10 of the present invention can be configured to determine an approximate position of the ultrasound probe 11 in space. For example, the ultrasound probe can include position sensors to determine the probe's position in space. A variety of position sensors can be used, such as accelerometers , gyroscopes, and/or electromagnetic field sensors. In certain embodiments, an
electromagnetic instrument tracking system or other similar system may be used alone or in combination with the ultrasonic diagnostic imaging system to determine the probe position. The electromagnetic tracking system uses electromagnetic field generators and
sensors to determine the location of an object in space. The location may be determined in relation to a fixed
plane, an ultrasound probe, and/or other reference point. One example of an electromagnetic tracking system that may be used is the PercuNav system by Philips®.
For ultrasound systems of the present invention, an electromagnetic tracking and navigation system may be used, such as the PercuNav system, elements of which are shown in Figure 1. The electromagnetic tracking and navigation system has a field generator 46 which generates an electromagnetic field permeating a surrounding space of the probe during imaging. Sensors 44 are located on the probe 12 which interact with the electromagnetic field and produce signals used to calculate the position and orientation of the image plane of the probe. This calculation is done by a coordinate generator 42 of the system, which is shown receiving signals representing the position and orientation of the image plane, and further receiving signals from the field generator 46 to which the coordinate generator 42 is also coupled for field registration purposes. The electromagnetic tracking and navigation system can be controlled via operator control signals from the control panel 30.
In some aspects, the present invention includes methods for producing a three dimensional ultrasound image by freehand scanning. The methods can include, for example, moving an ultrasound probe containing a one dimensional array of transducer elements in a first direction while acquiring a first sequence of spatially offset images from a region of interest. The first direction can be generally orthogonal to the plane of an image produced by the array. The methods can include storing the first sequence of images. Another
step of the methods can include moving the ultrasound probe in a second direction while acquiring a second sequence of spatially offset images from a region of interest. The probe can be moved in intersecting sweeps in the first and second directions such that at least one image in the first sequence intersects at least one image in the second sequence. The
intersecting sweeps can be generally orthogonal, e.g., the first direction can be generally orthogonal to the second direction. The methods can include storing the second sequence of images, and/or processing an image from the first sequence with an image of the second sequence to identify at least one common line of intersection of the two image. In some embodiments, the methods can include processing additional images from both sequences to identify one or more lines of intersection of each additional image with one or more lines of previously processed images to form a three dimensional grid of intersecting images. A step of the methods can include processing the three dimensional grid of intersecting images to produce a three
dimensional ultrasound image of a region of interest.
In accordance with the principles of the present invention a first image sequence which has been
acquired by sliding or rocking the probe in a first direction is stored in a first frame sequence buffer 32 The probe is moved so that the images received during its motion will sweep through a region of interest.
Prior to the start of the probe motion, the user touches a control on the user interface 30 to condition the frame sequence buffer to begin storing images.
When the sweep is completed, the control is touched again to inform the buffer that the first sequence is
complete. Any identical images stored at the beginning and the end of the sweep before and after the probe has moved may be identified by image matching and discarded. Since the probe is moved in a given direction the
images will be approximately parallel to each other
(non-intersecting in the elevation dimension) in the sequence. If a linear (straight line) motion is used, the acquired images will be approximately parallel to each other. If a rocking motion is used to sweep the images they will be slightly angled in relation to each other. Because freehand scanning is used, the image planes are expectedly not exactly parallel to each other or evenly spaced or angulated, but are only
approximately uniformly spaced. This variation in spacing and angulation is resolved by the processing of the present invention.
After a first sequence of images has been acquired and stored in buffer 32, the probe is slid or rocked in a second direction (e.g., an approximately orthogonal direction) to acquire a second sequence of images. The second sequence of images will also be in an spaced relationship (e.g., approximately parallel) by reason of the rocking or sliding motion but in an
approximately orthogonal relationship to the images of the first sequence. Spatially, the images of the two sequences will nest together like the dividers of an egg crate. The pattern of the intersecting planes is not perfectly orthogonal or perfectly uniformly spaced since manual hand scanning was used to acquire them.
The second sequence of images is stored in a second frame sequence buffer 34. The user interface 30 is used as before to command the second frame sequence buffer when to start and stop storing acquired images.
Images stored in the two frame sequence buffers are then spatially registered by detecting the image line of intersection of an image of one sequence to an image of the other sequence by an image line
registration processor 36 as described more fully below This registration is based upon the image content such as the features or speckle pattern of each image.
Since the intersecting images were hand scanned it is expected that the lines of intersection with other images will not be precisely uniformly spaced through each image, but will exhibit variation from a perfectly uniform pattern. This variation will cause a warping of the images by respacing their beam locations as necessary for registration, for instance. This warping accounts for the non-uniform or irregular displacement (and/or orientation) of the image planes from one image frame to the next which is inherent in manual hand scanning and will cause the final 3D image to be spatially undistorted despite the inherent
irregularities in its acquisition. The registered grid of planar images is stored in a 3D volume buffer 38. Spaces between the planes of the registered images may then be filled in by scan conversion using scan
converter 26. The data of the 3D volume image can thus be converted to uniform density in all three dimensions The 3D dataset is then rendered into a 3D image by volume renderer 28 for display on an image display 40. Alternatively, accurately spaced image planes of the scanned volume can be produced, enabling quantitative distance, area and volume measurements to be made of a scanned organ (e.g., a kidney) or a mass, so that the dimensions of the organ or mass can be quantitatively measured. User control of the ultrasound system is
provided by a user interface 30.
FIGURE 2 illustrates how the image registration process may begin. In this example the center frame of each sequence is initially selected for registration on the assumption that, if the frame acquisition were perfect, the two center frames of the two orthogonal sequences would intersect at their center line. FSi-c is the center image frame of the first frame sequence, and FS2 is the center image frame of the second frame sequence. Since the orientation of the images from the respective sequences is approximately orthogonal, one frame will spatially intersect the other at a common line of intersection Li . If the scanning were done in perfect alignment the line of intersection Li would be the aligned center lines of each image. But since the sequences were acquired by manual hand scanning, it is not, as can be seen from the tilt of line Li and its position at the tops of the images.
The line of intersection Li of the first two images selected for processing is generally the most exhaustive to determine because it can have virtually any
orientation in the two images and cannot be referenced to any previously found line of intersection. A pixel matching process such as block matching used in
panoramic imaging may be employed to compare and try to align a line of pixels of one image with a line of pixels in the other image. The degree of match of one line with another may not be perfect because the two images were acquired at different times and the anatomy images may have changed due to bodily function or movement or other effects. Hence a degree of match less than 100% should be used to ascertain matching lines of intersection. When the common line of pixels in the two
images has been found, the location of the common line Li is noted and used to initially estimate the positions of other intersecting images as shown in FIGURE 3. When the common line of intersection of FSi-c and FS2 has been found as illustrated in FIGURE 3, in which FSi-c is in the plane of the drawing and FS2 is orthogonal to the drawing, the positions of the other images of the second sequence can be estimated as illustrated in
FIGURE 3. In this example the two image sequences have been acquired by rocking the probe or sliding it in an arc as indicated by arrow R for the second sequence. This motion will cause the respective images to be approximately parallel but differently angularly tilted by a differential angle ΔΘ as shown for images 61, 62, 63, FS2 64, 65, and 66 of the second sequence. The magnitude of the assumed differential angle ΔΘ is predicated up the knowledge that a rocking motion was used to scan the region of interest which can be
selected by the user interface, an assumed breadth of scan, and the time of acquisition and number of images acquired for the sequence. The angular increment ΔΘ is not exact, of course, but provides a starting point from which the search for the line of intersection of each second sequence image and the first sequence image FSi-c.
The reason why ΔΘ is not an exact inter-image spacing can be appreciated from FIGURE 4. In this illustration line Ai is a line of intersection Li for two images and the next adjacent image is assumed to be intersecting at a line c which is offset from line Ai by angular increment ΔΘ . This presumed line of
intersection c is shown in relation to a pixel grid 80 for purposes of illustration. Line c is seen to
intersect the grid at center point c' at the top of the
grid and pass through the center point at the bottom of the grid (not visible in this view. ) But when a search is performed for a common line of pixels in the common image and the next orthogonal image, the actual line of intersection found may be line A2, which is seen to intersect the pixel grid at a different location A'2 at the top, exit the grid at a different location than the center, and is tilted at a distinctly different angle than the presumed intersection line c. FIGURE 5 illustrates how the line of intersection matching process proceeds for two intersecting image planes FS i-n and FS2-n of the two sequences. Image FS i-n has a line of image content indicated at 70 which is shown as a row of circles at the current line of intersection of the two images. But this same line of image content 70' is at a different planar location in the second image 70' which is not in alignment at this orientation. When a search is made for this image content in the second image, it is seen that the lines of image content are not
registered until the first image is tilted forward to bring 70 and 70' into alignment. When this registration is made the spacing to the line of intersection with an adjacent image may be increased or decreased from the nominal pixel spacing in the common image, in which case the pixel spacing in the common image can be stretched or compressed (warped) to maintain the geometric accuracy of the intervening image region.
When the registration of the images of the second sequence is done to find their lines of intersection with the image FS i-c of the first sequence, the final alignment may not be the initially assumed alignment shown in FIGURE 3, but the pattern of alignment shown in FIGURE 6. Other images of the first sequence may now be
registered to common lines of intersection with the second sequence images until all of the images of both sequences have been registered with each other,
effectively forming an egg crate-like grid of
intersecting planar images. The spacing between the image planes of the grid may then be filled in by
interpolating pixels between the image planes with the scan converter. Multiple-point interpolation may be used for this 3D pattern of images and pixels. The data with its final pixel addresses in the volume buffer after registrations and interpolation may then be
rendered into a 3D volume image by volume rendering and the image viewed dynamically by dynamic parallax display.
Another approach to reconstructing an image is to recognize that each scan sequence has a number of
variables due to the use of freehand scanning, such as the inter-image plane spacing, the relative angles of inclination of the image planes, the possible rotation of each plane in several dimensions, and so on. These image-to-image variables are unknowns. However, there is a common attribute of the two scan sequences, which is that each contains a view of the same target anatomy. This second approach is to reconstruct the common
anatomical image or portions thereof with the data of each image sequence using different estimates of the unknown variables, then to compare the reconstructions. This process is iterated with different estimates of the variables until the result is optimized, producing a near identical reconstruction from each image sequence. The 3D reconstruction resulting from the optimized variables can then be displayed as the three dimensional image .
In this iterative reconstruction method, the two scanning sequences have a common constraint, which is that each has been applied to the same target anatomy or region of interest in 3D space. For each scan sequence, there are unknown variable parameters that prevent an accurate reconstruction of the target of interest by use of just one sequence. Assume that values are randomly chosen for uncertain parameter sets of the two sequences, Θ1 for one sequence and Θ2 for the other sequence. Using the chosen values, a
reconstruction of the target of interest is performed using each image sequence. To test whether the chosen parameter sets are correct or not, a comparison is made to compare whether the two reconstructed targets are similar since they describe the same target.
It is not necessary that the reconstruction be a full reconstruction of the entire target volume. For instance, the reconstructed data can be just the image information that shares the same spatial coordinates, the lines of intersection Li of the two image sequences. Since the original scan plane data are discrete, interpolation may be needed in each image to obtain the exactly intersecting lines. This interpolation can be extended, if desired, to interpolate data for a full reconstruction of the target of interest. For example, a direct reconstruction of the 3D volume data of each image sequence with additional interpolated lines can be performed and then a comparison of the results is made. It can be seen that the reconstructed lines of intersection denote the data information for comparison once some parameter sets for the scanned sequences are chosen .
The next step is to perform a comparison to see whether the reconstructed lines of intersection of each scanning sequence are similar. One method of
comparison is to directly use one reconstructed data set minus the other reconstructed data set, then sum the difference directly. Another method is to sum the square of the difference of the two data sets.
It can be seen that once one or more unknown parameter sets are chosen for the two scan sequences, a value can be obtained which denotes the extent of similarity of the reconstructed lines of intersection or complete or partial volumes. If the correct parameter sets are found, the reconstructions from the two scanning sequences can ideally describe the exact same target. An optimization process is now performed to find the optimal solution (the correct values) of the unknown variable parameters which will minimize the function as described in the right part of the
following equation:
( C):= a¾mii i¾st{¾.C¾ > - ¾«¾)> where D denotes the representation of the data
information, Θ denotes the spatial uncertainty parameter of the current data, and Distis a distance metric used to evaluate the data constraints .
One intuitive way to perform the optimization is to search all the possible combinations of Θ1 and Θ2. Given a sufficiently finite number of variables and ranges of values and sufficient computing power, this can be done . Then the combination that has the minimum value of the "distance" function as described in the right part of the above equation is found and the
corresponding combination is the optimal solution for Θ1 and Θ2. FIGURE 7 illustrates the result of this kind of search known as a "grid search". From FIGURE 7 it can be seen that the function values of different parameter combinations build up a surface called a "response surface". In the response surface the location with the minimum value is the optimal value of the parameters, which are [4,8] in this example.
Another way to obtain the optimal solution is to first obtain several values of sampled points on the response surface. Then, based on a spatial shape assumption, an effort is made to reconstruct the response surface and find the location of the minimum value on the
reconstructed response surface. After testing the real function value with the values of the minimum location, a new response surface is reconstructed and a further search for the minimum value is performed until the iteration converges. This iterative method is a global optimization, which can be performed by a number of methods, such as hardy multi-quadratics and kriging- function based methods, among others.
It will be understood that each block of the block diagram illustrations, and combinations of blocks in the block diagram illustrations, as well any portion of the systems and methods disclosed herein, include hardware components coupled together in a computer- based system. In some embodiments, various steps can be implemented by computer program instructions. These program instructions may be provided to a processor to produce a machine, such that the instructions, which execute on the processor, create means for implementing the actions specified in the block diagram block or blocks or described for the systems and methods
disclosed herein. The computer program instructions may be executed by a processor to cause a series of
operational steps to be performed by the processor to produce a computer implemented process. The computer program instructions may also cause at least some of the operational steps to be performed in parallel.
Moreover, some of the steps may also be performed across more than one processor, such as might arise in a multi-processor computer system. In addition, one or more processes may also be performed concurrently with other processes, or even in a different sequence than illustrated without departing from the scope or spirit of the invention.
The computer program instructions can be stored on any suitable computer-readable hardware medium
including, but not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage,
magnetic cassettes, magnetic tape, magnetic disk
storage or other magnetic storage devices, or any other medium which can be used to store the desired
information and which can be accessed by a computing device .
For example, the present invention can include a system having instructions thereon, which when executed, cause the system to perform the following steps: moving an ultrasound probe containing a one dimensional array of transducer elements in a first direction while acquiring a first sequence of spatially offset images from a region of interest; storing the first sequence of images; moving the ultrasound probe in a second direction while acquiring a second sequence of
spatially offset images from a region of interest,
wherein at least one image in the first sequence intersects at least one image in the second sequence; storing the second sequence of images; processing an image from the first sequence with an image of the second sequence to identify at least one common line of intersection of the two images; processing additional images from both sequences to identify one or more lines of intersection of each additional image with one or more lines of previously processed images to form a three dimensional grid of intersecting images; and processing the three dimensional grid of intersecting images to produce a three dimensional ultrasound image of a region of interest.
Claims
1. A method for producing a three dimensional ultrasound image by freehand scanning comprising:
moving an ultrasound probe containing a one dimensional array of transducer elements in a first direction while acquiring a first sequence of spatially offset images from a region of interest;
storing the first sequence of images;
moving the ultrasound probe in a second direction while acquiring a second sequence of spatially offset images from a region of interest, wherein at least one image in the first sequence intersects at least one image in the second sequence;
storing the second sequence of images;
processing an image from the first sequence with an image of the second sequence to identify at least one common line of intersection of the two images;
processing additional images from both sequences to identify one or more lines of intersection of each additional image with one or more lines of previously processed images to form a three dimensional grid of intersecting images; and
processing the three dimensional grid of
intersecting images to produce a three dimensional ultrasound image of a region of interest.
2. The method of Claim 1, wherein processing the three dimensional grid of intersecting images further comprises scan converting additional image elements between images of the three dimensional grid.
The method of Claim 1, wherein moving the
ultrasound probe further comprises rocking or tilting the ultrasound probe.
4. The method of Claim 1, wherein moving the ultrasound probe further comprises sliding the probe in a substantially straight line.
5. The method of Claim 1, wherein moving the ultrasound probe further comprises moving an ultrasound probe containing a linear array of transducer elements.
6. The method of Claim 5, wherein moving the ultrasound probe further comprises moving an ultrasound probe containing a curved array of transducer elements.
7. The method of Claim 1, wherein processing images from both sequences to identify a common line of intersection further comprises searching the image content of two intersecting images to identify
substantially the same line of image content in the intersecting images.
8. The method of Claim 7, wherein identifying substantially the same line of image content further comprises identifying a line of image content in one image which is less than 100% identical to a line of image content of an intersecting image.
9. An ultrasound system for producing a three dimensional image of a region of interest, comprising: an ultrasound probe comprising a one dimensional array of transducer elements;
a first frame sequence buffer that is coupled to
the ultrasound probe and configured to store a first sequence of spatially offset two dimensional images when the probe is moved in a first direction;
a second frame sequence buffer that is coupled to the ultrasound probe and configured to store a second sequence of spatially offset two dimensional images when the probe is moved in a second direction;
an image line registration processor that is coupled the first and second frame sequence buffers an configured to identify a common line of intersection o an image of the first frame sequence buffer with an image of the second sequence buffer;
a 3D volume buffer that is coupled to the image line registration processor and configured to store th image data of images having the common line of
intersection; and
a display that is coupled to the 3D volume buffer and configured to display a three dimensional image.
10. The ultrasound system of Claim 9, further comprising a scan converter coupled to the 3D volume buffer and configured to produce image data between images found to have one or more common lines of intersection .
11. The ultrasound system of Claim 9, further comprising a volume renderer that is responsive to image data stored in the 3D volume buffer and
configured to produce a volume rendered three
dimensional image.
12. The ultrasound system of Claim 9, further comprising a detector that is coupled between the probe
and the frame sequence buffers and configured to detect tissue data for ultrasound images.
13. The ultrasound system of Claim 9, further comprising a detector that is coupled between the probe and the frame sequence buffers and configured to detect motion data for ultrasound images .
14. The ultrasound system of Claim 9, further comprising a user interface that is coupled to the first and second frame sequence buffers and configured to inform each buffer when to store a sequence of images .
15. The ultrasound system of Claim 9, wherein the one dimensional array of transducer elements further comprises one of a linear array of transducer elements, a curved array of transducer elements, or one row of elements of a two dimensional array of transducer elements .
16. The ultrasound system of Claim 9, wherein the ultrasound probe further comprises a position sensing device, and the ultrasound system is configured to determine a probe position based at least in-part on the position sensing device.
17. The ultrasound system of Claim 16, wherein the ultrasound system includes an electromagnetic tracking system configured to determine the probe position .
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CNPCT/CN2014/088188 | 2014-10-09 | ||
CN2014088188 | 2014-10-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016055902A1 true WO2016055902A1 (en) | 2016-04-14 |
Family
ID=54360493
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2015/057435 WO2016055902A1 (en) | 2014-10-09 | 2015-09-29 | Three dimensional ultrasound imaging by intersecting 2d scanning |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2016055902A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11911223B2 (en) * | 2018-02-23 | 2024-02-27 | Brainlab Ag | Image based ultrasound probe calibration |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0487339A1 (en) * | 1990-11-22 | 1992-05-27 | Advanced Technology Laboratories, Inc. | Acquisition and display of ultrasonic images from sequentially orientated image planes |
US5474073A (en) | 1994-11-22 | 1995-12-12 | Advanced Technology Laboratories, Inc. | Ultrasonic diagnostic scanning for three dimensional display |
US6234968B1 (en) * | 1999-06-15 | 2001-05-22 | Acuson Corporation | 3-D diagnostic medical ultrasound imaging using a 1-D array |
US8539838B2 (en) | 2008-06-05 | 2013-09-24 | Koninklijke Philips N.V. | Extended field of view ultrasonic imaging with a two dimensional array probe |
-
2015
- 2015-09-29 WO PCT/IB2015/057435 patent/WO2016055902A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0487339A1 (en) * | 1990-11-22 | 1992-05-27 | Advanced Technology Laboratories, Inc. | Acquisition and display of ultrasonic images from sequentially orientated image planes |
US5474073A (en) | 1994-11-22 | 1995-12-12 | Advanced Technology Laboratories, Inc. | Ultrasonic diagnostic scanning for three dimensional display |
US6234968B1 (en) * | 1999-06-15 | 2001-05-22 | Acuson Corporation | 3-D diagnostic medical ultrasound imaging using a 1-D array |
US8539838B2 (en) | 2008-06-05 | 2013-09-24 | Koninklijke Philips N.V. | Extended field of view ultrasonic imaging with a two dimensional array probe |
Non-Patent Citations (1)
Title |
---|
AARON FENSTER ET AL: "3-D Ultrasound Imaging: A Review", IEEE ENGINEERING IN MEDICINE AND BIOLOGY MAGAZINE, vol. 15, no. 6, November 1996 (1996-11-01), pages 41 - 50, XP011084747, ISSN: 0739-5175 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11911223B2 (en) * | 2018-02-23 | 2024-02-27 | Brainlab Ag | Image based ultrasound probe calibration |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6443896B1 (en) | Method for creating multiplanar ultrasonic images of a three dimensional object | |
US6500123B1 (en) | Methods and systems for aligning views of image data | |
US9265484B2 (en) | M-mode ultrasound imaging of arbitrary paths | |
JP4628645B2 (en) | Ultrasonic diagnostic method and apparatus for creating an image from a plurality of 2D slices | |
US8738339B2 (en) | System and method for ultrasonic testing | |
US6669641B2 (en) | Method of and system for ultrasound imaging | |
US11622743B2 (en) | Rib blockage delineation in anatomically intelligent echocardiography | |
KR102199134B1 (en) | OCT Image Processing | |
JPH04317641A (en) | Ultrasonic visualizing system | |
JP5960970B2 (en) | Ultrasound imaging system | |
CN103417243A (en) | Three-dimensional ultrasonic imaging device, three-dimensional ultrasonic imaging system and three-dimensional ultrasonic imaging method | |
US10255695B2 (en) | Calculating a four dimensional DSA dataset with variable spatial resolution | |
US11766297B2 (en) | Apparatus and method for detecting an interventional tool | |
CN111698947B (en) | Multi-parameter tissue hardness quantification | |
US10856851B2 (en) | Motion artifact suppression for three-dimensional parametric ultrasound imaging | |
JPH0723951A (en) | Ultrasonic diagnostic device | |
JP5296975B2 (en) | Ultrasonic diagnostic equipment | |
CN108024789B (en) | Inter-volume lesion detection and image preparation | |
US20140176561A1 (en) | Ultrasound data processing device | |
CN103917891A (en) | Improving large volume three-dimensional ultrasound imaging | |
WO2016055902A1 (en) | Three dimensional ultrasound imaging by intersecting 2d scanning | |
US20190209130A1 (en) | Real-Time Sagittal Plane Navigation in Ultrasound Imaging | |
JP4944582B2 (en) | Ultrasonic diagnostic equipment | |
JP5460484B2 (en) | Ultrasonic data processor | |
Harindranath et al. | An improved freehand 3D-ultrasound volume reconstruction technique for fast scans with scanline motion correction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15787015 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15787015 Country of ref document: EP Kind code of ref document: A1 |