Nothing Special   »   [go: up one dir, main page]

EP3894892A1 - Non-invasive diffuse acoustic confocal three-dimensional imaging - Google Patents

Non-invasive diffuse acoustic confocal three-dimensional imaging

Info

Publication number
EP3894892A1
EP3894892A1 EP19897458.6A EP19897458A EP3894892A1 EP 3894892 A1 EP3894892 A1 EP 3894892A1 EP 19897458 A EP19897458 A EP 19897458A EP 3894892 A1 EP3894892 A1 EP 3894892A1
Authority
EP
European Patent Office
Prior art keywords
acoustic
coherent
beams
source
focuser
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP19897458.6A
Other languages
German (de)
French (fr)
Other versions
EP3894892A4 (en
Inventor
Rodney HERRING
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/216,938 external-priority patent/US20190117189A1/en
Application filed by Individual filed Critical Individual
Publication of EP3894892A1 publication Critical patent/EP3894892A1/en
Publication of EP3894892A4 publication Critical patent/EP3894892A4/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8993Three dimensional imaging systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N29/00Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
    • G01N29/04Analysing solids
    • G01N29/06Visualisation of the interior, e.g. acoustic microscopy
    • G01N29/0654Imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8909Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
    • G01S15/8931Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration co-operating with moving reflectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8934Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8965Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using acousto-optical or acousto-electronic conversion techniques
    • G01S15/897Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using acousto-optical or acousto-electronic conversion techniques using application of holographic techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52023Details of receivers
    • G01S7/52036Details of receivers using analysis of echo signal for target characterisation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52079Constructional features
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0808Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the brain
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4209Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames
    • A61B8/4227Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames characterised by straps, belts, cuffs or braces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • A61B8/4472Wireless probes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/485Diagnostic techniques involving measuring strain or elastic properties
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2291/00Indexing codes associated with group G01N29/00
    • G01N2291/02Indexing codes associated with the analysed material
    • G01N2291/024Mixtures
    • G01N2291/02475Tissue characterisation

Definitions

  • the present technology is a method of non-invasively scanning a body or body part to obtain a three-dimensional image, in which size, location and shape are determined. The method further allows a determination of the state of the body or body part. An apparatus for scanning is also provided.
  • phase information of a beam that passes through an object can provide information on the object's temperature, composition, magnetic field or electrostatic field, whereas amplitude measurements can provide information on the opaqueness or density of the object.
  • the beams are comprised of waves of radiation, where a wave, F, can be described as having both an amplitude, A, and phase, Q, described mathematically as,
  • the information obtained from the method depends on whether it is detecting the amplitude or both the amplitude and phase of a beam's wave. If the method measures only a beam's amplitude, as is the case for X-ray, only density differences in the object are reported. This is a limitation of the technology as it does not provide information such as an object's temperature, composition, elasticity, strain field, magnetic or electrostatic fields.
  • An additional disadvantage of a number of imaging techniques such as X-ray imaging methods is the strength of radiation employed. When used in diagnosis, the levels employed may have the potential to damage cells in the body.
  • the images are not suitable for providing information other than that information that pertains to surfaces or interfaces.
  • a further deficiency of these microscopes is that the image produced has a significant amount of background intensity caused by the diffuse scattering of beams after striking the surface or interface. Taking as an example, a prostate gland, an ultrasound image poorly identifies the interface between the prostate add other tissue and can also identify the urethra, however, it cannot identify any abnormalities within the prostate.
  • Confocal scanning laser microscopes were developed in the 1980s for seeing three-dimensional objects.
  • Confocal scanning laser microscopy uses a laser beam passing through an object to create a three- dimensional amplitude image of the object by detecting the amplitude of the beam through a pinhole aperture placed confocal with a point on a focal plane of the object.
  • Confocal microscopes have now found widespread applications in the materials, biological, and medical sciences. As a diagnostic tool, confocal microscopes are limited to detecting only thin tissue and the density differences of objects, which produce amplitude differences of the detected beam. The beams cannot penetrate far in to tissues and other materials. They do not measure the object's phase information. Hence, confocal microscopes cannot measure an object's composition or temperature.
  • the method measures changes in the phase of a beam
  • information can be provided about the object's temperature and composition.
  • Acoustic beams can be used for this.
  • the phase of acoustic beams are modified by an object's refractive index, where the refractive index is dependent on the object's temperature and composition and is a measure of the acoustic beam's speed of sound.
  • the absolute phase of an object can be measured using a Confocal Scanning Holography Microscope, as described in US Patent No. 7,639,365. This approach cannot be used to image the inside of the human body as laser beams do not readily pass through the human body.
  • the relative phase of an object can be measure using an Acoustic Confbcal Interferometry Microscope, as described in US Patent No. 8,485,034. This approach requites an interference beam and a complex arrangement of mirrors and prisms and is not suitable for imaging the inside of the human body because of the geometric constraints.
  • Standard interferometry microscopes, standard holography microscopes, and standard holographic interferometry microscopes have been used to measure both the phase and the amplitude of objects, giving important information of objects such as their density, composition and temperature.
  • These microscopes create a three-dimensional amplitude image and phase image of the object by measuring both the phase and the amplitude.
  • the three-dimensional information measured from these micrpscopes comes only from the surface of the object and not at points within the object.
  • a reference beam and an object beam are used to collect data that results in the creation of the images. This limits the use of these microscopes to collecting data from or about surfaces of objects. In medical diagnosis they would therefore be potentially useful for diseases of the skin, but not for diseases of internal tissues or organs.
  • Spatially-filtered transmission ultrasound phase imaging involves measuring the amplitude and phase of an emitted beam and then again measuring the amplitude and phase of the acoustic beam after it passes through the object upon its arrival at a detector. The difference in amplitude and phase is attributed to the object. From the sound source, the beam diffusely scatters outward leading to background scatter that is not wanted. Within or around this background scatter will be the image of interest. That image is representative of the interfaces of the object being imaged.
  • United States Patent Application 20040059265 discloses noninvasively focusing acoustical energy on a mass such as a tumor within tissue to reduce or eliminate the mass.
  • the presence of the mass in the tissue is detected by applying acoustic energy to the substance.
  • the mass is localized to determine its position.
  • Temporal signatures are developed to drive the acoustical energy on the mass. Dynamic focusing of the acoustical energy on the mass to reduce or eliminate it is accomplished utilizing the temporal signatures.
  • Imaging of the mass is done using standing ultrasound imaging techniques (use of multiple parallel beams from an acoustic source outside of the body of the patient that are reflected off the object of interest in the body to provide intensity information [this reflected signal is referred to as a virtual source, but is not a source]) then either modeling or time reversal is used for driving acoustical energy which is then focused on the mass to treat the mass.
  • the deficiencies in this method include the fact that it is based on models and assumptions, the imaging of the mass is not better than the images obtained with ultrasound, as that is how they image, and the information attainable using the method are limited. Also, as the three-dimensional position cannot be determined, when used to treat a mass, such a treatment is reliant on the models and assumptions or time reversal to be being correct.
  • a device, system and method that can detect both the amplitude and phase of a beam.
  • Such a device, system and method would be able to provide information on the object's density, temperature, composition, elasticity, strain field, magnetic or electrostatic fields. This is of great significance in the medical field, as of being able to obtain information on density, temperature, and composition allows one to be able to potentially diagnose, treat and assess effectiveness of treatments for diseases such as cancer.
  • the device would be suitable for being hand-held, with a variety of different shaped detector holders for application to different parts of the body, for example, but not limited to the prostate, breast, head, and skin. It would be advantageous if the same beam used to detect and provide a three- dimensional image of the object could be used to treat the object.
  • Examples of an application where the measurement of temperature and composition is important include medical diagnostics aimed at understanding the function of organs, tissue and diseased regions in the body. Presently medical researchers do not have g0od means to non- invasively measure the internal temperature and composition of the body. It is an object of the present technology to provide such capabilities.
  • the present technology provides a method of creating three-dimensional irhages of objects of one optical density in another object of a different optical density that is far superior to ultrasound methods.
  • a coherent beam is focused into a virtual source outsidfe of the objects and within a fluid or an amorphous material.
  • an interference zone can be created. This interference zone is defocused to provide the three-dimensional image.
  • a method of imaging an object in a first material having a different optical density to the object comprising: focusing an acoustic coherent beam to a virtual acoustic source in a fluid or an amorphous second material outside Of the first material; moving the virtual acoustic source in the fluid or amorphous material such thit at least a plurality of scattered beams from the virtual acoustic source scan the first material and at least one scattered beam is reflected from the object to form a reflected beam and at least one scattered beam bypasses the object to form a bypass beam, and wherein the reflected beam and the bypass beam intercept one another to form a coherent interference zone; and defocusing the coherent interference zone to provide a Fresnel fringe, the Fresnel fringe forming an image of the object.
  • the method may further comprise an acoustic detector detecting the image of the object.
  • the method may further comprise moving the virtual acoustic source in the fluid or amorphous material such that at least one scattered beam passes through the object to become an object beam and is detected by the acoustic detector.
  • the method may further comprise comparing the phase of the object bearh with the phase of the bypass beam to provide information about the object.
  • the method may further comprise comparing the amplitude of the object beam with the amplitude of the bypass beam to provide information about the object.
  • the information from the phase may be the temperature, composition, magnetic field or electrostatic field of the object and the information from the amplitude may be the optical density of the object.
  • the speed of sound of the object may be determined.
  • the speed of sound of the object may be used to identify the object.
  • the object may be identified as a tumour or lesion in the first material.
  • the acoustic coherent beam may be focused in a fluid or amorphous second material in the body of a patient.
  • the acoustic coherent beam may be focused in one of urine in the bladder, fatty tissue, brain tissue, peritoneum fluid, stomach fluid, the gall bladder and the! spleen.
  • the acoustic coherent beam may be focused in a fluid outside of the body of a patient in which at least the part of the body of interest is immersed.
  • a system for imaging an object in a first material of different optical density to the object comprising: a coherent acoustic beam source which emits a coherent acoustic beam; a focuser positioned to focus the coherent acoustic beam to a virtual acoustic imaging source, the virtual acoustic imaging source emitting scattered beams; a coherent acoustic beam source actuator in mechanical communication with the coherent acoustic beam source; a focuser actuator in mechanical communication with the focuser; a processor in electronic communication with the coherent acoustic beam source actuator and the focuser actuator; a memory in communication with the processor and having instructions thereon to instruct the processor to move the coherent acoustic beam source and the focuser such that at least some of the scattered beams are reflected off the object to be reflected beams and pass by the object to be bypass beams and such that the reflected beams and the bypass beams form an interference zone, the memory further configured to move the coherent acoustic beam and
  • the memory may include instructions for the processor to sharpen the image.
  • the system may further comprise a spatial filter in front of the acoustic detector.
  • the system may further comprise a cone reflector, the cone reflector between the coherent acoustic beam source and the focuser.
  • the system may further comprise a pair of articulating arms, each with a distal end, and a second acoustic detector, each acoustic detector mounted on an arm, proximate the distal end.
  • the system may be sized for imaging ovaries.
  • a system for imaging an object in a first material of different optical density to the object comprising:
  • a coherent acoustic beam source which emits a coherent acoustic beam
  • a focuser positioned to focus the coherent acoustic beam to a virtual acoustic imaging source, the virtual acoustic imaging source emitting scattered beams
  • a coherent acoustic beam source actuator in mechanical communication with the coherent acoustic beam source
  • a focuser actuator in mechanical communication with the focuser
  • an acoustic detector and
  • -a computing device including a processor, a user interface and a memory, the processor in electronic communication with the acoustic detector, the memory in communication with the processor and having instructions thereon to instruct the processor to display an image on the user interface.
  • the memory may include instructions for the processor to sharpen the image.
  • the system may further comprise a spatial filter in front of the acoustic detector.
  • the focuser may be a concave curved cone shaped reflector.
  • the system may further comprise a conical mirrored surface between the cQncave curved cone shaped reflector and the acoustic detector.
  • the processor may be in electronic communication with th$ coherent acoustic beam source actuator and the focuser actuator and the memory may be in Communication with the processor and has instructions thereon to instruct the processor to fnove the coherent acoustic beam source and the focuser such that at least some of the scattered beams are reflected off the object to be reflected beams and pass by the object to be bypass beams and such that the reflected beams and the bypass beams form an interference zone, the memory further configured to move the coherent acoustic beam and the focuser td produce a Fresnel fringe in the interference zone and the acoustic detector may be positioned to image the Fresnel fringe.
  • Figure 1 is a schematic of the operation of a diffuse acoustic confocal three-dimensional imaging device of the present technology.
  • Figure 2A is a schematic showing the beams used in a prior art ultrasound inrtager
  • Figure 2B is a schematic showing the beams used in the present technology.
  • Figure 3 is an embodiment of a detection system of the present technology.
  • Figure 4 is an embodiment for imaging an ovary using the present technology.
  • Figure 5 is a schematic of the operation of the diffuse acoustic confocal three-dimensional imaging device for scanning an ovary using the embodiment of Figure 4.
  • Figure 6 is an embodiment for imaging both ovaries using the present technology.
  • Figure 7 is an embodiment for imaging a breast using the present technology.
  • Figure 8 is an embodiment for imaging the brain using the present technology.
  • Figure 9 is an embodiment of the detection system of the present technology.
  • Figure 10A is a schematic of a slice through the detection system of Figurp 9;
  • Figure 10B is a schematic of a top view of a slice through the conical mirror;
  • Figure IOC is a schematic looking into the conical mirror.
  • FIG 11 is a schematic showing the path of the beams in the embodiment Of Figure 9.
  • Figure 12 A is a side view of the detection system of Figure 9; and Figure 12B is a perspective view of the detection system of Figure 9.
  • a coherent acoustic source 12 such as a coherent acoustic emitter emits a single coherent acoustic beam 14.
  • the coherent acoustic source 12 can be manually moved or can be moved with source actuator 16 that is in mechanical communication with the coherent acoustic source 12.
  • the source actuator 16 is preferably controlled by a processor 18, which is under control of a memory 19, which has instructions thereon for instructing the processor 18 to actuate the source actuator 16.
  • the coherent acoustic source 12 provides a coherent acoustic beam 14 with a beam frequency between and including about 0.5 megahertz and about 100 megahertz for obtaining information including one or more of density, temperature, composition, elasticity, or strain field in a mammalian body.
  • the coherent acoustic beam 14 has a large cross-sectional area, typically on the order of a centimeter or a few centimeters.
  • the coherent acoustic beam 14 is directed to a cone shaped reflector 22 and then to a focusing (convergent) mirror or lens 24 where it is reflected by the curved surface and focused into a convergent beam 30 that is transmitted into a fluid or amorphous medium 32.
  • the focusing mirror 24 pivots under control of a focusing mirror actuator 26, which is under control of the processor 18, which in turn is controlled by the memory 19, which has instructions thereon for instructing the processor 18 to actuate the actuator 26.
  • the cone shaped reflector 22 is under control of an actuator 28 that moves it towards and away from the acoustic emitter (source) 12.
  • the actuator 28 which is under control df the processor 18, which in turn is controlled by the memory 19, which has instructions thereoh for instructing the processor 18 to actuate the actuator 28.
  • the fluid or amorphous medium 32 is within the body and is, for example, but not limited to, urine in the bladder, fatty tissue, brain tissue, peritoneum fluid, stomach fluid, the gall bladder and the spleen.
  • the convergent beam 30 converges to a point which is a virtual focused acoustic imaging soiirce 34 at the point of cross-over.
  • the processor 18 under control of the memory 19 is configured to direct the source actuator 16 to cause the coherent acoustic source 12 to move the source 34 into the fluid or amorphous medium 32 and to move it around within the fluid or amorphous medium 32. Further, the processor 18 under control of the memory 19 is configured to mpve the cone shaped reflector 22 towards and away from the focusing mirror 24, thus moving tlie virtual source 34 towards and away from the acoustic emitter 12, again positioning the virtual source 34 in the fluid or amorphous medium 32.
  • the virtual source 34 is located outside pf the object being imaged and inside the fluid or amorphous medium 32.
  • the source 34 transmits a plurality of beams 36 that are scattered in all directions three-dimensionally.
  • the scattered beams 36 pass out of the medium 32.
  • the plurality of beams 36 scan around outside of the medium 32.
  • the beams 36 pnter into any object 10 that they encounter, then out of the object 10 as object beams 38, which are detected by an acoustic detector 40.
  • the acoustic detector 40 is aimed at the source 34 sutih that it can detect the object beams 38.
  • the object beams 38 are refracted as they pass through the object 10.
  • the acoustic detector 40 can move to collect object beams 38 having a range of angular directions.
  • the object beams 28 are refracted as they pass through the object 10.
  • the acoustic detector 40 moves towards and away from the object 10 in order to defocus the image created by combinations of reflected beams 37and bypass beams 39, such that it becomes photographically visible.
  • a detector actuator 42 is in mechanical communication with the acoustic detector 40 and is under control of the processor 18 that is in electronic communication with the detector actuator 42. Again, the processor 18 receives instructions from the memory 19.
  • the object beams 38 also contain information about the object 10. The information carried by the object beams 38 is analyzed to determine its amplitude and phase according to techniques known in the art. The phase information of the object beam 38 provides information on the object's temperature, composition, magnetic field or electrostatic field and amplitude measurements provide information on the opaqueness or density of the object.
  • a spatial filter 46 reduces the noise from unwanted scattered beams 36 and is located in front of the acoustic detector 40.
  • a couplant 60 is used between the patient and each of the coherent acoustic source 12 and the acoustic detector 40.
  • the patient may be immersed in fluid, or the relevant part of the patient may be immersed in fluid, such that there is a fluid interface between the patient and the device 8.
  • the source 34 is move inside the medium 32 by pivoting the focusing mirror 24 and the acoustic detector 40, or by shifting the microscope 6, or by repositioning the patient.
  • a vector network analyzer is not required as the amplitude and phase information of the emitted and received intensities is not used to produce the image. However, a better intensity image can result using the vector network analyzer for the temporal filter. An intensity image using Fresnel fringes will form without using the temporal filter and spatial filter but using these filters the intensity image will improve, i.e., better spatial resolution, by being able to reduce the apparent size of the virtual source.
  • the vector network analyzer is needed to measure the time difference for receiving the acoustic beam at each element in the detector. Since the path length traveled by the acoustic beam from the emitter/lens assembly to the detector, by measuring the time using the vector network analyzer, the speed (m/s) can be determined.
  • Figure 2A and 2B show a comparison of the beams and how they produce an image in a prior art ultrasound (2A) and in the present diffuse acoustic confocal imager (2B).
  • the coherent beam 14 strikes the object 10 and produces reflected beams 2 and scattered beams 4.
  • the reflected beams 2 produce the image (noting that a relatively large difference in refractive index between the tissue types is needed to identify the interface, and that only the interface is identified).
  • the scattered beams 4 produce noise that needs to be removed in order to produce a clean image.
  • the emitted coherent beam 14 is very flat (planar) (spatial coherence) and are constant (temporal coherence).
  • the coherent beam 14 is focused to a convergent beam 30 which converges to a point, the virtual source 34, which emits scattered beams 36.
  • the virtual source 34 can be moved around to allow the scattered beams 36 to hit the object 10 from many spots, distances and angles.
  • the scattered beams 36 do one of miss the object 10, hit the edge of the object and reflect off the edge or pass through the object 10.
  • reflected beams 37 Those that are reflected off the object are referred to as reflected beams 37. Those that pass through the object are referred to as the object beams 38. Those that miss the object but overlap with the reflected beams 37 are called bypass beams 39.
  • the object beams 38 can be considered for diagnostic purposes of the object
  • the emission time of the beam is measured and then the arrival time of the beam at each pixel in the image is measured. Any differences in the speed of sound across the image can be used to diagnose structures in the image.
  • the spatial coherent interference between the reflected beams 37 and the bypass beams 39 in the overlap is used to create the image.
  • the image of the object 10 can be considered as an inline hologram for diagnostic purposes of the object 10. More specifically, the im ⁇ ge is created using the principle of Fresnel diffraction.
  • the bypass beam 39 overlaps with the reflected beam 37 to produce an interference zone 141. This interference zone 141 between these two beams form the Fresnel fringe or defocus fringe.
  • the retention of coherence in the bypass beam 39 and the reflected beam 37 is required for forming an interference fringe or Fresnel fringe in the image.
  • Fresnel fringe Some intensity in the defocus or Fresnel fringe also comes from the refracted object beams 38 exiting the object 10 and interfering with the bypass beam 39 in the same manner in which the bypass beam and the reflected beam interfere although the refracted object beam has a less significant role in the image formation process of the Fresnel fringe.
  • the Fresnel diffraction produces a fringe (Fresnel fringe) in an acoustic image when defocusing occurs.
  • the Fresnel fringe enhances the contrast between the forms and the background and allows for the imaging of soft tissues, and the interface between different soft tissues. This includes tissues that have very little difference in refractive indexes, such as for example, but not limited to, breast tissue and milk glands in the breast tissue, lesions in tissues, and an egg in a fallopian tube.
  • the width of the overlap increases with defocus, which increases the width of the Fresnel fringe.
  • the defocus decreases to zero where the object and camera are on the Same plane. In this condition, the object disappears and can't be seen because a fringe cannot be made as there is no overlap.
  • the spatial resolution is determined by the width of the Fresnel fringe.
  • the smallest width of the Fresnel fringe found in the image is the size of the virtual source size.
  • the size of the virtual source is determined by the focusing ability of the emitter/lens assembly and the wavelength of the emitted beam from the emitter.
  • An image formed with a large defocus, i.e., broad fringe lines, can be processed with the processor to sharpen the features (i.e., the Fresnel fringes) in the image by applying a defocus amount, delta f, and knowing the cone angle, alpha of the beam such that thg reduction in fringe width is delta f times 1/2 the cone angle.
  • knowing the cone angle and the change in fringe width by a known or measured defocus can be used to determine the distance or position of the object, z, in the image, enabling a 3D image to be produced since the lateral dimensions, x, y, are already measurable in the image. This is one more advantage of a transmission image as produced by the present technology versus a reflected image (ultrasound).
  • Ultrasound acquires z by a complex reflected time measurement of the reflected beam.
  • the distance between the virtual source 34 and the object 10 determines the magnification of the object. The further the virtual source 34 is to the object 10 the closer the magnification approaches one. The magnification of the object increases the closer the virtual source 34 approaches the object 10.
  • the spatial filter 46 and a temporal filter 54 (see Figure 1), which is provided by the software in the memory 19, restrict the volume of the acoustic virtual source 34 used for imaging, with the smaller the volume, the better the resolution for imaging.
  • the spatial resolution in part, is set by the size of the convergent beam 30 at the focused virtual source 34.
  • the spatial filter 46 defines a lateral x,y dimension or angular acceptance angle of the virtual source 34.
  • the spatial filter 46 can be made smaller than the virtual source 34 and is one of the factors that determines the spatial resolution. It can be the determining factor that determines the Spatial resolution in the x,y plane.
  • the temporal filter 54 determines the z or axial spatial resolution of the virtual source 34. It determines the acceptance time for receiving the acoustic signal ⁇ start time and stop time.
  • the fluid or amorphous medium 132 is outside the person and is contained in a container 102.
  • the container 102 is preferably adjustably attached to the device 110.
  • the components of the system are as described above.
  • the copvergent beam 130 converges to a point which is a virtual focused acoustic imaging source 134 at the point of crossover.
  • the processor 18 is under control of the memory 19 which is configured to direct the processor to move the source actuator 16 to cause the coherent acoustic source 12 to move the virtual source 134 into the fluid or amorphous medium 132 and to move it around within the fluid or amorphous medium 132.
  • the virtual source 134 transmits a plurality of beams 136 that are scattered in all directions three-dimensionally.
  • These scattered beams 136 pass out of the medium 132.
  • the plurality of scattered beams 136 scan around outside of the medium 132.
  • the scattered beams 136 enter into any object 10 that they encounter and pass through the object 10 and are detected as object beams 138 by an acoustic detector 140.
  • Other scattered beams 136 are reflected off the object 10 and are called reflected beams 137.
  • the scattered beams 136 that miss the object but overlap with the reflected beams 137 are called bypass beams 139.
  • the acoustic detector 140 is focused on the source 134 and is located such that it can detect the plurality of scattered beams 136 that are proximate the object 10.
  • the source 134 is moved within the medium 132 as needed to get a complete imdge of the object 10.
  • the acoustic detector 140 can move to collect scattered beams 136 having a range of angular directions as indicated in Figure 1.
  • a detector actuator 142 is in mechanical communication with the acoustic detector 140 and is under control of a processor 144 that is in electronic communication with the detector actuator 142.
  • the coherent interference between the reflected beams 137 and the bypass beams 139 in the overlap (interference ione 141) is used to create the image.
  • the image of the object 10 can be considered as an inline hologram for diagnostic purposes of the object 10.
  • the information carried by the object beams 136 can also be analyzed by the processor 18 to determine its amplitude and phase according to techniques known in the art. The amplitude and phase are compared with the amplitude and phase of a bypass beam 139.
  • the source 134 is moved inside the medium 132 by pivoting the focusing (convergent) mirror 124 and the acoustic detector 140, by moving the device 110, or by repositioning the patient.
  • the object 110 is, for example, a disease, a mass, a tumour, a growth or the like
  • the coherent interference between the reflected beams 137 and the bypass beams 139 in the overlap (interference zone 141) is used to create the image of the object.
  • the Fresnel fringe is what creates the image and is caused by defocjusing, as described above.
  • a device 200 for imaging the ovary is provided.
  • the acoustic emitter 202 is located at one end of the device and a focusing (convergent) mirror 204 is located at or proximate the other end of the device 200.
  • An acoustic detector 206 is aligned with the ovary.
  • the detector 206 may be a linear array detector.
  • a schematic is shown in Figure 5.
  • a coherent beam 210 is emitted by the acoustic emitter 202 and is focused by the focusing mirror 204 into a convergent beam 211 that results in a virtual source 212, which in turn produces scattered beams 214 in three dimensions.
  • the acoustic detector 206 detects object beams 216, which are the beams that pass through the object 217, which in this case is the ovary.
  • the acoustic detector 206 also detects the Fresnel fringe 218 which is produced by bypass beams 220 and reflected beams 222, as described above.
  • the virtual source 212 is positioned in the bladder 213 and is moved around so that the scattered beabis 214 scan around until the ovary is found.
  • the scattered beams 214 then reflect and become reflected beams 222, bypass the ovary to be bypass beams 220 and pass through the ovary to become object beams 216.
  • a device 300 for imaging both ovaries is provided.
  • the acoustic emitter 302 is centrally located in a vertical housing 304 which also houses the focusing (convergent) mirror 306.
  • the device 300 has two arms, a right arm 308 and a left arm $10. Proximate the ends 312 of the arms 308, 310 are located a right detector 314 and a left detector 316, respectively.
  • the arms 308, 310 include articulating segments 320 that are under control of mechanical manipulators 322.
  • a right detector actuator 324 and a left detector actuator 326 are in mechanical communication with the acoustic detectors 314, 316 and are under control of a processor 344 that is in electronic communication with the detector actuators 324, 326.
  • a handheld device 400 for imaging the breast includes an acoustic emitter 402, a focusing (convergent) mirror 404 and an acoustic defector 406 arranged as described above. It is manually controlled and has a Bluetooth radio 408 for communicating with a processor 410.
  • a system including a handheld emitter device 500 can be used to image the brain.
  • the detector or detectors 502 are placed as needed and can be retained in position simply by the patient leaning their head on the detectors.
  • the virtual source 504 is focused on the fatty tissue of the brain and the scattered beams 506 are emitted in three dimensions from the virtual source 504.
  • the beams that go through the object 510 are object beams 514. Again, the image is created by defocusing the interference zone 508 between the reflected beams 510 and the bypass beams 512.
  • a device generally referred to as 600 has a concave curved, cone-shaped reflector 610 that replaces the cone shaped reflector and has conical mirror 612 that replaces the convergent mirror.
  • the concave curved, cone-shaped reflector 610 is the focusing lens/mirror.
  • the acoustic emitter 614 is located at the base 615 of the interior 617 of the conical mirror 612 and is aligned with the concave curved, cone-shaped reflector 610.
  • a support base 616 and support members 618 retain the cortcave curved, cone- shaped reflector 610 in the interior 617 of the conical mirror 612.
  • a detector 620 may also be retained by the support base 616.
  • a housing 622 retains the conical mirror 012.
  • the device 600 includes a virtual aperture 624 that is produced by two features of the devicd 600. First there is essentially a low-pass filter produced by the concave-curved cone-shaped reflector reflecting the planar (coherent 640) beam's intensity from the optic axis 648 (see Figure l j) to the edge of the concave-curved cone-shaped reflector 610, i.e., its maximum diameter. The acoustic intensity that has a diameter greater than the diameter of the concave-curved cone-shaped reflector 610 doesn't get reflected by it and passes straight through the device 600. It has an outer limit of passage though that is determined by the radial distance to the edge of the conical mirror 612.
  • the other blockage of intensity Reaching the virtual source 646 position is the cut-off of the reflecting intensity coming from th conical mirror 612 to the virtual source 646 position by the bottom 632 (see Figure 10A) of the cohcave-curved cone- shaped reflector 610.
  • FIG. 10A-C The path of the acoustic beams from the acoustic emitter 614 to the conical mirror 612 is shown in Figures 10A-C.
  • the coherent beam 640 is emitted from the acoustic emitter 614 and strikes the concave curved, cone shaped reflector 610.
  • Figure 10A is a thin slice through the concave curved, cone shaped reflector 610 and the conical mirror 612. As can be seen in the slice, the side of the conical mirror 612 is flat, while the side of the concave curved, cohe shaped reflector
  • the coherent beam 640 reflect off the concave curved, code shaped reflector
  • the converging beams 642 strike the flat surface 626 of the conical mirror 612.
  • the converging beams 642 from the slice of Figure 10A hit the flat wall of the conical mirror 612 in one plane.
  • the converging beams 642 from the slice of Figure 10A form a ring 628 on the flat surface 626 of the conical mirror 612.
  • the coherent beams 640 will strike the concave surface of the concave curved, cone shaped reflector 610 360 degrees around the concave curved, cone shaped reflector 610 and from the top 630 to the bottom 632 of the concave curved, cone shaped reflector 610.
  • the path of the beams is also shown in Figure 11.
  • the coherent beam 640 js emitted from the acoustic emitter 614 and strikes the concave curved, cone shaped reflector @10.
  • the coherent beams 640 reflect off the concave curved, cone shaped reflector 610 and converge, thus these are the converging beams 642.
  • the converging beams 642 strike the flat surface 626 of the conical mirror 612 and are reflected as convergent beams 644 which converge to a point which is a virtual focused acoustic imaging source 646 at the point of cross-over.
  • the virtual source 646 is at or near the optic axis 648 of the conical mirrored surface 612. Note that the curvature of the concave curved, cone shaped reflector 610 is selected such that the virtual source 646 is on or near the optic axis 648.
  • the acoustic detector 620 can be placed on the end of the concave-curved cone-shaped reflector 610 or anywhere else around the virtual source 646 that points towards the virtual source 646. Without being bound to theory, the design of Figures 9-
  • the concave curved, cone shaped reflector $10 and the curved, focusing (convergent) mirror provide the same functionality, however the concave curved cone shaped reflector is easier to manufacture than is the curved, focusing (convergent) mirror; the concave curved, cone shaped reflector 610 can easily be replaced with another concave curved, cone shaped reflector 610 having different dimensions and curvature, which allows for changes in the focus depth of the virtual source 646; the concave curved, cone shaped reflector 610 can be moved forward and backwards on the optic axis 648 to change the depth of focus depth of the virtual source 646; and the design has increased spatial resolution and smaller virtual source size.
  • the concave-curve cone-shaped reflector 610 reflects the converging beam 642 to the conical mirror 612 where the surface of the conical mirror 612 is at a consistent angle with regard to the converging beam 642 as the concave-curve cone-shaped reflector 610 moves back and forth in the conical mirror 612. As noted, this can move the probe/virtual source 648 back and forth.
  • This design increases the capability of the device by giving it an added degree of freedom for focusing the beam and thus moving the virtual source within the body.
  • FIG. 12A An exemplary device is shown in Figures 12A and 12B.
  • the acoustic emitter 614 is at one end of the housing 622.
  • the cohcave curved, cone- shaped reflector is located in the interior 617 of the housing 622 and is retained by support members 618.
  • the conical mirrored surface 612 is on the inside of the housing 622.
  • the acoustic emitter 614 is at the other end of the housing 622.
  • the device 600 may be hand-held and user adjusted and actuated, or it may be under control of a processor and menrjory in a computing device as described above. Regardless, the computing device includes a user interface on which the image is displayed.
  • the method involves the acoustic emitter emitting a coherent beam which is reflected and focused or focused and reflected to provide a point which is virtual source in an amorphous material or fluid.
  • the virtual source is moved around so that it emits scattered beams of which some pass through an object, some bypass the object and sQme reflect off the object.
  • the bypass beams and the reflected beams interfere with one another in an
  • interference zone to provide an image which can be seen as a Fresnel Fringe ⁇ by defocusing the image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Acoustics & Sound (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Investigating Or Analyzing Materials By The Use Of Ultrasonic Waves (AREA)

Abstract

A method and a device for imaging an object in a first material having a different optical density to the object. The method comprises: focusing an acoustic coherent beam to a virtual acoustic source in a fluid or an amorphous second material outside of the first material; moving the virtual acoustic source in the fluid or amorphous material such that at least a plurality of scattered beams from the virtual acoustic source scan the first material and at least the scattered beam is reflected from the object to form a reflected beam and at least one scattered beam bypasses the object to form a bypass beam, and wherein the reflected beam and the bypass beam intercept one another to form a coherent interference zone; and defocusing the coherent interference zone to provide a Fresnel fringe, the Fresnel fringe forming an image of the object.

Description

NON-INVASIVE DIFFUSE ACOUSTIC CONFOCAL THREE-DIMENSIONAL IMAGING
FIELD
The present technology is a method of non-invasively scanning a body or body part to obtain a three-dimensional image, in which size, location and shape are determined. The method further allows a determination of the state of the body or body part. An apparatus for scanning is also provided.
BACKGROUND
The use of beams of radiation to obtain information about an object by detecting the amplitude or phase of the beam is well known for scientific and medical purposes. For example, the phase information of a beam that passes through an object can provide information on the object's temperature, composition, magnetic field or electrostatic field, whereas amplitude measurements can provide information on the opaqueness or density of the object. The beams are comprised of waves of radiation, where a wave, F, can be described as having both an amplitude, A, and phase, Q, described mathematically as,
F = Aexp(0) 1)
The information obtained from the method depends on whether it is detecting the amplitude or both the amplitude and phase of a beam's wave. If the method measures only a beam's amplitude, as is the case for X-ray, only density differences in the object are reported. This is a limitation of the technology as it does not provide information such as an object's temperature, composition, elasticity, strain field, magnetic or electrostatic fields. An additional disadvantage of a number of imaging techniques such as X-ray imaging methods is the strength of radiation employed. When used in diagnosis, the levels employed may have the potential to damage cells in the body.
Acoustic microscopes including Ultrasound are now widely used to image the inside of the body such as the fetus in the womb and blood flow in arties and veins. These microscopes emit a parallel acoustic beam that is reflected as a diffuse and scattered beam when it contacts surfaces such as bones and interfaces such as the interface between the embryonic fluid and the fetus. The reflected beams are used to measure the intensity of the acoustic beam. These microscopes cannot measure the intensity and phase of the beam passing through or reflected from soft tissue such as muscles or embryonic fluid. These microscopes also cannot measure temperature or composition as they only use the intensity of the acoustic beams and not the phase of the acoustic beams. Hence the images are not suitable for providing information other than that information that pertains to surfaces or interfaces. A further deficiency of these microscopes is that the image produced has a significant amount of background intensity caused by the diffuse scattering of beams after striking the surface or interface. Taking as an example, a prostate gland, an ultrasound image poorly identifies the interface between the prostate add other tissue and can also identify the urethra, however, it cannot identify any abnormalities within the prostate.
Another method that measures a beam's amplitude is confocal microscopy. Confocal scanning laser microscopes were developed in the 1980s for seeing three-dimensional objects. Confocal scanning laser microscopy uses a laser beam passing through an object to create a three- dimensional amplitude image of the object by detecting the amplitude of the beam through a pinhole aperture placed confocal with a point on a focal plane of the object.
Confocal microscopes have now found widespread applications in the materials, biological, and medical sciences. As a diagnostic tool, confocal microscopes are limited to detecting only thin tissue and the density differences of objects, which produce amplitude differences of the detected beam. The beams cannot penetrate far in to tissues and other materials. They do not measure the object's phase information. Hence, confocal microscopes cannot measure an object's composition or temperature.
If the method measures changes in the phase of a beam, then information can be provided about the object's temperature and composition. Acoustic beams can be used for this. The phase of acoustic beams are modified by an object's refractive index, where the refractive index is dependent on the object's temperature and composition and is a measure of the acoustic beam's speed of sound.
The absolute phase of an object can be measured using a Confocal Scanning Holography Microscope, as described in US Patent No. 7,639,365. This approach cannot be used to image the inside of the human body as laser beams do not readily pass through the human body.
The relative phase of an object can be measure using an Acoustic Confbcal Interferometry Microscope, as described in US Patent No. 8,485,034. This approach requites an interference beam and a complex arrangement of mirrors and prisms and is not suitable for imaging the inside of the human body because of the geometric constraints.
Standard interferometry microscopes, standard holography microscopes, and standard holographic interferometry microscopes have been used to measure both the phase and the amplitude of objects, giving important information of objects such as their density, composition and temperature. These microscopes create a three-dimensional amplitude image and phase image of the object by measuring both the phase and the amplitude. As they are light microscopes the three-dimensional information measured from these micrpscopes comes only from the surface of the object and not at points within the object. In all cases, a reference beam and an object beam are used to collect data that results in the creation of the images. This limits the use of these microscopes to collecting data from or about surfaces of objects. In medical diagnosis they would therefore be potentially useful for diseases of the skin, but not for diseases of internal tissues or organs.
Other means able to measure the amplitude and phase of objects using an acoustic beam is spatially-filtered transmission ultrasound phase imaging as disclosed in US Patent Nos. 6,679,846, 6,436,046, 6,132,375 and 6,193,663. Spatially-filtered transmission ultrasound phase imaging involves measuring the amplitude and phase of an emitted beam and then again measuring the amplitude and phase of the acoustic beam after it passes through the object upon its arrival at a detector. The difference in amplitude and phase is attributed to the object. From the sound source, the beam diffusely scatters outward leading to background scatter that is not wanted. Within or around this background scatter will be the image of interest. That image is representative of the interfaces of the object being imaged. It does not represent a three dimensional image, nor can it locate diseased tissue within the tissue or organ of interest. Similarly, in materials, it cannot provide a three dimensional image nor can it show a different material within the material or a region having different physical characteristics within the material, unless there is an interface, such as the interface between a liquid and a solid.
United States Patent Application 20040059265 discloses noninvasively focusing acoustical energy on a mass such as a tumor within tissue to reduce or eliminate the mass. The presence of the mass in the tissue is detected by applying acoustic energy to the substance. The mass is localized to determine its position. Temporal signatures are developed to drive the acoustical energy on the mass. Dynamic focusing of the acoustical energy on the mass to reduce or eliminate it is accomplished utilizing the temporal signatures. Imaging of the mass is done using standing ultrasound imaging techniques (use of multiple parallel beams from an acoustic source outside of the body of the patient that are reflected off the object of interest in the body to provide intensity information [this reflected signal is referred to as a virtual source, but is not a source]) then either modeling or time reversal is used for driving acoustical energy which is then focused on the mass to treat the mass. The deficiencies in this method include the fact that it is based on models and assumptions, the imaging of the mass is not better than the images obtained with ultrasound, as that is how they image, and the information attainable using the method are limited. Also, as the three-dimensional position cannot be determined, when used to treat a mass, such a treatment is reliant on the models and assumptions or time reversal to be being correct.
It would be advantageous to provide a device, system and method that can detect both the amplitude and phase of a beam. Such a device, system and method would be able to provide information on the object's density, temperature, composition, elasticity, strain field, magnetic or electrostatic fields. This is of great significance in the medical field, as of being able to obtain information on density, temperature, and composition allows one to be able to potentially diagnose, treat and assess effectiveness of treatments for diseases such as cancer. Ideally, the device would be suitable for being hand-held, with a variety of different shaped detector holders for application to different parts of the body, for example, but not limited to the prostate, breast, head, and skin. It would be advantageous if the same beam used to detect and provide a three- dimensional image of the object could be used to treat the object.
Examples of an application where the measurement of temperature and composition is important include medical diagnostics aimed at understanding the function of organs, tissue and diseased regions in the body. Presently medical researchers do not have g0od means to non- invasively measure the internal temperature and composition of the body. It is an object of the present technology to provide such capabilities.
SUMMARY
The present technology provides a method of creating three-dimensional irhages of objects of one optical density in another object of a different optical density that is far superior to ultrasound methods. A coherent beam is focused into a virtual source outsidfe of the objects and within a fluid or an amorphous material. By creating an interference zone of a bypass beam and a reflected beam, an interference zone can be created. This interference zone is defocused to provide the three-dimensional image.
In one embodiment, a method of imaging an object in a first material having a different optical density to the object is provided, the method comprising: focusing an acoustic coherent beam to a virtual acoustic source in a fluid or an amorphous second material outside Of the first material; moving the virtual acoustic source in the fluid or amorphous material such thit at least a plurality of scattered beams from the virtual acoustic source scan the first material and at least one scattered beam is reflected from the object to form a reflected beam and at least one scattered beam bypasses the object to form a bypass beam, and wherein the reflected beam and the bypass beam intercept one another to form a coherent interference zone; and defocusing the coherent interference zone to provide a Fresnel fringe, the Fresnel fringe forming an image of the object. The method may further comprise an acoustic detector detecting the image of the object.
The method may further comprise moving the virtual acoustic source in the fluid or amorphous material such that at least one scattered beam passes through the object to become an object beam and is detected by the acoustic detector.
The method may further comprise comparing the phase of the object bearh with the phase of the bypass beam to provide information about the object.
The method may further comprise comparing the amplitude of the object beam with the amplitude of the bypass beam to provide information about the object.
In the method the information from the phase may be the temperature, composition, magnetic field or electrostatic field of the object and the information from the amplitude may be the optical density of the object.
In the method, the speed of sound of the object may be determined.
In the method, the speed of sound of the object may be used to identify the object.
In the method, the object may be identified as a tumour or lesion in the first material.
In the method, the acoustic coherent beam may be focused in a fluid or amorphous second material in the body of a patient.
In the method, the acoustic coherent beam may be focused in one of urine in the bladder, fatty tissue, brain tissue, peritoneum fluid, stomach fluid, the gall bladder and the! spleen.
In the method the acoustic coherent beam may be focused in a fluid outside of the body of a patient in which at least the part of the body of interest is immersed.
In another embodiment, a system for imaging an object in a first material of different optical density to the object is provided, the system comprising: a coherent acoustic beam source which emits a coherent acoustic beam; a focuser positioned to focus the coherent acoustic beam to a virtual acoustic imaging source, the virtual acoustic imaging source emitting scattered beams; a coherent acoustic beam source actuator in mechanical communication with the coherent acoustic beam source; a focuser actuator in mechanical communication with the focuser; a processor in electronic communication with the coherent acoustic beam source actuator and the focuser actuator; a memory in communication with the processor and having instructions thereon to instruct the processor to move the coherent acoustic beam source and the focuser such that at least some of the scattered beams are reflected off the object to be reflected beams and pass by the object to be bypass beams and such that the reflected beams and the bypass beams form an interference zone, the memory further configured to move the coherent acoustic beam and the focuser to produce a Fresnel fringe in the interference zone; and an acoustic detector positioned to image the Fresnel fringe.
In the system, the memory may include instructions for the processor to sharpen the image.
The system may further comprise a spatial filter in front of the acoustic detector.
The system may further comprise a cone reflector, the cone reflector between the coherent acoustic beam source and the focuser.
The system may further comprise a pair of articulating arms, each with a distal end, and a second acoustic detector, each acoustic detector mounted on an arm, proximate the distal end.
The system may be sized for imaging ovaries.
In yet another embodiment, a system for imaging an object in a first material of different optical density to the object is provided, the system comprising:
-an apparatus including: a coherent acoustic beam source which emits a coherent acoustic beam; a focuser positioned to focus the coherent acoustic beam to a virtual acoustic imaging source, the virtual acoustic imaging source emitting scattered beams; a coherent acoustic beam source actuator in mechanical communication with the coherent acoustic beam source; a focuser actuator in mechanical communication with the focuser; and an acoustic detector; and
-a computing device including a processor, a user interface and a memory, the processor in electronic communication with the acoustic detector, the memory in communication with the processor and having instructions thereon to instruct the processor to display an image on the user interface.
In the system, the memory may include instructions for the processor to sharpen the image.
The system may further comprise a spatial filter in front of the acoustic detector.
In the system, the focuser may be a concave curved cone shaped reflector.
The system may further comprise a conical mirrored surface between the cQncave curved cone shaped reflector and the acoustic detector.
In the system, the processor may be in electronic communication with th$ coherent acoustic beam source actuator and the focuser actuator and the memory may be in Communication with the processor and has instructions thereon to instruct the processor to fnove the coherent acoustic beam source and the focuser such that at least some of the scattered beams are reflected off the object to be reflected beams and pass by the object to be bypass beams and such that the reflected beams and the bypass beams form an interference zone, the memory further configured to move the coherent acoustic beam and the focuser td produce a Fresnel fringe in the interference zone and the acoustic detector may be positioned to image the Fresnel fringe.
FIGURES
Figure 1 is a schematic of the operation of a diffuse acoustic confocal three-dimensional imaging device of the present technology.
Figure 2A is a schematic showing the beams used in a prior art ultrasound inrtager; Figure 2B is a schematic showing the beams used in the present technology.
Figure 3 is an embodiment of a detection system of the present technology.
Figure 4 is an embodiment for imaging an ovary using the present technology.
Figure 5 is a schematic of the operation of the diffuse acoustic confocal three-dimensional imaging device for scanning an ovary using the embodiment of Figure 4. Figure 6 is an embodiment for imaging both ovaries using the present technology.
Figure 7 is an embodiment for imaging a breast using the present technology.
Figure 8 is an embodiment for imaging the brain using the present technology.
Figure 9 is an embodiment of the detection system of the present technology.
Figure 10A is a schematic of a slice through the detection system of Figurp 9; Figure 10B is a schematic of a top view of a slice through the conical mirror; and Figure IOC is a schematic looking into the conical mirror.
Figure 11 is a schematic showing the path of the beams in the embodiment Of Figure 9.
Figure 12 A is a side view of the detection system of Figure 9; and Figure 12B is a perspective view of the detection system of Figure 9.
DESCRIPTION
Except as otherwise expressly provided, the following rules of interpretation apply to this specification (written description and claims): (a) all words used herein shall be construed to be of such gender or number (singular or plural) as the circumstances require; ( ) the singular terms "a", "an", and "the", as used in the specification and the appended claims include plural references unless the context clearly dictates otherwise; (c) the antecedent term "about" applied to a recited range or value denotes an approximation within the deviation in the range or value known or expected in the art from the measurements method; (d) the words "herein", "hereby", "hereof", "hereto", "hereinbefore", and "hereinafter", and words of similar Import, refer to this specification in its entirety and not to any particular paragraph, claim or other subdivision, unless otherwise specified; (e) descriptive headings are for convenience only and shall not control or affect the meaning or construction of any part of the specification; and (f) "or" and "any" are not exclusive and "include" and "including" are not limiting. Further, the terms "comprising," "having," "including," and "containing" are to be construed as open-ended terms (i.e., meaning "including, but not limited to,") unless otherwise noted. Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless Otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein. Where a specific range of values is provided, it is understood that each intervening value, to the tenth of the unit of the lower limit unless the context clearly dictates otherwise, between the upper and lower limit of that range and any other stated or intervening value in that stated range, is included therein. All smaller sub ranges are also included. The upper and lower limits of these smaller ranges are also included therein, subject to any specifically excluded limit in the stated range.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the relevant art. Although any methods and materials similar or equivalent to those described herein can also be used, the acceptable methods and materials are now described.
An overview of the system, generally referred to as 8, for imaging a tissue, an organ, or a body part (an object), generally referred to as 10, is shown schematically in Figure 1. It creates and uses the scattered beams that are considered to be noise (speckle noise). A coherent acoustic source 12 such as a coherent acoustic emitter emits a single coherent acoustic beam 14. The coherent acoustic source 12 can be manually moved or can be moved with source actuator 16 that is in mechanical communication with the coherent acoustic source 12. The source actuator 16 is preferably controlled by a processor 18, which is under control of a memory 19, which has instructions thereon for instructing the processor 18 to actuate the source actuator 16. The coherent acoustic source 12 provides a coherent acoustic beam 14 with a beam frequency between and including about 0.5 megahertz and about 100 megahertz for obtaining information including one or more of density, temperature, composition, elasticity, or strain field in a mammalian body.
The coherent acoustic beam 14 has a large cross-sectional area, typically on the order of a centimeter or a few centimeters. The coherent acoustic beam 14 is directed to a cone shaped reflector 22 and then to a focusing (convergent) mirror or lens 24 where it is reflected by the curved surface and focused into a convergent beam 30 that is transmitted into a fluid or amorphous medium 32. The focusing mirror 24 pivots under control of a focusing mirror actuator 26, which is under control of the processor 18, which in turn is controlled by the memory 19, which has instructions thereon for instructing the processor 18 to actuate the actuator 26. The cone shaped reflector 22 is under control of an actuator 28 that moves it towards and away from the acoustic emitter (source) 12. The actuator 28 which is under control df the processor 18, which in turn is controlled by the memory 19, which has instructions thereoh for instructing the processor 18 to actuate the actuator 28. In one embodiment, the fluid or amorphous medium 32 is within the body and is, for example, but not limited to, urine in the bladder, fatty tissue, brain tissue, peritoneum fluid, stomach fluid, the gall bladder and the spleen. The convergent beam 30 converges to a point which is a virtual focused acoustic imaging soiirce 34 at the point of cross-over. The processor 18 under control of the memory 19 is configured to direct the source actuator 16 to cause the coherent acoustic source 12 to move the source 34 into the fluid or amorphous medium 32 and to move it around within the fluid or amorphous medium 32. Further, the processor 18 under control of the memory 19 is configured to mpve the cone shaped reflector 22 towards and away from the focusing mirror 24, thus moving tlie virtual source 34 towards and away from the acoustic emitter 12, again positioning the virtual source 34 in the fluid or amorphous medium 32. The virtual source 34 is located outside pf the object being imaged and inside the fluid or amorphous medium 32. The source 34 transmits a plurality of beams 36 that are scattered in all directions three-dimensionally. The scattered beams 36 pass out of the medium 32. By moving the source 34 in the fluid or amorphous medium 32, the plurality of beams 36 scan around outside of the medium 32. The beams 36 pnter into any object 10 that they encounter, then out of the object 10 as object beams 38, which are detected by an acoustic detector 40. The acoustic detector 40 is aimed at the source 34 sutih that it can detect the object beams 38. The object beams 38 are refracted as they pass through the object 10. The acoustic detector 40 can move to collect object beams 38 having a range of angular directions. The object beams 28 are refracted as they pass through the object 10. The acoustic detector 40 moves towards and away from the object 10 in order to defocus the image created by combinations of reflected beams 37and bypass beams 39, such that it becomes photographically visible. A detector actuator 42 is in mechanical communication with the acoustic detector 40 and is under control of the processor 18 that is in electronic communication with the detector actuator 42. Again, the processor 18 receives instructions from the memory 19. The object beams 38 also contain information about the object 10. The information carried by the object beams 38 is analyzed to determine its amplitude and phase according to techniques known in the art. The phase information of the object beam 38 provides information on the object's temperature, composition, magnetic field or electrostatic field and amplitude measurements provide information on the opaqueness or density of the object. A spatial filter 46 reduces the noise from unwanted scattered beams 36 and is located in front of the acoustic detector 40.
A couplant 60 is used between the patient and each of the coherent acoustic source 12 and the acoustic detector 40. Alternatively, the patient may be immersed in fluid, or the relevant part of the patient may be immersed in fluid, such that there is a fluid interface between the patient and the device 8. In order for the object 10 to be observed, the source 34 is move inside the medium 32 by pivoting the focusing mirror 24 and the acoustic detector 40, or by shifting the microscope 6, or by repositioning the patient. A vector network analyzer is not required as the amplitude and phase information of the emitted and received intensities is not used to produce the image. However, a better intensity image can result using the vector network analyzer for the temporal filter. An intensity image using Fresnel fringes will form without using the temporal filter and spatial filter but using these filters the intensity image will improve, i.e., better spatial resolution, by being able to reduce the apparent size of the virtual source.
For phase or speed of sound imaging, the vector network analyzer is needed to measure the time difference for receiving the acoustic beam at each element in the detector. Since the path length traveled by the acoustic beam from the emitter/lens assembly to the detector, by measuring the time using the vector network analyzer, the speed (m/s) can be determined.
Figure 2A and 2B show a comparison of the beams and how they produce an image in a prior art ultrasound (2A) and in the present diffuse acoustic confocal imager (2B). The coherent beam 14 strikes the object 10 and produces reflected beams 2 and scattered beams 4. The reflected beams 2 produce the image (noting that a relatively large difference in refractive index between the tissue types is needed to identify the interface, and that only the interface is identified). The scattered beams 4 produce noise that needs to be removed in order to produce a clean image.
Using the diffuse acoustic confocal imager, the emitted coherent beam 14 is very flat (planar) (spatial coherence) and are constant (temporal coherence). The coherent beam 14 is focused to a convergent beam 30 which converges to a point, the virtual source 34, which emits scattered beams 36. The virtual source 34 can be moved around to allow the scattered beams 36 to hit the object 10 from many spots, distances and angles. The scattered beams 36 do one of miss the object 10, hit the edge of the object and reflect off the edge or pass through the object 10.
Those that are reflected off the object are referred to as reflected beams 37. Those that pass through the object are referred to as the object beams 38. Those that miss the object but overlap with the reflected beams 37 are called bypass beams 39.
The object beams 38 can be considered for diagnostic purposes of the object |L0. To form a speed of sound image using temporal coherence, the emission time of the beam is measured and then the arrival time of the beam at each pixel in the image is measured. Any differences in the speed of sound across the image can be used to diagnose structures in the image.
The spatial coherent interference between the reflected beams 37 and the bypass beams 39 in the overlap is used to create the image. The image of the object 10 can be considered as an inline hologram for diagnostic purposes of the object 10. More specifically, the im^ge is created using the principle of Fresnel diffraction. The bypass beam 39 overlaps with the reflected beam 37 to produce an interference zone 141. This interference zone 141 between these two beams form the Fresnel fringe or defocus fringe. The retention of coherence in the bypass beam 39 and the reflected beam 37 is required for forming an interference fringe or Fresnel fringe in the image. Some intensity in the defocus or Fresnel fringe also comes from the refracted object beams 38 exiting the object 10 and interfering with the bypass beam 39 in the same manner in which the bypass beam and the reflected beam interfere although the refracted object beam has a less significant role in the image formation process of the Fresnel fringe. The Fresnel diffraction produces a fringe (Fresnel fringe) in an acoustic image when defocusing occurs. The Fresnel fringe enhances the contrast between the forms and the background and allows for the imaging of soft tissues, and the interface between different soft tissues. This includes tissues that have very little difference in refractive indexes, such as for example, but not limited to, breast tissue and milk glands in the breast tissue, lesions in tissues, and an egg in a fallopian tube.
The width of the overlap increases with defocus, which increases the width of the Fresnel fringe. The defocus decreases to zero where the object and camera are on the Same plane. In this condition, the object disappears and can't be seen because a fringe cannot be made as there is no overlap.
Additionally, the spatial resolution is determined by the width of the Fresnel fringe. The smallest width of the Fresnel fringe found in the image is the size of the virtual source size. The size of the virtual source is determined by the focusing ability of the emitter/lens assembly and the wavelength of the emitted beam from the emitter. The resolution can approach the wavelength of the acoustic beam, which for a 50 MHz acoustic beam approaches the siz of the cell (lambda = frequency/speed = 50,000,000 per second / 1500 m/s = 0.00003 = 30 microns). This is much higher resolution than ultrasound.
An image formed with a large defocus, i.e., broad fringe lines, can be processed with the processor to sharpen the features (i.e., the Fresnel fringes) in the image by applying a defocus amount, delta f, and knowing the cone angle, alpha of the beam such that thg reduction in fringe width is delta f times 1/2 the cone angle. Likewise, knowing the cone angle and the change in fringe width by a known or measured defocus can be used to determine the distance or position of the object, z, in the image, enabling a 3D image to be produced since the lateral dimensions, x, y, are already measurable in the image. This is one more advantage of a transmission image as produced by the present technology versus a reflected image (ultrasound). Ultrasound acquires z by a complex reflected time measurement of the reflected beam. The distance between the virtual source 34 and the object 10 determines the magnification of the object. The further the virtual source 34 is to the object 10 the closer the magnification approaches one. The magnification of the object increases the closer the virtual source 34 approaches the object 10.
The spatial filter 46 and a temporal filter 54 (see Figure 1), which is provided by the software in the memory 19, restrict the volume of the acoustic virtual source 34 used for imaging, with the smaller the volume, the better the resolution for imaging. The spatial resolution, in part, is set by the size of the convergent beam 30 at the focused virtual source 34. The spatial filter 46 defines a lateral x,y dimension or angular acceptance angle of the virtual source 34. The spatial filter 46 can be made smaller than the virtual source 34 and is one of the factors that determines the spatial resolution. It can be the determining factor that determines the Spatial resolution in the x,y plane. The temporal filter 54 determines the z or axial spatial resolution of the virtual source 34. It determines the acceptance time for receiving the acoustic signal^ start time and stop time.
In an embodiment shown in Figure 3, the fluid or amorphous medium 132 is outside the person and is contained in a container 102. The container 102 is preferably adjustably attached to the device 110. The components of the system are as described above. The copvergent beam 130 converges to a point which is a virtual focused acoustic imaging source 134 at the point of crossover. The processor 18 is under control of the memory 19 which is configured to direct the processor to move the source actuator 16 to cause the coherent acoustic source 12 to move the virtual source 134 into the fluid or amorphous medium 132 and to move it around within the fluid or amorphous medium 132. The virtual source 134 transmits a plurality of beams 136 that are scattered in all directions three-dimensionally. These scattered beams 136 pass out of the medium 132. By moving the virtual source 134 in the fluid or amorphous medium 132, the plurality of scattered beams 136 scan around outside of the medium 132. Depending on the angle of incidence, the scattered beams 136 enter into any object 10 that they encounter and pass through the object 10 and are detected as object beams 138 by an acoustic detector 140. Other scattered beams 136 are reflected off the object 10 and are called reflected beams 137. The scattered beams 136 that miss the object but overlap with the reflected beams 137 are called bypass beams 139. The acoustic detector 140 is focused on the source 134 and is located such that it can detect the plurality of scattered beams 136 that are proximate the object 10. The source 134 is moved within the medium 132 as needed to get a complete imdge of the object 10. The acoustic detector 140 can move to collect scattered beams 136 having a range of angular directions as indicated in Figure 1. A detector actuator 142 is in mechanical communication with the acoustic detector 140 and is under control of a processor 144 that is in electronic communication with the detector actuator 142. The coherent interference between the reflected beams 137 and the bypass beams 139 in the overlap (interference ione 141) is used to create the image. The image of the object 10 can be considered as an inline hologram for diagnostic purposes of the object 10. The information carried by the object beams 136 can also be analyzed by the processor 18 to determine its amplitude and phase according to techniques known in the art. The amplitude and phase are compared with the amplitude and phase of a bypass beam 139.
In order for the object 10 to be observed, the source 134 is moved inside the medium 132 by pivoting the focusing (convergent) mirror 124 and the acoustic detector 140, by moving the device 110, or by repositioning the patient. If the object 110 is, for example, a disease, a mass, a tumour, a growth or the like, the coherent interference between the reflected beams 137 and the bypass beams 139 in the overlap (interference zone 141) is used to create the image of the object. The Fresnel fringe is what creates the image and is caused by defocjusing, as described above.
As shown in Figure 4, in one embodiment, a device 200 for imaging the ovary is provided. The acoustic emitter 202 is located at one end of the device and a focusing (convergent) mirror 204 is located at or proximate the other end of the device 200. An acoustic detector 206 is aligned with the ovary. The detector 206 may be a linear array detector. A schematic is shown in Figure 5. As described above, a coherent beam 210 is emitted by the acoustic emitter 202 and is focused by the focusing mirror 204 into a convergent beam 211 that results in a virtual source 212, which in turn produces scattered beams 214 in three dimensions. The acoustic detector 206 detects object beams 216, which are the beams that pass through the object 217, which in this case is the ovary. The acoustic detector 206 also detects the Fresnel fringe 218 which is produced by bypass beams 220 and reflected beams 222, as described above. The virtual source 212 is positioned in the bladder 213 and is moved around so that the scattered beabis 214 scan around until the ovary is found. The scattered beams 214 then reflect and become reflected beams 222, bypass the ovary to be bypass beams 220 and pass through the ovary to become object beams 216.
As shown in Figure 6, a device 300 for imaging both ovaries is provided. The acoustic emitter 302 is centrally located in a vertical housing 304 which also houses the focusing (convergent) mirror 306. The device 300 has two arms, a right arm 308 and a left arm $10. Proximate the ends 312 of the arms 308, 310 are located a right detector 314 and a left detector 316, respectively. The arms 308, 310 include articulating segments 320 that are under control of mechanical manipulators 322. As a further means of detecting the bearhs, a right detector actuator 324 and a left detector actuator 326 are in mechanical communication with the acoustic detectors 314, 316 and are under control of a processor 344 that is in electronic communication with the detector actuators 324, 326.
As shown in Figure 7, a handheld device 400 for imaging the breast is provided. It includes an acoustic emitter 402, a focusing (convergent) mirror 404 and an acoustic defector 406 arranged as described above. It is manually controlled and has a Bluetooth radio 408 for communicating with a processor 410.
As shown in Figure 8, a system including a handheld emitter device 500 can be used to image the brain. The detector or detectors 502 are placed as needed and can be retained in position simply by the patient leaning their head on the detectors. The virtual source 504 is focused on the fatty tissue of the brain and the scattered beams 506 are emitted in three dimensions from the virtual source 504. The beams that go through the object 510 are object beams 514. Again, the image is created by defocusing the interference zone 508 between the reflected beams 510 and the bypass beams 512. As shown in Figure 9, in one embodiment, a device generally referred to as 600, has a concave curved, cone-shaped reflector 610 that replaces the cone shaped reflector and has conical mirror 612 that replaces the convergent mirror. Functionally, the concave curved, cone-shaped reflector 610 is the focusing lens/mirror. The acoustic emitter 614 is located at the base 615 of the interior 617 of the conical mirror 612 and is aligned with the concave curved, cone-shaped reflector 610. A support base 616 and support members 618 retain the cortcave curved, cone- shaped reflector 610 in the interior 617 of the conical mirror 612. A detector 620 may also be retained by the support base 616. A housing 622 retains the conical mirror 012. The device 600 includes a virtual aperture 624 that is produced by two features of the devicd 600. First there is essentially a low-pass filter produced by the concave-curved cone-shaped reflector reflecting the planar (coherent 640) beam's intensity from the optic axis 648 (see Figure l j) to the edge of the concave-curved cone-shaped reflector 610, i.e., its maximum diameter. The acoustic intensity that has a diameter greater than the diameter of the concave-curved cone-shaped reflector 610 doesn't get reflected by it and passes straight through the device 600. It has an outer limit of passage though that is determined by the radial distance to the edge of the conical mirror 612. i.e., the hole made by the conical mirror 612 from which the acoustic beam rrjust pass in order to focus to the virtual source 646 position. The other blockage of intensity Reaching the virtual source 646 position is the cut-off of the reflecting intensity coming from th conical mirror 612 to the virtual source 646 position by the bottom 632 (see Figure 10A) of the cohcave-curved cone- shaped reflector 610.
The path of the acoustic beams from the acoustic emitter 614 to the conical mirror 612 is shown in Figures 10A-C. The coherent beam 640 is emitted from the acoustic emitter 614 and strikes the concave curved, cone shaped reflector 610. Figure 10A is a thin slice through the concave curved, cone shaped reflector 610 and the conical mirror 612. As can be seen in the slice, the side of the conical mirror 612 is flat, while the side of the concave curved, cohe shaped reflector
610 is concave. The coherent beam 640 reflect off the concave curved, code shaped reflector
610 and converges, thus producing the converging beams 642. The converging beams 642 strike the flat surface 626 of the conical mirror 612. As shown in Figure 10B, the converging beams 642 from the slice of Figure 10A hit the flat wall of the conical mirror 612 in one plane. As shown in Figure IOC, the converging beams 642 from the slice of Figure 10A form a ring 628 on the flat surface 626 of the conical mirror 612. As would be known to one skilled in the art, the coherent beams 640 will strike the concave surface of the concave curved, cone shaped reflector 610 360 degrees around the concave curved, cone shaped reflector 610 and from the top 630 to the bottom 632 of the concave curved, cone shaped reflector 610.
The path of the beams is also shown in Figure 11. The coherent beam 640 js emitted from the acoustic emitter 614 and strikes the concave curved, cone shaped reflector @10. The coherent beams 640 reflect off the concave curved, cone shaped reflector 610 and converge, thus these are the converging beams 642. The converging beams 642 strike the flat surface 626 of the conical mirror 612 and are reflected as convergent beams 644 which converge to a point which is a virtual focused acoustic imaging source 646 at the point of cross-over. The virtual source 646 is at or near the optic axis 648 of the conical mirrored surface 612. Note that the curvature of the concave curved, cone shaped reflector 610 is selected such that the virtual source 646 is on or near the optic axis 648.
From the virtual source 646 the acoustic beam diffusely scatters the acoustic beam in all directions as described above. The acoustic detector 620 can be placed on the end of the concave-curved cone-shaped reflector 610 or anywhere else around the virtual source 646 that points towards the virtual source 646. Without being bound to theory, the design of Figures 9-
12B have advantages as follows: The concave curved, cone shaped reflector $10 and the curved, focusing (convergent) mirror provide the same functionality, however the concave curved cone shaped reflector is easier to manufacture than is the curved, focusing (convergent) mirror; the concave curved, cone shaped reflector 610 can easily be replaced with another concave curved, cone shaped reflector 610 having different dimensions and curvature, which allows for changes in the focus depth of the virtual source 646; the concave curved, cone shaped reflector 610 can be moved forward and backwards on the optic axis 648 to change the depth of focus depth of the virtual source 646; and the design has increased spatial resolution and smaller virtual source size. The concave-curve cone-shaped reflector 610 reflects the converging beam 642 to the conical mirror 612 where the surface of the conical mirror 612 is at a consistent angle with regard to the converging beam 642 as the concave-curve cone-shaped reflector 610 moves back and forth in the conical mirror 612. As noted, this can move the probe/virtual source 648 back and forth. This design increases the capability of the device by giving it an added degree of freedom for focusing the beam and thus moving the virtual source within the body.
An exemplary device is shown in Figures 12A and 12B. As shown in Figure 12A, the acoustic emitter 614 is at one end of the housing 622. As shown in Figure 12B, the cohcave curved, cone- shaped reflector is located in the interior 617 of the housing 622 and is retained by support members 618. The conical mirrored surface 612 is on the inside of the housing 622. The acoustic emitter 614 is at the other end of the housing 622. The device 600 may be hand-held and user adjusted and actuated, or it may be under control of a processor and menrjory in a computing device as described above. Regardless, the computing device includes a user interface on which the image is displayed.
In all embodiments, the method involves the acoustic emitter emitting a coherent beam which is reflected and focused or focused and reflected to provide a point which is virtual source in an amorphous material or fluid. The virtual source is moved around so that it emits scattered beams of which some pass through an object, some bypass the object and sQme reflect off the object. The bypass beams and the reflected beams interfere with one another in an
interference zone to provide an image which can be seen as a Fresnel Fringe^ by defocusing the image.
While example embodiments have been described in connection with what is presently considered to be an example of a possible most practical and/or suitable embodiment, it is to be understood that the descriptions are not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the example embodiment. Those skilled in the art will recognize or be able to ascertain using no more than routine experimentation, many equivalents to the specific example embodiments specifically described herein. Such equivalents are intended to be encompassed in the scope of the claims, if appended hereto or subsequently filed.

Claims

1. A method of imaging an object in a first material having a different optical density to the object, the method comprising: focusing an acoustic coherent beam to a virtual acoustic source in a fluid or an amorphous second material outside of the first material; moving the virtual acoustic source in the fluid or amorphous material such that at least a plurality of scattered beams from the virtual acoustic source scan the first material and at least one scattered beam is reflected from the object to form a reflected beam and at least one scattered beam bypasses the object to form a bypass beam, and wherein the reflected beam and the bypass beam intercept one another to form a coherent interference zone; and defocusing the coherent interference zone to provide a Fresnel fringe, the Fresnel fringe forming an image of the object.
2. The method of claim 1, further comprising an acoustic detector detecting the image of the object.
3. The method of claim 2, further comprising moving the virtual acoustit source in the fluid or amorphous material such that at least one scattered beam passes through the object to become an object beam and is detected by the acoustic detector.
4. The method of claim 3, further comprising comparing the phase of the object beam with the phase of the bypass beam to provide information about the object.
5. The method of claim 4, further comprising comparing the amplitude of the object beam with the amplitude of the bypass beam to provide information about the object.
6. The method of claim 5, wherein the information from the phase is the temperature, composition, magnetic field or electrostatic field of the object and the information from the amplitude is the optical density of the object.
7. The method of claim 6, wherein the speed of sound of the object is determined.
8. The method of claim 7, wherein the speed of sound of the object is used to identify the object.
9. The method of claim 1, wherein the object is identified as a tumour or lesion in the first material.
10. The method of claim 9, wherein the acoustic coherent beam is focused in a fluid or amorphous second material in the body of a patient.
11. The method of claim 10, wherein the acoustic coherent beam is focused in one of urine in the bladder, fatty tissue, brain tissue, peritoneum fluid, stomach fluid, the gall bladder and the spleen.
12. The method of claim 9, wherein the acoustic coherent beam is focused in a fluid outside of the body of a patient in which at least the part of the body of interest is immersed.
13. A system for imaging an object in a first material of different optical density to the object, the system comprising: a coherent acoustic beam source which emits a coherent acoustic beam; a focuser positioned to focus the coherent acoustic beam tb a virtual acoustic imaging source, the virtual acoustic imaging source emitting scattered beams; a coherent acoustic beam source actuator in mechanical communication with the coherent acoustic beam source; a focuser actuator in mechanical communication With the focuser; a processor in electronic communication with the coherent acoustic beam source actuator and the focuser actuator; a memory in communication with the processor and having instructions thereon to instruct the processor to move the coherent acjoustic beam source and the focuser such that at least some of the scattered beams are reflected off the object to be reflected beams and pass by the object to be bypass beams and such that the reflected beams and the bypass beams form an interference zone, the memory further configured to move the coherent acoustic beam and the focuser to produce a Fresnel fringe in the interference zone; and an acoustic detector positioned tb image the Fresnel fringe.
14. The system of claim 13, wherein the memory includes instructions for the processor to sharpen the image.
15. The system of claim 14, further comprising a spatial filter in front of the acoustic detector.
16. The system of claim 15, further comprising a cone reflector, the cone reflector between the coherent acoustic beam source and the focuser.
17. The system of claim 16, further comprising a pair of articulating arms, each with a distal end, and a second acoustic detector, each acoustic detector mounted on an arm, proximate the distal end.
18. The system of claim 17, sized for imaging ovaries.
19. A system for imaging an object in a first material of different optical density to the object, the system comprising:
-an apparatus including: a coherent acoustic beam source which emits a coherent acoustic beam; a focuser positioned to focus the coherent acoustic beam to a virtual acoustic imaging source, the virtual acoustic imaging source emitting scattered beams; a coherent acoustic beam source actuator in mechanical communication with the coherent acoustic beam source; a focuser actuator in mechanical communication with the focuser; and an acoustic detector; and
-a computing device including a processor, a user interface and a memory, the processor in electronic communication with the acoustic detector, the memory in communication with the processor and having instructions thereon to instruct the processor to display an image on the user interface.
20. The system of claim 20, wherein the memory includes instructions for the processor to sharpen the image.
21. The system of claim 21, further comprising a spatial filter in front of the acoustic detector.
22. The system of claim 22, wherein the focuser is a concave curved con$ shaped reflector.
23. The system of claim 23, further comprising a conical mirrored surface between the concave curved cone shaped reflector and the acoustic detector.
24. The system of any one of claims 19 to 23, wherein the processor is in electronic communication with the coherent acoustic beam source actuator and the focuser actuator and the memory is in communication with the processor and has instructions thereon to instruct the processor to move the coherent acoustic beam source and the focuser such that at least some of the scattered beams are reflected off the object to be reflected beams and pass by the object to be bypass beams and such that the reflected beams and the bypass beams form an interference zone, the memory further configured to move the coherent acoustic beam and the focuser to produce a Fresnel fringe in the interference zone and the acoustic detector is positioned to image the Fresnel fringe.
EP19897458.6A 2018-12-11 2019-12-06 Non-invasive diffuse acoustic confocal three-dimensional imaging Withdrawn EP3894892A4 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16/216,938 US20190117189A1 (en) 2015-01-15 2018-12-11 Non-invasive diffuse acoustic confocal three-dimensional imaging
PCT/CA2019/000164 WO2020118406A1 (en) 2018-12-11 2019-12-06 Non-invasive diffuse acoustic con focal three-dimensional imaging

Publications (2)

Publication Number Publication Date
EP3894892A1 true EP3894892A1 (en) 2021-10-20
EP3894892A4 EP3894892A4 (en) 2022-05-25

Family

ID=71075900

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19897458.6A Withdrawn EP3894892A4 (en) 2018-12-11 2019-12-06 Non-invasive diffuse acoustic confocal three-dimensional imaging

Country Status (3)

Country Link
EP (1) EP3894892A4 (en)
JP (1) JP7510697B2 (en)
WO (1) WO2020118406A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4367648A (en) * 1980-03-25 1983-01-11 National Research Development Corporation Dark field viewing apparatus
US5535751A (en) * 1994-12-22 1996-07-16 Morphometrix Technologies Inc. Confocal ultrasonic imaging system
US6450960B1 (en) * 2000-08-29 2002-09-17 Barbara Ann Karmanos Cancer Institute Real-time three-dimensional acoustoelectronic imaging and characterization of objects
US20040059265A1 (en) 2002-09-12 2004-03-25 The Regents Of The University Of California Dynamic acoustic focusing utilizing time reversal
US8485034B2 (en) 2009-12-14 2013-07-16 Rodney Arthur Herring Acoustic confocal interferometry microscope
CN107530046B (en) 2015-01-15 2021-07-13 罗德尼·赫林 Diffuse acoustic confocal imager

Also Published As

Publication number Publication date
WO2020118406A1 (en) 2020-06-18
JP7510697B2 (en) 2024-07-04
EP3894892A4 (en) 2022-05-25
JP2022513899A (en) 2022-02-09

Similar Documents

Publication Publication Date Title
US9964747B2 (en) Imaging system and method for imaging an object
JP6643251B2 (en) Device and method for photoacoustic imaging of objects
AU2020257073B2 (en) Diffuse acoustic confocal imager
US20150201902A1 (en) Dual-modality endoscope, method of manufacture, and use thereof
JP6478572B2 (en) SUBJECT INFORMATION ACQUISITION DEVICE AND ACOUSTIC WAVE DEVICE CONTROL METHOD
US9880381B2 (en) Varifocal lens, optical scanning probe including the varifocal lens, and medical apparatus including the optical scanning probe
KR20150053630A (en) Probe and medical imaging apparatus employing the same
JP6472437B2 (en) Photoacoustic apparatus and acoustic wave receiving apparatus
JP2017003587A (en) Device and method for hybrid optoacoustic tomography and ultrasonography
JP6742745B2 (en) Information acquisition device and display method
JP2017038917A (en) Subject information acquisition device
US6702747B2 (en) Acoustically generated images having selected components
JP7510697B2 (en) Non-invasive diffuse acoustic confocal 3D imaging
US20190117189A1 (en) Non-invasive diffuse acoustic confocal three-dimensional imaging
JP2020039809A (en) Subject information acquisition device and control method therefor
WO2022243714A1 (en) Depth-surface imaging device for registering ultrasound images to each other and to surface images by using surface information
JP6679327B2 (en) Ultrasonic device
JP2020110362A (en) Information processing device, information processing method, and program
JP2015211708A (en) Object information acquisition device

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20210709

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Free format text: PREVIOUS MAIN CLASS: G01S0015890000

Ipc: G01S0007520000

A4 Supplementary search report drawn up and despatched

Effective date: 20220425

RIC1 Information provided on ipc code assigned before grant

Ipc: G01N 29/06 20060101ALI20220419BHEP

Ipc: A61B 8/08 20060101ALI20220419BHEP

Ipc: G01S 15/89 20060101ALI20220419BHEP

Ipc: G01S 7/52 20060101AFI20220419BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20230701