US20090060311A1 - Systems and methods for processing x-ray images - Google Patents
Systems and methods for processing x-ray images Download PDFInfo
- Publication number
- US20090060311A1 US20090060311A1 US12/182,932 US18293208A US2009060311A1 US 20090060311 A1 US20090060311 A1 US 20090060311A1 US 18293208 A US18293208 A US 18293208A US 2009060311 A1 US2009060311 A1 US 2009060311A1
- Authority
- US
- United States
- Prior art keywords
- image
- images
- fluoroscopic
- input
- target region
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 85
- 238000012545 processing Methods 0.000 title claims description 12
- 230000033001 locomotion Effects 0.000 claims abstract description 59
- 230000005855 radiation Effects 0.000 claims description 45
- 230000008569 process Effects 0.000 claims description 28
- 230000000241 respiratory effect Effects 0.000 claims description 10
- 230000006870 function Effects 0.000 claims description 7
- 238000011282 treatment Methods 0.000 description 46
- 210000001519 tissue Anatomy 0.000 description 21
- 230000000875 corresponding effect Effects 0.000 description 20
- 238000003384 imaging method Methods 0.000 description 19
- 239000002131 composite material Substances 0.000 description 15
- 238000012935 Averaging Methods 0.000 description 13
- 238000004891 communication Methods 0.000 description 13
- 238000002591 computed tomography Methods 0.000 description 12
- 230000002708 enhancing effect Effects 0.000 description 8
- 238000012544 monitoring process Methods 0.000 description 7
- 238000001959 radiotherapy Methods 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 5
- 230000008685 targeting Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 210000000988 bone and bone Anatomy 0.000 description 3
- 238000005314 correlation function Methods 0.000 description 3
- 210000004072 lung Anatomy 0.000 description 3
- 239000003550 marker Substances 0.000 description 3
- 230000002123 temporal effect Effects 0.000 description 3
- 206010028980 Neoplasm Diseases 0.000 description 2
- 230000004913 activation Effects 0.000 description 2
- 230000002238 attenuated effect Effects 0.000 description 2
- 230000000747 cardiac effect Effects 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000010191 image analysis Methods 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000000737 periodic effect Effects 0.000 description 2
- 230000002829 reductive effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 238000002405 diagnostic procedure Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 210000005003 heart tissue Anatomy 0.000 description 1
- 238000002675 image-guided surgery Methods 0.000 description 1
- 238000002513 implantation Methods 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 210000002307 prostate Anatomy 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 238000007493 shaping process Methods 0.000 description 1
- 210000004872 soft tissue Anatomy 0.000 description 1
- 238000012732 spatial analysis Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 210000001835 viscera Anatomy 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/1048—Monitoring, verifying, controlling systems and methods
- A61N5/1049—Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/44—Constructional features of apparatus for radiation diagnosis
- A61B6/4429—Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
- A61B6/4435—Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure
- A61B6/4441—Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure the rigid structure being a C-arm or U-arm
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/467—Arrangements for interfacing with the operator or the patient characterised by special input means
- A61B6/469—Arrangements for interfacing with the operator or the patient characterised by special input means for selecting a region of interest [ROI]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5229—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
- A61B6/5235—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/54—Control of apparatus or devices for radiation diagnosis
- A61B6/541—Control of apparatus or devices for radiation diagnosis involving acquisition triggered by a physiological signal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7285—Specific aspects of physiological measurement analysis for synchronising or triggering a physiological measurement or image acquisition with a physiological event or waveform, e.g. an ECG signal
- A61B5/7289—Retrospective gating, i.e. associating measured signals or images with a physiological event after the actual measurement or image acquisition, e.g. by simultaneously recording an additional physiological signal during the measurement or image acquisition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5288—Devices using data or image processing specially adapted for radiation diagnosis involving retrospective matching to a physiological signal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/1048—Monitoring, verifying, controlling systems and methods
- A61N5/1049—Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
- A61N2005/1061—Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam using an x-ray imaging system having a separate imaging source
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/103—Treatment planning systems
- A61N5/1037—Treatment planning systems taking into account the movement of the target, e.g. 4D-image based planning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/1048—Monitoring, verifying, controlling systems and methods
- A61N5/1064—Monitoring, verifying, controlling systems and methods for adjusting radiation treatment in response to monitoring
Definitions
- the field of the invention relates to methods and systems for processing images, and more particularly, to methods and systems for processing x-ray images.
- Radiation therapy involves medical procedures that selectively expose certain areas of a human body, such as cancerous tumors, to doses of radiation.
- the purpose of the radiation therapy is to irradiate the targeted biological tissue such that undesirable tissue is destroyed. Radiation has also been-used to obtain image of tissue for diagnostic or treatment purposes.
- the position and movement of a target tissue can be monitored by an imaging system, such as a fluoroscopic imaging system, while radiation is delivered to the target tissue.
- an imaging system such as a fluoroscopic imaging system
- soft tissue targets such as a variety of tumors are not visible in x-ray fluoroscopic images. This is due to structures in front or behind the target tissue which are also visible in the x-ray images thus increasing the clutter to the level that the target tissue cannot be distinguished.
- Radio-opaque markers have been used to aid physicians in identifying a target tissue under fluoroscopic imaging.
- the radio-opaque markers can be injected or implanted at desired sites within a patient, and they shows up as high contrast features in fluoroscopic images. By observing the positions of the internal radio-opaque markers in fluoroscopic images, a physician can determine a position of a target tissue.
- implantation of markers is intrusive to the patient, and it may not be practical or feasible in all cases.
- a method includes obtaining a first image of an object, obtaining a second image of an object, determining a level of correlation between the first and second images, and using the determined level of correlation between the first and second images to obtain information regarding a motion of the object.
- a computer product having a set of instructions, an execution of which causes a process to be performed, the processing includes obtaining a first image of an object, obtaining a second image of an object, determining a level of correlation between the first and second images, and using the determined level of correlation between the first and second images to obtain information regarding a motion of the object.
- a system includes a processor that is configured for obtaining a first image of an object, obtaining a second image of an object, determining a level of correlation between the first and second images, and using the determined level of correlation between the first and second images to obtain information regarding a motion of the object.
- FIG. 1 illustrates a fluoroscopic imaging system with which embodiments of the present invention may be implemented
- FIG. 2 is a flowchart showing a process for targeting an object in accordance with an embodiment of the invention
- FIG. 3 shows an algorithm for processing images in accordance with an embodiment of the invention
- FIG. 4 shows examples of images generated during a treatment or diagnostic session performed in accordance with the process of FIG. 2 ;
- FIG. 5 is a flowchart showing a process for generating templates that may be used in the process of FIG. 2 ;
- FIG. 6 shows examples of images generated at different stages of the template generation process
- FIG. 7 is a block diagram showing a system for processing images in accordance with an embodiment of the invention.
- FIG. 8 shows a motion signal chart and a gating signal chart
- FIG. 9 shows a motion signal chart, a phase chart, and a gating signal chart
- FIG. 10 is a flowchart showing a process for gating a medical procedure in accordance with an embodiment of the invention.
- FIG. 11 is a flowchart showing a process for monitoring a patient's position in accordance with an embodiment of the invention.
- FIG. 12 is a diagram of a computer hardware system with which embodiments of the present invention can be implemented.
- FIG. 1 illustrates a fluoroscopic system 10 with which embodiments of the present invention may be implemented.
- the system 10 includes a fluoroscope 12 , a processor 14 , and a work station 16 having a display 18 and a user interface 20 , such as a keyboard and/or a mouse.
- the processor 14 may be an integral component of the work station 16 , or alternative, a separate component that is connected to the work station 16 .
- the fluoroscope 12 is illustrated as a C-arm fluoroscope in which an x-ray source 22 is mounted on a structural member or C-arm 24 opposite to an imaging assembly 26 , which is configured to receive and detect x-ray emitting from the x-ray source 22 .
- the C-arm 24 is capable of moving about a patient for producing two dimensional projection images of the patient from different angles.
- a patient 30 is positioned between the x-ray source 22 and the imaging assembly 26 .
- a x-ray beam 32 is then directed towards a target region 34 within the patient 30 , and is attenuated as it passes through the patient 30 .
- the imaging assembly 26 receives the attenuated x-ray beam 32 , and generates electrical signals in response thereto.
- the electrical signals are transmitted to the processor 14 , which is configured to generate images in the display 18 based on the electrical signals in accordance with an embodiment of the present invention.
- another radiation source 28 may be positioned adjacent the fluoroscopic system 10 for delivering treatment radiation 29 to the target region 34 . Similar imaging systems or other types of imaging systems may also be used to implement embodiments of the present invention.
- FIG. 2 is a block diagram illustrating an embodiment of a process 200 for tracking a position of the target region 34 of the patient 30 as the target region 34 is being imaged using the fluoroscopic system 10 of FIG. 1 .
- the target region 34 may include a tissue, such as a lung tissue or a heart tissue, that undergoes periodic physiological movements.
- the target region 34 may also include tissue that does not undergoes periodic physiological movements, such as a bone tissue or prostate.
- the processor 14 processes the fluoroscopic image to enhance a feature, such as a moving feature of an object, in the fluoroscopic image (Step 206 ).
- a feature such as a moving feature of an object
- the enhancement of the moving feature may be performed based on image averaging and image subtraction techniques.
- the sixth input fluoroscopic image IRFI.sub.6 may be enhanced or modified by performing image averaging on the previous five input fluoroscopic images to obtain a composite image (i.e.
- composite image includes an array of data that may be stored in a medium, and therefore, is not limited to a displayed image.
- the image averaging may be performed based on a weighted average prescribed as a function over time. For example, if later input fluoroscopic images are to be accounted for more in the averaging, later input fluoroscopic images may be multiplied by a higher weighted factor during the image averaging, and vice versa.
- FIG. 3 shows a recursive algorithm for enhancing a moving feature of an object in an image, in which the current input fluoroscopic image is multiplied by a weighted factor (1-a) while the previous recursive average of the input fluoroscopic image(s) is multiplied by a weighted factor (a).
- the Z.sup. ⁇ 1 represents a memory that holds one frame with one frame time delay. This results in an exponentially decreasing weighted average for the earlier samples. Other types of weighted averaging may also be used.
- the process of enhancing a feature in the fluoroscopic image is not limited to the examples described previously, and that other modified versions of the process may also be used.
- the boxcar averaging may be performed on certain previous input fluoroscopic images (e.g., the last three images), instead of on all of the previous input fluoroscopic images.
- other functions or algorithms may be applied to any combination of the previous input fluoroscopic images and/or the current input fluoroscopic image before or after the image averaging is performed.
- the processor 14 next registers the enhanced input fluoroscopic image with a template (Step 208 ).
- a sequence of templates is provided, and each of the templates contains an image of at least a portion of the target region 34 that is created at a certain time-point or a phase of a physiological cycle.
- the processor 14 selects a template from the sequence of templates that best matches an image of the target region 34 in the enhanced input fluoroscopic image.
- the construction of the templates will be described later.
- the term “phase” refers to a variable that represents, measures, or associates with, a degree of completion of a physiological cycle.
- the input fluoroscopic image is compared with the templates, and the template that best matches with an image in the input fluoroscopic image is registered or cross correlated with the input fluoroscopic image.
- the processor 14 performs an image comparison to determine which portion of the enhanced input fluoroscopic image best matches with each of the template images.
- image analysis such as pattern matching, may be used. For example, if a template contains an image formed by 50.times.50 pixels, the processor 14 may perform a spatial analysis to determine a region (having 50.times.50 pixels) within the enhanced input fluoroscopic image that best matches the template image.
- the processor 14 then computes values representative degrees of match between the templates and an image in the input fluoroscopic image, and selects the template associated with the highest value to be registered with the input fluoroscopic image.
- the position of the image within the input fluoroscopic image that best matches the registered template may be stored in a computer-readable medium for later use.
- each cross correlation between the enhanced input image and a template results in a 2D correlation function with a correlation peak.
- the location of the peak indicates the position of the target region 34
- the value of the peak indicates a degree of match between the input fluoroscopic image and the template.
- the template that provides the highest peak value is then selected as the matching template, and the corresponding peak position in the correlation function is used to determine the position of the target region 34 .
- the input fluoroscopic image is compared with all of the templates to determine the matching template. In another embodiment, instead of comparing the input fluoroscopic image with all of the templates, the input fluoroscopic image is compared with only a subset of templates.
- the subset of templates are selected such that their corresponding phase values (or time points of a respiration cycle at which they are generated) are centered around, or proximate to, the phase of the template that had the best match with the last input fluoroscopic image (i.e., from the last tracking cycle).
- phase values or time points of a respiration cycle at which they are generated
- Such technique increases the efficiency for registering the input fluoroscopic image with the template because an input fluoroscopic image and a template that are collected at the same phase or time-point of a physiological cycle are likely to have similar image contrast.
- the next template in the sequence may be selected to determine if it matches with an image in the current input fluoroscopic image. If it is determined that the template does not match the input fluoroscopic image (i.e., the degree of match does not exceed a prescribed threshold), another template is then selected to determine if it matches with an image in the input fluoroscopic image. For example, the next template or the previous template in the sequence may be selected, until a match is found.
- the position of the target region 34 in the fluoroscopic image is determined (Step 210 ). Particularly, the position of the image in the input fluoroscopic image that matches with the template is the position of the target region 34 .
- a marker may be displayed in the display 18 to indicate the position of the identified target region 34 in the input fluoroscopic image. For example, a frame or an outline having a similar shape as that of the corresponding registered template may be displayed in the input fluoroscopic image.
- the phase associated with the input fluoroscopic image can be determined based on the phase of the matched template. Alternatively the phase associated with the input fluoroscopic image can be determined by a separate tracking mechanism, such as RPM external markers, available at Varian Medical System, Inc., Palo Alto, Calif.
- the next real-time input fluoroscopic image is generated and the previously described process is repeated until the end of the session is reached (Step 212 ).
- the templates and the input fluoroscopic images may be generated at same or different time intervals.
- the templates may be generated at a shorter time interval as compared to that for the input fluoroscopic images, thereby allowing more matching variations between different sets of the input fluoroscopic images and the templates.
- the steps described previously with reference to the process 200 can be carried out in substantially real-time. That is, the input fluoroscopic images can be processed to determine a position of the target region immediately or shortly after they are generated in step 204 . Alternatively, the input fluoroscopic images can be generated in a batch, time-stamped, and stored for subsequent processing. In this case, the enhancing step 206 , the registering step 208 , and the determining step 210 can be performed subsequently.
- FIG. 4 shows examples of images generated at different stages of the dynamic targeting process described previously.
- An example of an input fluoroscopic image 400 created during a phase of a respiratory cycle, and its corresponding motion enhanced fluoroscopic image 402 created using the technique described with reference to step 206 are shown.
- the moving object(s) i.e., the lung tissue 404
- the contrast of the relatively stationary object(s) i.e., the bone 406
- FIG. 4 also shows a rectangular frame 408 displayed in the fluoroscopic image 402 identifying a region in the fluoroscopic image 402 that matches with the template 410 .
- the template 410 is selected from a group 412 of available templates.
- the group 412 can include all of the generated templates, or alternatively, a subset of the generated templates, as discussed previously.
- FIG. 5 shows a process 500 for generating the sequence of templates in accordance with an embodiment of the present invention.
- the radiation source 22 of the fluoroscopic system 10 is positioned and aimed towards an area of the body that includes the target region 34 , and a plurality of reference fluoroscopic images RFI is generated using the fluoroscopic system 10 (Step 502 ).
- the position and orientation of the x-ray source 22 relative to the patient 30 may be stored for later use.
- the position and orientation of the x-ray source 22 used during the template generation session may be used to set up the x-ray source 22 for generating the input fluoroscopic images.
- the image in the input fluoroscopic image would be similar to that in the template, thereby allowing matching of the template with the input fluoroscopic image.
- the plurality of reference fluoroscopic images is preferably collected over a physiological cycle, such as a respiratory cycle or a cardiac cycle, of the moving tissue.
- 120 to 200 reference fluoroscopic images are collected over a period of 12 to 20 seconds in order to capture movements of the target region 34 during a respiratory cycle.
- the collected reference fluoroscopic images are time-stamped and are then stored in digital format in a computer readable medium, such as a hard-drive, a CD-Rom, a diskette, or a server.
- the reference fluoroscopic images are associated with phases or time-points of a physiological cycle (Step 504 ).
- the generated reference fluoroscopic images are time-stamped as they are generated in Step 502 .
- a patient position monitoring system such as that available at Varian Medical System, Inc., Palo Alto, Calif., may be used to detect physiological motion of the patient, and generates motion data as the reference fluoroscopic images are generated.
- the reference fluoroscopic images are then associated with phases or time-points of a physiological cycle based on their corresponding stamped time and the motion data.
- the reference fluoroscopic images can be synchronized with the motion data to a common time line.
- the reference fluoroscopic images may also be registered in phase with three-dimensional computed tomography images generated during a planning session (described below).
- images of the target region 34 are identified in the respective reference fluoroscopic images.
- the images of the target region 34 may be determined manually by a user, such as a physician or a technician.
- the user examines each of the selected reference fluoroscopic images and identifies the target region 34 in each of the selected reference fluoroscopic images.
- the user may place a marker representative of the position of the target region 34 in the corresponding reference fluoroscopic image.
- the user may operate the user interface 20 and place a frame around a region of interest (ROI) containing the target region 34 in the corresponding reference fluoroscopic image.
- the user may also draw an outline around a ROI having a shape that resembles the target region 34 in the corresponding reference fluoroscopic image.
- the outline may represent a boundary of the target region 34 to which treatment may be applied.
- the image of the target region 34 in the respective reference fluoroscopic images may be determined by projecting a three-dimensional (3D) treatment volume onto the respective reference fluoroscopic images.
- 3D computed tomography (CT) images of the treatment volume are obtained such that they cover a period, such as a physiological cycle.
- CT images may be generated simultaneously with the sequence of the reference fluoroscopic images.
- the 3D CT images may be generated separately from the reference fluoroscopic images, in which case, the reference fluoroscopic images may subsequently be registered in phase with the 3D CT images.
- Conventional techniques may be employed to register the sequence of the reference fluoroscopic images with the CT images.
- PRM Respiratory Gating System available at Varian Medical System, Inc., Palo Alto, Calif., may also be used to register the reference fluoroscopic images with the CT images.
- the 3D CT images are then examined to determine the position of the target region 34 in the respective images.
- the position of the target region 34 in each of the respective CT images is projected onto the respective two-dimensional (2D) reference fluoroscopic image using known transformation techniques.
- ROIs containing images of the target region 34 can then be defined in the respective reference fluoroscopic images.
- a rectangular frame circumscribing the target region 34 may be used to define a ROI.
- an outline having a shape that resembles the target region 34 may define a ROI.
- the reference fluoroscopic images are processed to enhance a moving object in the images (Step 508 ).
- the enhancement of a moving object may be performed using a similar technique described previously with reference to the input fluoroscopic images.
- each of the reference fluoroscopic images in the sequence is modified based on image averaging and image subtraction techniques.
- the sixth reference fluoroscopic image RFI.sub.6 in the sequence is modified by performing image averaging on the previous five fluoroscopic images to obtain an average image, and by subtracting the average image from the sixth fluoroscopic image RFI.sub.6.
- the image averaging may be performed based on boxcar or recursive techniques.
- the image averaging may be performed based on a weighted average prescribed as a function over time, as described previously.
- the images contained within the ROIs in the reference fluoroscopic images are stored as a sequence of templates (Step 510 ).
- the templates may be stored in a computer readable medium, such as a hard-drive, a CD-Rom, a diskette, or a server.
- the motion enhancement is performed after the ROIs are determined in the reference fluoroscopic images.
- the order of the steps of enhancing a moving object and ROI determination can be different from the process 500 .
- digitally reconstructed radiographs are produced from each reference 3D CT image for the direction of fluoroscopic image that will be used in treatment.
- the target volume is projected in each DRR, and the DRRs are used as the reference fluoroscopic images in the same manner as the previous embodiment.
- the above-described process 500 for generating the sequence of templates may be performed in the same session (e.g., a treatment session) in which the process 200 is being performed.
- the templates may be generated in another session that is carried out separately and prior to a treatment or diagnostic session.
- FIG. 6 shows examples of images generated at different stages of the template generation process 500 described previously.
- An example of a reference fluoroscopic image 600 created during a phase of a respiratory cycle, and its corresponding motion enhanced fluoroscopic image 602 created using the technique described with reference to step 508 are shown.
- the moving object(s) i.e., the lung tissue 604
- the contrast of the stationary object(s) i.e., the bone 606
- FIG. 6 shows a ROI 608 in the fluoroscopic image 602 that has been selected as a template 610 .
- the input fluoroscopic image 400 described previously with reference to FIG. 4 is similar to the reference fluoroscopic image 600 because (1) the images 400 and 600 are collected from substantially the same angle and position relative to the patient 30 , and (2) the input fluoroscopic image 400 and the reference fluoroscopic image 600 are collected at substantially the same time-point of a physiological cycle.
- FIG. 7 shows a system 700 for performing the above described processes.
- the system 700 includes a template generation module 702 and an image matching module 704 , either or both of which may be implemented using the processor 14 or a computer system.
- the template generation module 702 includes a phase association module 706 , which associates the reference images 708 with phases or time-points of a physiological cycle.
- the template generation module 702 also includes a projection module 710 that projects a four dimensional treatment plan (3D treatment plan over time) onto the selected reference images 708 , and a motion enhancement module 712 for enhancing a feature in the selected reference images 708 .
- the motion enhancement module 712 enhance a feature in the entire image for each of the selected reference images 708 .
- the motion enhancement module 712 enhances a feature in only the projected overlay on the selected reference images 708 .
- the motion enhancement module 712 is optional, in which case, the system 700 does not include the motion enhancement module 712 .
- the image matching module 704 includes a motion enhancement module 720 for enhancing a feature in the input images 722 that are generated during a treatment or diagnostic session.
- the image matching module 704 also includes a spatial and temporal matching module 724 for matching the input images 722 with the generated templates 714 . Particularly, for each of the input images 722 , the spatial and temporal matching module 724 selects a template 714 that best matches an image in the input image 722 , and generates an output 726 .
- the output 726 includes the position (X.sub.n, Y.sub.n) of the sub-image in the input image 722 that best matches the template T.sub.n, and an index n of the best-matching template T.sub.n.
- the index n may be used to determine the time-point or phase of a physiological cycle at which the input image 722 is generated.
- the previously described method allows a user determine a position of the target region 34 during a session without the use of a radio-opaque marker, and may be implemented using existing imaging systems.
- the method may be used by a physician to perform a wide range of operations or procedures.
- the position of the target region 34 obtained using the previously described process may be used as an input signal to control and aim a radiation treatment beam 29 towards the target region 34 .
- the radiation treatment beam 29 is continuously positioned to follow the target region 34 based on the positions of the target region 34 identified in the fluoroscopic images.
- the aim point of a treatment radiation beam may be controlled by a moving collimator based on data regarding the position of the target region 34 received from the processor 14 .
- a treatment couch supporting a patient can be moved to control a position of the target region 34 at which the beam 29 is directed.
- the above-described method may be used to detect a movement of the target region 34 , based on which a medical procedure may be gated.
- a medical procedure may be gated.
- the radiation source 28 may be gated to be turned on or off based on the positions of the target region 34 identified in the input fluoroscopic images.
- the position of the image within the input fluoroscopic image that is registered with the corresponding template may be used to determine if the target region 34 has moved beyond a prescribed threshold position. If the target region 34 remains within the prescribed threshold position, the radiation beam 29 is turned on, and if the target region 34 has moved beyond the threshold position, the radiation beam 29 is then deactivated.
- FIG. 8 shows an example of a motion signal chart 800 and a gating signal chart 802 that is aligned with the motion signal chart 800 .
- the motion signal chart 800 may be created by using position data of the target region 34 obtained using the previously described process 200 .
- a treatment interval 804 may be defined by an upper bound 806 and a lower bound 808 , as shown in the motion signal chart 800 .
- the upper bound 806 has a value of 0.8
- the lower bound 808 has a value of ⁇ 0.8.
- any position of the target region 34 that falls outside the prescribed treatment interval 804 results in a “beam off” gating signal 810 that stops the application of radiation to the patient 30 .
- Any position of the target region 34 that falls within the prescribed treatment interval 804 results in a “beam on” gating signal 812 that allows radiation to be applied to the patient 30 .
- the radiation source 28 may be gated to be turned on or off based on the phase of a physiological cycle.
- the position vs. time history of the image within the input fluoroscopic image that is registered with the corresponding template may be used to determine a phase of a physiological cycle. If the target region 34 remains within a prescribed phase interval, the radiation beam 29 is turned on, and if the target region 34 has moved beyond the prescribed phase interval, the radiation beam 29 is then deactivated.
- FIG. 9 shows an example of a motion signal chart 900 , a corresponding phase chart 902 for the target region 34 , and a gating signal chart 904 that is aligned with the phase chart 902 .
- the motion signal chart 900 may be created by using position data of the target region 34 obtained using the previously described method (i.e., at step 210 ).
- the phase chart 902 may be created based on a beginning and an end of a physiological cycle in the motion signal chart 900 .
- the phase chart 902 shows the phase progression of a physiological movement of the target region 34 over time.
- a prescribed phase interval 906 may be defined by an upper bound 908 and a lower bound 910 , which are represented as dotted lines in the phase chart 902 .
- the upper bound 908 has a value of 185.degree. and the lower bound 910 has a value of 25.degree.
- any position of the target region 34 corresponding to a phase that falls outside the prescribed phase interval 906 results in a “beam off” gating signal 912 that stops the application of radiation to the patient 30 .
- Any position of the target region 34 corresponding to a phase that falls within the prescribed phase interval 906 results in a “beam on” gating signal 914 that allows radiation to be applied to the patient 30 .
- the radiation treatment beam may be gated to be turned on or off by associating the templates with treatment data.
- certain templates may be associated with a “beam on” signal, while the rest of the templates are associated with a “beam off” signal.
- templates generated within a prescribed treatment phase interval may be associated with a “beam on” signal, while templates generated outside the prescribed treatment phase interval may be associated with a “beam off” signal.
- the treatment data may also include a “beam on duration” signal.
- the templates may also be associated with treatment data that are commonly used in radiation therapy, such as beam shape data and radiation dosage data.
- a template also contains a “beam shape” data
- the processor 14 then directs a signal to a beam-shaping (e.g., a multi-leaf) collimator to change the shape of the treatment beam 29 based on the “beam shape” data.
- a beam-shaping e.g., a multi-leaf
- values may be computed to indicate a degree of correlation between the previously generated input fluoroscopic images and their corresponding registered templates.
- the registered template for the current input fluoroscopic image is likely to be correct, and treatment may be applied in accordance with the treatment data prescribed by the corresponding registered template.
- radiation may be delivered to the patient during a desired portion of a physiological cycle.
- a portion such as a quiescent period
- quiescent periods occur during the respiratory cycle at the ends of expiration and inspiration.
- the determined position of the target region 34 can be used to detect quiescent periods of physiological cycles.
- the motion of the target region 34 slows down or may even cease for a fraction of a moment, thereby allowing a radiation treatment to be directed to the target region 34 .
- the activation of a radiation beam may be gated in substantially real-time, or alternatively, in a predictive fashion. For example, based on a detected position of a target region and a degree of match between previous input fluoroscopic images and the templates, the processor 14 can predictively activate a radiation source (an example of predictive gating) so as to compensate for delay of activation time inherent in some x-ray systems. Predictive gating has been described in U.S. patent application Ser. No. 09/893,122 referenced herein.
- FIG. 10 shows a method 1000 for gating a medical treatment based on a degree of detected motion of the target region 34 in accordance with an embodiment of the present invention.
- a real-time input fluoroscopic image is generated using the fluoroscopic system 10 of FIG. 1 (Step 1004 ).
- a ROI in the input fluoroscopic image is determined (Step 1006 ).
- the ROI includes at least a portion of the target region 34 , which can be a tissue targeted for treatment, or alternatively, any other tissue captured in the input fluoroscopic image.
- the ROI can be determined by a physician during a treatment or planning session.
- the ROI may be defined by a frame circumscribing a portion of the input fluoroscopic image.
- a composite image CI is created by subtracting the image in the ROI in the previous input fluoroscopic image from the image in the ROI in the current input fluoroscopic image (Step 1008 ).
- a value associated with a contrast of the composite image is next calculated over the ROI ( 1010 ).
- the variance of the pixels in the composite image which is associated with a contrast of the composite image CI, may be calculated over the ROI, and may be used as a measure of the extent of motion undergone by the tissue within the ROI (e.g., the target region 34 ). In other embodiments, different measures of the contrast in the composite image may be used.
- a beam gating signal is determined based on the calculated value ( 1012 ). Since an image of an object in the ROI having low contrast indicates that the object has not moved significantly over time, and vice versa, a radiation beam may be disabled when the calculated value (associated with the contrast of the composite image in the ROI) exceeds a certain threshold, and be enabled when the value is below the threshold. In one embodiment, if the calculated value m>T. A, then a radiation beam is disabled, and vice versa, where T is a prescribed threshold value, and A is a normalization factor for compensating for changes or daily variations in the operation of the fluoroscopic imaging system 10 .
- A
- the next real-time input fluoroscopic image is generated and the previously described process is repeated until a sufficient radiation has been delivered to the target region 34 (Step 1014 ).
- the target object may be a patient or an internal organ.
- a position of the object 30 may be determined using a method that is similar to that discussed previously with reference to FIG. 2 .
- one template is generated using the process 500 discussed previously.
- a portion of the reference fluoroscopic image containing the target object i.e., object that is not expected to move beyond a certain prescribed threshold during a session
- input fluoroscopic images of the target object 30 are analyzed and compared with the template to determine the position of the object in the input fluoroscopic images.
- the processor 14 may perform image analysis to determine a portion in each of the input fluoroscopic images that best matches with the template.
- the position of the matched portion in each of the input fluoroscopic images represents the position of the object.
- FIG. 11 shows a method 1100 for target object position monitoring (i.e., determining whether there is target object movement) in accordance with an embodiment of the present invention.
- the radiation source 22 of the fluoroscopic system 10 and the image detector is positioned and aimed towards the target object 30 , and a reference fluoroscopic image RFI is generated using the fluoroscopic system 10 (Step 1102 ).
- a portion of the reference fluoroscopic image is selected as a ROI (Step 1104 ).
- the portion of the reference fluoroscopic image should contain an image of a target object, that is expected to be held relatively stationary during a treatment or diagnostic session.
- the position of the ROI in the reference fluoroscopic image may be stored in a computer-readable medium for later use.
- a real-time input fluoroscopic image IFI.sub.n is generated using the fluoroscopic system 10 (Step 1106 ).
- the reference fluoroscopic image and the input fluoroscopic image are generated in the same session with the patient 30 staying in substantially the same position.
- the reference fluoroscopic image and the input fluoroscopic image may be generated in different sessions.
- the x-ray source 22 and image detector are set up such that its position and orientation relative to the patient 30 are substantially the same as those in which the reference fluoroscopic image was generated.
- a portion of the input fluoroscopic image IFI.sub.n having the same position as the ROI in the reference fluoroscopic image RFI is selected and subtracted from the image in the ROI to obtain the composite image CI.sub.n.
- the composite image CI.sub.n is then analyzed to determine whether there has been target object movement ( 1110 ). If there has been target object movement, the pixels in the composite image CI.sub.n should have an increase in contrast.
- the target object 30 may be considered to have moved if the contrast increase is above a certain prescribed threshold. With respect to radiation therapy, the radiation beam 29 may be deactivated when the contrast increase is above a prescribed threshold.
- Step 1112 The next real-time input fluoroscopic image is then generated and the previously described process is repeated until the end of the session is reached.
- target object position monitoring and determination may be performed in conjunction with the dynamic targeting or gating of a medical procedure described previously.
- other techniques for monitoring or determining a target object position such as those described in U.S. patent application Ser. No. 09/893,122, may also be used.
- the entire disclosure of the U.S. patent application Ser. No. 09/893,122 is expressly incorporated by reference herein.
- FIG. 12 is a block diagram that illustrates an embodiment of a computer system 1200 upon which an embodiment of the invention may be implemented.
- Computer system 1200 includes a bus 1202 or other communication mechanism for communicating information, and a processor 1204 coupled with the bus 1202 for processing information.
- the processor 1204 may be an example of the processor 14 of FIG. 1 .
- the computer system 1200 also includes a main memory 1206 , such as a random access memory (RAM) or other dynamic storage device, coupled to the bus 1202 for storing information and instructions to be executed by the processor 1204 .
- the main memory 1206 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by the processor 1204 .
- the computer system 1200 further includes a read only memory (ROM) 1208 or other static storage device coupled to the bus 1202 for storing static information and instructions for the processor 1204 .
- ROM read only memory
- a data storage device 1210 such as a magnetic disk or optical disk, is provided and coupled to the bus 1202 for storing information and instructions.
- the computer system 1200 may be coupled via the bus 1202 to a display 1212 , such as a cathode ray tube (CRT), for displaying information to a user.
- a display 1212 such as a cathode ray tube (CRT)
- An input device 1214 is coupled to the bus 1202 for communicating information and command selections to processor 1204 .
- cursor control 1216 is Another type of user input device
- cursor control 1216 such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 1204 and for controlling cursor movement on display 1212 .
- This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane.
- the invention is related to the use of computer system 1200 for processing images. According to one embodiment of the invention, such use is provided by computer system 1200 in response to processor 1204 executing one or more sequences of one or more instructions contained in the main memory 1206 . Such instructions may be read into the main memory 1206 from another computer-readable medium, such as storage device 1210 . Execution of the sequences of instructions contained in the main memory 1206 causes the processor 1204 to perform the process steps described herein. One or more processors in a multi-processing arrangement may also be employed to execute the sequences of instructions contained in the main memory 1206 . In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions to implement the invention. Thus, embodiments of the invention are not limited to any specific combination of hardware circuitry and software.
- Non-volatile media includes, for example, optical or magnetic disks, such as the storage device 1210 .
- Volatile media includes dynamic memory, such as the main memory 1206 .
- Transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise the bus 1202 . Transmission media can also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
- Computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read.
- Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to the processor 1204 for execution.
- the instructions may initially be carried on a magnetic disk of a remote computer.
- the remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem.
- a modem local to the computer system 1200 can receive the data on the telephone line and use an infrared transmitter to convert the data to an infrared signal.
- An infrared detector coupled to the bus 1202 can receive the data carried in the infrared signal and place the data on the bus 1202 .
- the bus 1202 carries the data to the main memory 1206 , from which the processor 1204 retrieves and executes the instructions.
- the instructions received by the main memory 1206 may optionally be stored on the storage device 1210 either before or after execution by the processor 1204 .
- the computer system 1200 also includes a communication interface 1218 coupled to the bus 1202 .
- the communication interface 1218 provides a two-way data communication coupling to a network link 1220 that is connected to a local network 1222 .
- the communication interface 1218 may be an integrated services digital network (ISDN) card or a modem to provide a data communication connection to a corresponding type of telephone line.
- ISDN integrated services digital network
- the communication interface 1218 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN.
- LAN local area network
- Wireless links may also be implemented.
- the communication interface 1218 sends and receives electrical, electromagnetic or optical signals that carry data streams representing various types of information.
- the network link 1220 typically provides data communication through one or more networks to other devices.
- the network link 1220 may provide a connection through local network 1222 to a host computer 1224 or to medical equipment 1226 such as a radiation beam source or a switch operatively coupled to a radiation beam source.
- the data streams transported over the network link 1220 can comprise electrical, electromagnetic or optical signals.
- the signals through the various networks and the signals on the network link 1220 and through the communication interface 1218 , which carry data to and from the computer system 1200 are exemplary forms of carrier waves transporting the information.
- the computer system 1200 can send messages and receive data, including program code, through the network(s), the network link 1220 , and the communication interface 1218 .
- the systems and methods may also be implemented using other types of imaging.
- the previously described methods may be modified, and are intended to be within the scope of the present invention.
- the step i.e., step 206 and 508
- the contrasts or features of an image in the templates and the input images are such that they allow registration between the templates and the input images, then the methods 200 and 500 may not include step 206 and 508 , respectively.
- the methods have been described with reference to radiation treatment, it should be understood that the same or similar methods may also be used to perform other types of medical procedures.
- the gating methods described with reference to, FIGS. 8-10 may be used in various diagnostic imaging procedures as well as image-guided surgery in which movement of surgical instruments are controlled by the position of the target object.
- the above-described methods may also have applications in retrospective gating. In this case, the input fluoroscopic images or the processed input fluoroscopic images can be time-stamped and stored for future processing.
- physiological data e.g., position of target region or patient
- physiological data obtained from the processed input fluoroscopic images
- the raw data associated with the imaging application is synchronized to a common time base with the physiological motion data. Segments of the raw data that correspond to movement cycle intervals of interest are used to reconstruct the volumetric image thus minimizing the distortion and size-changes caused by-patient motion.
- the method 200 is not limited to determining a position of a portion of a patient or animal body.
- the method 200 may also be used to determine a position of a non-animal body or other objects in a medical or non-medical environment.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Molecular Biology (AREA)
- High Energy & Nuclear Physics (AREA)
- Optics & Photonics (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physiology (AREA)
- Human Computer Interaction (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Radiation-Therapy Devices (AREA)
Abstract
Description
- This application is a continuation of U.S. patent application Ser. No. 11/593,950, filed on Nov. 6, 2006, which is a continuation of U.S. patent application Ser. No. 10/656,063, filed on Sep. 5, 2003, now U.S. Pat. No. 7,158,610, the disclosures of both of which are expressly incorporated by reference herein.
- The field of the invention relates to methods and systems for processing images, and more particularly, to methods and systems for processing x-ray images.
- Radiation therapy involves medical procedures that selectively expose certain areas of a human body, such as cancerous tumors, to doses of radiation. The purpose of the radiation therapy is to irradiate the targeted biological tissue such that undesirable tissue is destroyed. Radiation has also been-used to obtain image of tissue for diagnostic or treatment purposes.
- In a radiation treatment session, the position and movement of a target tissue can be monitored by an imaging system, such as a fluoroscopic imaging system, while radiation is delivered to the target tissue. This ensures that the target tissue is in a desired position while the radiation is being delivered. However, often soft tissue targets such as a variety of tumors are not visible in x-ray fluoroscopic images. This is due to structures in front or behind the target tissue which are also visible in the x-ray images thus increasing the clutter to the level that the target tissue cannot be distinguished.
- Internal radio-opaque markers have been used to aid physicians in identifying a target tissue under fluoroscopic imaging. The radio-opaque markers can be injected or implanted at desired sites within a patient, and they shows up as high contrast features in fluoroscopic images. By observing the positions of the internal radio-opaque markers in fluoroscopic images, a physician can determine a position of a target tissue. However, implantation of markers is intrusive to the patient, and it may not be practical or feasible in all cases.
- Accordingly, systems and methods for visualization of internal tissue without use of internal markers would be useful.
- In accordance with some embodiments, a method includes obtaining a first image of an object, obtaining a second image of an object, determining a level of correlation between the first and second images, and using the determined level of correlation between the first and second images to obtain information regarding a motion of the object.
- In accordance with other embodiments, a computer product having a set of instructions, an execution of which causes a process to be performed, the processing includes obtaining a first image of an object, obtaining a second image of an object, determining a level of correlation between the first and second images, and using the determined level of correlation between the first and second images to obtain information regarding a motion of the object.
- In accordance with other embodiments, a system includes a processor that is configured for obtaining a first image of an object, obtaining a second image of an object, determining a level of correlation between the first and second images, and using the determined level of correlation between the first and second images to obtain information regarding a motion of the object.
- Other and further aspects and features will be evident from reading the following detailed description of the embodiments, which are intended to illustrate, not limit, the invention.
- The drawings illustrate the design and utility of preferred embodiments of the present invention, in which similar elements are referred to by common reference numerals. In order to better appreciate how advantages and objects of the present inventions are obtained, a more particular description of the present inventions briefly described above will be rendered by reference to specific embodiments thereof, which are illustrated in the accompanying drawings. Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered limiting its scope, the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings.
-
FIG. 1 illustrates a fluoroscopic imaging system with which embodiments of the present invention may be implemented; -
FIG. 2 is a flowchart showing a process for targeting an object in accordance with an embodiment of the invention; -
FIG. 3 shows an algorithm for processing images in accordance with an embodiment of the invention; -
FIG. 4 shows examples of images generated during a treatment or diagnostic session performed in accordance with the process ofFIG. 2 ; -
FIG. 5 is a flowchart showing a process for generating templates that may be used in the process ofFIG. 2 ; -
FIG. 6 shows examples of images generated at different stages of the template generation process; -
FIG. 7 is a block diagram showing a system for processing images in accordance with an embodiment of the invention; -
FIG. 8 shows a motion signal chart and a gating signal chart; -
FIG. 9 shows a motion signal chart, a phase chart, and a gating signal chart; -
FIG. 10 is a flowchart showing a process for gating a medical procedure in accordance with an embodiment of the invention; -
FIG. 11 is a flowchart showing a process for monitoring a patient's position in accordance with an embodiment of the invention; and -
FIG. 12 is a diagram of a computer hardware system with which embodiments of the present invention can be implemented. - Various embodiments of the present invention are described hereinafter with reference to the figures. It should be noted that the figures are not drawn to scale and that elements of similar structures or functions are represented by like reference numerals throughout the figures. It should also be noted that the figures are only intended to facilitate the description of specific embodiments of the invention. They are not intended as an exhaustive description of the invention or as a limitation on the scope of the invention. In addition, an illustrated embodiment needs not have all the aspects or advantages of the invention shown. An aspect or an advantage described in conjunction with a particular embodiment of the present invention is not necessarily limited to that embodiment and can be practiced in any other embodiments of the present invention even if not so illustrated.
-
FIG. 1 illustrates afluoroscopic system 10 with which embodiments of the present invention may be implemented. Thesystem 10 includes afluoroscope 12, aprocessor 14, and awork station 16 having adisplay 18 and auser interface 20, such as a keyboard and/or a mouse. Theprocessor 14 may be an integral component of thework station 16, or alternative, a separate component that is connected to thework station 16. Thefluoroscope 12 is illustrated as a C-arm fluoroscope in which anx-ray source 22 is mounted on a structural member or C-arm 24 opposite to an imaging assembly 26, which is configured to receive and detect x-ray emitting from thex-ray source 22. The C-arm 24 is capable of moving about a patient for producing two dimensional projection images of the patient from different angles. - During use of the
fluoroscopic system 10, apatient 30 is positioned between thex-ray source 22 and the imaging assembly 26. Ax-ray beam 32 is then directed towards atarget region 34 within thepatient 30, and is attenuated as it passes through thepatient 30. The imaging assembly 26 receives theattenuated x-ray beam 32, and generates electrical signals in response thereto. The electrical signals are transmitted to theprocessor 14, which is configured to generate images in thedisplay 18 based on the electrical signals in accordance with an embodiment of the present invention. During a treatment session, anotherradiation source 28 may be positioned adjacent thefluoroscopic system 10 for deliveringtreatment radiation 29 to thetarget region 34. Similar imaging systems or other types of imaging systems may also be used to implement embodiments of the present invention. -
FIG. 2 is a block diagram illustrating an embodiment of aprocess 200 for tracking a position of thetarget region 34 of thepatient 30 as thetarget region 34 is being imaged using thefluoroscopic system 10 ofFIG. 1 . - To track a position of the
target region 34 of thepatient 30 undergoing a fluoroscopic imaging, a real-time input fluoroscopic image is generated using the fluoroscopic system 10 (Step 204). Thetarget region 34 may include a tissue, such as a lung tissue or a heart tissue, that undergoes periodic physiological movements. Alternatively, thetarget region 34 may also include tissue that does not undergoes periodic physiological movements, such as a bone tissue or prostate. - Next, the
processor 14 processes the fluoroscopic image to enhance a feature, such as a moving feature of an object, in the fluoroscopic image (Step 206). By enhancing a moving feature in the input fluoroscopic image, contrast of an image of a moving object is enhanced while contrast of an image of a relatively stationary object is reduced. In the illustrated embodiment, the enhancement of the moving feature may be performed based on image averaging and image subtraction techniques. - In one embodiment, boxcar averaging technique may be used. Particularly, to obtain an enhanced input fluoroscopic image EIFI.sub.n for the nth input fluoroscopic image IFI.sub.n, a long term average of the previous input fluoroscopic images is calculated and subtracted from the nth input fluoroscopic image IFI.sub.n, (i.e., EIFI.sub.n=IFI.sub.n-Avg(IFI.sub.x=n−m to x=n−1, where m=length of boxcar). For example, the sixth input fluoroscopic image IRFI.sub.6 may be enhanced or modified by performing image averaging on the previous five input fluoroscopic images to obtain a composite image (i.e. an average image), and by subtracting the composite image from the sixth input fluoroscopic image RFI.sub.6. As used in this specification, the term “composite image” includes an array of data that may be stored in a medium, and therefore, is not limited to a displayed image.
- In an alternative embodiment, the image averaging may be performed based on a weighted average prescribed as a function over time. For example, if later input fluoroscopic images are to be accounted for more in the averaging, later input fluoroscopic images may be multiplied by a higher weighted factor during the image averaging, and vice versa.
FIG. 3 shows a recursive algorithm for enhancing a moving feature of an object in an image, in which the current input fluoroscopic image is multiplied by a weighted factor (1-a) while the previous recursive average of the input fluoroscopic image(s) is multiplied by a weighted factor (a). The Z.sup.−1 represents a memory that holds one frame with one frame time delay. This results in an exponentially decreasing weighted average for the earlier samples. Other types of weighted averaging may also be used. - It should be noted that the process of enhancing a feature in the fluoroscopic image is not limited to the examples described previously, and that other modified versions of the process may also be used. For example, in another embodiment, the boxcar averaging may be performed on certain previous input fluoroscopic images (e.g., the last three images), instead of on all of the previous input fluoroscopic images. In other embodiments, other functions or algorithms may be applied to any combination of the previous input fluoroscopic images and/or the current input fluoroscopic image before or after the image averaging is performed.
- The
processor 14 next registers the enhanced input fluoroscopic image with a template (Step 208). In the illustrated embodiment, a sequence of templates is provided, and each of the templates contains an image of at least a portion of thetarget region 34 that is created at a certain time-point or a phase of a physiological cycle. Theprocessor 14 selects a template from the sequence of templates that best matches an image of thetarget region 34 in the enhanced input fluoroscopic image. The construction of the templates will be described later. As used in this specification, the term “phase” refers to a variable that represents, measures, or associates with, a degree of completion of a physiological cycle. - In one embodiment, the input fluoroscopic image is compared with the templates, and the template that best matches with an image in the input fluoroscopic image is registered or cross correlated with the input fluoroscopic image. In this case, the
processor 14 performs an image comparison to determine which portion of the enhanced input fluoroscopic image best matches with each of the template images. Known techniques for performing image analysis, such as pattern matching, may be used. For example, if a template contains an image formed by 50.times.50 pixels, theprocessor 14 may perform a spatial analysis to determine a region (having 50.times.50 pixels) within the enhanced input fluoroscopic image that best matches the template image. Theprocessor 14 then computes values representative degrees of match between the templates and an image in the input fluoroscopic image, and selects the template associated with the highest value to be registered with the input fluoroscopic image. The position of the image within the input fluoroscopic image that best matches the registered template may be stored in a computer-readable medium for later use. - In one embodiment, each cross correlation between the enhanced input image and a template results in a 2D correlation function with a correlation peak. In each correlation function, the location of the peak indicates the position of the
target region 34, and the value of the peak indicates a degree of match between the input fluoroscopic image and the template. The template that provides the highest peak value is then selected as the matching template, and the corresponding peak position in the correlation function is used to determine the position of thetarget region 34. - Examples of an algorithm that may be used to search for the template that best matches the input fluoroscopic image will now be described. However, it should be understood that the determination of the template that best matches the input fluoroscopic image may also be performed using other algorithms or techniques. In one embodiment, the input fluoroscopic image is compared with all of the templates to determine the matching template. In another embodiment, instead of comparing the input fluoroscopic image with all of the templates, the input fluoroscopic image is compared with only a subset of templates. In this case, the subset of templates are selected such that their corresponding phase values (or time points of a respiration cycle at which they are generated) are centered around, or proximate to, the phase of the template that had the best match with the last input fluoroscopic image (i.e., from the last tracking cycle). Such technique increases the efficiency for registering the input fluoroscopic image with the template because an input fluoroscopic image and a template that are collected at the same phase or time-point of a physiological cycle are likely to have similar image contrast. In another embodiment, if a match is found between the previous input fluoroscopic image and a template, and if the templates and the fluoroscopic images are generated at substantially the same phases or time-points of a physiological cycle, the next template in the sequence may be selected to determine if it matches with an image in the current input fluoroscopic image. If it is determined that the template does not match the input fluoroscopic image (i.e., the degree of match does not exceed a prescribed threshold), another template is then selected to determine if it matches with an image in the input fluoroscopic image. For example, the next template or the previous template in the sequence may be selected, until a match is found.
- Once the input fluoroscopic image is matched with the template, the position of the
target region 34 in the fluoroscopic image is determined (Step 210). Particularly, the position of the image in the input fluoroscopic image that matches with the template is the position of thetarget region 34. A marker may be displayed in thedisplay 18 to indicate the position of the identifiedtarget region 34 in the input fluoroscopic image. For example, a frame or an outline having a similar shape as that of the corresponding registered template may be displayed in the input fluoroscopic image. The phase associated with the input fluoroscopic image can be determined based on the phase of the matched template. Alternatively the phase associated with the input fluoroscopic image can be determined by a separate tracking mechanism, such as RPM external markers, available at Varian Medical System, Inc., Palo Alto, Calif. - The next real-time input fluoroscopic image is generated and the previously described process is repeated until the end of the session is reached (Step 212). The templates and the input fluoroscopic images may be generated at same or different time intervals. For example, the templates may be generated at a shorter time interval as compared to that for the input fluoroscopic images, thereby allowing more matching variations between different sets of the input fluoroscopic images and the templates.
- It should be noted that the steps described previously with reference to the
process 200 can be carried out in substantially real-time. That is, the input fluoroscopic images can be processed to determine a position of the target region immediately or shortly after they are generated instep 204. Alternatively, the input fluoroscopic images can be generated in a batch, time-stamped, and stored for subsequent processing. In this case, the enhancingstep 206, the registeringstep 208, and the determiningstep 210 can be performed subsequently. -
FIG. 4 shows examples of images generated at different stages of the dynamic targeting process described previously. An example of an inputfluoroscopic image 400 created during a phase of a respiratory cycle, and its corresponding motion enhancedfluoroscopic image 402 created using the technique described with reference to step 206 are shown. As can be seen in the figure, by subtracting the average image from the current input fluoroscopic image, the moving object(s), i.e., thelung tissue 404, is enhanced while the contrast of the relatively stationary object(s), i.e., thebone 406, is reduced. FIG. 4 also shows arectangular frame 408 displayed in thefluoroscopic image 402 identifying a region in thefluoroscopic image 402 that matches with thetemplate 410. Thetemplate 410 is selected from agroup 412 of available templates. Thegroup 412 can include all of the generated templates, or alternatively, a subset of the generated templates, as discussed previously. - The construction of the templates will now be described. Various methods may be used to generate the templates.
FIG. 5 shows aprocess 500 for generating the sequence of templates in accordance with an embodiment of the present invention. First, theradiation source 22 of thefluoroscopic system 10 is positioned and aimed towards an area of the body that includes thetarget region 34, and a plurality of reference fluoroscopic images RFI is generated using the fluoroscopic system 10 (Step 502). The position and orientation of thex-ray source 22 relative to the patient 30 may be stored for later use. Particularly, the position and orientation of thex-ray source 22 used during the template generation session may be used to set up thex-ray source 22 for generating the input fluoroscopic images. As a result, the image in the input fluoroscopic image would be similar to that in the template, thereby allowing matching of the template with the input fluoroscopic image. If thetarget region 34 includes a moving tissue, the plurality of reference fluoroscopic images is preferably collected over a physiological cycle, such as a respiratory cycle or a cardiac cycle, of the moving tissue. In one embodiment, 120 to 200 reference fluoroscopic images are collected over a period of 12 to 20 seconds in order to capture movements of thetarget region 34 during a respiratory cycle. The collected reference fluoroscopic images are time-stamped and are then stored in digital format in a computer readable medium, such as a hard-drive, a CD-Rom, a diskette, or a server. - Next, the reference fluoroscopic images are associated with phases or time-points of a physiological cycle (Step 504). In one embodiment, the generated reference fluoroscopic images are time-stamped as they are generated in
Step 502. A patient position monitoring system, such as that available at Varian Medical System, Inc., Palo Alto, Calif., may be used to detect physiological motion of the patient, and generates motion data as the reference fluoroscopic images are generated. The reference fluoroscopic images are then associated with phases or time-points of a physiological cycle based on their corresponding stamped time and the motion data. For example, the reference fluoroscopic images can be synchronized with the motion data to a common time line. In another embodiment, the reference fluoroscopic images may also be registered in phase with three-dimensional computed tomography images generated during a planning session (described below). - In
Step 506, images of thetarget region 34 are identified in the respective reference fluoroscopic images. In one embodiment, the images of thetarget region 34 may be determined manually by a user, such as a physician or a technician. In this case, the user examines each of the selected reference fluoroscopic images and identifies thetarget region 34 in each of the selected reference fluoroscopic images. For each identifiedtarget region 34 in the reference fluoroscopic images, the user may place a marker representative of the position of thetarget region 34 in the corresponding reference fluoroscopic image. For example, the user may operate theuser interface 20 and place a frame around a region of interest (ROI) containing thetarget region 34 in the corresponding reference fluoroscopic image. Alternatively, the user may also draw an outline around a ROI having a shape that resembles thetarget region 34 in the corresponding reference fluoroscopic image. In this case, the outline may represent a boundary of thetarget region 34 to which treatment may be applied. - In another embodiment, the image of the
target region 34 in the respective reference fluoroscopic images may be determined by projecting a three-dimensional (3D) treatment volume onto the respective reference fluoroscopic images. In this case, a number of 3D computed tomography (CT) images of the treatment volume are obtained such that they cover a period, such as a physiological cycle. The 3D CT images may be generated simultaneously with the sequence of the reference fluoroscopic images. Alternatively, the 3D CT images may be generated separately from the reference fluoroscopic images, in which case, the reference fluoroscopic images may subsequently be registered in phase with the 3D CT images. Conventional techniques may be employed to register the sequence of the reference fluoroscopic images with the CT images. PRM Respiratory Gating System, available at Varian Medical System, Inc., Palo Alto, Calif., may also be used to register the reference fluoroscopic images with the CT images. - The 3D CT images are then examined to determine the position of the
target region 34 in the respective images. In one embodiment, the position of thetarget region 34 in each of the respective CT images is projected onto the respective two-dimensional (2D) reference fluoroscopic image using known transformation techniques. Based on the projected positions of thetarget region 34 in the respective reference fluoroscopic images, ROIs containing images of thetarget region 34 can then be defined in the respective reference fluoroscopic images. For example, a rectangular frame circumscribing thetarget region 34 may be used to define a ROI. Alternatively, an outline having a shape that resembles thetarget region 34 may define a ROI. - Next, the reference fluoroscopic images are processed to enhance a moving object in the images (Step 508). The enhancement of a moving object may be performed using a similar technique described previously with reference to the input fluoroscopic images. In the illustrated embodiment, each of the reference fluoroscopic images in the sequence is modified based on image averaging and image subtraction techniques. Particularly, to obtain an enhanced reference fluoroscopic image ERFI.sub.n for the nth reference fluoroscopic image RFI.sub.n in the sequence, a long term average of the previous reference fluoroscopic images is calculated and subtracted from the nth reference fluoroscopic image RFI.sub.n, (i.e., ERFI.sub.n=RFI.sub.n-Avg(RFI.sub.x=1 to x=n−1). For example, the sixth reference fluoroscopic image RFI.sub.6 in the sequence is modified by performing image averaging on the previous five fluoroscopic images to obtain an average image, and by subtracting the average image from the sixth fluoroscopic image RFI.sub.6. In one embodiment, the image averaging may be performed based on boxcar or recursive techniques. In alternative embodiments, the image averaging may be performed based on a weighted average prescribed as a function over time, as described previously.
- Next, the images contained within the ROIs in the reference fluoroscopic images are stored as a sequence of templates (Step 510). The templates may be stored in a computer readable medium, such as a hard-drive, a CD-Rom, a diskette, or a server.
- In the previously described embodiment, the motion enhancement is performed after the ROIs are determined in the reference fluoroscopic images. However, this needs not be the case. In an alternative embodiment, the order of the steps of enhancing a moving object and ROI determination can be different from the
process 500. Furthermore, in another embodiment, instead of generating reference fluoroscopic images, digitally reconstructed radiographs (DRR) are produced from each reference 3D CT image for the direction of fluoroscopic image that will be used in treatment. In this case, the target volume is projected in each DRR, and the DRRs are used as the reference fluoroscopic images in the same manner as the previous embodiment. - It should be noted that the above-described
process 500 for generating the sequence of templates may be performed in the same session (e.g., a treatment session) in which theprocess 200 is being performed. Alternatively, the templates may be generated in another session that is carried out separately and prior to a treatment or diagnostic session. -
FIG. 6 shows examples of images generated at different stages of thetemplate generation process 500 described previously. An example of a referencefluoroscopic image 600 created during a phase of a respiratory cycle, and its corresponding motion enhancedfluoroscopic image 602 created using the technique described with reference to step 508 are shown. As can be seen in the figure, by subtracting the composite image of previously generated reference fluoroscopic images from the current reference fluoroscopic image, the moving object(s), i.e., thelung tissue 604, is enhanced while the contrast of the stationary object(s), i.e., thebone 606, is minimized. Furthermore,FIG. 6 shows aROI 608 in thefluoroscopic image 602 that has been selected as atemplate 610. Note that the inputfluoroscopic image 400 described previously with reference toFIG. 4 is similar to the referencefluoroscopic image 600 because (1) theimages patient 30, and (2) the inputfluoroscopic image 400 and the referencefluoroscopic image 600 are collected at substantially the same time-point of a physiological cycle. -
FIG. 7 shows asystem 700 for performing the above described processes. Thesystem 700 includes atemplate generation module 702 and animage matching module 704, either or both of which may be implemented using theprocessor 14 or a computer system. Thetemplate generation module 702 includes aphase association module 706, which associates thereference images 708 with phases or time-points of a physiological cycle. Thetemplate generation module 702 also includes aprojection module 710 that projects a four dimensional treatment plan (3D treatment plan over time) onto the selectedreference images 708, and amotion enhancement module 712 for enhancing a feature in the selectedreference images 708. In one embodiment, themotion enhancement module 712 enhance a feature in the entire image for each of the selectedreference images 708. In another embodiment, themotion enhancement module 712 enhances a feature in only the projected overlay on the selectedreference images 708. Also in another embodiment, themotion enhancement module 712 is optional, in which case, thesystem 700 does not include themotion enhancement module 712. - The
image matching module 704 includes amotion enhancement module 720 for enhancing a feature in theinput images 722 that are generated during a treatment or diagnostic session. Theimage matching module 704 also includes a spatial andtemporal matching module 724 for matching theinput images 722 with the generatedtemplates 714. Particularly, for each of theinput images 722, the spatial andtemporal matching module 724 selects atemplate 714 that best matches an image in theinput image 722, and generates anoutput 726. Theoutput 726 includes the position (X.sub.n, Y.sub.n) of the sub-image in theinput image 722 that best matches the template T.sub.n, and an index n of the best-matching template T.sub.n. The index n may be used to determine the time-point or phase of a physiological cycle at which theinput image 722 is generated. - The previously described method allows a user determine a position of the
target region 34 during a session without the use of a radio-opaque marker, and may be implemented using existing imaging systems. The method may be used by a physician to perform a wide range of operations or procedures. - Dynamic Targeting
- In one embodiment, the position of the
target region 34 obtained using the previously described process may be used as an input signal to control and aim aradiation treatment beam 29 towards thetarget region 34. In this case, theradiation treatment beam 29 is continuously positioned to follow thetarget region 34 based on the positions of thetarget region 34 identified in the fluoroscopic images. For example, the aim point of a treatment radiation beam may be controlled by a moving collimator based on data regarding the position of thetarget region 34 received from theprocessor 14. Alternatively a treatment couch supporting a patient can be moved to control a position of thetarget region 34 at which thebeam 29 is directed. - Physiological Gating
- In another embodiment, the above-described method may be used to detect a movement of the
target region 34, based on which a medical procedure may be gated. Several examples of applications towards physiological gating will now be described with reference to radiation therapy. However, it should be understood by those skilled in the art that similar techniques or methods may be used to control other types of treatments or diagnostic procedures. - In one embodiment, the
radiation source 28 may be gated to be turned on or off based on the positions of thetarget region 34 identified in the input fluoroscopic images. In this case, the position of the image within the input fluoroscopic image that is registered with the corresponding template may be used to determine if thetarget region 34 has moved beyond a prescribed threshold position. If thetarget region 34 remains within the prescribed threshold position, theradiation beam 29 is turned on, and if thetarget region 34 has moved beyond the threshold position, theradiation beam 29 is then deactivated. -
FIG. 8 shows an example of amotion signal chart 800 and agating signal chart 802 that is aligned with themotion signal chart 800. Themotion signal chart 800 may be created by using position data of thetarget region 34 obtained using the previously describedprocess 200. A treatment interval 804 may be defined by an upper bound 806 and a lower bound 808, as shown in themotion signal chart 800. In the illustrated example, the upper bound 806 has a value of 0.8 and the lower bound 808 has a value of −0.8. As shown in thegating signal chart 802, any position of thetarget region 34 that falls outside the prescribed treatment interval 804 results in a “beam off” gating signal 810 that stops the application of radiation to thepatient 30. Any position of thetarget region 34 that falls within the prescribed treatment interval 804 results in a “beam on” gating signal 812 that allows radiation to be applied to thepatient 30. - In another embodiment, the
radiation source 28 may be gated to be turned on or off based on the phase of a physiological cycle. In this case, the position vs. time history of the image within the input fluoroscopic image that is registered with the corresponding template may be used to determine a phase of a physiological cycle. If thetarget region 34 remains within a prescribed phase interval, theradiation beam 29 is turned on, and if thetarget region 34 has moved beyond the prescribed phase interval, theradiation beam 29 is then deactivated. -
FIG. 9 shows an example of amotion signal chart 900, acorresponding phase chart 902 for thetarget region 34, and agating signal chart 904 that is aligned with thephase chart 902. Themotion signal chart 900 may be created by using position data of thetarget region 34 obtained using the previously described method (i.e., at step 210). Thephase chart 902 may be created based on a beginning and an end of a physiological cycle in themotion signal chart 900. Thephase chart 902 shows the phase progression of a physiological movement of thetarget region 34 over time. Aprescribed phase interval 906 may be defined by an upper bound 908 and a lower bound 910, which are represented as dotted lines in thephase chart 902. In the illustrated example, the upper bound 908 has a value of 185.degree. and the lower bound 910 has a value of 25.degree. According to the illustratedgating signal chart 904, any position of thetarget region 34 corresponding to a phase that falls outside theprescribed phase interval 906 results in a “beam off” gating signal 912 that stops the application of radiation to thepatient 30. Any position of thetarget region 34 corresponding to a phase that falls within theprescribed phase interval 906 results in a “beam on” gating signal 914 that allows radiation to be applied to thepatient 30. - In yet another embodiment, the radiation treatment beam may be gated to be turned on or off by associating the templates with treatment data. In one embodiment, certain templates may be associated with a “beam on” signal, while the rest of the templates are associated with a “beam off” signal. For example, templates generated within a prescribed treatment phase interval may be associated with a “beam on” signal, while templates generated outside the prescribed treatment phase interval may be associated with a “beam off” signal. In an alternative embodiment, in addition to the “beam off” and “beam on” signals, the treatment data may also include a “beam on duration” signal. In other embodiments, the templates may also be associated with treatment data that are commonly used in radiation therapy, such as beam shape data and radiation dosage data. During a radiation treatment session, real time input fluoroscopic images are obtained and are registered with the templates in accordance with the previously described method. When an input fluoroscopic image is registered with a template that contains a “beam on” signal, the
treatment radiation source 28 then directs atreatment radiation beam 29 towards thetarget region 34 for a duration prescribed by the corresponding “beam on duration” signal. On the other hand, when an input fluoroscopic image is registered with a template that contains a “beam off” signal, thetreatment radiation source 28 then holds off thetreatment beam 29 and seizes directing radiation towards thetarget region 34. If a template also contains a “beam shape” data, when an input fluoroscopic image is registered with such template, theprocessor 14 then directs a signal to a beam-shaping (e.g., a multi-leaf) collimator to change the shape of thetreatment beam 29 based on the “beam shape” data. In one embodiment, to ensure that a correct treatment is being delivered to thetarget region 34, values may be computed to indicate a degree of correlation between the previously generated input fluoroscopic images and their corresponding registered templates. If the value indicates that there has been a high correlation in the temporal and/or spatial matching between the previously generated input fluoroscopic images and their corresponding registered templates, the registered template for the current input fluoroscopic image is likely to be correct, and treatment may be applied in accordance with the treatment data prescribed by the corresponding registered template. - In yet another embodiment, radiation may be delivered to the patient during a desired portion of a physiological cycle. In radiation therapy, it may be desirable to apply the
radiation beam 29 towards thetarget region 34 during a portion, such as a quiescent period, of a physiological cycle. For example, quiescent periods occur during the respiratory cycle at the ends of expiration and inspiration. In this case, the determined position of thetarget region 34 can be used to detect quiescent periods of physiological cycles. During the quiescent periods, the motion of thetarget region 34 slows down or may even cease for a fraction of a moment, thereby allowing a radiation treatment to be directed to thetarget region 34. - It should be noted that in the above described embodiments, the activation of a radiation beam may be gated in substantially real-time, or alternatively, in a predictive fashion. For example, based on a detected position of a target region and a degree of match between previous input fluoroscopic images and the templates, the
processor 14 can predictively activate a radiation source (an example of predictive gating) so as to compensate for delay of activation time inherent in some x-ray systems. Predictive gating has been described in U.S. patent application Ser. No. 09/893,122 referenced herein. -
FIG. 10 shows amethod 1000 for gating a medical treatment based on a degree of detected motion of thetarget region 34 in accordance with an embodiment of the present invention. - To gate a medical treatment on the
target region 34 of the patient 30 undergoing a fluoroscopic imaging, a real-time input fluoroscopic image is generated using thefluoroscopic system 10 ofFIG. 1 (Step 1004). - Next, a ROI in the input fluoroscopic image is determined (Step 1006). In one embodiment, the ROI includes at least a portion of the
target region 34, which can be a tissue targeted for treatment, or alternatively, any other tissue captured in the input fluoroscopic image. The ROI can be determined by a physician during a treatment or planning session. For example, the ROI may be defined by a frame circumscribing a portion of the input fluoroscopic image. - Next, a composite image CI is created by subtracting the image in the ROI in the previous input fluoroscopic image from the image in the ROI in the current input fluoroscopic image (Step 1008). For example, for the third input fluoroscopic image IFI.sub.3 generated in a sequence, a corresponding composite image CI.sub.3 is created by subtracting the image in the ROI in the previous input fluoroscopic image (i.e., the second fluoroscopic image IFI.sub.2) from the third input fluoroscopic image IFI.sub.3 (i.e., CI.sub.n=IFI.sub.n-IFI.sub.n−1). It should be understood that this step needs not be performed for the first input fluoroscopic image in the sequence since there is no previous input fluoroscopic image before the first input fluoroscopic image.
- A value associated with a contrast of the composite image is next calculated over the ROI (1010). In one embodiment, the variance of the pixels in the composite image, which is associated with a contrast of the composite image CI, may be calculated over the ROI, and may be used as a measure of the extent of motion undergone by the tissue within the ROI (e.g., the target region 34). In other embodiments, different measures of the contrast in the composite image may be used.
- A beam gating signal is determined based on the calculated value (1012). Since an image of an object in the ROI having low contrast indicates that the object has not moved significantly over time, and vice versa, a radiation beam may be disabled when the calculated value (associated with the contrast of the composite image in the ROI) exceeds a certain threshold, and be enabled when the value is below the threshold. In one embodiment, if the calculated value m>T. A, then a radiation beam is disabled, and vice versa, where T is a prescribed threshold value, and A is a normalization factor for compensating for changes or daily variations in the operation of the
fluoroscopic imaging system 10. One possible value for A is A=|max m(t)−min m(t)| where max m(t) and min m(t) are derived from observing m over a recent physiological cycle, such as a respiratory cycle or a cardiac cycle. - The next real-time input fluoroscopic image is generated and the previously described process is repeated until a sufficient radiation has been delivered to the target region 34 (Step 1014).
- Target Object Position Monitoring
- Besides dynamically targeting a moving object and gating a medical procedure, methods similar to that described previously may also be used to monitor or determine the position of a target object during a session. The target object may be a patient or an internal organ.
- In one embodiment, a position of the
object 30 may be determined using a method that is similar to that discussed previously with reference toFIG. 2 . In this case, instead of generating a sequence of templates, one template is generated using theprocess 500 discussed previously. In this case, a portion of the reference fluoroscopic image containing the target object (i.e., object that is not expected to move beyond a certain prescribed threshold during a session) is selected as the template. During a treatment or diagnostic session, input fluoroscopic images of thetarget object 30 are analyzed and compared with the template to determine the position of the object in the input fluoroscopic images. For example, theprocessor 14 may perform image analysis to determine a portion in each of the input fluoroscopic images that best matches with the template. The position of the matched portion in each of the input fluoroscopic images represents the position of the object. By observing the determined positions of the object in the input fluoroscopic images, one can determine how much thetarget object 30 has moved during a session. With respect to radiation therapy, if it is determined that theobject 30 has moved beyond a certain prescribed threshold, theradiation beam 29 may be deactivated. - In certain situations, it may be desirable to determine that there is target object movement, and it may not be necessary to determine how much an object has moved.
FIG. 11 shows amethod 1100 for target object position monitoring (i.e., determining whether there is target object movement) in accordance with an embodiment of the present invention. First, theradiation source 22 of thefluoroscopic system 10 and the image detector is positioned and aimed towards thetarget object 30, and a reference fluoroscopic image RFI is generated using the fluoroscopic system 10 (Step 1102). - Next, a portion of the reference fluoroscopic image is selected as a ROI (Step 1104). Particularly, the portion of the reference fluoroscopic image should contain an image of a target object, that is expected to be held relatively stationary during a treatment or diagnostic session. The position of the ROI in the reference fluoroscopic image may be stored in a computer-readable medium for later use.
- To perform target object position monitoring during a treatment or diagnostic session, a real-time input fluoroscopic image IFI.sub.n is generated using the fluoroscopic system 10 (Step 1106). In the illustrated embodiment, the reference fluoroscopic image and the input fluoroscopic image are generated in the same session with the patient 30 staying in substantially the same position. Alternatively, the reference fluoroscopic image and the input fluoroscopic image may be generated in different sessions. In this case, the
x-ray source 22 and image detector are set up such that its position and orientation relative to the patient 30 are substantially the same as those in which the reference fluoroscopic image was generated. - In
Step 1108, the current input fluoroscopic image IFI.sub.n is subtracted from the reference fluoroscopic image RFI over the ROI to obtain a composite image CI.sub.n (i.e., CI.sub.n=IFI.sub.n-RFI). In other words, a portion of the input fluoroscopic image IFI.sub.n having the same position as the ROI in the reference fluoroscopic image RFI is selected and subtracted from the image in the ROI to obtain the composite image CI.sub.n. The composite image CI.sub.n is then analyzed to determine whether there has been target object movement (1110). If there has been target object movement, the pixels in the composite image CI.sub.n should have an increase in contrast. Thetarget object 30 may be considered to have moved if the contrast increase is above a certain prescribed threshold. With respect to radiation therapy, theradiation beam 29 may be deactivated when the contrast increase is above a prescribed threshold. - The next real-time input fluoroscopic image is then generated and the previously described process is repeated until the end of the session is reached (Step 1112).
- The above-described target object position monitoring and determination may be performed in conjunction with the dynamic targeting or gating of a medical procedure described previously. Alternatively, other techniques for monitoring or determining a target object position, such as those described in U.S. patent application Ser. No. 09/893,122, may also be used. The entire disclosure of the U.S. patent application Ser. No. 09/893,122 is expressly incorporated by reference herein.
- Computer System Architecture
-
FIG. 12 is a block diagram that illustrates an embodiment of acomputer system 1200 upon which an embodiment of the invention may be implemented.Computer system 1200 includes abus 1202 or other communication mechanism for communicating information, and aprocessor 1204 coupled with thebus 1202 for processing information. Theprocessor 1204 may be an example of theprocessor 14 ofFIG. 1 . Thecomputer system 1200 also includes amain memory 1206, such as a random access memory (RAM) or other dynamic storage device, coupled to thebus 1202 for storing information and instructions to be executed by theprocessor 1204. Themain memory 1206 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by theprocessor 1204. Thecomputer system 1200 further includes a read only memory (ROM) 1208 or other static storage device coupled to thebus 1202 for storing static information and instructions for theprocessor 1204. Adata storage device 1210, such as a magnetic disk or optical disk, is provided and coupled to thebus 1202 for storing information and instructions. - The
computer system 1200 may be coupled via thebus 1202 to adisplay 1212, such as a cathode ray tube (CRT), for displaying information to a user. Aninput device 1214, including alphanumeric and other keys, is coupled to thebus 1202 for communicating information and command selections toprocessor 1204. Another type of user input device iscursor control 1216, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections toprocessor 1204 and for controlling cursor movement ondisplay 1212. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane. - The invention is related to the use of
computer system 1200 for processing images. According to one embodiment of the invention, such use is provided bycomputer system 1200 in response toprocessor 1204 executing one or more sequences of one or more instructions contained in themain memory 1206. Such instructions may be read into themain memory 1206 from another computer-readable medium, such asstorage device 1210. Execution of the sequences of instructions contained in themain memory 1206 causes theprocessor 1204 to perform the process steps described herein. One or more processors in a multi-processing arrangement may also be employed to execute the sequences of instructions contained in themain memory 1206. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions to implement the invention. Thus, embodiments of the invention are not limited to any specific combination of hardware circuitry and software. - The term “computer-readable medium” as used herein refers to any medium that participates in providing instructions to the
processor 1204 for execution. Such a medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media includes, for example, optical or magnetic disks, such as thestorage device 1210. Volatile media includes dynamic memory, such as themain memory 1206. Transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise thebus 1202. Transmission media can also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications. - Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read.
- Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to the
processor 1204 for execution. For example, the instructions may initially be carried on a magnetic disk of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to thecomputer system 1200 can receive the data on the telephone line and use an infrared transmitter to convert the data to an infrared signal. An infrared detector coupled to thebus 1202 can receive the data carried in the infrared signal and place the data on thebus 1202. Thebus 1202 carries the data to themain memory 1206, from which theprocessor 1204 retrieves and executes the instructions. The instructions received by themain memory 1206 may optionally be stored on thestorage device 1210 either before or after execution by theprocessor 1204. - The
computer system 1200 also includes a communication interface 1218 coupled to thebus 1202. The communication interface 1218 provides a two-way data communication coupling to anetwork link 1220 that is connected to alocal network 1222. For example, the communication interface 1218 may be an integrated services digital network (ISDN) card or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, the communication interface 1218 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN. Wireless links may also be implemented. In any such implementation, the communication interface 1218 sends and receives electrical, electromagnetic or optical signals that carry data streams representing various types of information. - The
network link 1220 typically provides data communication through one or more networks to other devices. For example, thenetwork link 1220 may provide a connection throughlocal network 1222 to ahost computer 1224 or tomedical equipment 1226 such as a radiation beam source or a switch operatively coupled to a radiation beam source. The data streams transported over thenetwork link 1220 can comprise electrical, electromagnetic or optical signals. The signals through the various networks and the signals on thenetwork link 1220 and through the communication interface 1218, which carry data to and from thecomputer system 1200, are exemplary forms of carrier waves transporting the information. Thecomputer system 1200 can send messages and receive data, including program code, through the network(s), thenetwork link 1220, and the communication interface 1218. - Although the embodiments of the systems and methods have been described with reference to fluoroscopic imaging, it should be understood that the systems and methods may also be implemented using other types of imaging. Depending on the type of imaging used, the previously described methods may be modified, and are intended to be within the scope of the present invention. For example, if the type of imaging technique used is such that it can generate images of a target region with sufficient contrast or desired features, then the step (i.e.,
step 206 and 508) of enhancing a moving object in an image may not be necessary. Particularly, in other embodiments, if the contrasts or features of an image in the templates and the input images are such that they allow registration between the templates and the input images, then themethods step - Although the methods have been described with reference to radiation treatment, it should be understood that the same or similar methods may also be used to perform other types of medical procedures. For example, the gating methods described with reference to,
FIGS. 8-10 may be used in various diagnostic imaging procedures as well as image-guided surgery in which movement of surgical instruments are controlled by the position of the target object. In addition, besides real-time and predictive gating described previously, the above-described methods may also have applications in retrospective gating. In this case, the input fluoroscopic images or the processed input fluoroscopic images can be time-stamped and stored for future processing. For example, in three-dimensional imaging applications such as computed tomography, PET, and MRI, physiological data (e.g., position of target region or patient) obtained from the processed input fluoroscopic images can be used to retrospectively “gate” a reconstruction process. For this purpose, the raw data associated with the imaging application is synchronized to a common time base with the physiological motion data. Segments of the raw data that correspond to movement cycle intervals of interest are used to reconstruct the volumetric image thus minimizing the distortion and size-changes caused by-patient motion. - Furthermore, the
method 200 is not limited to determining a position of a portion of a patient or animal body. Themethod 200 may also be used to determine a position of a non-animal body or other objects in a medical or non-medical environment. - Although particular embodiments of the present inventions have been shown and described, it will be understood that it is not intended to limit the present inventions to the preferred embodiments, and it will be obvious to those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the present inventions. For example, the operations performed by the
processor 14 can be performed by any combination of hardware and software within the scope of the invention, and should not be limited to particular embodiments comprising a particular definition of “processor”. The specification and drawings are, accordingly, to be regarded in an illustrative rather than restrictive sense. The present inventions are intended to cover alternatives, modifications, and equivalents, which may be included within the spirit and scope of the present inventions as defined by the claims.
Claims (24)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/182,932 US20090060311A1 (en) | 2003-09-05 | 2008-07-30 | Systems and methods for processing x-ray images |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/656,063 US7158610B2 (en) | 2003-09-05 | 2003-09-05 | Systems and methods for processing x-ray images |
US11/593,950 US7620146B2 (en) | 1998-10-23 | 2006-11-06 | Systems and methods for processing x-ray images |
US12/182,932 US20090060311A1 (en) | 2003-09-05 | 2008-07-30 | Systems and methods for processing x-ray images |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/593,950 Continuation US7620146B2 (en) | 1998-10-23 | 2006-11-06 | Systems and methods for processing x-ray images |
Publications (1)
Publication Number | Publication Date |
---|---|
US20090060311A1 true US20090060311A1 (en) | 2009-03-05 |
Family
ID=34226273
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/656,063 Expired - Fee Related US7158610B2 (en) | 1998-10-23 | 2003-09-05 | Systems and methods for processing x-ray images |
US11/593,950 Expired - Fee Related US7620146B2 (en) | 1998-10-23 | 2006-11-06 | Systems and methods for processing x-ray images |
US12/182,932 Abandoned US20090060311A1 (en) | 2003-09-05 | 2008-07-30 | Systems and methods for processing x-ray images |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/656,063 Expired - Fee Related US7158610B2 (en) | 1998-10-23 | 2003-09-05 | Systems and methods for processing x-ray images |
US11/593,950 Expired - Fee Related US7620146B2 (en) | 1998-10-23 | 2006-11-06 | Systems and methods for processing x-ray images |
Country Status (4)
Country | Link |
---|---|
US (3) | US7158610B2 (en) |
EP (1) | EP1661440A4 (en) |
JP (1) | JP4842820B2 (en) |
WO (1) | WO2005025279A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100231709A1 (en) * | 2009-03-16 | 2010-09-16 | Fuji Xerox Co., Ltd. | Position measurement system, position measurement method and computer-readable medium |
US20100278414A1 (en) * | 2009-04-29 | 2010-11-04 | Kajetan Berlinger | Method and device for determining preferred alignments of a treatment beam generator |
US20150327828A1 (en) * | 2013-03-06 | 2015-11-19 | Fujifilm Corporation | Body motion display device and body motion display method |
Families Citing this family (52)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6937696B1 (en) | 1998-10-23 | 2005-08-30 | Varian Medical Systems Technologies, Inc. | Method and system for predictive physiological gating |
US7158610B2 (en) * | 2003-09-05 | 2007-01-02 | Varian Medical Systems Technologies, Inc. | Systems and methods for processing x-ray images |
US7769430B2 (en) | 2001-06-26 | 2010-08-03 | Varian Medical Systems, Inc. | Patient visual instruction techniques for synchronizing breathing with a medical procedure |
US7620444B2 (en) | 2002-10-05 | 2009-11-17 | General Electric Company | Systems and methods for improving usability of images for medical applications |
US8571639B2 (en) * | 2003-09-05 | 2013-10-29 | Varian Medical Systems, Inc. | Systems and methods for gating medical procedures |
JP4439882B2 (en) * | 2003-11-14 | 2010-03-24 | キヤノン株式会社 | Radiation image processing apparatus and processing method |
US7366336B2 (en) * | 2004-03-09 | 2008-04-29 | Siemens Medical Solutions Usa, Inc. | System to link periodic X-ray images |
US7388976B2 (en) * | 2004-03-09 | 2008-06-17 | Siemens Medical Solutions Usa, Inc. | Time-based system to link periodic X-ray images |
US20070053491A1 (en) * | 2005-09-07 | 2007-03-08 | Eastman Kodak Company | Adaptive radiation therapy method with target detection |
US20070189455A1 (en) * | 2006-02-14 | 2007-08-16 | Accuray Incorporated | Adaptive x-ray control |
US7570738B2 (en) * | 2006-08-04 | 2009-08-04 | Siemens Medical Solutions Usa, Inc. | Four-dimensional (4D) image verification in respiratory gated radiation therapy |
WO2008044572A1 (en) * | 2006-10-04 | 2008-04-17 | Hitachi Medical Corporation | Medical image diagnostic device |
US7620147B2 (en) * | 2006-12-13 | 2009-11-17 | Oraya Therapeutics, Inc. | Orthovoltage radiotherapy |
US7535991B2 (en) | 2006-10-16 | 2009-05-19 | Oraya Therapeutics, Inc. | Portable orthovoltage radiotherapy |
WO2008086434A2 (en) * | 2007-01-09 | 2008-07-17 | Cyberheart, Inc. | Depositing radiation in heart muscle under ultrasound guidance |
US20080177280A1 (en) * | 2007-01-09 | 2008-07-24 | Cyberheart, Inc. | Method for Depositing Radiation in Heart Muscle |
US10974075B2 (en) | 2007-03-16 | 2021-04-13 | Varian Medical Systems, Inc. | Radiation treatment planning and delivery for moving targets in the heart |
WO2008115830A2 (en) * | 2007-03-16 | 2008-09-25 | Cyberheart, Inc. | Radiation treatment planning and delivery for moving targets in the heart |
US8363783B2 (en) * | 2007-06-04 | 2013-01-29 | Oraya Therapeutics, Inc. | Method and device for ocular alignment and coupling of ocular structures |
US8920406B2 (en) * | 2008-01-11 | 2014-12-30 | Oraya Therapeutics, Inc. | Device and assembly for positioning and stabilizing an eye |
US8315354B2 (en) * | 2007-12-07 | 2012-11-20 | Konica Minolta Medical & Graphic, Inc. | Dynamic radiographing system |
CN101951990A (en) | 2007-12-23 | 2011-01-19 | Oraya治疗公司 | Methods and devices for detecting, controlling, and predicting radiation delivery |
US7801271B2 (en) | 2007-12-23 | 2010-09-21 | Oraya Therapeutics, Inc. | Methods and devices for orthovoltage ocular radiotherapy and treatment planning |
US10667727B2 (en) | 2008-09-05 | 2020-06-02 | Varian Medical Systems, Inc. | Systems and methods for determining a state of a patient |
JP2012501792A (en) | 2008-09-12 | 2012-01-26 | アキュレイ インコーポレイテッド | Control of X-ray imaging based on target movement |
US8396248B2 (en) * | 2008-09-16 | 2013-03-12 | Varian Medical Systems, Inc. | Sequential stereo imaging for estimating trajectory and monitoring target position |
US8337512B2 (en) * | 2008-12-08 | 2012-12-25 | Siemens Aktiengesellschaft | Device and workflow for minimally-invasive therapy, in particular needle guidance |
EP2453793A1 (en) | 2009-07-17 | 2012-05-23 | Cyberheart, Inc. | Heart treatment kit, system, and method for radiosurgically alleviating arrhythmia |
EP2543018B1 (en) * | 2010-03-02 | 2016-03-02 | Brainlab AG | Tracking representations of indicator body parts |
US8798347B2 (en) * | 2010-03-15 | 2014-08-05 | Siemens Aktiengesellschaft | System and method for image-based respiratory motion compensation for fluoroscopic coronary roadmapping |
JP5441781B2 (en) * | 2010-03-25 | 2014-03-12 | キヤノン株式会社 | Photoacoustic imaging apparatus, photoacoustic imaging method, and program |
US8460166B2 (en) * | 2010-10-01 | 2013-06-11 | Elekta Ab (Publ) | Radiotherapy planning and delivery |
US10307619B2 (en) * | 2012-02-06 | 2019-06-04 | Insightec, Ltd. | Reference-library extension during imaging of moving organs |
US9002079B2 (en) * | 2012-05-21 | 2015-04-07 | General Electric Company | Systems and methods for motion detecting for medical imaging |
JP5981220B2 (en) * | 2012-05-21 | 2016-08-31 | 東芝メディカルシステムズ株式会社 | Medical image processing apparatus and X-ray imaging apparatus |
JP5954734B2 (en) * | 2012-09-11 | 2016-07-20 | 株式会社日立製作所 | Moving body tracking device and radiation therapy system |
DE102013201822B4 (en) * | 2013-02-05 | 2022-12-29 | Siemens Healthcare Gmbh | Method for generating a PET or SPECT image data set and hybrid imaging modality therefor |
US20140243579A1 (en) * | 2013-02-27 | 2014-08-28 | Loyola University Chicago | Dual-energy image suppression method |
US9972088B2 (en) * | 2013-05-28 | 2018-05-15 | Konica Minolta, Inc. | Image processing apparatus and storage medium |
US9375184B2 (en) * | 2013-09-12 | 2016-06-28 | Technische Universität München | System and method for prediction of respiratory motion from 3D thoracic images |
CN106061376B (en) * | 2014-03-03 | 2020-07-28 | 瓦里安医疗系统公司 | System and method for patient position monitoring |
JP6215108B2 (en) * | 2014-03-27 | 2017-10-18 | 株式会社日立製作所 | Bed positioning device for radiation therapy equipment |
BE1022455B1 (en) * | 2014-10-28 | 2016-04-06 | Erik Billiet | METHOD OF OPTIMIZING IN-VIVO MEASUREMENT ACCURACY WHEN MEASURING INVASIVE BLOOD PRESSURE WITH A LIQUID-FILLED CATHETER MANOMETER SYSTEM |
DE102015215584B4 (en) * | 2015-08-14 | 2022-03-03 | Siemens Healthcare Gmbh | Method and system for the reconstruction of planning images |
JP6533991B2 (en) * | 2016-02-16 | 2019-06-26 | 東芝エネルギーシステムズ株式会社 | MEDICAL IMAGE PROCESSING APPARATUS, METHOD, PROGRAM, AND RADIATION THERAPY APPARATUS |
JP6811960B2 (en) * | 2016-11-15 | 2021-01-13 | 株式会社島津製作所 | X-ray fluoroscopy method and X-ray fluoroscopy device |
JP2019017867A (en) * | 2017-07-20 | 2019-02-07 | 株式会社東芝 | Information processing apparatus, information processing system, and program |
JP7140320B2 (en) * | 2017-12-20 | 2022-09-21 | 国立研究開発法人量子科学技術研究開発機構 | MEDICAL DEVICE, METHOD OF CONTROLLING MEDICAL DEVICE, AND PROGRAM |
JP7116944B2 (en) | 2017-12-20 | 2022-08-12 | 国立研究開発法人量子科学技術研究開発機構 | MEDICAL DEVICE, METHOD OF CONTROLLING MEDICAL DEVICE, AND PROGRAM |
EP3412208B1 (en) * | 2018-01-29 | 2021-05-26 | Siemens Healthcare GmbH | Provision of a medical image |
WO2020115520A1 (en) * | 2018-12-02 | 2020-06-11 | Playsight Interactive Ltd. | Ball tracking in sport events |
JP2019147062A (en) * | 2019-06-18 | 2019-09-05 | 株式会社東芝 | Medical image processing device |
Citations (90)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3952201A (en) * | 1973-07-21 | 1976-04-20 | Emi Limited | Radiography |
US3974386A (en) * | 1974-07-12 | 1976-08-10 | Wisconsin Alumni Research Foundation | Differential X-ray method and apparatus |
US4289142A (en) * | 1978-11-24 | 1981-09-15 | Kearns Kenneth L | Physiological occurrence, such as apnea, monitor and X-ray triggering device |
US4335427A (en) * | 1980-04-21 | 1982-06-15 | Technicare Corporation | Method of selecting a preferred difference image |
US4387722A (en) * | 1978-11-24 | 1983-06-14 | Kearns Kenneth L | Respiration monitor and x-ray triggering apparatus |
US4545384A (en) * | 1983-02-23 | 1985-10-08 | Tokyo Shibaura Denki Kabushiki Kaisha | Nuclear magnetic resonance diagnostic apparatus |
US4663591A (en) * | 1985-08-16 | 1987-05-05 | General Electric Company | Method for reducing image artifacts due to periodic signal variations in NMR imaging |
US4672651A (en) * | 1985-03-28 | 1987-06-09 | Hitachi Medical Corporation | Method of and apparatus for reconstructing shape of interested part of object through irradiation with X-rays |
US4686999A (en) * | 1985-04-10 | 1987-08-18 | Tri Fund Research Corporation | Multi-channel ventilation monitor and method |
US4802486A (en) * | 1985-04-01 | 1989-02-07 | Nellcor Incorporated | Method and apparatus for detecting optical pulses |
US4928692A (en) * | 1985-04-01 | 1990-05-29 | Goodman David E | Method and apparatus for detecting optical pulses |
US5051903A (en) * | 1989-08-14 | 1991-09-24 | General Electric Company | Method and apparatus for predicting values of a varying periodic phenomenon |
US5109435A (en) * | 1988-08-08 | 1992-04-28 | Hughes Aircraft Company | Segmentation method for use against moving objects |
US5107845A (en) * | 1987-11-23 | 1992-04-28 | Bertin & Cie | Method and device for monitoring human respiration |
US5134472A (en) * | 1989-02-08 | 1992-07-28 | Kabushiki Kaisha Toshiba | Moving object detection apparatus and method |
US5150426A (en) * | 1990-11-20 | 1992-09-22 | Hughes Aircraft Company | Moving target detection method using two-frame subtraction and a two quadrant multiplier |
US5199424A (en) * | 1987-06-26 | 1993-04-06 | Sullivan Colin E | Device for monitoring breathing during sleep and control of CPAP treatment that is patient controlled |
US5207223A (en) * | 1990-10-19 | 1993-05-04 | Accuray, Inc. | Apparatus for and method of performing stereotaxic surgery |
US5239591A (en) * | 1991-07-03 | 1993-08-24 | U.S. Philips Corp. | Contour extraction in multi-phase, multi-slice cardiac mri studies by propagation of seed contours between images |
US5271055A (en) * | 1992-08-19 | 1993-12-14 | General Electric Company | Methods for reducing motion induced artifacts in a projection imaging system |
US5285786A (en) * | 1991-06-12 | 1994-02-15 | Kabushiki Kaisha Toshiba | Apparatus and method for radiographic diagnosis |
US5377681A (en) * | 1989-11-13 | 1995-01-03 | University Of Florida | Method of diagnosing impaired blood flow |
US5446548A (en) * | 1993-10-08 | 1995-08-29 | Siemens Medical Systems, Inc. | Patient positioning and monitoring system |
US5515849A (en) * | 1994-01-25 | 1996-05-14 | Aloka Co., Ltd. | Diagnostic ultrasound apparatus |
US5522382A (en) * | 1987-06-26 | 1996-06-04 | Rescare Limited | Device and method for treating obstructed breathing having a delay/ramp feature |
US5535289A (en) * | 1993-09-13 | 1996-07-09 | Fuji Photo Film Co., Ltd. | Method for reducing noise in energy subtraction images |
US5573012A (en) * | 1994-08-09 | 1996-11-12 | The Regents Of The University Of California | Body monitoring and imaging apparatus and method |
US5738102A (en) * | 1994-03-31 | 1998-04-14 | Lemelson; Jerome H. | Patient monitoring system |
US5924989A (en) * | 1995-04-03 | 1999-07-20 | Polz; Hans | Method and device for capturing diagnostically acceptable three-dimensional ultrasound image data records |
US5982915A (en) * | 1997-07-25 | 1999-11-09 | Arch Development Corporation | Method of detecting interval changes in chest radiographs utilizing temporal subtraction combined with automated initial matching of blurred low resolution images |
US5993390A (en) * | 1998-09-18 | 1999-11-30 | Hewlett- Packard Company | Segmented 3-D cardiac ultrasound imaging method and apparatus |
US6067373A (en) * | 1998-04-02 | 2000-05-23 | Arch Development Corporation | Method, system and computer readable medium for iterative image warping prior to temporal subtraction of chest radiographs in the detection of interval changes |
US6075557A (en) * | 1997-04-17 | 2000-06-13 | Sharp Kabushiki Kaisha | Image tracking system and method and observer tracking autostereoscopic display |
US6084939A (en) * | 1996-11-26 | 2000-07-04 | Canon Kabushiki Kaisha | Image processing apparatus and method |
US6125166A (en) * | 1998-01-13 | 2000-09-26 | Fuji Photo Film Co., Ltd. | Method of forming energy subtraction images |
US6266443B1 (en) * | 1998-12-22 | 2001-07-24 | Mitsubishi Electric Research Laboratories, Inc. | Object boundary detection using a constrained viterbi search |
US6333991B1 (en) * | 1997-11-15 | 2001-12-25 | Elekta Ab | Analysis of radiographic images |
US6370417B1 (en) * | 1998-09-22 | 2002-04-09 | Siemens Akiengesellschaft | Method for positioning a catheter in a vessel, and device for implementing the method |
US6375612B1 (en) * | 1998-03-24 | 2002-04-23 | P. Timothy Guichon | Method and system for monitoring animals |
US20020091314A1 (en) * | 2000-07-07 | 2002-07-11 | Cornel Schlossbauer | Method for breath compensation in radiation therapy |
US20020097155A1 (en) * | 2001-01-23 | 2002-07-25 | Cassel Cynthia L. | Combination breathing monitor alarm and audio baby alarm |
US6434215B1 (en) * | 2001-06-28 | 2002-08-13 | General Electric Company | EKG-less cardiac image reconstruction |
US20020115931A1 (en) * | 2001-02-21 | 2002-08-22 | Strauss H. William | Localizing intravascular lesions on anatomic images |
US20020118274A1 (en) * | 2001-01-31 | 2002-08-29 | Akira Yahashi | Three-dimensional measuring method and system |
US6473634B1 (en) * | 2000-11-22 | 2002-10-29 | Koninklijke Philips Electronics N.V. | Medical imaging at two temporal resolutions for tumor treatment planning |
US6475156B1 (en) * | 1999-06-14 | 2002-11-05 | Taema | Apparatus for the diagnosis or treatment of respiratory sleep disorders and operating process |
US6486604B1 (en) * | 1998-04-23 | 2002-11-26 | Thales Electron Devices Gmbh | Traveling-wave valve arrangement |
US6487274B2 (en) * | 2001-01-29 | 2002-11-26 | Siemens Medical Solutions Usa, Inc. | X-ray target assembly and radiation therapy systems and methods |
US20020188194A1 (en) * | 1991-01-28 | 2002-12-12 | Sherwood Services Ag | Surgical positioning system |
US20030026758A1 (en) * | 2001-07-27 | 2003-02-06 | Baker Gregg S. | Method and device for monitoring real-time position of an area targeted by a radiosurgery system |
US6526156B1 (en) * | 1997-01-10 | 2003-02-25 | Xerox Corporation | Apparatus and method for identifying and tracking objects with view-based representations |
US6535574B1 (en) * | 2001-11-01 | 2003-03-18 | Siemens Medical Solutions Usa, Inc. | Patient positioning system employing surface photogrammetry and portal imaging |
US6546124B1 (en) * | 1999-07-02 | 2003-04-08 | General Electric Company | Method and apparatus for performing an adaptive extended dynamic range algorithm |
US20030086596A1 (en) * | 2001-11-07 | 2003-05-08 | Medical Metrics, Inc. | Method, computer software, and system for tracking, stabilizing, and reporting motion between vertebrae |
US20030099388A1 (en) * | 2001-11-23 | 2003-05-29 | University Of Chicago | Novel subtraction technique for computerized detection of small lung nodules in computer tomography images |
US20030135103A1 (en) * | 2001-11-12 | 2003-07-17 | Mistretta Charles A. | Three-dimensional phase contrast imaging using interleaved projection data |
US20030185450A1 (en) * | 2002-02-13 | 2003-10-02 | Garakani Arman M. | Method and apparatus for acquisition, compression, and characterization of spatiotemporal signals |
US20040005088A1 (en) * | 1998-10-23 | 2004-01-08 | Andrew Jeung | Method and system for monitoring breathing activity of an infant |
US6690965B1 (en) * | 1998-10-23 | 2004-02-10 | Varian Medical Systems, Inc. | Method and system for physiological gating of radiation therapy |
US20040092815A1 (en) * | 2002-11-12 | 2004-05-13 | Achim Schweikard | Method and apparatus for tracking an internal target region without an implanted fiducial |
US20040092816A1 (en) * | 2002-11-08 | 2004-05-13 | Koninklijke Philips Electronics N.V. | Artifact elimination in time-gated anatomical imaging |
US20040097805A1 (en) * | 2002-11-19 | 2004-05-20 | Laurent Verard | Navigation system for cardiac therapies |
US20040114718A1 (en) * | 2002-11-28 | 2004-06-17 | Elekta Ab | Radiotherapy apparatus and operating method |
US6766064B1 (en) * | 2000-03-10 | 2004-07-20 | General Electric Company | Method and apparatus for performing a contrast based dynamic range management algorithm |
US20040215077A1 (en) * | 2002-11-08 | 2004-10-28 | Witt Jerome F. | Color ensemble interleave with artifact elimination in time-gated anatomical imaging |
US20040234115A1 (en) * | 2003-01-21 | 2004-11-25 | Lambert Zijp | Imaging internal structures |
US20050002546A1 (en) * | 2001-11-30 | 2005-01-06 | Raoul Florent | Medical viewing system and method for enhancing structures in noisy images |
US20050027196A1 (en) * | 2003-07-30 | 2005-02-03 | Fitzgerald Loretta A. | System for processing patient radiation treatment data |
US20050053267A1 (en) * | 2003-09-05 | 2005-03-10 | Varian Medical Systems Technologies, Inc. | Systems and methods for tracking moving targets and monitoring object positions |
US20050054916A1 (en) * | 2003-09-05 | 2005-03-10 | Varian Medical Systems Technologies, Inc. | Systems and methods for gating medical procedures |
US20050080336A1 (en) * | 2002-07-22 | 2005-04-14 | Ep Medsystems, Inc. | Method and apparatus for time gating of medical images |
US20050113672A1 (en) * | 2003-11-26 | 2005-05-26 | Salla Prathyusha K. | Method and system for composite gating using multiple inputs |
US6904126B2 (en) * | 2002-06-19 | 2005-06-07 | Canon Kabushiki Kaisha | Radiological imaging apparatus and method |
US6940945B2 (en) * | 2002-10-08 | 2005-09-06 | Siemens Aktiengesellschaft | Method for producing an x-ray image |
US20050201510A1 (en) * | 1998-10-23 | 2005-09-15 | Hassan Mostafavi | Method and system for predictive physiological gating |
US6984208B2 (en) * | 2002-08-01 | 2006-01-10 | The Hong Kong Polytechnic University | Method and apparatus for sensing body gesture, posture and movement |
US7003146B2 (en) * | 2000-04-20 | 2006-02-21 | Koninklijke Philips Electronics, N.V. | X-ray examination apparatus and method for forming an X-ray image |
US7006862B2 (en) * | 2001-07-17 | 2006-02-28 | Accuimage Diagnostics Corp. | Graphical user interfaces and methods for retrospectively gating a set of images |
US7058204B2 (en) * | 2000-10-03 | 2006-06-06 | Gesturetek, Inc. | Multiple camera control system |
US7062078B2 (en) * | 2000-11-04 | 2006-06-13 | Koninklijke Philips Electronics, N.V. | Method and device for the registration of images |
US20060165267A1 (en) * | 2001-10-15 | 2006-07-27 | Bradley Wyman | System and method for determining convergence of image set registration |
US20060241443A1 (en) * | 2004-11-22 | 2006-10-26 | Whitmore Willet F Iii | Real time ultrasound monitoring of the motion of internal structures during respiration for control of therapy delivery |
US7158610B2 (en) * | 2003-09-05 | 2007-01-02 | Varian Medical Systems Technologies, Inc. | Systems and methods for processing x-ray images |
US20070053491A1 (en) * | 2005-09-07 | 2007-03-08 | Eastman Kodak Company | Adaptive radiation therapy method with target detection |
US7221733B1 (en) * | 2002-01-02 | 2007-05-22 | Varian Medical Systems Technologies, Inc. | Method and apparatus for irradiating a target |
US7227925B1 (en) * | 2002-10-02 | 2007-06-05 | Varian Medical Systems Technologies, Inc. | Gantry mounted stereoscopic imaging system |
US7257436B2 (en) * | 2002-06-05 | 2007-08-14 | Anzai Medical Kabushiki Kaisha | Apparatus for generating radiation application synchronizing signal |
US20070189455A1 (en) * | 2006-02-14 | 2007-08-16 | Accuray Incorporated | Adaptive x-ray control |
US7349522B2 (en) * | 2005-06-22 | 2008-03-25 | Board Of Trustees Of The University Of Arkansas | Dynamic radiation therapy simulation system |
US20080144772A1 (en) * | 2006-12-14 | 2008-06-19 | Byong Yong Yi | Treatment-Speed Regulated Tumor-Tracking |
Family Cites Families (85)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3861807A (en) | 1972-08-17 | 1975-01-21 | Charles Lescrenier | Position locating and maintaining method and means |
US3871360A (en) | 1973-07-30 | 1975-03-18 | Brattle Instr Corp | Timing biological imaging, measuring, and therapeutic timing systems |
FR2273505A1 (en) | 1974-06-07 | 1976-01-02 | Inst Nat Sante Rech Med | RESPIRATORY AND HEART CYCLE CORRELATION DEVICE, AND APPLICATION TO HEART RATE MEASUREMENT |
DE2718804C3 (en) | 1977-04-27 | 1979-10-31 | Karlheinz Prof. Dr. 3000 Hannover Renner | Device for positioning control of patients and / or radiation sources |
US4463425A (en) | 1980-07-17 | 1984-07-31 | Terumo Corporation | Period measurement system |
JPS58136334A (en) * | 1982-02-05 | 1983-08-13 | 株式会社日立メデイコ | X-ray mobile image measuring apparatus |
US4971065A (en) | 1985-02-11 | 1990-11-20 | Pearce Stephen D | Transducer for detecting apnea |
DE3514542A1 (en) | 1985-04-22 | 1986-10-23 | Siemens AG, 1000 Berlin und 8000 München | METHOD AND DEVICE FOR COMPOSING AN MR IMAGE FROM BREATH-CONTROLLED IMAGE DATA |
EP0205931B1 (en) | 1985-05-23 | 1990-11-22 | Heinrich Prof. Dr. Ing. Reents | Device for measuring vital functions of a human, in particular of an infant |
US4853771A (en) | 1986-07-09 | 1989-08-01 | The United States Of America As Represented By The Secretary Of The Navy | Robotic vision system |
US4710717A (en) | 1986-12-29 | 1987-12-01 | General Electric Company | Method for fast scan cine NMR imaging |
FR2637189A1 (en) | 1988-10-04 | 1990-04-06 | Cgr Mev | SYSTEM AND METHOD FOR MEASURING AND / OR VERIFYING THE POSITION OF A PATIENT IN RADIOTHERAPY EQUIPMENT |
US4994965A (en) | 1988-11-23 | 1991-02-19 | General Electric Company | Method for reducing motion induced image artifacts in projection imaging |
US5295483A (en) | 1990-05-11 | 1994-03-22 | Christopher Nowacki | Locating target in human body |
US5662111A (en) | 1991-01-28 | 1997-09-02 | Cosman; Eric R. | Process of stereotactic optical navigation |
US5279309A (en) | 1991-06-13 | 1994-01-18 | International Business Machines Corporation | Signaling device and method for monitoring positions in a surgical operation |
US5262945A (en) | 1991-08-09 | 1993-11-16 | The United States Of America As Represented By The Department Of Health And Human Services | Method for quantification of brain volume from magnetic resonance images |
US5619995A (en) | 1991-11-12 | 1997-04-15 | Lobodzinski; Suave M. | Motion video transformation system and method |
DE4207632C2 (en) | 1992-03-11 | 1995-07-20 | Bodenseewerk Geraetetech | Device and method for positioning a body part for treatment purposes |
US5389101A (en) | 1992-04-21 | 1995-02-14 | University Of Utah | Apparatus and method for photogrammetric surgical localization |
US5603318A (en) | 1992-04-21 | 1997-02-18 | University Of Utah Research Foundation | Apparatus and method for photogrammetric surgical localization |
US5265142A (en) | 1992-05-08 | 1993-11-23 | General Electric Company | Image reconstruction technique for a computer tomography system |
JP2785921B2 (en) | 1992-10-21 | 1998-08-13 | シャープ株式会社 | Semiconductor laser drive circuit for optical memory readout device |
US5513646A (en) | 1992-11-09 | 1996-05-07 | I Am Fine, Inc. | Personal security monitoring system and method |
JPH06292085A (en) * | 1993-04-01 | 1994-10-18 | Toshiba Corp | Method and device for processing radiograph |
EP0699050B1 (en) | 1993-04-26 | 2004-03-03 | St. Louis University | Indicating the position of a probe |
JPH0723945A (en) | 1993-07-07 | 1995-01-27 | Toshiba Corp | Medical image photographing device |
US5363844A (en) | 1993-08-13 | 1994-11-15 | Mayo Foundation For Medical Education And Research | Breath-hold monitor for MR imaging |
JP3393895B2 (en) | 1993-09-13 | 2003-04-07 | 株式会社東芝 | Magnetic resonance imaging |
US5394875A (en) | 1993-10-21 | 1995-03-07 | Lewis; Judith T. | Automatic ultrasonic localization of targets implanted in a portion of the anatomy |
US5396875A (en) | 1994-02-08 | 1995-03-14 | Ford Motor Company | Air/fuel control with adaptively learned reference |
US5538494A (en) | 1994-03-17 | 1996-07-23 | Hitachi, Ltd. | Radioactive beam irradiation method and apparatus taking movement of the irradiation area into consideration |
JP3810019B2 (en) * | 1994-04-08 | 2006-08-16 | 株式会社東芝 | Image shooting device |
US5912656A (en) | 1994-07-01 | 1999-06-15 | Ohmeda Inc. | Device for producing a display from monitored data |
US5531520A (en) | 1994-09-01 | 1996-07-02 | Massachusetts Institute Of Technology | System and method of registration of three-dimensional data sets including anatomical body data |
US5622187A (en) | 1994-09-30 | 1997-04-22 | Nomos Corporation | Method and apparatus for patient positioning for radiation therapy |
US5582182A (en) | 1994-10-03 | 1996-12-10 | Sierra Biotechnology Company, Lc | Abnormal dyspnea perception detection system and method |
US5588430A (en) | 1995-02-14 | 1996-12-31 | University Of Florida Research Foundation, Inc. | Repeat fixation for frameless stereotactic procedure |
US6259943B1 (en) | 1995-02-16 | 2001-07-10 | Sherwood Services Ag | Frameless to frame-based registration system |
JP2000500031A (en) | 1995-07-16 | 2000-01-11 | ウルトラ−ガイド リミティド | Aiming for freehand needle guidance |
GB9515311D0 (en) | 1995-07-26 | 1995-09-20 | 3D Scanners Ltd | Stripe scanners and methods of scanning |
DE19529639C2 (en) | 1995-08-11 | 1997-06-19 | Siemens Ag | Process for the temporal and location-resolved representation of functional brain activities of a patient and arrangement for carrying out the process |
US5638819A (en) | 1995-08-29 | 1997-06-17 | Manwaring; Kim H. | Method and apparatus for guiding an instrument to a target |
JP3597918B2 (en) | 1995-09-11 | 2004-12-08 | 株式会社日立メディコ | X-ray CT system |
US5794621A (en) | 1995-11-03 | 1998-08-18 | Massachusetts Institute Of Technology | System and method for medical imaging utilizing a robotic device, and robotic device for use in medical imaging |
US5828770A (en) | 1996-02-20 | 1998-10-27 | Northern Digital Inc. | System for determining the spatial position and angular orientation of an object |
US5771310A (en) | 1996-12-30 | 1998-06-23 | Shriners Hospitals For Children | Method and apparatus for recording three-dimensional topographies |
US5823192A (en) | 1996-07-31 | 1998-10-20 | University Of Pittsburgh Of The Commonwealth System Of Higher Education | Apparatus for automatically positioning a patient for treatment/diagnoses |
US5820553A (en) | 1996-08-16 | 1998-10-13 | Siemens Medical Systems, Inc. | Identification system and method for radiation therapy |
US6296613B1 (en) | 1997-08-22 | 2001-10-02 | Synthes (U.S.A.) | 3D ultrasound recording device |
US5727554A (en) | 1996-09-19 | 1998-03-17 | University Of Pittsburgh Of The Commonwealth System Of Higher Education | Apparatus responsive to movement of a patient during treatment/diagnosis |
US5764723A (en) | 1996-10-16 | 1998-06-09 | The Trustees Of Columbia University In The City Of New York | Apparatus and method to gate a source for radiation therapy |
US5784431A (en) | 1996-10-29 | 1998-07-21 | University Of Pittsburgh Of The Commonwealth System Of Higher Education | Apparatus for matching X-ray images with reference images |
US5906202A (en) | 1996-11-21 | 1999-05-25 | Aradigm Corporation | Device and method for directing aerosolized mist to a specific area of the respiratory tract |
FR2760277B1 (en) | 1997-02-28 | 1999-03-26 | Commissariat Energie Atomique | METHOD AND DEVICE FOR LOCATING AN OBJECT IN SPACE |
US5997883A (en) | 1997-07-01 | 1999-12-07 | General Electric Company | Retrospective ordering of segmented MRI cardiac data using cardiac phase |
DE19732784C1 (en) | 1997-07-30 | 1999-03-04 | Bruker Medizintech | Positioning system and method for exact position determination of a manually operated manipulator in an MR tomograph |
US6434507B1 (en) | 1997-09-05 | 2002-08-13 | Surgical Navigation Technologies, Inc. | Medical instrument and method for use with computer-assisted image guided surgery |
DE29724767U1 (en) | 1997-10-01 | 2004-01-08 | Siemens Ag | Medical apparatus with device for detecting position of object - derives three dimensional data corresponding to object position to prevent collision of objects |
US6348058B1 (en) | 1997-12-12 | 2002-02-19 | Surgical Navigation Technologies, Inc. | Image guided spinal surgery guide, system, and method for use thereof |
US5993397A (en) | 1998-01-23 | 1999-11-30 | Branson; Krista Lynn | Infant respiratory monitor |
US6076005A (en) | 1998-02-25 | 2000-06-13 | St. Jude Children's Research Hospital | Respiration responsive gating means and apparatus and methods using the same |
US6198959B1 (en) | 1998-03-27 | 2001-03-06 | Cornell Research Foundation Inc. | Coronary magnetic resonance angiography using motion matched acquisition |
US6185446B1 (en) | 1998-08-21 | 2001-02-06 | William F. Carlsen, Jr. | Method and apparatus for monitoring the breathing of a patient during magnetic resonance imaging |
US6144874A (en) | 1998-10-15 | 2000-11-07 | General Electric Company | Respiratory gating method for MR imaging |
US6279579B1 (en) * | 1998-10-23 | 2001-08-28 | Varian Medical Systems, Inc. | Method and system for positioning patients for medical treatment procedures |
US6621889B1 (en) | 1998-10-23 | 2003-09-16 | Varian Medical Systems, Inc. | Method and system for predictive physiological gating of radiation therapy |
US6973202B2 (en) * | 1998-10-23 | 2005-12-06 | Varian Medical Systems Technologies, Inc. | Single-camera tracking of an object |
US6138302A (en) | 1998-11-10 | 2000-10-31 | University Of Pittsburgh Of The Commonwealth System Of Higher Education | Apparatus and method for positioning patient |
JP4794708B2 (en) * | 1999-02-04 | 2011-10-19 | オリンパス株式会社 | 3D position and orientation sensing device |
US6144875A (en) | 1999-03-16 | 2000-11-07 | Accuray Incorporated | Apparatus and method for compensating for respiratory and patient motion during treatment |
US6501981B1 (en) | 1999-03-16 | 2002-12-31 | Accuray, Inc. | Apparatus and method for compensating for respiratory and patient motions during treatment |
DE19917867B4 (en) | 1999-04-20 | 2005-04-21 | Brainlab Ag | Method and device for image support in the treatment of treatment objectives with integration of X-ray detection and navigation system |
US6370217B1 (en) | 1999-05-07 | 2002-04-09 | General Electric Company | Volumetric computed tomography system for cardiac imaging |
DE19946948A1 (en) | 1999-09-30 | 2001-04-05 | Philips Corp Intellectual Pty | Method and arrangement for determining the position of a medical instrument |
US6661617B1 (en) | 1999-12-14 | 2003-12-09 | Seagate Technology Llc | Structure and fabrication process for integrated moving-coil magnetic micro-actuator |
US7058440B2 (en) * | 2001-06-28 | 2006-06-06 | Koninklijke Philips Electronics N.V. | Dynamic computed tomography imaging using positional state modeling |
DE10133237B4 (en) | 2001-07-09 | 2007-04-19 | Siemens Ag | Method for computed tomography and computed tomography (CT) device |
US6526117B1 (en) | 2001-11-09 | 2003-02-25 | Ge Medical Systems Global Technology Company, Llc | Method and apparatus to minimize phase misregistration artifacts in gated CT images |
US20030210812A1 (en) * | 2002-02-26 | 2003-11-13 | Ali Khamene | Apparatus and method for surgical navigation |
US7182083B2 (en) * | 2002-04-03 | 2007-02-27 | Koninklijke Philips Electronics N.V. | CT integrated respiratory monitor |
US7620444B2 (en) * | 2002-10-05 | 2009-11-17 | General Electric Company | Systems and methods for improving usability of images for medical applications |
US7778691B2 (en) * | 2003-06-13 | 2010-08-17 | Wisconsin Alumni Research Foundation | Apparatus and method using synchronized breathing to treat tissue subject to respiratory motion |
US7306564B2 (en) * | 2003-11-26 | 2007-12-11 | Denso Corporation | Breath monitor |
US7314451B2 (en) * | 2005-04-25 | 2008-01-01 | Earlysense Ltd. | Techniques for prediction and monitoring of clinical episodes |
-
2003
- 2003-09-05 US US10/656,063 patent/US7158610B2/en not_active Expired - Fee Related
-
2004
- 2004-09-03 EP EP04783505A patent/EP1661440A4/en not_active Withdrawn
- 2004-09-03 JP JP2006525540A patent/JP4842820B2/en not_active Expired - Fee Related
- 2004-09-03 WO PCT/US2004/029277 patent/WO2005025279A1/en active Search and Examination
-
2006
- 2006-11-06 US US11/593,950 patent/US7620146B2/en not_active Expired - Fee Related
-
2008
- 2008-07-30 US US12/182,932 patent/US20090060311A1/en not_active Abandoned
Patent Citations (101)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3952201A (en) * | 1973-07-21 | 1976-04-20 | Emi Limited | Radiography |
US3974386A (en) * | 1974-07-12 | 1976-08-10 | Wisconsin Alumni Research Foundation | Differential X-ray method and apparatus |
US4289142A (en) * | 1978-11-24 | 1981-09-15 | Kearns Kenneth L | Physiological occurrence, such as apnea, monitor and X-ray triggering device |
US4387722A (en) * | 1978-11-24 | 1983-06-14 | Kearns Kenneth L | Respiration monitor and x-ray triggering apparatus |
US4335427A (en) * | 1980-04-21 | 1982-06-15 | Technicare Corporation | Method of selecting a preferred difference image |
US4545384A (en) * | 1983-02-23 | 1985-10-08 | Tokyo Shibaura Denki Kabushiki Kaisha | Nuclear magnetic resonance diagnostic apparatus |
US4672651A (en) * | 1985-03-28 | 1987-06-09 | Hitachi Medical Corporation | Method of and apparatus for reconstructing shape of interested part of object through irradiation with X-rays |
US4802486A (en) * | 1985-04-01 | 1989-02-07 | Nellcor Incorporated | Method and apparatus for detecting optical pulses |
US4928692A (en) * | 1985-04-01 | 1990-05-29 | Goodman David E | Method and apparatus for detecting optical pulses |
US4686999A (en) * | 1985-04-10 | 1987-08-18 | Tri Fund Research Corporation | Multi-channel ventilation monitor and method |
US4663591A (en) * | 1985-08-16 | 1987-05-05 | General Electric Company | Method for reducing image artifacts due to periodic signal variations in NMR imaging |
US5199424A (en) * | 1987-06-26 | 1993-04-06 | Sullivan Colin E | Device for monitoring breathing during sleep and control of CPAP treatment that is patient controlled |
US5522382A (en) * | 1987-06-26 | 1996-06-04 | Rescare Limited | Device and method for treating obstructed breathing having a delay/ramp feature |
US6635021B1 (en) * | 1987-06-26 | 2003-10-21 | Resmed Limited | Method and apparatus useful in the diagnosis of obstructive sleep apnea of a patient |
US6705315B2 (en) * | 1987-06-26 | 2004-03-16 | Resmed Limited | Device for monitoring breathing during sleep and ramped control of CPAP treatment |
US6398739B1 (en) * | 1987-06-26 | 2002-06-04 | Resmed Limited | Device and method for nonclinical monitoring of breathing during sleep, control of CPAP treatment and preventing apnea |
US5107845A (en) * | 1987-11-23 | 1992-04-28 | Bertin & Cie | Method and device for monitoring human respiration |
US5109435A (en) * | 1988-08-08 | 1992-04-28 | Hughes Aircraft Company | Segmentation method for use against moving objects |
US5134472A (en) * | 1989-02-08 | 1992-07-28 | Kabushiki Kaisha Toshiba | Moving object detection apparatus and method |
US5051903A (en) * | 1989-08-14 | 1991-09-24 | General Electric Company | Method and apparatus for predicting values of a varying periodic phenomenon |
US5377681A (en) * | 1989-11-13 | 1995-01-03 | University Of Florida | Method of diagnosing impaired blood flow |
US5207223A (en) * | 1990-10-19 | 1993-05-04 | Accuray, Inc. | Apparatus for and method of performing stereotaxic surgery |
US5150426A (en) * | 1990-11-20 | 1992-09-22 | Hughes Aircraft Company | Moving target detection method using two-frame subtraction and a two quadrant multiplier |
US20020188194A1 (en) * | 1991-01-28 | 2002-12-12 | Sherwood Services Ag | Surgical positioning system |
US5285786A (en) * | 1991-06-12 | 1994-02-15 | Kabushiki Kaisha Toshiba | Apparatus and method for radiographic diagnosis |
US5239591A (en) * | 1991-07-03 | 1993-08-24 | U.S. Philips Corp. | Contour extraction in multi-phase, multi-slice cardiac mri studies by propagation of seed contours between images |
US5271055A (en) * | 1992-08-19 | 1993-12-14 | General Electric Company | Methods for reducing motion induced artifacts in a projection imaging system |
US5535289A (en) * | 1993-09-13 | 1996-07-09 | Fuji Photo Film Co., Ltd. | Method for reducing noise in energy subtraction images |
US5446548A (en) * | 1993-10-08 | 1995-08-29 | Siemens Medical Systems, Inc. | Patient positioning and monitoring system |
US5515849A (en) * | 1994-01-25 | 1996-05-14 | Aloka Co., Ltd. | Diagnostic ultrasound apparatus |
US5738102A (en) * | 1994-03-31 | 1998-04-14 | Lemelson; Jerome H. | Patient monitoring system |
US5573012A (en) * | 1994-08-09 | 1996-11-12 | The Regents Of The University Of California | Body monitoring and imaging apparatus and method |
US5924989A (en) * | 1995-04-03 | 1999-07-20 | Polz; Hans | Method and device for capturing diagnostically acceptable three-dimensional ultrasound image data records |
US6084939A (en) * | 1996-11-26 | 2000-07-04 | Canon Kabushiki Kaisha | Image processing apparatus and method |
US6526156B1 (en) * | 1997-01-10 | 2003-02-25 | Xerox Corporation | Apparatus and method for identifying and tracking objects with view-based representations |
US6075557A (en) * | 1997-04-17 | 2000-06-13 | Sharp Kabushiki Kaisha | Image tracking system and method and observer tracking autostereoscopic display |
US5982915A (en) * | 1997-07-25 | 1999-11-09 | Arch Development Corporation | Method of detecting interval changes in chest radiographs utilizing temporal subtraction combined with automated initial matching of blurred low resolution images |
US6333991B1 (en) * | 1997-11-15 | 2001-12-25 | Elekta Ab | Analysis of radiographic images |
US6125166A (en) * | 1998-01-13 | 2000-09-26 | Fuji Photo Film Co., Ltd. | Method of forming energy subtraction images |
US6375612B1 (en) * | 1998-03-24 | 2002-04-23 | P. Timothy Guichon | Method and system for monitoring animals |
US6569092B1 (en) * | 1998-03-24 | 2003-05-27 | P. Timothy Guichon | Method and system for monitoring animals |
US6067373A (en) * | 1998-04-02 | 2000-05-23 | Arch Development Corporation | Method, system and computer readable medium for iterative image warping prior to temporal subtraction of chest radiographs in the detection of interval changes |
US6486604B1 (en) * | 1998-04-23 | 2002-11-26 | Thales Electron Devices Gmbh | Traveling-wave valve arrangement |
US5993390A (en) * | 1998-09-18 | 1999-11-30 | Hewlett- Packard Company | Segmented 3-D cardiac ultrasound imaging method and apparatus |
US6370417B1 (en) * | 1998-09-22 | 2002-04-09 | Siemens Akiengesellschaft | Method for positioning a catheter in a vessel, and device for implementing the method |
US7123758B2 (en) * | 1998-10-23 | 2006-10-17 | Varian Medical Systems Technologies, Inc. | Method and system for monitoring breathing activity of a subject |
US20050201510A1 (en) * | 1998-10-23 | 2005-09-15 | Hassan Mostafavi | Method and system for predictive physiological gating |
US6980679B2 (en) * | 1998-10-23 | 2005-12-27 | Varian Medical System Technologies, Inc. | Method and system for monitoring breathing activity of a subject |
US6690965B1 (en) * | 1998-10-23 | 2004-02-10 | Varian Medical Systems, Inc. | Method and system for physiological gating of radiation therapy |
US20040005088A1 (en) * | 1998-10-23 | 2004-01-08 | Andrew Jeung | Method and system for monitoring breathing activity of an infant |
US20070053494A1 (en) * | 1998-10-23 | 2007-03-08 | Varian Medical Systems Technologies, Inc. | Systems and methods for processing x-ray images |
US6266443B1 (en) * | 1998-12-22 | 2001-07-24 | Mitsubishi Electric Research Laboratories, Inc. | Object boundary detection using a constrained viterbi search |
US6475156B1 (en) * | 1999-06-14 | 2002-11-05 | Taema | Apparatus for the diagnosis or treatment of respiratory sleep disorders and operating process |
US6546124B1 (en) * | 1999-07-02 | 2003-04-08 | General Electric Company | Method and apparatus for performing an adaptive extended dynamic range algorithm |
US6766064B1 (en) * | 2000-03-10 | 2004-07-20 | General Electric Company | Method and apparatus for performing a contrast based dynamic range management algorithm |
US7003146B2 (en) * | 2000-04-20 | 2006-02-21 | Koninklijke Philips Electronics, N.V. | X-ray examination apparatus and method for forming an X-ray image |
US20020091314A1 (en) * | 2000-07-07 | 2002-07-11 | Cornel Schlossbauer | Method for breath compensation in radiation therapy |
US6731970B2 (en) * | 2000-07-07 | 2004-05-04 | Brainlab Ag | Method for breath compensation in radiation therapy |
US7058204B2 (en) * | 2000-10-03 | 2006-06-06 | Gesturetek, Inc. | Multiple camera control system |
US7062078B2 (en) * | 2000-11-04 | 2006-06-13 | Koninklijke Philips Electronics, N.V. | Method and device for the registration of images |
US6473634B1 (en) * | 2000-11-22 | 2002-10-29 | Koninklijke Philips Electronics N.V. | Medical imaging at two temporal resolutions for tumor treatment planning |
US20020097155A1 (en) * | 2001-01-23 | 2002-07-25 | Cassel Cynthia L. | Combination breathing monitor alarm and audio baby alarm |
US6487274B2 (en) * | 2001-01-29 | 2002-11-26 | Siemens Medical Solutions Usa, Inc. | X-ray target assembly and radiation therapy systems and methods |
US20020118274A1 (en) * | 2001-01-31 | 2002-08-29 | Akira Yahashi | Three-dimensional measuring method and system |
US20020115931A1 (en) * | 2001-02-21 | 2002-08-22 | Strauss H. William | Localizing intravascular lesions on anatomic images |
US6434215B1 (en) * | 2001-06-28 | 2002-08-13 | General Electric Company | EKG-less cardiac image reconstruction |
US7006862B2 (en) * | 2001-07-17 | 2006-02-28 | Accuimage Diagnostics Corp. | Graphical user interfaces and methods for retrospectively gating a set of images |
US20030026758A1 (en) * | 2001-07-27 | 2003-02-06 | Baker Gregg S. | Method and device for monitoring real-time position of an area targeted by a radiosurgery system |
US20060165267A1 (en) * | 2001-10-15 | 2006-07-27 | Bradley Wyman | System and method for determining convergence of image set registration |
US6535574B1 (en) * | 2001-11-01 | 2003-03-18 | Siemens Medical Solutions Usa, Inc. | Patient positioning system employing surface photogrammetry and portal imaging |
US20030086596A1 (en) * | 2001-11-07 | 2003-05-08 | Medical Metrics, Inc. | Method, computer software, and system for tracking, stabilizing, and reporting motion between vertebrae |
US20030135103A1 (en) * | 2001-11-12 | 2003-07-17 | Mistretta Charles A. | Three-dimensional phase contrast imaging using interleaved projection data |
US6678399B2 (en) * | 2001-11-23 | 2004-01-13 | University Of Chicago | Subtraction technique for computerized detection of small lung nodules in computer tomography images |
US20030099388A1 (en) * | 2001-11-23 | 2003-05-29 | University Of Chicago | Novel subtraction technique for computerized detection of small lung nodules in computer tomography images |
US20050002546A1 (en) * | 2001-11-30 | 2005-01-06 | Raoul Florent | Medical viewing system and method for enhancing structures in noisy images |
US7221733B1 (en) * | 2002-01-02 | 2007-05-22 | Varian Medical Systems Technologies, Inc. | Method and apparatus for irradiating a target |
US20030185450A1 (en) * | 2002-02-13 | 2003-10-02 | Garakani Arman M. | Method and apparatus for acquisition, compression, and characterization of spatiotemporal signals |
US7257436B2 (en) * | 2002-06-05 | 2007-08-14 | Anzai Medical Kabushiki Kaisha | Apparatus for generating radiation application synchronizing signal |
US6904126B2 (en) * | 2002-06-19 | 2005-06-07 | Canon Kabushiki Kaisha | Radiological imaging apparatus and method |
US20050080336A1 (en) * | 2002-07-22 | 2005-04-14 | Ep Medsystems, Inc. | Method and apparatus for time gating of medical images |
US6984208B2 (en) * | 2002-08-01 | 2006-01-10 | The Hong Kong Polytechnic University | Method and apparatus for sensing body gesture, posture and movement |
US7227925B1 (en) * | 2002-10-02 | 2007-06-05 | Varian Medical Systems Technologies, Inc. | Gantry mounted stereoscopic imaging system |
US6940945B2 (en) * | 2002-10-08 | 2005-09-06 | Siemens Aktiengesellschaft | Method for producing an x-ray image |
US20040215077A1 (en) * | 2002-11-08 | 2004-10-28 | Witt Jerome F. | Color ensemble interleave with artifact elimination in time-gated anatomical imaging |
US20040092816A1 (en) * | 2002-11-08 | 2004-05-13 | Koninklijke Philips Electronics N.V. | Artifact elimination in time-gated anatomical imaging |
US7103400B2 (en) * | 2002-11-08 | 2006-09-05 | Koninklijke Philips Electronics, N.V. | Artifact elimination in time-gated anatomical imaging |
US7260426B2 (en) * | 2002-11-12 | 2007-08-21 | Accuray Incorporated | Method and apparatus for tracking an internal target region without an implanted fiducial |
US20040092815A1 (en) * | 2002-11-12 | 2004-05-13 | Achim Schweikard | Method and apparatus for tracking an internal target region without an implanted fiducial |
US20040097805A1 (en) * | 2002-11-19 | 2004-05-20 | Laurent Verard | Navigation system for cardiac therapies |
US20040114718A1 (en) * | 2002-11-28 | 2004-06-17 | Elekta Ab | Radiotherapy apparatus and operating method |
US20040234115A1 (en) * | 2003-01-21 | 2004-11-25 | Lambert Zijp | Imaging internal structures |
US20050027196A1 (en) * | 2003-07-30 | 2005-02-03 | Fitzgerald Loretta A. | System for processing patient radiation treatment data |
US20050053267A1 (en) * | 2003-09-05 | 2005-03-10 | Varian Medical Systems Technologies, Inc. | Systems and methods for tracking moving targets and monitoring object positions |
US7158610B2 (en) * | 2003-09-05 | 2007-01-02 | Varian Medical Systems Technologies, Inc. | Systems and methods for processing x-ray images |
US20050054916A1 (en) * | 2003-09-05 | 2005-03-10 | Varian Medical Systems Technologies, Inc. | Systems and methods for gating medical procedures |
US20050113672A1 (en) * | 2003-11-26 | 2005-05-26 | Salla Prathyusha K. | Method and system for composite gating using multiple inputs |
US20060241443A1 (en) * | 2004-11-22 | 2006-10-26 | Whitmore Willet F Iii | Real time ultrasound monitoring of the motion of internal structures during respiration for control of therapy delivery |
US7349522B2 (en) * | 2005-06-22 | 2008-03-25 | Board Of Trustees Of The University Of Arkansas | Dynamic radiation therapy simulation system |
US20070053491A1 (en) * | 2005-09-07 | 2007-03-08 | Eastman Kodak Company | Adaptive radiation therapy method with target detection |
US20070189455A1 (en) * | 2006-02-14 | 2007-08-16 | Accuray Incorporated | Adaptive x-ray control |
US20080144772A1 (en) * | 2006-12-14 | 2008-06-19 | Byong Yong Yi | Treatment-Speed Regulated Tumor-Tracking |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100231709A1 (en) * | 2009-03-16 | 2010-09-16 | Fuji Xerox Co., Ltd. | Position measurement system, position measurement method and computer-readable medium |
US20100278414A1 (en) * | 2009-04-29 | 2010-11-04 | Kajetan Berlinger | Method and device for determining preferred alignments of a treatment beam generator |
US8693763B2 (en) * | 2009-04-29 | 2014-04-08 | Brainlab Ag | Method and device for determining preferred alignments of a treatment beam generator |
US20150327828A1 (en) * | 2013-03-06 | 2015-11-19 | Fujifilm Corporation | Body motion display device and body motion display method |
US10194880B2 (en) * | 2013-03-06 | 2019-02-05 | Fujifilm Corporation | Body motion display device and body motion display method |
Also Published As
Publication number | Publication date |
---|---|
JP2007503937A (en) | 2007-03-01 |
US20070053494A1 (en) | 2007-03-08 |
EP1661440A4 (en) | 2010-07-14 |
EP1661440A1 (en) | 2006-05-31 |
US7620146B2 (en) | 2009-11-17 |
US7158610B2 (en) | 2007-01-02 |
WO2005025279A1 (en) | 2005-03-17 |
US20050053196A1 (en) | 2005-03-10 |
JP4842820B2 (en) | 2011-12-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7158610B2 (en) | Systems and methods for processing x-ray images | |
US8571639B2 (en) | Systems and methods for gating medical procedures | |
US20050053267A1 (en) | Systems and methods for tracking moving targets and monitoring object positions | |
US8396248B2 (en) | Sequential stereo imaging for estimating trajectory and monitoring target position | |
US9232928B2 (en) | Method and system for predictive physiological gating | |
US11295449B2 (en) | Three-dimensional tracking of a target in a body | |
EP3565634B1 (en) | System for patient-specific motion management for treatment | |
US20130121551A1 (en) | Systems and methods for segmentation of radiopaque structures in images | |
US10631778B2 (en) | Patient setup using respiratory gated and time resolved image data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: VARIAN MEDICAL SYSTEMS, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MOSTAFAVI, HASSAN;REEL/FRAME:021318/0591 Effective date: 20030903 |
|
AS | Assignment |
Owner name: VARIAN MEDICAL SYSTEMS, INC., CALIFORNIA Free format text: MERGER;ASSIGNOR:VARIAN MEDICAL SYSTEMS TECHNOLOGIES, INC.;REEL/FRAME:021728/0352 Effective date: 20080926 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |