US20050031176A1 - Method and apparatus of multi-modality image fusion - Google Patents

Method and apparatus of multi-modality image fusion Download PDF

Info

Publication number
US20050031176A1
US20050031176A1 US10/604,673 US60467303A US2005031176A1 US 20050031176 A1 US20050031176 A1 US 20050031176A1 US 60467303 A US60467303 A US 60467303A US 2005031176 A1 US2005031176 A1 US 2005031176A1
Authority
US
United States
Prior art keywords
anatomical
image
data
functional
fiducial markers
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/604,673
Inventor
Sarah Hertel
Gopal Avinash
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
General Electric Co
GE Medical Systems Global Technology Co LLC
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US10/604,673 priority Critical patent/US20050031176A1/en
Assigned to GE MEDICAL SYSTEMS GLOBAL TECHNOLOGY CO. LLC reassignment GE MEDICAL SYSTEMS GLOBAL TECHNOLOGY CO. LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AVINASH, GOPAL B., HERTEL, SARAH R.
Assigned to GENERAL ELECTRIC COMPANY reassignment GENERAL ELECTRIC COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GE MEDICAL SYSTEMS GLOBAL TECHNOLOGY COMPANY, LLC
Publication of US20050031176A1 publication Critical patent/US20050031176A1/en
Priority to US11/931,078 priority patent/US7848553B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/38Registration of image sequences
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods

Definitions

  • the present invention relates generally to diagnostic imaging and, more particularly, to a method and apparatus of combining or fusing functional diagnostic data and anatomical diagnostic data acquired of a subject with imaging systems of different modalities to generate a composite image for clinical inspection.
  • the fusion of functional image data and anatomical image data is a widely-practiced technique to provide composite images for improved pathology identification and clinical diagnosis.
  • the functional and anatomical image data is acquired using nuclear medicine based systems such as single-photon computed tomography (SPECT), and positron emission tomography (PET) or radiology based imaging systems such as computed tomography (CT), magnetic resonance (MR), ultrasound, and x-ray.
  • SPECT single-photon computed tomography
  • PET positron emission tomography
  • CT computed tomography
  • MR magnetic resonance
  • ultrasound and x-ray.
  • CT computed tomography
  • MR magnetic resonance
  • a combined PET/CT scanner may not be feasible in all circumstances. For instance, it may not be practical for a diagnostic imaging center, hospital, or the like to replace existing PET and CT systems with a combined imager.
  • a combined PET/CT scanner by definition, may generate a composite image of functional and anatomical data acquired using PET and CT, respectively.
  • the scanner cannot provide a composite image of PET and MR data.
  • SPECT and MR data, or SPECT and CT data may not address the myriad of diagnostic needs required of a radiologist or other health care provider in rendering a diagnosis to a patient.
  • the resolution of functional images compared to anatomical images is notably inferior.
  • Another consideration that specifically affects cardiac imaging is the considerable amount of motion that can add additional blurring to any image set.
  • the goal of anatomical imaging in the heart is to observe the heart without motion.
  • Functional imaging of the heart can compensate for motion by dividing the imaging into bins but the number of bins is the denominator when the total dataset is the numerator.
  • the number of coincidence events is limited to the number of radioactive decay events and being able to observe as much data as possible is desirable for a successful diagnosis.
  • the radiologist or other health care provider must decipher a single composite image with the functional and anatomical information, with respect to one another, being misaligned. Additional post-fusion processing steps may be taken to correct the misalignment of the respective images.
  • a conventional fusion of CT and PET image data is illustrative of the above drawbacks.
  • the CT study is performed with ECG gating and the PET study may or may not be performed with ECG gating.
  • the anatomical position of the heart typically changes relative to the ECG cycle.
  • image processing the CT image is reconstructed from a portion of the data centered on a selected phase during the cardiac cycle in order to provide an image with the least amount of motion blurring artifacts.
  • the CT coronary arteries are then tracked and segmented out of the CT image. The segmented images retain the coordinate system of the original data frozen at one particular phase of the cardiac cycle.
  • a static or dynamic PET image may then be reconstructed from the entire set of PET data that is averaged over many ECG cycles.
  • a gated PET image set is reconstructed for each bin in the gated study.
  • One of these bins may correspond to the selected phase for which the CT data set was reconstructed.
  • the alignment may further improve with such conditions.
  • These PET images are then processed such that the left ventricle is segmented based on the long axis of the heart. Using this information, a PET 3D model can be displayed in “model” space that approximates the anatomical shape of a left ventricle.
  • the CT image is then fused with the PET image along the model coordinates to form a composite image.
  • the respective images from which the composite image is formed are not registered because the coordinate systems are not common to both image sets.
  • different amounts of misalignment may be introduced.
  • the composite image typically must undergo additional and time-consuming processing to effectively align the functional data with the anatomical data in a clinical area of interest to provide optimal images for diagnosis.
  • Another classic multi-modality paradigm aligns internal or external fiducial markers from a functional image with corresponding anatomical points on an anatomical image.
  • This conventional fiducial marker-based system implements a manual method of fusion that does not take local variations in the datasets into account.
  • the conventional automated rigid or non-rigid body registration process uses mutual information as the cost function for high-lighting differences between the functional and anatomical images. The cost function therefore defines or guides the registration process of the functional data to the anatomical data.
  • fiducial markers and rigid and non-rigid affine transformation to register images.
  • these automated methods do not use any localized anatomical constraints to guide them.
  • these conventional approaches may only perform data-to-data fusion and, as such, are inapplicable when fusion between data and modeled data, or fusion between modeled data and modeled data is desired.
  • the present invention is directed to a method and apparatus for fusing or combining functional image data and anatomical image data that overcome the aforementioned drawbacks.
  • the invention which may be carried out through user interaction or automatically, enables composite and clinically valuable images to be generated that display functional and anatomical data acquired with different imaging systems.
  • the respective images may be aligned with one another before a composite image is generated. Warping is carried out that takes into consideration anatomical constraints while maintaining alignment of the fiducial and anatomical markers.
  • a method of medical image overlap comprises the steps of determining at least two anatomical fiducial markers on a functional image and determining corresponding points to the at least two anatomical fiducial markers on an anatomical image.
  • the method also includes the step of aligning the at least two anatomical fiducial markers with the corresponding points on the anatomical image and the step of warping the functional image to fit constraints of the anatomical image while maintaining alignment of the at least two anatomical fiducial markers and the corresponding points on the anatomical image.
  • a diagnostic image generation system includes at least one database containing functional and anatomical image data and a computer programmed to determine at least a pair of fiducial markers on a functional image.
  • the computer is also programmed to locate corresponding anatomical indicia on an anatomical image and generate a composite image of the functional image and the anatomical image such that the fiducial markers and the anatomical indicia are aligned and anatomical constraints are observed.
  • a computer readable storage medium has a computer program stored thereon.
  • the computer program represents a set of instructions that when executed by a computer cause the computer to access functional image data of a medical patient as well as anatomical image data of the medical patient.
  • the computer is then programmed to identify more than one fiducial marker in the functional image data and identify anatomical locations in the anatomical image data that correspond to the more than one fiducial marker.
  • the set of instructions further cause the computer to generate an image with the functional image data superimposed on the anatomical image data that considers anatomical constraints.
  • FIG. 1 is a schematic representation of a multi-node network of medical imaging systems applicable with the present invention.
  • FIG. 2 is a flow chart setting forth the steps of a functional image data and anatomical image data fusion technique in accordance with the present invention.
  • the present invention will be described with respect to a process, which may be carried out through interaction with a user or automatically, to generate a composite diagnostic image of functional and anatomical data acquired separately with a PET imaging system and a CT imaging system.
  • imaging systems of other modalities such as MR, SPECT, ultrasound, x-ray, and the like may be used to acquire the functional and anatomical data to be combined into a composite image.
  • the present invention will be described with respect to the acquisition and imaging of data from a cardiac region of a patient. However, one skilled in the art will appreciate that the present invention is equivalently applicable with data acquisition and imaging of other anatomical regions of a patient.
  • FIG. 1 an overview block diagram of a medical diagnostic and service networked system 10 is shown which includes a plurality of remote treatment stations, such as Station A referenced with numeral 12 , and Station B referenced with numeral 14 , which may include a medical treatment facility, hospital, clinic, or mobile imaging facility. It is understood, that the number of treatment stations can be limitless, but two specific embodiments are shown with Station A and Station B, which will be further explained hereinafter.
  • the treatment stations 12 , 14 are connected to a centralized facility 16 through a communications link 18 , such as a network of interconnected server nodes. This network of interconnected nodes may be a secure, internal, intranet or a public communications network, such as the internet.
  • a single centralized facility is shown and described, it is understood that the present invention contemplates the use of multiple centralized facilities, each capable of communication with each treatment station.
  • Each treatment station has operational software associated therewith which can be serviced by the centralized facility 16 .
  • the various systems disclosed are configured to be selectively linked to the centralized facility 16 by a workstation, which in the example of treatment station 12 , includes a laptop computer 20 or permanent workstation 26 connected to an internal network 22 . Such selective linking is desirable for accessing data from the systems and transmitting data to the systems.
  • a treatment site may have a number of devices such as a variety of medical diagnostic systems of various modalities.
  • the devices may include a number of networked medical image scanners 24 connected to the internal network 22 .
  • a treatment station or treatment site 14 can include a number of non-networked medical image scanners 28 , 30 , and 32 each having a computer or work station associated therewith and having an internal modem or network connection device 34 , 36 , and 38 to connect the remote treatment station to a communications link 18 , such as the internet, to communicate with centralized facility 16 .
  • each of the network scanners 24 has its own workstation for individual operation and are linked together by the internal network 22 . Additionally, each of the network/scanners may be linked to a local database 40 configured to store data associated with imaging scan sessions, as will be discussed shortly. Further, such a system is provided with communications components allowing it to send and receive data over a communications link 18 . Similarly, for the non-networked medical image scanners at remote treatment station 14 , each of the scanners 28 , 30 , and 32 is connected to communications link 18 through which they can communicate with the centralized facility 16 . Furthermore, each scanner 28 , 30 , 32 may include a database 42 , 44 , 46 , respectively, for storing scanning data.
  • Scanning data may be transferred to a centralized database 48 through communications link 18 and router 50 .
  • the centralized database 48 is included in a remote file server 52 , where workstations and scanners, external to the local intranet containing the centralized database 48 , can access the database as though located locally on the intranet 54 . More specifically, as will be described, workstations 20 , 26 can access the data stored in the centralized database 48 , or other remote database, such as database 40 , as though the data were stored in a database within the specific workstation requesting the data.
  • FIG. 1 contemplates a medical facility having such systems as MRI systems, ultrasound systems, x-ray systems, CT systems, as well as PET systems, nuclear imaging systems, or any other type of medical imaging system, however, the present invention is not so limited.
  • Such facilities may also provide services to centralized medical diagnostic management systems, picture archiving and communications systems (PACS), teleradiology systems, etc.
  • PACS picture archiving and communications systems
  • Such systems can be either stationary and located in a fixed place and available by a known network address, or be mobile having various net-work addresses.
  • each treatment station 12 , 14 can include any combination of the aforementioned systems, or a treatment station may have all of a single type of system.
  • Each system is connectable and can transmit data through a network and/or with at least one database 40 , 48 .
  • the single representation of the centralized database 48 is for demonstrative purposes only, and it is assumed that there is a need for multiple databases in such a system.
  • each of the systems and substations described herein and referenced in FIG. 1 may be linked selectively to the centralized facility 16 via a network 18 .
  • any acceptable network may be employed whether public, open, dedicated, private, or so forth.
  • the communications links to the network may be of any acceptable type, including conventional telephone lines, fiber optics, cable modem links, digital subscriber lines, wireless data transfer systems, or the like.
  • Each of the systems is provided with communications interface hardware and software of generally known design, permitting them to establish network links and exchange data with the centralized facility 16 .
  • the systems or particularly, workstations 20 , 26 are provided with specialized software so as to communicate with the centralized facility 16 and particularly with the remote database 48 as though the data stored in the remote database is located locally on workstation 20 .
  • the network connection can be terminated. In other cases, the network connection is maintained continuously.
  • the scanning data from as imaging session is automatically transmitted from the scanner to the database 48 . That is, database 48 is automatically updated after each imaging scan is executed. Records must be maintained as to the dosage used and catalogued according to the particular diagnostic procedure as well as the individual patient. From these records, the treatment facilities or institutions may ensure conformity with dosage guidelines and regulations. Further, as a result of maintaining an active database storing scan parameter values of executed imaging sessions, a user or prescriber of an imminent imaging session may query the database to later retrieve scanning data for review from any workstation 20 , 26 that is permitted to access the remote database 48 .
  • the database having the scan parameter values stored thereon may be accessed from a number of scanners that are remotely located from the database. Furthermore, there is no requirement that each scanner be physically located in the same treatment station or facility. That is, a scanner located in station 12 may electronically transmit and receive data from the remote database 48 while simultaneously therewith any scanner 28 , 30 , 32 in station 14 may likewise transmit and receive data to and from database 48 . Later a workstation 20 , 26 at any locality, for example that may be remote to both the scanner 24 and the centralized facility 16 , can access the data from any scanner 24 , 28 , 30 , 32 by accessing the centralized facility 16 . Furthermore, database 48 need not be located in a separate centralized facility 16 . That is, database 48 may be located in either one of stations 12 , 14 as well as be remotely located within that station or treatment facility and the workstation 20 , 26 requiring access to the scanning data.
  • FIG. 2 the steps of a processing technique or method for aligning and registering functional and anatomical data acquired from separate imaging systems built on separate imaging technologies are set forth.
  • the process may be automated or guided through user interactions and commands.
  • Process 56 begins with the accessing of anatomical image data 58 and functional image data or a model of functional image data 60 .
  • a model of functional image data may be defined as segmented image data with arbitrary or similar intensities as the original functional image data from which the model was generated.
  • Arrow 62 indicates that the anatomical image data and the functional image data are geometrically collocated. That is, the anatomical and functional data are geometrically oriented about a common coordinate system; however, the data are not registered.
  • Process 56 continues with the identification of anatomical fiducial markers on the functional image 64 .
  • at least two reference markers are identified.
  • the reference markers are used to identify corresponding anatomical locations on the anatomical image.
  • the fiducial markers be internal anatomical features.
  • external surface markers may be used, but the external markers must be used during the acquisition of the functional data as well as the anatomical data. This may be problematic given that the functional data may be acquired at a different time and location than the acquisition of the anatomical data.
  • the reference markers may include the ventricular grooves between respective ventricles of a patient's heart.
  • corresponding anatomical indicia or points are determined 66 on the anatomical image.
  • the ventricular grooves would be identified on the anatomical image.
  • the functional image is overlaid 68 on the anatomical image such that anatomical indicia and the fiducial markers are cooperatively aligned.
  • the anatomical image remains fixed and the functional image is superimposed thereon.
  • the aligning of the fiducial markers and the corresponding anatomical indicia may be carried automatically by a computer programmed to do so or may be done through user interaction with a graphical user interface (GUI) displaying each of the images.
  • GUI graphical user interface
  • the user such a radiologist, technician, or health care professional, may electronically “grab” the functional image, “drag” the image age over the anatomical image such that the fiducial markers and anatomical indicia are aligned, and “drop”the functional image on the anatomical image.
  • the user may identify or “highlight” the respective fiducial markers and anatomical indicia, and then instruct the computer to overlay or superimpose the functional image on the anatomical image.
  • to sufficiently align the fiducial markers and the corresponding anatomical indicia it may be necessary to carry out various translation, scaling, and rotation processes.
  • Process 56 continues at step 70 with the warping of the functional data to the anatomical data such that anatomical constraints are met while maintaining alignment of the fiducial markers and the corresponding anatomical indicia.
  • the process tailors the warping process to anatomical constraints of the anatomical data rather than a direct warping of the functional and anatomical data.
  • the functional data corresponds to ventricular anatomy and the anatomical data corresponds to the coronary artery.
  • warping would be applied locally such that the coronary arteries of the anatomical image lay on the outer surface of the ventricular anatomy of the functional image.
  • anatomical constraints are application and modality dependent and are useful for creating clinically meaningful results.
  • the anatomic constraints are used to define physical relationships between aspects covered by functional and anatomic data, and to enforce known relationships between functional and anatomic data.
  • Warping is an elastic registration process that may be used to fuse or combine images acquired from scanners of separate modalities.
  • elastic transformation techniques multi-scale, multi-region, pyramidal approaches are implemented.
  • a cost function is utilized to highlight differences between the images on a scale-by-scale basis such that the differences are optimized at every scale. That is, an image is sampled at a given scale and then is segmented or divided into multiple regions. Separate shift vectors are then determined or calculated at different regions. The vectors are interpolated to generate a smooth shift transformation which is applied to warp the image. The image is then re-sampled and the registration process is repeated at successive scales until a pre-determined final scale is reached.
  • a method of medical image overlap comprises the steps of determining at least two anatomical fiducial markers on a functional image and determining corresponding points to the at least two anatomical fiducial markers on an anatomical image.
  • the method also includes the step of aligning the at least two anatomical fiducial markers with the corresponding points on the anatomical image and the step of warping the functional image to fit constraints of the anatomical image while maintaining alignment of the at least two anatomical fiducial markers and the corresponding points on the anatomical image.
  • a diagnostic image generation system includes at least one database containing functional and anatomical image data and a computer programmed to determine at least a pair of fiducial markers on a functional image.
  • the computer is also programmed to locate corresponding anatomical indicia on an anatomical image and generate a composite image of the functional image and the anatomical image such that the fiducial markers and the anatomical indicia are aligned and anatomical constraints are observed.
  • a computer readable storage medium has a computer program stored thereon.
  • the computer program represents a set of instructions that when executed by a computer cause the computer to access functional image data of a medical patient as well as anatomical image data of the medical patient.
  • the computer is then programmed to identify more than one fiducial marker in the functional image data and identify anatomical locations in the anatomical image data that correspond to the more than one fiducial marker.
  • the set of instructions further cause the computer to generate an image with the functional image data superimposed on the anatomical image data that considers anatomical constraints.

Abstract

The present invention is directed to a method and apparatus for fusing or combining functional image data and anatomical image data. The invention, which may be carried out through user interaction or automatically, enables composite and clinically valuable images to be generated that display functional and anatomical data acquired with different imaging systems. By identifying fiducial markers on a functional data image and correlating the fiducial markers with anatomical markers or indicia on the anatomical data image, the respective images may be aligned with one another before a composite image is generated.

Description

    BACKGROUND OF INVENTION
  • The present invention relates generally to diagnostic imaging and, more particularly, to a method and apparatus of combining or fusing functional diagnostic data and anatomical diagnostic data acquired of a subject with imaging systems of different modalities to generate a composite image for clinical inspection.
  • The fusion of functional image data and anatomical image data is a widely-practiced technique to provide composite images for improved pathology identification and clinical diagnosis. Typically, the functional and anatomical image data is acquired using nuclear medicine based systems such as single-photon computed tomography (SPECT), and positron emission tomography (PET) or radiology based imaging systems such as computed tomography (CT), magnetic resonance (MR), ultrasound, and x-ray. Generally, it is desirable to “fuse” an image from SPECT or PET with an image from CT or MR. In this regard, it is typically desired for the functional image from SPECT or PET to be superimposed on the anatomical image acquired using CT or MR.
  • Fusion of functional and anatomical data that has been acquired separately with imaging systems predicated on different imaging technologies can be problematic. That is, the functional data may be acquired at a different time than the anatomical data. As such, patient positioning between the separate data acquisitions typically varies. Different size acquisitions with different slice thickness and pixel sizes with different central points are also not un-common. As such, for a clinically valuable composite image to be produced, these differences as well as others typically encountered, must be resolved.
  • One solution has been the development of a hybrid scanner capable of acquiring PET and CT images during a single scan study in such a manner to avoid many of the drawbacks enumerated above. A combined PET/CT scanner, however, may not be feasible in all circumstances. For instance, it may not be practical for a diagnostic imaging center, hospital, or the like to replace existing PET and CT systems with a combined imager. Moreover, a combined PET/CT scanner, by definition, may generate a composite image of functional and anatomical data acquired using PET and CT, respectively. However, the scanner cannot provide a composite image of PET and MR data. SPECT and MR data, or SPECT and CT data. As such, a hybrid system may not address the myriad of diagnostic needs required of a radiologist or other health care provider in rendering a diagnosis to a patient.
  • Another solution that is consistent with conventional fusion techniques fails to adequately address the drawbacks associated with the overlaying of collocated functional and anatomical data that are not registered. That is, present fusion protocols combine data having a common coordinate alignment, but fail to register the functional and anatomical images. Registering is commonly defined as the process of aligning medical image data. This is based on the premise that the functional and anatomical data sets were acquired under identical physiological states and therefore can be fused without taking additional measures into account. In this regard, conventional fusion techniques orientate the functional and anatomical data but do not take measures to sufficiently align the functional and anatomical data. Furthermore, the image resolution from PET and SPECT are limited by maximum energy resolution of positron-emitting isotopes. The resolution of functional images compared to anatomical images is notably inferior. Another consideration that specifically affects cardiac imaging is the considerable amount of motion that can add additional blurring to any image set. The goal of anatomical imaging in the heart is to observe the heart without motion. Functional imaging of the heart can compensate for motion by dividing the imaging into bins but the number of bins is the denominator when the total dataset is the numerator. The number of coincidence events is limited to the number of radioactive decay events and being able to observe as much data as possible is desirable for a successful diagnosis. As a result, the radiologist or other health care provider must decipher a single composite image with the functional and anatomical information, with respect to one another, being misaligned. Additional post-fusion processing steps may be taken to correct the misalignment of the respective images.
  • A conventional fusion of CT and PET image data is illustrative of the above drawbacks. During a PET/CT cardiac acquisition, the CT study is performed with ECG gating and the PET study may or may not be performed with ECG gating. The anatomical position of the heart typically changes relative to the ECG cycle. During image processing the CT image is reconstructed from a portion of the data centered on a selected phase during the cardiac cycle in order to provide an image with the least amount of motion blurring artifacts. The CT coronary arteries are then tracked and segmented out of the CT image. The segmented images retain the coordinate system of the original data frozen at one particular phase of the cardiac cycle. A static or dynamic PET image may then be reconstructed from the entire set of PET data that is averaged over many ECG cycles. A gated PET image set is reconstructed for each bin in the gated study. One of these bins may correspond to the selected phase for which the CT data set was reconstructed. The alignment may further improve with such conditions. These PET images are then processed such that the left ventricle is segmented based on the long axis of the heart. Using this information, a PET 3D model can be displayed in “model” space that approximates the anatomical shape of a left ventricle. The CT image is then fused with the PET image along the model coordinates to form a composite image. However, the respective images from which the composite image is formed are not registered because the coordinate systems are not common to both image sets. Depending on the amount of image blurring due to radioactive tracer energy, degree of cardiac motion, and the modeling techniques, different amounts of misalignment may be introduced. As such, the composite image typically must undergo additional and time-consuming processing to effectively align the functional data with the anatomical data in a clinical area of interest to provide optimal images for diagnosis.
  • Another classic multi-modality paradigm aligns internal or external fiducial markers from a functional image with corresponding anatomical points on an anatomical image. This conventional fiducial marker-based system implements a manual method of fusion that does not take local variations in the datasets into account. The conventional automated rigid or non-rigid body registration process uses mutual information as the cost function for high-lighting differences between the functional and anatomical images. The cost function therefore defines or guides the registration process of the functional data to the anatomical data. There are also methods that use fiducial markers and rigid and non-rigid affine transformation to register images. However, these automated methods do not use any localized anatomical constraints to guide them. As a result, these conventional approaches may only perform data-to-data fusion and, as such, are inapplicable when fusion between data and modeled data, or fusion between modeled data and modeled data is desired.
  • Therefore, it would be desirable to design an apparatus and method of fusing multi-modality images such that alignment is resolved prior to the fusion of the separate images such that post-fusion processing is reduced and supports fusion of modeled functional and/or anatomical data.
  • BRIEF DESCRIPTION OF INVENTION
  • The present invention is directed to a method and apparatus for fusing or combining functional image data and anatomical image data that overcome the aforementioned drawbacks. The invention, which may be carried out through user interaction or automatically, enables composite and clinically valuable images to be generated that display functional and anatomical data acquired with different imaging systems. By identifying fiducial markers on a functional data image and correlating the fiducial markers with anatomical markers or indicia on the anatomical data image, the respective images may be aligned with one another before a composite image is generated. Warping is carried out that takes into consideration anatomical constraints while maintaining alignment of the fiducial and anatomical markers.
  • Therefore, in accordance with one aspect of the invention, a method of medical image overlap comprises the steps of determining at least two anatomical fiducial markers on a functional image and determining corresponding points to the at least two anatomical fiducial markers on an anatomical image. The method also includes the step of aligning the at least two anatomical fiducial markers with the corresponding points on the anatomical image and the step of warping the functional image to fit constraints of the anatomical image while maintaining alignment of the at least two anatomical fiducial markers and the corresponding points on the anatomical image.
  • According to another aspect of the invention, a diagnostic image generation system includes at least one database containing functional and anatomical image data and a computer programmed to determine at least a pair of fiducial markers on a functional image. The computer is also programmed to locate corresponding anatomical indicia on an anatomical image and generate a composite image of the functional image and the anatomical image such that the fiducial markers and the anatomical indicia are aligned and anatomical constraints are observed.
  • In accordance with yet another aspect of the present invention, a computer readable storage medium has a computer program stored thereon. The computer program represents a set of instructions that when executed by a computer cause the computer to access functional image data of a medical patient as well as anatomical image data of the medical patient. The computer is then programmed to identify more than one fiducial marker in the functional image data and identify anatomical locations in the anatomical image data that correspond to the more than one fiducial marker. The set of instructions further cause the computer to generate an image with the functional image data superimposed on the anatomical image data that considers anatomical constraints.
  • Various other features, objects and advantages of the present invention will be made apparent from the following detailed description and the drawings.
  • BRIEF DESCRIPTION OF DRAWINGS
  • The drawings illustrate one preferred embodiment presently contemplated for carrying out the invention.
  • In the drawings:
  • FIG. 1 is a schematic representation of a multi-node network of medical imaging systems applicable with the present invention.
  • FIG. 2 is a flow chart setting forth the steps of a functional image data and anatomical image data fusion technique in accordance with the present invention.
  • DETAILED DESCRIPTION
  • The present invention will be described with respect to a process, which may be carried out through interaction with a user or automatically, to generate a composite diagnostic image of functional and anatomical data acquired separately with a PET imaging system and a CT imaging system. One skilled in the art will appreciate, however, that imaging systems of other modalities such as MR, SPECT, ultrasound, x-ray, and the like may be used to acquire the functional and anatomical data to be combined into a composite image. Further, the present invention will be described with respect to the acquisition and imaging of data from a cardiac region of a patient. However, one skilled in the art will appreciate that the present invention is equivalently applicable with data acquisition and imaging of other anatomical regions of a patient.
  • Referring now to FIG. 1, an overview block diagram of a medical diagnostic and service networked system 10 is shown which includes a plurality of remote treatment stations, such as Station A referenced with numeral 12, and Station B referenced with numeral 14, which may include a medical treatment facility, hospital, clinic, or mobile imaging facility. It is understood, that the number of treatment stations can be limitless, but two specific embodiments are shown with Station A and Station B, which will be further explained hereinafter. The treatment stations 12, 14 are connected to a centralized facility 16 through a communications link 18, such as a network of interconnected server nodes. This network of interconnected nodes may be a secure, internal, intranet or a public communications network, such as the internet. Although a single centralized facility is shown and described, it is understood that the present invention contemplates the use of multiple centralized facilities, each capable of communication with each treatment station. Each treatment station has operational software associated therewith which can be serviced by the centralized facility 16.
  • The various systems disclosed are configured to be selectively linked to the centralized facility 16 by a workstation, which in the example of treatment station 12, includes a laptop computer 20 or permanent workstation 26 connected to an internal network 22. Such selective linking is desirable for accessing data from the systems and transmitting data to the systems.
  • In general, a treatment site may have a number of devices such as a variety of medical diagnostic systems of various modalities. As another example, in the present embodiment, the devices may include a number of networked medical image scanners 24 connected to the internal network 22. Alternately, a treatment station or treatment site 14 can include a number of non-networked medical image scanners 28, 30, and 32 each having a computer or work station associated therewith and having an internal modem or network connection device 34, 36, and 38 to connect the remote treatment station to a communications link 18, such as the internet, to communicate with centralized facility 16.
  • It is understood that each of the network scanners 24 has its own workstation for individual operation and are linked together by the internal network 22. Additionally, each of the network/scanners may be linked to a local database 40 configured to store data associated with imaging scan sessions, as will be discussed shortly. Further, such a system is provided with communications components allowing it to send and receive data over a communications link 18. Similarly, for the non-networked medical image scanners at remote treatment station 14, each of the scanners 28, 30, and 32 is connected to communications link 18 through which they can communicate with the centralized facility 16. Furthermore, each scanner 28, 30, 32 may include a database 42, 44, 46, respectively, for storing scanning data. Scanning data may be transferred to a centralized database 48 through communications link 18 and router 50. The centralized database 48 is included in a remote file server 52, where workstations and scanners, external to the local intranet containing the centralized database 48, can access the database as though located locally on the intranet 54. More specifically, as will be described, workstations 20, 26 can access the data stored in the centralized database 48, or other remote database, such as database 40, as though the data were stored in a database within the specific workstation requesting the data.
  • The embodiment shown in FIG. 1 contemplates a medical facility having such systems as MRI systems, ultrasound systems, x-ray systems, CT systems, as well as PET systems, nuclear imaging systems, or any other type of medical imaging system, however, the present invention is not so limited. Such facilities may also provide services to centralized medical diagnostic management systems, picture archiving and communications systems (PACS), teleradiology systems, etc. Such systems can be either stationary and located in a fixed place and available by a known network address, or be mobile having various net-work addresses. In the embodiment shown in FIG. 1, each treatment station 12, 14 can include any combination of the aforementioned systems, or a treatment station may have all of a single type of system. Each system is connectable and can transmit data through a network and/or with at least one database 40, 48. However, it is understood that the single representation of the centralized database 48 is for demonstrative purposes only, and it is assumed that there is a need for multiple databases in such a system.
  • As previously discussed, each of the systems and substations described herein and referenced in FIG. 1 may be linked selectively to the centralized facility 16 via a network 18. According to the present invention, any acceptable network may be employed whether public, open, dedicated, private, or so forth. The communications links to the network may be of any acceptable type, including conventional telephone lines, fiber optics, cable modem links, digital subscriber lines, wireless data transfer systems, or the like. Each of the systems is provided with communications interface hardware and software of generally known design, permitting them to establish network links and exchange data with the centralized facility 16. However, the systems or particularly, workstations 20, 26 are provided with specialized software so as to communicate with the centralized facility 16 and particularly with the remote database 48 as though the data stored in the remote database is located locally on workstation 20. In some cases, during periods when no data is exchanged between the customer stations and the centralized facility, the network connection can be terminated. In other cases, the network connection is maintained continuously.
  • In one embodiment, the scanning data from as imaging session, for example, on scanner 24, is automatically transmitted from the scanner to the database 48. That is, database 48 is automatically updated after each imaging scan is executed. Records must be maintained as to the dosage used and catalogued according to the particular diagnostic procedure as well as the individual patient. From these records, the treatment facilities or institutions may ensure conformity with dosage guidelines and regulations. Further, as a result of maintaining an active database storing scan parameter values of executed imaging sessions, a user or prescriber of an imminent imaging session may query the database to later retrieve scanning data for review from any workstation 20, 26 that is permitted to access the remote database 48.
  • As described above, the database having the scan parameter values stored thereon may be accessed from a number of scanners that are remotely located from the database. Furthermore, there is no requirement that each scanner be physically located in the same treatment station or facility. That is, a scanner located in station 12 may electronically transmit and receive data from the remote database 48 while simultaneously therewith any scanner 28, 30, 32 in station 14 may likewise transmit and receive data to and from database 48. Later a workstation 20, 26 at any locality, for example that may be remote to both the scanner 24 and the centralized facility 16, can access the data from any scanner 24, 28, 30, 32 by accessing the centralized facility 16. Furthermore, database 48 need not be located in a separate centralized facility 16. That is, database 48 may be located in either one of stations 12, 14 as well as be remotely located within that station or treatment facility and the workstation 20, 26 requiring access to the scanning data.
  • Referring now to FIG. 2, the steps of a processing technique or method for aligning and registering functional and anatomical data acquired from separate imaging systems built on separate imaging technologies are set forth. The process may be automated or guided through user interactions and commands.
  • Process 56 begins with the accessing of anatomical image data 58 and functional image data or a model of functional image data 60. A model of functional image data may be defined as segmented image data with arbitrary or similar intensities as the original functional image data from which the model was generated. Arrow 62 indicates that the anatomical image data and the functional image data are geometrically collocated. That is, the anatomical and functional data are geometrically oriented about a common coordinate system; however, the data are not registered.
  • Process 56 continues with the identification of anatomical fiducial markers on the functional image 64. Preferably, at least two reference markers are identified. The reference markers, as will be described below, are used to identify corresponding anatomical locations on the anatomical image. Additionally, it is preferred that the fiducial markers be internal anatomical features. However, external surface markers may be used, but the external markers must be used during the acquisition of the functional data as well as the anatomical data. This may be problematic given that the functional data may be acquired at a different time and location than the acquisition of the anatomical data. For example, in a cardiac study, the reference markers may include the ventricular grooves between respective ventricles of a patient's heart.
  • Following determination and identification of fiducial markers on the functional image, corresponding anatomical indicia or points are determined 66 on the anatomical image. In the cardiac study example given above, the ventricular grooves would be identified on the anatomical image. Once the corresponding anatomical indicia are determined and identified, the functional image is overlaid 68 on the anatomical image such that anatomical indicia and the fiducial markers are cooperatively aligned. In this regard, in a preferred embodiment, the anatomical image remains fixed and the functional image is superimposed thereon.
  • The aligning of the fiducial markers and the corresponding anatomical indicia may be carried automatically by a computer programmed to do so or may be done through user interaction with a graphical user interface (GUI) displaying each of the images. In this regard, the user, such a radiologist, technician, or health care professional, may electronically “grab” the functional image, “drag” the image age over the anatomical image such that the fiducial markers and anatomical indicia are aligned, and “drop”the functional image on the anatomical image. In another embodiment, the user may identify or “highlight” the respective fiducial markers and anatomical indicia, and then instruct the computer to overlay or superimpose the functional image on the anatomical image. Additionally, to sufficiently align the fiducial markers and the corresponding anatomical indicia it may be necessary to carry out various translation, scaling, and rotation processes.
  • Process 56 continues at step 70 with the warping of the functional data to the anatomical data such that anatomical constraints are met while maintaining alignment of the fiducial markers and the corresponding anatomical indicia. In this regard, the process tailors the warping process to anatomical constraints of the anatomical data rather than a direct warping of the functional and anatomical data. For instance, in the cardiac example above, the health care provider will recognize that the functional data corresponds to ventricular anatomy and the anatomical data corresponds to the coronary artery. As it is common for the coronary arteries to be located on the outer surfaces of the ventricles, warping would be applied locally such that the coronary arteries of the anatomical image lay on the outer surface of the ventricular anatomy of the functional image. In this case, enforcing the anatomical constraint requires that the nearest point on the ventricular surface project onto the location of the coronary artery while maintaining a smooth surface. As a result, the functional and anatomical are more precisely aligned and the composite image generated at step 72 is clinically valuable. As noted above, anatomical constraints are application and modality dependent and are useful for creating clinically meaningful results. In this invention, the anatomic constraints are used to define physical relationships between aspects covered by functional and anatomic data, and to enforce known relationships between functional and anatomic data.
  • Warping is an elastic registration process that may be used to fuse or combine images acquired from scanners of separate modalities. With warped, elastic transformation techniques, multi-scale, multi-region, pyramidal approaches are implemented. As such, a cost function is utilized to highlight differences between the images on a scale-by-scale basis such that the differences are optimized at every scale. That is, an image is sampled at a given scale and then is segmented or divided into multiple regions. Separate shift vectors are then determined or calculated at different regions. The vectors are interpolated to generate a smooth shift transformation which is applied to warp the image. The image is then re-sampled and the registration process is repeated at successive scales until a pre-determined final scale is reached.
  • The above process has been described with respect to the fusion of data between anatomical image data and either functional image data or modeled functional image data. The process, however, may be equivalently carried out to fuse modeled anatomical image data and either functional image data or modeled functional image data.
  • Therefore, in accordance with one aspect of the invention, a method of medical image overlap comprises the steps of determining at least two anatomical fiducial markers on a functional image and determining corresponding points to the at least two anatomical fiducial markers on an anatomical image. The method also includes the step of aligning the at least two anatomical fiducial markers with the corresponding points on the anatomical image and the step of warping the functional image to fit constraints of the anatomical image while maintaining alignment of the at least two anatomical fiducial markers and the corresponding points on the anatomical image.
  • According to another aspect of the invention, a diagnostic image generation system includes at least one database containing functional and anatomical image data and a computer programmed to determine at least a pair of fiducial markers on a functional image. The computer is also programmed to locate corresponding anatomical indicia on an anatomical image and generate a composite image of the functional image and the anatomical image such that the fiducial markers and the anatomical indicia are aligned and anatomical constraints are observed.
  • In accordance with yet another aspect of the present invention, a computer readable storage medium has a computer program stored thereon. The computer program represents a set of instructions that when executed by a computer cause the computer to access functional image data of a medical patient as well as anatomical image data of the medical patient. The computer is then programmed to identify more than one fiducial marker in the functional image data and identify anatomical locations in the anatomical image data that correspond to the more than one fiducial marker. The set of instructions further cause the computer to generate an image with the functional image data superimposed on the anatomical image data that considers anatomical constraints.
  • The present invention has been described in terms of the preferred embodiment, and it is recognized that equivalents, alternatives, and modifications, aside from those expressly stated, are possible and within the scope of the appending claims.

Claims (22)

1. A method of medical image overlap comprising the steps of:
determining at least two anatomical fiducial markers on a functional image;
determining corresponding points to the at least two anatomical fiducial markers on an anatomical image;
aligning the at least two anatomical fiducial markers with the corresponding points on the anatomical image; and
warping the functional image to fit constraints of the anatomical image while maintaining alignment of the at least two anatomical fiducial markers and the corresponding points on the anatomical image.
2. The method of claim 1 further comprising the step of accessing a model of functional data prior to determining the at least two anatomical fiducial markers.
3. The method of claim 1 wherein the functional image includes perfusion data and the anatomical image includes anatomical data of a coronary artery.
4. The method of claim 3 wherein the at least two anatomical fiducial markers and the corresponding points on the anatomical image correspond to ventricle grooves between ventricles of a medical patient.
5. The method of claim 4 wherein the data acquired with PET and the data acquired with CT include gated images.
6. The method of claim 3 wherein the perfusion data includes data acquired with positron emission tomography (PET) and the anatomical data includes data acquired with computed tomography (CT).
7. The method of claim 3 wherein anatomical constraints of the functional image take into account cardiac motion.
8. The method of claim 1 wherein the step of determining the at least two anatomical fiducial markers includes the step of locating the at least two anatomical fiducial markers in a three-dimensional image.
9. The method of claim 1 wherein the step of aligning includes registering the functional image and the anatomical image by at least one of translating, scaling, and rotating the functional image and the anatomical image with respect to one another.
10. The method of claim 1 further comprising the step of enforcing anatomical constraints during the step of warping by projecting a nearest point on the functional image onto the anatomical image while maintaining surface smoothness.
11. A diagnostic image generation system comprising:
at least one database containing functional and anatomical image data; and
a computer programmed to:
determine at least a pair of fiducial markers on a functional image;
locate corresponding anatomical indicia on an anatomical image; and
generate a composite image of the functional image and the anatomical image such that the fiducial markers and the anatomical indicia are aligned and anatomical constraints are considered.
12. The system of claim 11 wherein the computer is further programmed to at least one of translate, scale, and rotate the functional image and the anatomical image with respect to one another such that the at least the pair of fiducial markers and the anatomical indicia are cooperatively aligned.
13. The system of claim 11 wherein the functional image corresponds to perfusion data acquired of a patient using PET and the anatomical image corresponds to coronary artery data of the patient acquired using CT.
14. The system of claim 13 wherein the functional image data and the anatomical image data include gated data.
15. The system of claim 11 wherein the functional image is a 3D approximate model of a patient anatomy.
16. The system of claim 11 wherein the computer is further programmed to warp the functional image such that functional image data is fit to anatomical constraints of the anatomical image.
17. The system of claim 11 wherein the computer is further programmed to isolate ventricular grooves when determining the at least a pair of fiducial markers.
18. A computer readable storage medium having a computer program stored thereon, the computer program representing a set of instructions that when executed by a computer cause the computer to:
access functional image data of a medical patient;
access anatomical image data of the medical patient;
identify more than one fiducial marker in the functional image data;
identify anatomical locations in the anatomical image data that correspond to the more than one fiducial marker; and
generate an image with the functional image data super-imposed on the anatomical image data that considers anatomical constraints.
19. The computer readable storage medium of claim 18 wherein the set of instructions further causes the computer to align the more than one fiducial marker with the anatomical locations.
20. The computer readable storage medium of claim 19 wherein the set of instructions further causes the computer to warp the functional image data to fit constraints of the anatomical image data.
21. The computer readable storage medium of claim 18 wherein the functional data includes positron emission tomographic perfusion data of a coronary region of a medical patient and the anatomical image data includes computed tomographic coronary artery data of the medical patient.
22. The computer readable storage medium of claim 18 wherein the functional image data and the anatomical image data are geometrically collocated.
US10/604,673 2003-08-08 2003-08-08 Method and apparatus of multi-modality image fusion Abandoned US20050031176A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US10/604,673 US20050031176A1 (en) 2003-08-08 2003-08-08 Method and apparatus of multi-modality image fusion
US11/931,078 US7848553B2 (en) 2003-08-08 2007-10-31 Method and apparatus of multi-modality image fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US10/604,673 US20050031176A1 (en) 2003-08-08 2003-08-08 Method and apparatus of multi-modality image fusion

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/931,078 Continuation US7848553B2 (en) 2003-08-08 2007-10-31 Method and apparatus of multi-modality image fusion

Publications (1)

Publication Number Publication Date
US20050031176A1 true US20050031176A1 (en) 2005-02-10

Family

ID=34115672

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/604,673 Abandoned US20050031176A1 (en) 2003-08-08 2003-08-08 Method and apparatus of multi-modality image fusion
US11/931,078 Active 2025-03-06 US7848553B2 (en) 2003-08-08 2007-10-31 Method and apparatus of multi-modality image fusion

Family Applications After (1)

Application Number Title Priority Date Filing Date
US11/931,078 Active 2025-03-06 US7848553B2 (en) 2003-08-08 2007-10-31 Method and apparatus of multi-modality image fusion

Country Status (1)

Country Link
US (2) US20050031176A1 (en)

Cited By (88)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050206967A1 (en) * 2004-03-19 2005-09-22 General Electric Company Method and system for managing modality worklists in hybrid scanners
US20050249434A1 (en) * 2004-04-12 2005-11-10 Chenyang Xu Fast parametric non-rigid image registration based on feature correspondences
US20060004275A1 (en) * 2004-06-30 2006-01-05 Vija A H Systems and methods for localized image registration and fusion
US20080013810A1 (en) * 2006-07-12 2008-01-17 Ziosoft, Inc. Image processing method, computer readable medium therefor, and image processing system
WO2008083874A2 (en) * 2007-01-11 2008-07-17 Sicat Gmbh & Co. Kg Image registration
US20080221440A1 (en) * 2007-03-08 2008-09-11 Sync-Rx, Ltd. Imaging and tools for use with moving organs
US20080298664A1 (en) * 2007-05-22 2008-12-04 Diana Martin Method for data evaluation
US20090010540A1 (en) * 2007-07-03 2009-01-08 General Electric Company Method and system for performing image registration
US20090136099A1 (en) * 2007-11-26 2009-05-28 Boyden Edward S Image guided surgery with dynamic image reconstruction
US20090147024A1 (en) * 2007-12-11 2009-06-11 The Boeing Company Graphical display system and method
WO2009077971A1 (en) * 2007-12-18 2009-06-25 Koninklijke Philips Electronics, N.V. Fusion of cardiac 3d ultrasound and x-ray information by means of epicardial surfaces and landmarks
US20090248447A1 (en) * 2008-03-25 2009-10-01 Kabushiki Kaisha Toshiba Report generation support system
US20090306547A1 (en) * 2007-03-08 2009-12-10 Sync-Rx, Ltd. Stepwise advancement of a medical tool
US20100067755A1 (en) * 2006-08-08 2010-03-18 Koninklijke Philips Electronics N.V. Registration of electroanatomical mapping points to corresponding image data
US20100128928A1 (en) * 2008-11-27 2010-05-27 Sony Corporation Image processing apparatus, image processing method, and program
US20100157041A1 (en) * 2007-03-08 2010-06-24 Sync-Rx, Ltd. Automatic stabilization of an image stream of a moving organ
US20100198063A1 (en) * 2007-05-19 2010-08-05 The Regents Of The University Of California Multi-Modality Phantoms and Methods for Co-registration of Dual PET-Transrectal Ultrasound Prostate Imaging
US20110013220A1 (en) * 2009-07-20 2011-01-20 General Electric Company Application server for use with a modular imaging system
US20110034801A1 (en) * 2009-08-06 2011-02-10 Siemens Medical Solutions Usa, Inc. System for Processing Angiography and Ultrasound Image Data
WO2011112559A3 (en) * 2010-03-08 2012-01-12 Bruce Adams System, method and article for normalization and enhancement of tissue images
US20120033865A1 (en) * 2009-04-15 2012-02-09 Koninklijke Philips Electronics N.V. Quantification of medical image data
US8243882B2 (en) 2010-05-07 2012-08-14 General Electric Company System and method for indicating association between autonomous detector and imaging subsystem
US20120230568A1 (en) * 2011-03-09 2012-09-13 Siemens Aktiengesellschaft Method and System for Model-Based Fusion of Multi-Modal Volumetric Images
WO2013132402A3 (en) * 2012-03-08 2014-02-27 Koninklijke Philips N.V. Intelligent landmark selection to improve registration accuracy in multimodal image fusion
US8855744B2 (en) 2008-11-18 2014-10-07 Sync-Rx, Ltd. Displaying a device within an endoluminal image stack
US9095313B2 (en) 2008-11-18 2015-08-04 Sync-Rx, Ltd. Accounting for non-uniform longitudinal motion during movement of an endoluminal imaging probe
US9101286B2 (en) 2008-11-18 2015-08-11 Sync-Rx, Ltd. Apparatus and methods for determining a dimension of a portion of a stack of endoluminal data points
US9144394B2 (en) 2008-11-18 2015-09-29 Sync-Rx, Ltd. Apparatus and methods for determining a plurality of local calibration factors for an image
US9226683B2 (en) 2012-04-16 2016-01-05 Siemens Medical Solutions Usa, Inc. System scan timing by ultrasound contrast agent study
US9286673B2 (en) 2012-10-05 2016-03-15 Volcano Corporation Systems for correcting distortions in a medical image and methods of use thereof
US9292918B2 (en) 2012-10-05 2016-03-22 Volcano Corporation Methods and systems for transforming luminal images
US9301687B2 (en) 2013-03-13 2016-04-05 Volcano Corporation System and method for OCT depth calibration
US9305334B2 (en) 2007-03-08 2016-04-05 Sync-Rx, Ltd. Luminal background cleaning
US9307926B2 (en) 2012-10-05 2016-04-12 Volcano Corporation Automatic stent detection
US9324141B2 (en) 2012-10-05 2016-04-26 Volcano Corporation Removal of A-scan streaking artifact
US9360630B2 (en) 2011-08-31 2016-06-07 Volcano Corporation Optical-electrical rotary joint and methods of use
US9367965B2 (en) 2012-10-05 2016-06-14 Volcano Corporation Systems and methods for generating images of tissue
US9375164B2 (en) 2007-03-08 2016-06-28 Sync-Rx, Ltd. Co-use of endoluminal data and extraluminal imaging
US9383263B2 (en) 2012-12-21 2016-07-05 Volcano Corporation Systems and methods for narrowing a wavelength emission of light
US9478940B2 (en) 2012-10-05 2016-10-25 Volcano Corporation Systems and methods for amplifying light
US9486143B2 (en) 2012-12-21 2016-11-08 Volcano Corporation Intravascular forward imaging device
US20170014648A1 (en) * 2014-03-03 2017-01-19 Varian Medical Systems, Inc. Systems and methods for patient position monitoring
US9596993B2 (en) 2007-07-12 2017-03-21 Volcano Corporation Automatic calibration systems and methods of use
US9612105B2 (en) 2012-12-21 2017-04-04 Volcano Corporation Polarization sensitive optical coherence tomography system
US9622706B2 (en) 2007-07-12 2017-04-18 Volcano Corporation Catheter for in vivo imaging
US9629571B2 (en) 2007-03-08 2017-04-25 Sync-Rx, Ltd. Co-use of endoluminal data and extraluminal imaging
US9709379B2 (en) 2012-12-20 2017-07-18 Volcano Corporation Optical coherence tomography system that is reconfigurable between different imaging modes
US9730613B2 (en) 2012-12-20 2017-08-15 Volcano Corporation Locating intravascular images
US9770172B2 (en) 2013-03-07 2017-09-26 Volcano Corporation Multimodal segmentation in intravascular images
US9858668B2 (en) 2012-10-05 2018-01-02 Volcano Corporation Guidewire artifact removal in images
US9867530B2 (en) 2006-08-14 2018-01-16 Volcano Corporation Telescopic side port catheter device with imaging system and method for accessing side branch occlusions
US9888969B2 (en) 2007-03-08 2018-02-13 Sync-Rx Ltd. Automatic quantitative vessel analysis
US9974509B2 (en) 2008-11-18 2018-05-22 Sync-Rx Ltd. Image super enhancement
US20180232925A1 (en) * 2017-02-10 2018-08-16 Arizona Board Of Regents On Behalf Of Arizona State University Real-time medical image visualization systems and related methods
US10058284B2 (en) 2012-12-21 2018-08-28 Volcano Corporation Simultaneous imaging, monitoring, and therapy
US10070827B2 (en) 2012-10-05 2018-09-11 Volcano Corporation Automatic image playback
US10166003B2 (en) 2012-12-21 2019-01-01 Volcano Corporation Ultrasound imaging with variable line density
US10191220B2 (en) 2012-12-21 2019-01-29 Volcano Corporation Power-efficient optical circuit
US10219887B2 (en) 2013-03-14 2019-03-05 Volcano Corporation Filters with echogenic characteristics
US10219780B2 (en) 2007-07-12 2019-03-05 Volcano Corporation OCT-IVUS catheter for concurrent luminal imaging
US10226597B2 (en) 2013-03-07 2019-03-12 Volcano Corporation Guidewire with centering mechanism
US10238367B2 (en) 2012-12-13 2019-03-26 Volcano Corporation Devices, systems, and methods for targeted cannulation
US10292677B2 (en) 2013-03-14 2019-05-21 Volcano Corporation Endoluminal filter having enhanced echogenic properties
US10332228B2 (en) 2012-12-21 2019-06-25 Volcano Corporation System and method for graphical processing of medical data
US10362962B2 (en) 2008-11-18 2019-07-30 Synx-Rx, Ltd. Accounting for skipped imaging locations during movement of an endoluminal imaging probe
CN110192894A (en) * 2018-02-27 2019-09-03 徕卡仪器(新加坡)有限公司 Combined ultrasonic and optical ultrasonic head
US10413317B2 (en) 2012-12-21 2019-09-17 Volcano Corporation System and method for catheter steering and operation
US10420530B2 (en) 2012-12-21 2019-09-24 Volcano Corporation System and method for multipath processing of image signals
US10426590B2 (en) 2013-03-14 2019-10-01 Volcano Corporation Filters with echogenic characteristics
US10568586B2 (en) 2012-10-05 2020-02-25 Volcano Corporation Systems for indicating parameters in an imaging data set and methods of use
US10595820B2 (en) 2012-12-20 2020-03-24 Philips Image Guided Therapy Corporation Smooth transition catheters
US10638939B2 (en) 2013-03-12 2020-05-05 Philips Image Guided Therapy Corporation Systems and methods for diagnosing coronary microvascular disease
US10716528B2 (en) 2007-03-08 2020-07-21 Sync-Rx, Ltd. Automatic display of previously-acquired endoluminal images
US10724082B2 (en) 2012-10-22 2020-07-28 Bio-Rad Laboratories, Inc. Methods for analyzing DNA
US10748289B2 (en) 2012-06-26 2020-08-18 Sync-Rx, Ltd Coregistration of endoluminal data points with values of a luminal-flow-related index
US10758207B2 (en) 2013-03-13 2020-09-01 Philips Image Guided Therapy Corporation Systems and methods for producing an image from a rotational intravascular ultrasound device
US10939826B2 (en) 2012-12-20 2021-03-09 Philips Image Guided Therapy Corporation Aspirating and removing biological material
US10942022B2 (en) 2012-12-20 2021-03-09 Philips Image Guided Therapy Corporation Manual calibration of imaging system
US10993694B2 (en) 2012-12-21 2021-05-04 Philips Image Guided Therapy Corporation Rotational ultrasound imaging catheter with extended catheter body telescope
US11026591B2 (en) 2013-03-13 2021-06-08 Philips Image Guided Therapy Corporation Intravascular pressure sensor calibration
US11040140B2 (en) 2010-12-31 2021-06-22 Philips Image Guided Therapy Corporation Deep vein thrombosis therapeutic methods
US11064964B2 (en) 2007-03-08 2021-07-20 Sync-Rx, Ltd Determining a characteristic of a lumen by measuring velocity of a contrast agent
US11064903B2 (en) 2008-11-18 2021-07-20 Sync-Rx, Ltd Apparatus and methods for mapping a sequence of images to a roadmap image
US11141063B2 (en) 2010-12-23 2021-10-12 Philips Image Guided Therapy Corporation Integrated system architectures and methods of use
US11154313B2 (en) 2013-03-12 2021-10-26 The Volcano Corporation Vibrating guidewire torquer and methods of use
US11197651B2 (en) 2007-03-08 2021-12-14 Sync-Rx, Ltd. Identification and presentation of device-to-vessel relative motion
US11272845B2 (en) 2012-10-05 2022-03-15 Philips Image Guided Therapy Corporation System and method for instant and automatic border detection
US11406498B2 (en) 2012-12-20 2022-08-09 Philips Image Guided Therapy Corporation Implant delivery system and implants

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070167806A1 (en) * 2005-11-28 2007-07-19 Koninklijke Philips Electronics N.V. Multi-modality imaging and treatment
US8280483B2 (en) 2006-06-14 2012-10-02 Koninklijke Philips Electronics N.V. Multi-modality medical image viewing
JP5562740B2 (en) * 2009-07-15 2014-07-30 株式会社東芝 Medical image display system and medical image communication method
EP2491531B1 (en) * 2009-10-22 2015-03-04 Koninklijke Philips N.V. Alignment of an ordered stack of images from a specimen.
CN102804228B (en) 2010-03-18 2015-08-19 皇家飞利浦电子股份有限公司 Functional image data strengthens and/or booster
JP5893607B2 (en) 2010-04-05 2016-03-23 プログノシス バイオサイエンシズ インコーポレイテッドPrognosys Biosciences,Inc. Spatial-encoded biological assay
US10787701B2 (en) 2010-04-05 2020-09-29 Prognosys Biosciences, Inc. Spatially encoded biological assays
US20190300945A1 (en) 2010-04-05 2019-10-03 Prognosys Biosciences, Inc. Spatially Encoded Biological Assays
BR112013001487B1 (en) * 2010-07-21 2022-05-03 Armin E. Moehrle Image report creation method and apparatus
EP2651302A2 (en) * 2010-12-16 2013-10-23 Koninklijke Philips N.V. Apparatus for ct-mri and nuclear hybrid imaging, cross calibration, and performance assessment
GB201106254D0 (en) 2011-04-13 2011-05-25 Frisen Jonas Method and product
EP2645330B1 (en) * 2012-03-29 2017-11-29 Siemens Healthcare GmbH Method and system for associating at least two different medical findings with each other
US9269140B2 (en) 2012-12-10 2016-02-23 The Cleveland Clinic Foundation Image fusion with automated compensation for brain deformation
WO2014110169A1 (en) 2013-01-08 2014-07-17 Biocardia, Inc. Target site selection, entry and update with automatic remote image annotation
WO2014210225A1 (en) 2013-06-25 2014-12-31 Prognosys Biosciences, Inc. Methods and systems for determining spatial patterns of biological targets in a sample
CN105874507B (en) 2013-11-27 2019-09-27 模拟技术公司 More image mode navigation system
US11344382B2 (en) 2014-01-24 2022-05-31 Elucent Medical, Inc. Systems and methods comprising localization agents
CN104021536B (en) * 2014-06-16 2017-01-04 西北工业大学 A kind of adaptive SAR image and Multispectral Image Fusion Methods
US10774374B2 (en) 2015-04-10 2020-09-15 Spatial Transcriptomics AB and Illumina, Inc. Spatially distinguished, multiplex nucleic acid analysis of biological specimens
WO2017059228A1 (en) 2015-10-02 2017-04-06 Elucent Medical, Inc. Signal tag detection components, devices, and systems
US9730764B2 (en) 2015-10-02 2017-08-15 Elucent Medical, Inc. Signal tag detection components, devices, and systems
JP6974853B2 (en) 2015-10-02 2021-12-01 エルセント メディカル,インコーポレイテッド Signal tag detection elements, devices and systems
WO2018031826A1 (en) 2016-08-12 2018-02-15 Elucent Medical, Inc. Surgical device guidance and monitoring devices, systems, and methods
US9905044B1 (en) 2016-08-25 2018-02-27 General Electric Company Systems and methods for functional imaging
US10803633B2 (en) 2018-02-06 2020-10-13 General Electric Company Systems and methods for follow-up functional imaging
US10278779B1 (en) 2018-06-05 2019-05-07 Elucent Medical, Inc. Exciter assemblies
US20230323447A1 (en) 2018-08-28 2023-10-12 10X Genomics, Inc. Method for transposase-mediated spatial tagging and analyzing genomic dna in a biological sample
WO2020123316A2 (en) 2018-12-10 2020-06-18 10X Genomics, Inc. Methods for determining a location of a biological analyte in a biological sample
US11309072B2 (en) 2020-04-21 2022-04-19 GE Precision Healthcare LLC Systems and methods for functional imaging

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5384674A (en) * 1991-02-08 1995-01-24 Sharp Kabushiki Kaisha Image recording/reproducing apparatus which displays images to be searched
US5555098A (en) * 1991-12-05 1996-09-10 Eastman Kodak Company Method and apparatus for providing multiple programmed audio/still image presentations from a digital disc image player
US5687160A (en) * 1993-12-10 1997-11-11 Sony Corporation Optical recording medium with lists having playback control information
US5712947A (en) * 1993-08-14 1998-01-27 Sony Corporation Method of recording ID signals for retrieving images, method of retrieving images, and apparatus for reproducing recorded images
US5731852A (en) * 1995-01-16 1998-03-24 Samsung Electronics Co., Ltd. Image/audio information recording and reproducing apparatus using a semiconductor memory
US5745643A (en) * 1995-04-06 1998-04-28 Kabushiki Kaisha Toshiba System for and method of reproducing playback data appropriately by the use of attribute information on the playback data
US5926568A (en) * 1997-06-30 1999-07-20 The University Of North Carolina At Chapel Hill Image object matching using core analysis and deformable shape loci
US6067400A (en) * 1996-03-29 2000-05-23 Matsushita Electric Industrial Co., Ltd. Multimedia optical disc having improved interactive reproduction procedure, a reproduction apparatus and a method for such a disc
US6148138A (en) * 1996-03-15 2000-11-14 Pioneer Electronics Corporation Information record medium, apparatus for recording the same and apparatus for reproducing the same
US6185365B1 (en) * 1995-08-21 2001-02-06 Matshushita Electric Industrial Co., Ltd. Multimedia optical disk, reproduction apparatus and method for achieving variable scene development based on interactive control
US6205235B1 (en) * 1998-07-23 2001-03-20 David Roberts Method and apparatus for the non-invasive imaging of anatomic tissue structures
US6253026B1 (en) * 1997-09-17 2001-06-26 Matsushita Electric Industrial Co., Ltd. Optical disc, recording apparatus, and computer-readable recording medium
US20010036302A1 (en) * 1999-12-10 2001-11-01 Miller Michael I. Method and apparatus for cross modality image registration
US6353702B1 (en) * 1998-07-07 2002-03-05 Kabushiki Kaisha Toshiba Information storage system capable of recording and playing back a plurality of still pictures
US6385389B1 (en) * 1998-01-21 2002-05-07 Kabushiki Kaisha Toshiba Information recording medium, method for recording information, and method for reproduction information
US20020054049A1 (en) * 1996-11-12 2002-05-09 Kenji Toyoda Image playback apparatus, image recording apparatus, and methods thereof
US6389222B1 (en) * 1998-07-07 2002-05-14 Kabushiki Kaisha Toshiba Management system for protected and temporarily-erased still picture information
US6526226B2 (en) * 1995-09-29 2003-02-25 Matsushita Electric Industrial Co., Ltd. Method and an apparatus for reproducing bitstream having non-sequential system clock data seamlessly therebetween
US20030053668A1 (en) * 2001-08-22 2003-03-20 Hendrik Ditt Device for processing images, in particular medical images
US6594442B1 (en) * 1998-06-17 2003-07-15 Hitachi, Ltd. Optical disk recording still image data, a method and apparatus for recording and playing back still image data to and from the optical disk
US6721493B1 (en) * 1998-06-24 2004-04-13 Samsung Electronics Co., Ltd. Recording medium for storing information for still picture, recording and/or reproducing method and apparatus therefor

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5970499A (en) * 1997-04-11 1999-10-19 Smith; Kurt R. Method and apparatus for producing and accessing composite data
FI113615B (en) * 2002-10-17 2004-05-31 Nexstim Oy Three-dimensional modeling of skull shape and content

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5384674A (en) * 1991-02-08 1995-01-24 Sharp Kabushiki Kaisha Image recording/reproducing apparatus which displays images to be searched
US5555098A (en) * 1991-12-05 1996-09-10 Eastman Kodak Company Method and apparatus for providing multiple programmed audio/still image presentations from a digital disc image player
US5712947A (en) * 1993-08-14 1998-01-27 Sony Corporation Method of recording ID signals for retrieving images, method of retrieving images, and apparatus for reproducing recorded images
US5687160A (en) * 1993-12-10 1997-11-11 Sony Corporation Optical recording medium with lists having playback control information
US5731852A (en) * 1995-01-16 1998-03-24 Samsung Electronics Co., Ltd. Image/audio information recording and reproducing apparatus using a semiconductor memory
US5745643A (en) * 1995-04-06 1998-04-28 Kabushiki Kaisha Toshiba System for and method of reproducing playback data appropriately by the use of attribute information on the playback data
US6185365B1 (en) * 1995-08-21 2001-02-06 Matshushita Electric Industrial Co., Ltd. Multimedia optical disk, reproduction apparatus and method for achieving variable scene development based on interactive control
US6526226B2 (en) * 1995-09-29 2003-02-25 Matsushita Electric Industrial Co., Ltd. Method and an apparatus for reproducing bitstream having non-sequential system clock data seamlessly therebetween
US6148138A (en) * 1996-03-15 2000-11-14 Pioneer Electronics Corporation Information record medium, apparatus for recording the same and apparatus for reproducing the same
US6067400A (en) * 1996-03-29 2000-05-23 Matsushita Electric Industrial Co., Ltd. Multimedia optical disc having improved interactive reproduction procedure, a reproduction apparatus and a method for such a disc
US20020054049A1 (en) * 1996-11-12 2002-05-09 Kenji Toyoda Image playback apparatus, image recording apparatus, and methods thereof
US5926568A (en) * 1997-06-30 1999-07-20 The University Of North Carolina At Chapel Hill Image object matching using core analysis and deformable shape loci
US6253026B1 (en) * 1997-09-17 2001-06-26 Matsushita Electric Industrial Co., Ltd. Optical disc, recording apparatus, and computer-readable recording medium
US6385389B1 (en) * 1998-01-21 2002-05-07 Kabushiki Kaisha Toshiba Information recording medium, method for recording information, and method for reproduction information
US6594442B1 (en) * 1998-06-17 2003-07-15 Hitachi, Ltd. Optical disk recording still image data, a method and apparatus for recording and playing back still image data to and from the optical disk
US6721493B1 (en) * 1998-06-24 2004-04-13 Samsung Electronics Co., Ltd. Recording medium for storing information for still picture, recording and/or reproducing method and apparatus therefor
US6353702B1 (en) * 1998-07-07 2002-03-05 Kabushiki Kaisha Toshiba Information storage system capable of recording and playing back a plurality of still pictures
US6389222B1 (en) * 1998-07-07 2002-05-14 Kabushiki Kaisha Toshiba Management system for protected and temporarily-erased still picture information
US6560405B2 (en) * 1998-07-07 2003-05-06 Kabushiki Kaisha Toshiba Information storage system capable of recording and playing back a plurality of still pictures
US6205235B1 (en) * 1998-07-23 2001-03-20 David Roberts Method and apparatus for the non-invasive imaging of anatomic tissue structures
US20010036302A1 (en) * 1999-12-10 2001-11-01 Miller Michael I. Method and apparatus for cross modality image registration
US20030053668A1 (en) * 2001-08-22 2003-03-20 Hendrik Ditt Device for processing images, in particular medical images

Cited By (145)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050206967A1 (en) * 2004-03-19 2005-09-22 General Electric Company Method and system for managing modality worklists in hybrid scanners
US20050249434A1 (en) * 2004-04-12 2005-11-10 Chenyang Xu Fast parametric non-rigid image registration based on feature correspondences
US7596283B2 (en) * 2004-04-12 2009-09-29 Siemens Medical Solutions Usa, Inc. Fast parametric non-rigid image registration based on feature correspondences
US20060004275A1 (en) * 2004-06-30 2006-01-05 Vija A H Systems and methods for localized image registration and fusion
US8090429B2 (en) * 2004-06-30 2012-01-03 Siemens Medical Solutions Usa, Inc. Systems and methods for localized image registration and fusion
US20080013810A1 (en) * 2006-07-12 2008-01-17 Ziosoft, Inc. Image processing method, computer readable medium therefor, and image processing system
US8437518B2 (en) * 2006-08-08 2013-05-07 Koninklijke Philips Electronics N.V. Registration of electroanatomical mapping points to corresponding image data
US20100067755A1 (en) * 2006-08-08 2010-03-18 Koninklijke Philips Electronics N.V. Registration of electroanatomical mapping points to corresponding image data
US9867530B2 (en) 2006-08-14 2018-01-16 Volcano Corporation Telescopic side port catheter device with imaging system and method for accessing side branch occlusions
US8798346B2 (en) 2007-01-11 2014-08-05 Sicat Gmbh & Co. Kg Image registration
WO2008083874A3 (en) * 2007-01-11 2008-09-04 Sicat Gmbh & Co Kg Image registration
WO2008083874A2 (en) * 2007-01-11 2008-07-17 Sicat Gmbh & Co. Kg Image registration
US20100124367A1 (en) * 2007-01-11 2010-05-20 Sicat Gmbh & Co. Kg Image registration
US8781193B2 (en) 2007-03-08 2014-07-15 Sync-Rx, Ltd. Automatic quantitative vessel analysis
US11064964B2 (en) 2007-03-08 2021-07-20 Sync-Rx, Ltd Determining a characteristic of a lumen by measuring velocity of a contrast agent
US20090306547A1 (en) * 2007-03-08 2009-12-10 Sync-Rx, Ltd. Stepwise advancement of a medical tool
US11197651B2 (en) 2007-03-08 2021-12-14 Sync-Rx, Ltd. Identification and presentation of device-to-vessel relative motion
US9305334B2 (en) 2007-03-08 2016-04-05 Sync-Rx, Ltd. Luminal background cleaning
US9216065B2 (en) 2007-03-08 2015-12-22 Sync-Rx, Ltd. Forming and displaying a composite image
US20100157041A1 (en) * 2007-03-08 2010-06-24 Sync-Rx, Ltd. Automatic stabilization of an image stream of a moving organ
US20100161023A1 (en) * 2007-03-08 2010-06-24 Sync-Rx, Ltd. Automatic tracking of a tool upon a vascular roadmap
US20100160764A1 (en) * 2007-03-08 2010-06-24 Sync-Rx, Ltd. Automatic generation and utilization of a vascular roadmap
US20100161022A1 (en) * 2007-03-08 2010-06-24 Sync-Rx, Ltd. Pre-deployment positioning of an implantable device within a moving organ
US20100160773A1 (en) * 2007-03-08 2010-06-24 Sync-Rx, Ltd. Automatic quantitative vessel analysis at the location of an automatically-detected tool
US20100172556A1 (en) * 2007-03-08 2010-07-08 Sync-Rx, Ltd. Automatic enhancement of an image stream of a moving organ
US20100191102A1 (en) * 2007-03-08 2010-07-29 Sync-Rx, Ltd. Automatic correction and utilization of a vascular roadmap comprising a tool
US9308052B2 (en) 2007-03-08 2016-04-12 Sync-Rx, Ltd. Pre-deployment positioning of an implantable device within a moving organ
US20100222671A1 (en) * 2007-03-08 2010-09-02 Sync-Rx, Ltd. Identification and presentation of device-to-vessel relative motion
US20100228076A1 (en) * 2007-03-08 2010-09-09 Sync-Rx, Ltd Controlled actuation and deployment of a medical device
US10226178B2 (en) 2007-03-08 2019-03-12 Sync-Rx Ltd. Automatic reduction of visibility of portions of an image
US9968256B2 (en) 2007-03-08 2018-05-15 Sync-Rx Ltd. Automatic identification of a tool
US11179038B2 (en) 2007-03-08 2021-11-23 Sync-Rx, Ltd Automatic stabilization of a frames of image stream of a moving organ having intracardiac or intravascular tool in the organ that is displayed in movie format
US20080221439A1 (en) * 2007-03-08 2008-09-11 Sync-Rx, Ltd. Tools for use with moving organs
US9375164B2 (en) 2007-03-08 2016-06-28 Sync-Rx, Ltd. Co-use of endoluminal data and extraluminal imaging
US9888969B2 (en) 2007-03-08 2018-02-13 Sync-Rx Ltd. Automatic quantitative vessel analysis
US20080221440A1 (en) * 2007-03-08 2008-09-11 Sync-Rx, Ltd. Imaging and tools for use with moving organs
US9014453B2 (en) 2007-03-08 2015-04-21 Sync-Rx, Ltd. Automatic angiogram detection
US10307061B2 (en) 2007-03-08 2019-06-04 Sync-Rx, Ltd. Automatic tracking of a tool upon a vascular roadmap
US9855384B2 (en) 2007-03-08 2018-01-02 Sync-Rx, Ltd. Automatic enhancement of an image stream of a moving organ and displaying as a movie
US9008367B2 (en) 2007-03-08 2015-04-14 Sync-Rx, Ltd. Apparatus and methods for reducing visibility of a periphery of an image stream
US10499814B2 (en) 2007-03-08 2019-12-10 Sync-Rx, Ltd. Automatic generation and utilization of a vascular roadmap
US9717415B2 (en) 2007-03-08 2017-08-01 Sync-Rx, Ltd. Automatic quantitative vessel analysis at the location of an automatically-detected tool
US8670603B2 (en) 2007-03-08 2014-03-11 Sync-Rx, Ltd. Apparatus and methods for masking a portion of a moving image stream
US8693756B2 (en) 2007-03-08 2014-04-08 Sync-Rx, Ltd. Automatic reduction of interfering elements from an image stream of a moving organ
US8700130B2 (en) 2007-03-08 2014-04-15 Sync-Rx, Ltd. Stepwise advancement of a medical tool
US10716528B2 (en) 2007-03-08 2020-07-21 Sync-Rx, Ltd. Automatic display of previously-acquired endoluminal images
US9629571B2 (en) 2007-03-08 2017-04-25 Sync-Rx, Ltd. Co-use of endoluminal data and extraluminal imaging
US9008754B2 (en) 2007-03-08 2015-04-14 Sync-Rx, Ltd. Automatic correction and utilization of a vascular roadmap comprising a tool
US20100198063A1 (en) * 2007-05-19 2010-08-05 The Regents Of The University Of California Multi-Modality Phantoms and Methods for Co-registration of Dual PET-Transrectal Ultrasound Prostate Imaging
US20080298664A1 (en) * 2007-05-22 2008-12-04 Diana Martin Method for data evaluation
US8144951B2 (en) * 2007-05-22 2012-03-27 Siemens Aktiengesellschaft Method for data evaluation
US20090010540A1 (en) * 2007-07-03 2009-01-08 General Electric Company Method and system for performing image registration
US7995864B2 (en) * 2007-07-03 2011-08-09 General Electric Company Method and system for performing image registration
US9596993B2 (en) 2007-07-12 2017-03-21 Volcano Corporation Automatic calibration systems and methods of use
US11350906B2 (en) 2007-07-12 2022-06-07 Philips Image Guided Therapy Corporation OCT-IVUS catheter for concurrent luminal imaging
US10219780B2 (en) 2007-07-12 2019-03-05 Volcano Corporation OCT-IVUS catheter for concurrent luminal imaging
US9622706B2 (en) 2007-07-12 2017-04-18 Volcano Corporation Catheter for in vivo imaging
US20090136099A1 (en) * 2007-11-26 2009-05-28 Boyden Edward S Image guided surgery with dynamic image reconstruction
US9076203B2 (en) * 2007-11-26 2015-07-07 The Invention Science Fund I, Llc Image guided surgery with dynamic image reconstruction
US8031209B2 (en) * 2007-12-11 2011-10-04 The Boeing Company Graphical display system and method
US20090147024A1 (en) * 2007-12-11 2009-06-11 The Boeing Company Graphical display system and method
WO2009077971A1 (en) * 2007-12-18 2009-06-25 Koninklijke Philips Electronics, N.V. Fusion of cardiac 3d ultrasound and x-ray information by means of epicardial surfaces and landmarks
US20090248447A1 (en) * 2008-03-25 2009-10-01 Kabushiki Kaisha Toshiba Report generation support system
US11064903B2 (en) 2008-11-18 2021-07-20 Sync-Rx, Ltd Apparatus and methods for mapping a sequence of images to a roadmap image
US8855744B2 (en) 2008-11-18 2014-10-07 Sync-Rx, Ltd. Displaying a device within an endoluminal image stack
US9974509B2 (en) 2008-11-18 2018-05-22 Sync-Rx Ltd. Image super enhancement
US9144394B2 (en) 2008-11-18 2015-09-29 Sync-Rx, Ltd. Apparatus and methods for determining a plurality of local calibration factors for an image
US9101286B2 (en) 2008-11-18 2015-08-11 Sync-Rx, Ltd. Apparatus and methods for determining a dimension of a portion of a stack of endoluminal data points
US10362962B2 (en) 2008-11-18 2019-07-30 Synx-Rx, Ltd. Accounting for skipped imaging locations during movement of an endoluminal imaging probe
US11883149B2 (en) 2008-11-18 2024-01-30 Sync-Rx Ltd. Apparatus and methods for mapping a sequence of images to a roadmap image
US9095313B2 (en) 2008-11-18 2015-08-04 Sync-Rx, Ltd. Accounting for non-uniform longitudinal motion during movement of an endoluminal imaging probe
US20100128928A1 (en) * 2008-11-27 2010-05-27 Sony Corporation Image processing apparatus, image processing method, and program
US8494217B2 (en) * 2008-11-27 2013-07-23 Sony Corporation Image processing apparatus, image processing method, and program
US20120033865A1 (en) * 2009-04-15 2012-02-09 Koninklijke Philips Electronics N.V. Quantification of medical image data
US8811708B2 (en) * 2009-04-15 2014-08-19 Koninklijke Philips N.V. Quantification of medical image data
US20110013220A1 (en) * 2009-07-20 2011-01-20 General Electric Company Application server for use with a modular imaging system
US8786873B2 (en) 2009-07-20 2014-07-22 General Electric Company Application server for use with a modular imaging system
US20110034801A1 (en) * 2009-08-06 2011-02-10 Siemens Medical Solutions Usa, Inc. System for Processing Angiography and Ultrasound Image Data
US8909323B2 (en) * 2009-08-06 2014-12-09 Siemens Medical Solutions Usa, Inc. System for processing angiography and ultrasound image data
US9339194B2 (en) 2010-03-08 2016-05-17 Cernoval, Inc. System, method and article for normalization and enhancement of tissue images
WO2011112559A3 (en) * 2010-03-08 2012-01-12 Bruce Adams System, method and article for normalization and enhancement of tissue images
US10201281B2 (en) 2010-03-08 2019-02-12 Cernoval, Inc. System, method and article for normalization and enhancement of tissue images
US8243882B2 (en) 2010-05-07 2012-08-14 General Electric Company System and method for indicating association between autonomous detector and imaging subsystem
US11141063B2 (en) 2010-12-23 2021-10-12 Philips Image Guided Therapy Corporation Integrated system architectures and methods of use
US11040140B2 (en) 2010-12-31 2021-06-22 Philips Image Guided Therapy Corporation Deep vein thrombosis therapeutic methods
US9824302B2 (en) * 2011-03-09 2017-11-21 Siemens Healthcare Gmbh Method and system for model-based fusion of multi-modal volumetric images
US20120230568A1 (en) * 2011-03-09 2012-09-13 Siemens Aktiengesellschaft Method and System for Model-Based Fusion of Multi-Modal Volumetric Images
US9360630B2 (en) 2011-08-31 2016-06-07 Volcano Corporation Optical-electrical rotary joint and methods of use
WO2013132402A3 (en) * 2012-03-08 2014-02-27 Koninklijke Philips N.V. Intelligent landmark selection to improve registration accuracy in multimodal image fusion
US9478028B2 (en) * 2012-03-08 2016-10-25 Koninklijke Philips N.V. Intelligent landmark selection to improve registration accuracy in multimodal image fusion
US20150016728A1 (en) * 2012-03-08 2015-01-15 Koninklijke Philips N.V. Intelligent landmark selection to improve registration accuracy in multimodal image fushion
JP2015514447A (en) * 2012-03-08 2015-05-21 コーニンクレッカ フィリップス エヌ ヴェ Intelligent landmark selection to improve registration accuracy in multimodal image integration
US9226683B2 (en) 2012-04-16 2016-01-05 Siemens Medical Solutions Usa, Inc. System scan timing by ultrasound contrast agent study
US10748289B2 (en) 2012-06-26 2020-08-18 Sync-Rx, Ltd Coregistration of endoluminal data points with values of a luminal-flow-related index
US10984531B2 (en) 2012-06-26 2021-04-20 Sync-Rx, Ltd. Determining a luminal-flow-related index using blood velocity determination
US9367965B2 (en) 2012-10-05 2016-06-14 Volcano Corporation Systems and methods for generating images of tissue
US11272845B2 (en) 2012-10-05 2022-03-15 Philips Image Guided Therapy Corporation System and method for instant and automatic border detection
US9324141B2 (en) 2012-10-05 2016-04-26 Volcano Corporation Removal of A-scan streaking artifact
US10070827B2 (en) 2012-10-05 2018-09-11 Volcano Corporation Automatic image playback
US11890117B2 (en) 2012-10-05 2024-02-06 Philips Image Guided Therapy Corporation Systems for indicating parameters in an imaging data set and methods of use
US9478940B2 (en) 2012-10-05 2016-10-25 Volcano Corporation Systems and methods for amplifying light
US9307926B2 (en) 2012-10-05 2016-04-12 Volcano Corporation Automatic stent detection
US10568586B2 (en) 2012-10-05 2020-02-25 Volcano Corporation Systems for indicating parameters in an imaging data set and methods of use
US11510632B2 (en) 2012-10-05 2022-11-29 Philips Image Guided Therapy Corporation Systems for indicating parameters in an imaging data set and methods of use
US9286673B2 (en) 2012-10-05 2016-03-15 Volcano Corporation Systems for correcting distortions in a medical image and methods of use thereof
US9858668B2 (en) 2012-10-05 2018-01-02 Volcano Corporation Guidewire artifact removal in images
US9292918B2 (en) 2012-10-05 2016-03-22 Volcano Corporation Methods and systems for transforming luminal images
US11864870B2 (en) 2012-10-05 2024-01-09 Philips Image Guided Therapy Corporation System and method for instant and automatic border detection
US10724082B2 (en) 2012-10-22 2020-07-28 Bio-Rad Laboratories, Inc. Methods for analyzing DNA
US10238367B2 (en) 2012-12-13 2019-03-26 Volcano Corporation Devices, systems, and methods for targeted cannulation
US10595820B2 (en) 2012-12-20 2020-03-24 Philips Image Guided Therapy Corporation Smooth transition catheters
US11892289B2 (en) 2012-12-20 2024-02-06 Philips Image Guided Therapy Corporation Manual calibration of imaging system
US9730613B2 (en) 2012-12-20 2017-08-15 Volcano Corporation Locating intravascular images
US11141131B2 (en) 2012-12-20 2021-10-12 Philips Image Guided Therapy Corporation Smooth transition catheters
US10942022B2 (en) 2012-12-20 2021-03-09 Philips Image Guided Therapy Corporation Manual calibration of imaging system
US10939826B2 (en) 2012-12-20 2021-03-09 Philips Image Guided Therapy Corporation Aspirating and removing biological material
US11406498B2 (en) 2012-12-20 2022-08-09 Philips Image Guided Therapy Corporation Implant delivery system and implants
US9709379B2 (en) 2012-12-20 2017-07-18 Volcano Corporation Optical coherence tomography system that is reconfigurable between different imaging modes
US11253225B2 (en) 2012-12-21 2022-02-22 Philips Image Guided Therapy Corporation System and method for multipath processing of image signals
US10420530B2 (en) 2012-12-21 2019-09-24 Volcano Corporation System and method for multipath processing of image signals
US9612105B2 (en) 2012-12-21 2017-04-04 Volcano Corporation Polarization sensitive optical coherence tomography system
US11786213B2 (en) 2012-12-21 2023-10-17 Philips Image Guided Therapy Corporation System and method for multipath processing of image signals
US10413317B2 (en) 2012-12-21 2019-09-17 Volcano Corporation System and method for catheter steering and operation
US10166003B2 (en) 2012-12-21 2019-01-01 Volcano Corporation Ultrasound imaging with variable line density
US10332228B2 (en) 2012-12-21 2019-06-25 Volcano Corporation System and method for graphical processing of medical data
US10993694B2 (en) 2012-12-21 2021-05-04 Philips Image Guided Therapy Corporation Rotational ultrasound imaging catheter with extended catheter body telescope
US10058284B2 (en) 2012-12-21 2018-08-28 Volcano Corporation Simultaneous imaging, monitoring, and therapy
US9486143B2 (en) 2012-12-21 2016-11-08 Volcano Corporation Intravascular forward imaging device
US10191220B2 (en) 2012-12-21 2019-01-29 Volcano Corporation Power-efficient optical circuit
US9383263B2 (en) 2012-12-21 2016-07-05 Volcano Corporation Systems and methods for narrowing a wavelength emission of light
US9770172B2 (en) 2013-03-07 2017-09-26 Volcano Corporation Multimodal segmentation in intravascular images
US10226597B2 (en) 2013-03-07 2019-03-12 Volcano Corporation Guidewire with centering mechanism
US11154313B2 (en) 2013-03-12 2021-10-26 The Volcano Corporation Vibrating guidewire torquer and methods of use
US10638939B2 (en) 2013-03-12 2020-05-05 Philips Image Guided Therapy Corporation Systems and methods for diagnosing coronary microvascular disease
US11026591B2 (en) 2013-03-13 2021-06-08 Philips Image Guided Therapy Corporation Intravascular pressure sensor calibration
US9301687B2 (en) 2013-03-13 2016-04-05 Volcano Corporation System and method for OCT depth calibration
US10758207B2 (en) 2013-03-13 2020-09-01 Philips Image Guided Therapy Corporation Systems and methods for producing an image from a rotational intravascular ultrasound device
US10426590B2 (en) 2013-03-14 2019-10-01 Volcano Corporation Filters with echogenic characteristics
US10292677B2 (en) 2013-03-14 2019-05-21 Volcano Corporation Endoluminal filter having enhanced echogenic properties
US10219887B2 (en) 2013-03-14 2019-03-05 Volcano Corporation Filters with echogenic characteristics
US20170014648A1 (en) * 2014-03-03 2017-01-19 Varian Medical Systems, Inc. Systems and methods for patient position monitoring
US10737118B2 (en) * 2014-03-03 2020-08-11 Varian Medical Systems, Inc. Systems and methods for patient position monitoring
US10643360B2 (en) * 2017-02-10 2020-05-05 Arizona Board Of Regents On Behalf Of Arizona State University Real-time medical image visualization systems and related methods
US20180232925A1 (en) * 2017-02-10 2018-08-16 Arizona Board Of Regents On Behalf Of Arizona State University Real-time medical image visualization systems and related methods
CN110192894A (en) * 2018-02-27 2019-09-03 徕卡仪器(新加坡)有限公司 Combined ultrasonic and optical ultrasonic head

Also Published As

Publication number Publication date
US7848553B2 (en) 2010-12-07
US20080064949A1 (en) 2008-03-13

Similar Documents

Publication Publication Date Title
US7848553B2 (en) Method and apparatus of multi-modality image fusion
CN109844865B (en) Network, decision support system and related Graphical User Interface (GUI) application for medical image analysis
Chandrashekara et al. Analysis of 3-D myocardial motion in tagged MR images using nonrigid image registration
US6563941B1 (en) Model-based registration of cardiac CTA and MR acquisitions
Bidaut et al. Automated registration of dynamic MR images for the quantification of myocardial perfusion
Maintz et al. An overview of medical image registration methods
JP5984251B2 (en) Medical image processing system, medical image processing apparatus, and medical image processing method
Akbarzadeh et al. Evaluation of whole‐body MR to CT deformable image registration
Banerjee et al. A completely automated pipeline for 3D reconstruction of human heart from 2D cine magnetic resonance slices
CN101019152A (en) System and method for loading timepoints for analysis of disease progression or response to therapy
Li et al. Multi-modality cardiac image computing: A survey
Aladl et al. Automated image registration of gated cardiac single‐photon emission computed tomography and magnetic resonance imaging
Hill et al. Medical image registration using knowledge of adjacency of anatomical structures
US11495346B2 (en) External device-enabled imaging support
CN103829962B (en) PET and CT scan interlock method and PET/CT scanning systems
Collignon et al. Surface-based registration of 3D medical images
EP2332124B1 (en) Patient specific anatomical sketches for medical reports
Lemke et al. Applications of picture processing, image analysis and computer graphics techniques to cranial CT scans
CN113689477A (en) Multi-modality medical image registration method, system, readable storage medium and device
Goble et al. Real-time system for 3D neurosurgical planning
Jiang et al. Using maximal cross-section detection for the registration of 3D image data of the head
Positano et al. Automatic characterization of myocardial perfusion in contrast enhanced MRI
Speight MRI to CT image registration
Jiao et al. Anatomy-aware self-supervised fetal MRI synthesis from unpaired ultrasound images
Collignon et al. New high-performance 3D registration algorithms for 3D medical images

Legal Events

Date Code Title Description
AS Assignment

Owner name: GE MEDICAL SYSTEMS GLOBAL TECHNOLOGY CO. LLC, WISC

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HERTEL, SARAH R.;AVINASH, GOPAL B.;REEL/FRAME:013860/0329

Effective date: 20030808

AS Assignment

Owner name: GENERAL ELECTRIC COMPANY, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GE MEDICAL SYSTEMS GLOBAL TECHNOLOGY COMPANY, LLC;REEL/FRAME:016212/0534

Effective date: 20030331

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION