US20050096538A1 - Image plane stabilization for medical imaging - Google Patents
Image plane stabilization for medical imaging Download PDFInfo
- Publication number
- US20050096538A1 US20050096538A1 US10/696,608 US69660803A US2005096538A1 US 20050096538 A1 US20050096538 A1 US 20050096538A1 US 69660803 A US69660803 A US 69660803A US 2005096538 A1 US2005096538 A1 US 2005096538A1
- Authority
- US
- United States
- Prior art keywords
- dimensional
- region
- motion
- interest
- scan plane
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000002059 diagnostic imaging Methods 0.000 title claims abstract description 14
- 230000006641 stabilisation Effects 0.000 title description 6
- 238000011105 stabilization Methods 0.000 title description 6
- 230000033001 locomotion Effects 0.000 claims abstract description 134
- 238000003384 imaging method Methods 0.000 claims abstract description 43
- 230000004075 alteration Effects 0.000 claims abstract description 6
- 238000000034 method Methods 0.000 claims description 47
- 230000000087 stabilizing effect Effects 0.000 claims description 16
- 238000002604 ultrasonography Methods 0.000 claims description 12
- 230000004044 response Effects 0.000 claims description 7
- 238000012285 ultrasound imaging Methods 0.000 claims description 5
- 230000003044 adaptive effect Effects 0.000 abstract description 8
- 238000005562 fading Methods 0.000 abstract description 4
- 239000013598 vector Substances 0.000 description 14
- 238000013519 translation Methods 0.000 description 11
- 230000014616 translation Effects 0.000 description 11
- 238000001514 detection method Methods 0.000 description 7
- 239000002872 contrast media Substances 0.000 description 6
- 230000009466 transformation Effects 0.000 description 6
- 230000008569 process Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 238000003491 array Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 230000010412 perfusion Effects 0.000 description 4
- 238000011002 quantification Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 238000001914 filtration Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000002123 temporal effect Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000002688 persistence Effects 0.000 description 2
- 230000003595 spectral effect Effects 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 101000630284 Homo sapiens Proline-tRNA ligase Proteins 0.000 description 1
- 206010028980 Neoplasm Diseases 0.000 description 1
- 102100026126 Proline-tRNA ligase Human genes 0.000 description 1
- 238000001574 biopsy Methods 0.000 description 1
- 210000004204 blood vessel Anatomy 0.000 description 1
- 238000002591 computed tomography Methods 0.000 description 1
- 230000008602 contraction Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000002592 echocardiography Methods 0.000 description 1
- 230000002526 effect on cardiovascular system Effects 0.000 description 1
- 210000003754 fetus Anatomy 0.000 description 1
- 238000010304 firing Methods 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002107 myocardial effect Effects 0.000 description 1
- 230000000241 respiratory effect Effects 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 230000002792 vascular Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/13—Tomography
- A61B8/14—Echo-tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/467—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
- A61B8/469—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5269—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts
- A61B8/5276—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts due to motion
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8909—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
- G01S15/8915—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
- G01S15/8925—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array the array being a two-dimensional transducer configuration, i.e. matrix or orthogonal linear arrays
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8993—Three dimensional imaging systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52077—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging with means for elimination of unwanted signals, e.g. noise or interference
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52085—Details related to the ultrasound signal acquisition, e.g. scan sequences
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5284—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving retrospective matching to a physiological signal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/54—Control of the diagnostic device
- A61B8/543—Control of the diagnostic device involving acquisition triggered by a physiological signal
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8979—Combined Doppler and pulse-echo imaging systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52085—Details related to the ultrasound signal acquisition, e.g. scan sequences
- G01S7/52087—Details related to the ultrasound signal acquisition, e.g. scan sequences using synchronization techniques
- G01S7/52088—Details related to the ultrasound signal acquisition, e.g. scan sequences using synchronization techniques involving retrospective scan line rearrangements
Definitions
- the present invention relates to image stabilization in medical imaging.
- An imaging position is stabilized with respect to a region of interest as images are acquired over time.
- a transducer In medical diagnostic ultrasound imaging, a transducer is positioned adjacent to a patient. The sonographer attempts to maintain the transducer in a given position relative to a region of interest within the patient. Temporal variations in the transducer position due to movements by the sonographer, movements by the patient, breathing motion, heart motion or other sources of motion cause the transducer to move relative to the patient.
- the scan plane is typically fixed at least in the elevation dimension with respect to the transducer. The undesired or unintended motion results in scanning different tissue within the patient.
- Images may be stabilized within the scan plane. Motion between subsequent images in a sequence of images is tracked. The acquired image data is then adjusted or shifted along the azimuth or range dimensions so that a region of interest is maintained at the same location on the display. Other processes, such as contrast agent quantification, use motion tracking to reduce motion artifacts. Previously acquired data is processed or shifted as a function of the motion to reduce the artifacts. However, some motion artifacts may remain despite shifts in data. The shifted data may not optimally represent the region of interest. To provide the maximum versatility, a large amount of unused image information is acquired and stored for allowing shifts. Acquiring lots of ultrasound information may reduce frame rates.
- Motion tracking is also used in three-dimensional and extended field of view imaging.
- a plurality of two-dimensional scans are performed in different positions within a same plane for extended field of view imaging. The motion between the various acquired images is determined for assembling the images together in an extended field of view.
- a plurality of two-dimensional images are acquired for a plurality of scan planes within a three-dimensional volume.
- Motion tracking is performed using ultrasound data, motion sensors on the transducer or other techniques for determining the relative positions of the scan planes.
- An image representing three-dimensional space is then rendered from the acquired sets of ultrasound data. However, multiple images or sets of data are acquired to form the extended field of view or three-dimensional representation.
- Another motion adaptive process is disclosed in U.S. Pat. No. 5,873,830.
- An amount of motion between different images is detected.
- the beamformer is configured to increase spatial resolution, such as by increasing line density or the number of transmit beams.
- the frame rate is increased by decreasing the line density or number of beams.
- changing density or number of beams as a function of detected motion may still result in desired tissue fading in or fading out of the image scan plane due to the motion.
- a medical imaging system automatically acquires two-dimensional images representing a user-defined region of interest despite motion.
- the plane of acquisition is updated or altered adaptively as a function of detected motion.
- the user-designated region of interest is then continually scanned due to the alteration in scan plane position.
- a multi-dimensional array is used to stabilize imaging of a region of interest in a three-dimensional volume.
- the user defines a region of interest for two-dimensional imaging.
- Motion is then detected for six or other number of degrees of freedom, such as translation along each of three dimensions and rotation about each of those three dimensions.
- the position of a scan plane used to generate a subsequent two-dimensional image is then oriented as a function of the detected motion within the three-dimensional volume.
- the scan plane is positioned such that the region of interest designated by the user is within the scan plane.
- a method for stabilizing an image plane in medical imaging is provided. Motion is tracked within a region. An acquisition scan plane position is automatically altered relative to the transducer as a function of the motion.
- a method for stabilizing a scan plane within a volume in medical diagnostic ultrasound imaging is provided.
- a region of interest is identified.
- Data representing at least portions of a three-dimensional volume positioned at least partially around the region of interest is acquired.
- Data representing sub-volumes of the three-dimensional volume is acquired using fewer scan lines.
- the data representing the sub-volumes is compared with the data representing the portions of the three-dimensional volume. Motion is detected as a function of the comparison.
- a two-dimensional scan plane is positioned within the three-dimensional volume as a function of the region of interest and the detected motion.
- a two-dimensional image is then acquired using the positioned two-dimensional scan plane.
- a method for stabilizing imaging within a volume in medical diagnostic ultrasound imaging is provided.
- a two-dimensional area is repetitively scanned with a multi-dimensional transducer array.
- Motion within a volume that includes the two-dimensional area is repetitively detected.
- the two-dimensional area is adaptively repositioned within the volume as a function of the detected motion.
- a system for stabilizing a scan plane within a volume in medical imaging connects with a beamformer.
- the beamformer is responsive to a beamformer controller and is operable to acquire data representing tissue within a data acquisition scan plane.
- the beamformer controller is operable to control a position of the data acquisition scan plane relative to the multi-dimensional transducer array.
- a processor is operable to detect motion within a volume.
- the beamformer controller is operable to alter the position of the data acquisition scan plane in response to the detected motion.
- FIG. 1 is a block diagram of one embodiment of a system for stabilizing a scan plane in medical imaging
- FIG. 2 is a flow chart diagram of a method for stabilizing an image plane in medical imaging in one embodiment
- FIGS. 3A-3C are graphical representations of one embodiment for implementing the method of FIG. 2 for two-dimensional motion tracking.
- FIGS. 4A-4C are graphical representations representing another embodiment of FIG. 2 for three-dimensional motion tracking.
- Image movement due to respiratory motion, patient motion, sonographer motion or other undesirable motions leading to a tissue of interest moving into and/or out of a sequence of images is avoided.
- By tracking a position of a tissue of interest subsequent acquisitions are aligned to insonnify the tissue, resulting in a steady or more stable maintenance of the image plane relative to the tissue of interest.
- quantification is made more stable and consistent.
- Diagnosis may be improved since each image in the sequence is more likely to represent the tissue of interest.
- Moving tissues, such as associated with the fetus or cardiology imaging may be more accurately monitored by maintaining a scan plane relative to the moving tissue despite the tissue movement.
- Perfusion measurements such as associated with contrast agent enhancement applications, may be improved.
- FIG. 1 shows one embodiment of a medical imaging system 10 for stabilizing a scan plane within a region or volume.
- the system 10 includes a transducer 12 , a beamformer 14 , a beamformer controller 16 , a processor 18 , an image processor 20 , a display 22 and a user interface 24 . Additional, different or fewer components may be provided, such as not having the user interface 24 , image processor 20 or display 22 .
- the system 10 is a medical diagnostic ultrasound system for acquiring image information using acoustic energy.
- other medical imaging systems may be used, such as a computed tomography, magnetic resonance, X-ray or other now known or latter developed imaging systems.
- the position of a subsequent scan is controlled to maintain a tissue of interest within the scan plane or acquisition region.
- the transducer 12 is a multi-dimensional array of elements. For example, a 1.5, 1.75 or two-dimensional array of elements are provided. Annular, wobbler, or other mechanically or electrically steerable arrays may be used. Two-dimensional array is used broadly to include an array of N ⁇ M elements where N and M are equal or non-equal, but both greater than 1. Arrays with non-square or non-rectangular element patterns may be provided in any multi-dimensional arrangement.
- the multi-dimensional transducer array 12 is steerable in two dimensions, such as along an elevation and azimuth dimension. In alternative embodiments, the transducer 12 is a one-dimensional linear array for scanning a two-dimensional region.
- the beamformer 14 is an analog or digital ultrasound transmit and/or receive beamformer.
- the beamformer 14 is the beamformer disclosed in U.S. Pat. Nos. 5,675,554 and 5,685,308, the disclosures of which are incorporated herein by reference.
- the beamformer 14 is shown in general, but in one embodiment includes both a transmit and receive beamformers as separate devices.
- the transmit beamformer generates the acoustic energy along the acquisition scan plane.
- the receive beamformer receives responsive echo signals and provides them to the image processor 20 and the processor 18 .
- sufficient beamformer channels are provided on transmit and/or receive to beamform along both the azimuth and elevation dimensions.
- sparse array techniques may be used.
- plane wave imaging techniques are provided.
- the number of cables between the transducer 12 and the beamformer 14 is reduced by time division multiplexing for allowing a greater number of channels while minimizing the size of the cable.
- sufficient channels are provided for beamforming along an azimuth dimension, and switchable connections between the channels and elements of the arrays are used to position a linear array of elements along any of azimuth and elevation positions on the plane of the transducer 14 .
- an acquisition scan plane is always normal to at least one dimension but electronic steering is provided for scanning along angles for another dimension.
- the beamformer 14 includes a plurality of transmit channels connectable with one or more of the elements of the transducer 12 .
- Each transmit channel includes a delay, amplifier and a waveform generator. Additional, different or fewer components may be provided.
- the transmit channels generate waveforms with different apodization and delay profiles relative to other waveforms for steering acoustic energy along one or more scan lines. By selecting which transmit channels connect to which elements of the transducer array 12 , ultrasound scan lines are generated along any various azimuthal and elevation locations and angles.
- the beamformer 14 is responsive to the beamformer controller 16 for positioning the acquisition scan plane.
- the acquisition scan plane is positioned within a two-dimensional or three-dimensional region. Acoustic energy is transmitted in any of various now known or later developed scan patterns along the scan plane for acquiring data.
- the acquisition scan plane is used for acquiring data for subsequent images.
- the processor 18 is a digital signal processor, general processor, application specific integrated circuit, control processor, detector or other now known or latter developed processor.
- the processor 18 is a separate component from the beamformer 14 , the beamformer controller 16 and the image processor 20 .
- the processor 18 is a general, system control processor connected with the user interface 24 .
- the processor 18 is a processor within the beamformer 14 , the beamformer controller 16 or the image processor 20 .
- the processor 18 has multiple processors or circuits distributed at a same or different locations throughout the system 10 .
- the processor 18 is operable to detect motion within a volume in response to acquired data.
- the processor 18 identifies motion from the received ultrasound data. While the processor 18 is shown connected to the beamformer 14 , in other embodiments, the processor 18 connects to an output of the image processor 20 for processing detected data.
- the beamformer controller 16 is a general processor, application specific integrated circuit, digital signal processor or other now known or later developed controller for controlling the beamformer 14 .
- the beamformer controller 16 is the controller disclosed in U.S. Pat. Nos. 5,675,554 or 5,685,308.
- the beamformer controller 16 is operable to control a position of the data acquisition scan plane relative to the multi-dimensional transducer array 12 .
- the beamformer controller 16 receives input from the processor 18 . The input indicates a desired scan plane position, an amount of motion, a direction of motion, or change.
- the beamformer controller 16 is operable to alter the position of the data acquisition scan plane for transmit and/or receive operation.
- the beamformer controller 16 controls the apodization and delay profile generated across the multiple channels of the transmit beamformer 14 .
- the connection of the channels to specific elements within the array may also be controlled by the beamformer controller 16 , such as by controlling a multiplexer or transmit and receive switch.
- the scan plane is positioned at any of various positions and angles within three-dimensional space relative to the transducer 12 .
- the image processor 20 includes one or more spatial or temporal filters, one or more detectors and a scan converter. Additional, different or fewer components may be provided.
- the image processor 20 receives data responsive to transmission along the acquisition scan plane. The data is then detected and converted to a display format. A resulting image is displayed on the display 22 .
- the detected information or image information may alternatively or additionally be stored for later viewing or processing.
- the image processor 20 is operable to determine one or more quantities as a function of the data, such as a distance between detected data points associated with tissue features. Since the feedback between the beamformer 14 , the processor 18 and the beamformer controller 16 provides for real time or adaptive positioning of the acquisition scan plane, the resulting images generated by the image processor are more likely images of a tissue of interest despite undesired motions.
- the user interface 24 is a keyboard, trackball, mouse, touchpad, touch- screen, slider, knob, button, combinations thereof or other now known or latter developed input device.
- the user interface 24 is shown connected with the beamformer controller 16 .
- the user interface 24 connects to the beamformer controller 16 through one or more other devices, such as the processor 18 or a system control processor.
- the user designates a region of interest within a two-dimensional or three-dimensional image using the user interface 24 .
- the user interface 24 is operable to receive the input indicating a region of interest and store or otherwise communicate the spatial position within the image to the beamformer controller 16 or processor 18 . For example, a plurality of two-dimensional images are generated.
- the user positions the transducer 12 such that a tissue of interest is being imaged, the user indicates the position of the tissue of interest, such as by tracing the tissue of interest or starting an automatic border detection function.
- the tissue of interest may be automatically set by the system using techniques such as automatic image segmentation.
- the system 10 tracks motion of the transducer 12 relative to the tissue of interest within a three dimensional volume. As the region of interest moves relative to the transducer 12 due to undesired motion, the acquisition scan plane is altered to account for the movement. As a result, the acquisition scan plane continuously or more likely passes through the region of interest.
- FIG. 2 shows one embodiment of a method for stabilizing imaging or an image plane within a volume in medical diagnostic ultrasound or other medical imaging. Different, additional or fewer acts are provided in other embodiments. The method of FIG. 2 is applicable to both two-dimensional and three-dimensional tracking.
- FIGS. 3A-3C show one embodiment of stabilizing a scan region in two dimensions.
- FIGS. 4A-4C show a graphic representation of an alternative embodiment of the stabilizing a two-dimensional scan region within a three-dimensional volume. FIG. 2 will be described with respect to both embodiments.
- a region of interest 40 is identified in act 30 .
- the region of interest 40 is identified from a two-dimensional image of a region using a user input and/or automated detection.
- the region 42 represents a two-dimensional region for which the transducer is capable of scanning.
- the region of interest 40 is within the region 42 .
- an image representing the entirety of the region 42 or a subset of the region 42 is acquired.
- Frame of reference data such as data representing the entirety of the region 42 is acquired in act 32 .
- Different or less sample density may be provided in acquiring the frame of reference than for subsequent imaging.
- motion is tracked in act 34 .
- a plurality of sub-regions 44 such as areas associated with a plurality of scan lines spaced throughout the region 42 are acquired.
- the sub-regions are two-dimensional areas that extend less than an entirety of the depth of the region 42 .
- speckle tracking or tracking of features e.g., applying gradient processing and then tracking peak gradient locations
- any motion of the sub-regions or subimages 44 relative to the reference is determined. For example, any of various constant or adaptive search processes are used to provide a best match or a sufficient match of each of the sub-regions 44 to the reference frame of data.
- a translation along two dimensions and a rotation for each of the subimages 44 is determined.
- a translation along a single dimension, translation along two dimensions without rotation, or translation along a single dimension with rotation is used.
- the resulting translational and rotational vectors are combined, such as through averaging, to identify an overall motion. Since only sub-regions are scanned for tracking motion, the frame rate is now increased compared to scanning the whole volume for tracking.
- a scan plane position is altered as a function of the tracked motion in act 36 .
- the scan plane 46 is less than the entire spatial extent 42 possible by the transducer.
- the scan plane 46 is sized to just encompass the region of interest 40 or to include the region of interest 40 as well as additional information. As transducer movement relative to the region of interest 40 occurs, the scan plane 46 is positioned to scan a region of interest 40 based on the tracked motion.
- act 38 image data is acquired based on the shifted acquisition scan plane position. Since the scan plane 46 is shifted to account for motion, the region of interest 40 appears in the displayed image at a same location for each subsequent image regardless of motion between the region of interest 40 and the transducer 12 . For example, the region of interest 40 shifts within the region 42 as shown in FIG. 3A as opposed to FIG. 3B . By shifting the scan plane 46 , the region of interest 40 is then moved by the estimated motion amount in a reverse direction and shown to the user as the region 46 shown in FIG. 3C . The region of interest 40 appears to be stabilized or stationary and is included within each of the images. Acts 34 , 36 and 38 are repeated for subsequent images without requiring further acquisition of an entire reference frame of data. In alternative embodiments, an entire frame of reference data may be subsequently acquired.
- FIGS. 4A-4C and 2 represent a similar process for positioning a scan plane in a three-dimensional region 52 as opposed to the two-dimensional region 42 of FIGS. 3 A-C.
- the three-dimensional region 52 corresponds to a volume that is a subset or the entirety of the volume that the transducer is operable to scan.
- the volume region 52 is conical but may be pyramid-shaped, cylindrical, or other shapes in other embodiments.
- the volume region 52 corresponds to electric steering of a two-dimensional array in one embodiment. The steering is at any of various angles, such as a normal through to 45 degrees away from normal. Other angles may be used.
- the region of interest 40 is identified.
- the user moves the transducer relative to the patient or causes the system to move the scan plane 54 to find the region of interest 40 .
- a two-dimensional region is scanned with ultrasound energy.
- the two-dimensional region is the plane EFGH within the volume region 52 .
- the plane 54 is positioned at a center of the transducer array, but may be positioned any where in any orientation within the volume region 52 .
- a three-dimensional representation is generated for identifying the region of interest 40 .
- the user inputs information designating the region of interest 40 within the region 52 .
- the user identifies the region of interest by tracing, by selecting two or more points, by selecting a point, by automatic border detection, automatic segmentation or by other now known or latter developed techniques.
- the system 10 determines the spatial location of the region of interest 40 within the volume 52 .
- reference data is acquired in act 32 .
- the entire three-dimensional volume is scanned. For example, a representative sample of the volume region 52 is obtained. A larger or smaller three-dimensional volume, such as associated with greater or lesser steering angles, may be scanned.
- the representative sample is acquired over the entire spatial extent in one embodiment, but may be acquired over lesser spatial extents in other embodiments.
- the entire spatial extent is based on an area of the two-dimensional transducer array and the steering angle. For example, the two-dimensional array used for acquiring data defines the entire spatial extent of the scan.
- the representative sample is acquired over the entire or other spatial extent with a same or different scan line density than for subsequent imaging.
- Data representing at least portions of the three-dimensional volume are acquired for positions at least partially around the region of interest.
- a lesser line density, sample density or combinations thereof may be used.
- the representative data is equally or evenly spaced throughout the volume region 52 , but unequal or variations in sample or line density may be provided.
- the data for the entire volume is acquired with a low resolution, such as using a low frequency or smaller aperture. Low resolution may result in a higher frame rate for scanning the entire spatial extent.
- a two-dimensional area is repetitively scanned with the multi-dimensional transducer array.
- the two-dimensional area such as the scan plane 54
- the two-dimensional area can be a C-Plane, B-Plane or any other variation of the above two planes, obtained by rotating C— or B-Planes.
- the transducer may also acquire a small 3D volume enclosing the region of interest, such as with two or more spatially distinct scan planes.
- act 34 motion within the three-dimensional volume region 52 is tracked.
- the motion within the volume is repetitively detected for generating a plurality of images. Since the volume region 52 where motion is detected includes the two-dimensional scan plane 54 and associated region of interest 40 , the detected motion indicates motion of the region of interest 40 within the volume region 52 .
- Motion is detected by comparing data acquired at different times, such as comparing each subsequently acquired set of data with the reference frame of data acquired in act 32 .
- a lesser amount of data is acquired to maintain higher frame rate.
- acoustic energy is transmitted to three sub-regions of the three-dimensional volume region 52 without acquiring data for the entire three-dimensional volume region 52 .
- two of the sub-regions are along a same set of scan lines. More than three sub-regions along the same or different scan lines may be used.
- the data is acquired using fewer scan lines than performed for acquiring the reference information in act 32 . As shown in FIG.
- each set of sub-regions includes nine adjacent scan lines, but sets of spaced scan lines, sparse scan lines, a greater number of scan lines or a fewer number of scan lines may be used.
- each of the sets of scan lines 56 is of a same or similar scan line density, but different densities may be provided.
- the scan line density and scan line positions for each of the sets of scan lines 56 are the same density and scan lines for a sub-volume used to acquire the reference frame data, but different densities or scan line positions may be used.
- Acquisition parameters for obtaining data for motion tracking are the same or different than used for acquiring the reference information.
- acquisition of the tracking data is adaptive. For example, the size of each beam, the number of beams or other acquisition parameter is adjusted as a function of a previous motion estimate, the variance associated with the motion estimate or a measure of a tissue rigidity. For large variance motion estimates or low tissue rigidity, the beam size is increased or the number of beams is increased.
- the acquisition parameters may also be updated as a function of a change in acquisition parameters for imaging. For example, the user selects a different center frequency, aperture, F number, depths of imaging or other imaging parameter. The same parameter is altered for obtaining a tracking data. The same parameter is used for both tracking and imaging. In alternative or additional embodiments, different imaging parameters are used for tracking than for imaging.
- Data associated with a cubed region at two or more different depths along each of the sets of scan lines 56 is used for comparison and motion detection. As shown in FIG. 4B , six tracking regions 58 are obtained. Additional or fewer sub-regions 58 may be used. In alternative embodiments, one or three or more sub-regions within each of the sets of scan lines 56 are used. While data representing cubes are acquired in one embodiment, data representing any of other various one, two or three-dimensional shapes may be used. In another embodiment, the data along the entire depth of each of the sets of scan lines 56 is used for motion detection.
- a substantially lesser portion of the volume region 52 is scanned than is performed for acquiring the reference information or for scanning an entire volume. For example, 50 percent fewer scan lines are acquired as compared to scanning the entire volume 52 with a same density. A greater or lesser percentage may be provided. As a result, the sub-volumes also represent a substantially less total volume than the entire three-dimensional volume region 52 .
- the motion vectors 60 are determined by tracking each cube using speckle correlation.
- a high pass filter or other filtering and acquisition parameters are selected to best identify or provide speckle information.
- a spatial gradient is applied to the data to identify one or more features within each sub-region 58 . Easily identified landmarks, such a cystic areas, blood vessels or highly echogenic specular targets are tracked instead of tracking pixels within a sub-volume for speckle correlation.
- the sub-regions 58 are adaptively placed prior to acquisition by identifying features within the reference frame of data acquired in act 32 . Filtering or other techniques in addition to or as an alternative to a spatial gradient function may be used to identify one or more features. A feature pattern or volume around an identified single feature for each sub-volume 58 is identified. Filtering or other functions may be used in addition to or as an alternative to the spatial gradient for identifying a tracking feature.
- a motion vector 60 is determined for each of the sub-volumes 58 .
- a direction, a magnitude or both a direction and a magnitude of the motion are determined by comparing the data from each of the sub-volumes 58 with the reference data acquired in act 32 .
- a translation within three dimensions is determined without determining rotation.
- the amount and direction of translation of the sub-volume 58 relative to the volume region 52 indicates a motion vector 60 .
- Data responsive to the grouped sets of beams is used to determine the direction and magnitude of motion of the volume region 52 relative to the transducer.
- a minimum sum of absolute differences, cross correlation, or other now known or latter developed correlation is used to match the data for the sub-volumes 58 with the referenced data.
- Correlation is performed using data prior to detection, data after detection but prior to scan conversion, data after scan conversion, display image data, or other data. Any of various search patterns involving translating and/or rotating the data representing the sub- volumes 58 relative to the referenced data is used to identify a best match. A coarse search followed by a fine search, a search adapted to expected motion, a size of the region to be searched adapted to previous amounts of motion, or other adaptive or efficient search techniques may be used.
- the same reference data is used to compare to each subsequently acquired set of data representing sub-regions 58 .
- data representing a subsequently acquired sub-volume 58 is compared to data from a previously acquired sub-volume in a same general area. Given minimal amount of motion, the motion vector may be small enough to track from one sub-volume to a subsequently acquired sub-volume without comparison to the reference frame of data.
- the sub-volumes are translated and/or rotated as a group to find a single motion vector.
- a least squares fit, an average, or other combination of the motion vectors is used to calculate a single transformation indicating motion of the transducer 12 relative to the volume region 52 .
- Rigid body motion is assumed for each sub-volume, but warping or other techniques may be used to account for tissue deformation.
- Translations in three dimensions and rotations about the three dimensions are determined using a least squares fit, such as determined from using six separate motion vectors shown in FIG. 4B .
- 6,306,091 discloses various techniques for identifying a rigid body transformation from a plurality of vectors.
- the motion tracking, subvector or global vector techniques disclosed in the '091 patent are extended to three-dimensional processing.
- the resulting rigid body transformation represents six degrees of freedom, such as a translation in an X, Y and Z dimensions as well as rotation about each of the dimensions. In alternative embodiments, fewer degrees of freedom or motion associated with only translation, only rotation or a subset of the six degrees of freedom is provided.
- one or more tracking parameters are adjusted as a function of a position of the tracking location within the region 52 .
- a position of the tracking location For example, as the speckle is positioned deeper and deeper within the region 52 , diffraction results in larger speckle. Element factor or other factors may change as a position of depth, steering angle or other location within volume region 52 .
- the correlation, cross correlation, minimum sum of differences or other matching function is altered based on the position.
- a warping such as a one-, two- or three-dimensional expansion or contraction of the data is performed as part of the correlation operation as a function of the position of the tracking location. By spatially expanding or contracting the data, the data more likely matches the reference data.
- the position of the acquisition scan plane 54 is automatically altered relative to the transducer 12 as a function of the detected motion.
- FIG. 4C shows transformation of the acquisition scan plane 54 to account for the detected or estimated motion.
- the acquisition scan plane 54 is maintained at a position to intersect the region of interest 40 over time.
- the position of the acquisition scan plane 54 is altered within the three-dimensional volume region 52 to account for relative motion between the region of interest 40 and the transducer 12 .
- the motion tracking provides information on the position of the region of interest 40 within the volume region 52 relative to the transducer 12 .
- the acquisition scan plane (i.e., the transmission and/or reception plane) is adaptively repositioned, altered or updated.
- the two-dimensional area of the acquisition scan plane is positioned within the volume as a function of and in response to the detected motion and the region of interest 40 .
- the acquisition scan plane 54 is positioned in a plane QPRS different than the EFGH plane of FIG. 4A as a function of the motion vectors 60 shown in FIG. 4B .
- the region of interest 40 moves in a direction opposite to the detected motion.
- the acquisition scan plane 54 is translated and/or rotated, such as translating and rotating within the six degrees of freedom.
- the acquisition scan plane 54 is translated and maintained at the same angle relative to the normal to the array or not rotated.
- six degrees of freedom may be provided for positioning the acquisition scan plane 54 . Fewer degrees of freedom may be provided for other multi-dimensional or two-dimensional arrays.
- the scan plane 54 is adaptively positioned using one or more degrees of freedom to more likely scan the region of interest 40 .
- the reference frame of data is acquired for every N frames of data containing the region of interest, where N is a number such as 10. Other values may also be used for N.
- image data is acquired.
- Acoustic energy is electronically or mechanically steered across the acquisition scan plane 54 in any of now known or later developed formats, such as sector, vector, linear or as a plane wave.
- the data from acoustic echoes represents the tissue intersected within the acquisition scan plane 54 .
- Received data is beamformed, image processed and used to generate a two-dimensional or three-dimensional display.
- the region of interest 40 is represented in the image due to the shift in the scan plane position.
- spectral Doppler display associated with a range gate position or point, continuous wave Doppler display associated with a line, or M-mode display associated with a line are generated from a point or line within the scan plane 54 .
- the point or line are tracked and adaptively positioned.
- the motion tracking of act 34 , the acquisition scan plane position alteration of act 36 and the acquisition of image data of act 38 are repeated over time such that the two-dimensional acquisition scan plane 54 is adaptively positioned to intersect the region of interest over time.
- the adaptively positioned acquisition scan planes are repetitively scanned for generating images. Upon viewing a sequence of images, the user perceives the region of interest 40 as being stationary or stabilized. The region of interest 40 is less likely to fade out of the images due to the adapted positioning of the scan plane 54 .
- the acquisition scan plane 54 is maintained in a position to intersect the region of interest 40 during multiple acquisitions accounting for relative motion of the transducer 12 to the tissue.
- Further stabilization is provided by shifting the resulting two-dimensional images as a function of an initial position of the region of interest 40 .
- Adaptive positioning of the acquisition scan plane 54 results in the region of interest 40 being continually imaged.
- the region of interest 40 may also or alternatively be shifted within the display two-dimensional image by translation along one or two dimensions and/or rotation to maintain further stabilization. For example, as the transducer shifts to the left relative to the tissue, the region of interest 40 may appear to shift to the left within resulting images.
- the region of interest 40 is tracked or the shift is accounted for in the display two-dimensional image. In one embodiment, the shift occurs to the image data in range and azimuth.
- the acquisition scan plane 54 extends only over a portion of the width of the volume region 52 accessible by the transducer 12 .
- the positioning of the acquisition scan plane automatically shifts the region of interest 40 in the displayed image.
- tissue is compressed due to additional pressure from the transducer 12 or extended due to a release of pressure
- the region of interest 40 may appear to shift upwards or downwards on the image.
- the region of interest 40 is maintained in the same location on the display.
- a further shift in the acquisition scan plane position 54 may be performed as a function of time. For example, the difference in time between acquisition of the data used for tracking motion and the acquisition of data used for generating an image is considered. A velocity, acceleration or both velocity and acceleration are determined. The temporal difference is used with the velocity or acceleration information to determine an additional shift.
- the tracking of the imaging plane is used for any B-mode, Doppler, M-mode, spectral Doppler, continuous wave Doppler, harmonic, contrast agent imaging or other imaging.
- Other applications may benefit from tracking the position of the acquisition scan plane 54 in three-dimensional volume region 52 .
- tumor perfusion using contrast agents or other radiology-based contrast quantification is performed.
- Contrast agent quantification may also be performed for myocardial perfusion or other cardiology applications.
- Triggered imaging of the heart is provided so that the resulting images are acquired at a same time during a heart cycle. Alternatively, warping or other non-rigid motion is accounted for throughout the heart cycle.
- Another application is a biopsy or surgical guidance. Better guidance may be provided by maintaining the scan plane in position relative to the region of interest.
- cardiovascular quantitative measurements such as vascular measurements of carotid plaque assessment, pulsatility, aortic aneurism or others.
- the acquisition scan plane is used for acquiring Doppler information for both Doppler and B-mode information.
- the motion is tracked using B-mode or other information. Any of various combinations of using the same or different data for tracking and imaging may be used.
- B-mode or other information. Any of various combinations of using the same or different data for tracking and imaging may be used.
- By stabilizing the scan plane position relative to the region of interest 40 more aggressive persistence for Doppler imaging may be used with no or minimal decrease in resolution. Vessel structure reconstruction may also be improved.
- Other high persistence imaging such as contrast agent imaging to identify microvascular structures, may be improved.
- Off-line motion tracking processing is eliminated by providing for a tracking and imaging described above in real time or while a patient is being scanned during an imaging session.
- Real time imaging is provided due to the reduced or minimal impact of acquiring motion information using sub-volumes.
- Motion tracking is performed in three dimensions without having to acquire consecutive full three-dimensional volume representations of data.
- stabilization of the acquisition scan plane is used for acquiring a three-dimensional set of data.
- the scan plane is purposefully positioned at different locations within the volume region 52 .
- motion of the transducer 12 relative to the tissue is accounted for as discussed above.
- the acquisition scan plane position is adjusted as a function of both the motion and the intended displacement for three-dimensional data acquisition.
Abstract
A medical imaging system automatically acquires two-dimensional images representing a user-defined region of interest despite motion. The plane of acquisition is updated or altered adaptively as a function of detected motion. The user-designated region of interest is then continually scanned due to the alteration in scan plane position. A multi-dimensional array is used to stabilize imaging of a region of interest in a three-dimensional volume. The user defines a region of interest for two-dimensional imaging. Motion is then detected. The position of a scan plane used to generate a subsequent two-dimensional image is then oriented as a function of the detected motion within the three-dimensional volume. By repeating the motion determination and adaptive alteration of the scan plane position, real time imaging of a same region of interest is provided while minimizing the region of interest fading into or out of the sequence of images.
Description
- The present invention relates to image stabilization in medical imaging. An imaging position is stabilized with respect to a region of interest as images are acquired over time.
- In medical diagnostic ultrasound imaging, a transducer is positioned adjacent to a patient. The sonographer attempts to maintain the transducer in a given position relative to a region of interest within the patient. Temporal variations in the transducer position due to movements by the sonographer, movements by the patient, breathing motion, heart motion or other sources of motion cause the transducer to move relative to the patient. The scan plane is typically fixed at least in the elevation dimension with respect to the transducer. The undesired or unintended motion results in scanning different tissue within the patient.
- Images may be stabilized within the scan plane. Motion between subsequent images in a sequence of images is tracked. The acquired image data is then adjusted or shifted along the azimuth or range dimensions so that a region of interest is maintained at the same location on the display. Other processes, such as contrast agent quantification, use motion tracking to reduce motion artifacts. Previously acquired data is processed or shifted as a function of the motion to reduce the artifacts. However, some motion artifacts may remain despite shifts in data. The shifted data may not optimally represent the region of interest. To provide the maximum versatility, a large amount of unused image information is acquired and stored for allowing shifts. Acquiring lots of ultrasound information may reduce frame rates.
- Motion tracking is also used in three-dimensional and extended field of view imaging. A plurality of two-dimensional scans are performed in different positions within a same plane for extended field of view imaging. The motion between the various acquired images is determined for assembling the images together in an extended field of view. Similarly for three-dimensional imaging, a plurality of two-dimensional images are acquired for a plurality of scan planes within a three-dimensional volume. Motion tracking is performed using ultrasound data, motion sensors on the transducer or other techniques for determining the relative positions of the scan planes. An image representing three-dimensional space is then rendered from the acquired sets of ultrasound data. However, multiple images or sets of data are acquired to form the extended field of view or three-dimensional representation.
- Another motion adaptive process is disclosed in U.S. Pat. No. 5,873,830. An amount of motion between different images is detected. Where motion is not detected or minimal, the beamformer is configured to increase spatial resolution, such as by increasing line density or the number of transmit beams. Where motion is detected, the frame rate is increased by decreasing the line density or number of beams. However, changing density or number of beams as a function of detected motion may still result in desired tissue fading in or fading out of the image scan plane due to the motion.
- By way of introduction, the preferred embodiments described below includes methods and systems for stabilizing a scan plane in medical imaging. A medical imaging system automatically acquires two-dimensional images representing a user-defined region of interest despite motion. The plane of acquisition is updated or altered adaptively as a function of detected motion. The user-designated region of interest is then continually scanned due to the alteration in scan plane position.
- In one embodiment, a multi-dimensional array is used to stabilize imaging of a region of interest in a three-dimensional volume. The user defines a region of interest for two-dimensional imaging. Motion is then detected for six or other number of degrees of freedom, such as translation along each of three dimensions and rotation about each of those three dimensions. The position of a scan plane used to generate a subsequent two-dimensional image is then oriented as a function of the detected motion within the three-dimensional volume. The scan plane is positioned such that the region of interest designated by the user is within the scan plane. By repeating the motion determination and adaptive alteration of the scan plane position, real time imaging of a same region of interest is provided while minimizing the region of interest fading into or out of the sequence of images.
- In a first aspect, a method for stabilizing an image plane in medical imaging is provided. Motion is tracked within a region. An acquisition scan plane position is automatically altered relative to the transducer as a function of the motion.
- In a second aspect, a method for stabilizing a scan plane within a volume in medical diagnostic ultrasound imaging is provided. A region of interest is identified. Data representing at least portions of a three-dimensional volume positioned at least partially around the region of interest is acquired. Data representing sub-volumes of the three-dimensional volume is acquired using fewer scan lines. The data representing the sub-volumes is compared with the data representing the portions of the three-dimensional volume. Motion is detected as a function of the comparison. A two-dimensional scan plane is positioned within the three-dimensional volume as a function of the region of interest and the detected motion. A two-dimensional image is then acquired using the positioned two-dimensional scan plane.
- In a third aspect, a method for stabilizing imaging within a volume in medical diagnostic ultrasound imaging is provided. A two-dimensional area is repetitively scanned with a multi-dimensional transducer array. Motion within a volume that includes the two-dimensional area is repetitively detected. The two-dimensional area is adaptively repositioned within the volume as a function of the detected motion.
- In a fourth aspect, a system for stabilizing a scan plane within a volume in medical imaging is provided. A multi-dimensional transducer array connects with a beamformer. The beamformer is responsive to a beamformer controller and is operable to acquire data representing tissue within a data acquisition scan plane. The beamformer controller is operable to control a position of the data acquisition scan plane relative to the multi-dimensional transducer array. A processor is operable to detect motion within a volume. The beamformer controller is operable to alter the position of the data acquisition scan plane in response to the detected motion.
- The present invention is defined by the following claims, and nothing in this section should be taken as a limitation on those claims. Further aspects and advantages of the invention are discussed below in conjunction with the preferred embodiments.
- The components and the figures are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention. Moreover, in the figures, like reference numerals designate corresponding parts throughout the different views.
-
FIG. 1 is a block diagram of one embodiment of a system for stabilizing a scan plane in medical imaging; -
FIG. 2 is a flow chart diagram of a method for stabilizing an image plane in medical imaging in one embodiment; -
FIGS. 3A-3C are graphical representations of one embodiment for implementing the method ofFIG. 2 for two-dimensional motion tracking; and -
FIGS. 4A-4C are graphical representations representing another embodiment ofFIG. 2 for three-dimensional motion tracking. - Image movement due to respiratory motion, patient motion, sonographer motion or other undesirable motions leading to a tissue of interest moving into and/or out of a sequence of images is avoided. By tracking a position of a tissue of interest, subsequent acquisitions are aligned to insonnify the tissue, resulting in a steady or more stable maintenance of the image plane relative to the tissue of interest. In addition to ease of use and general aesthetic desirability, quantification is made more stable and consistent. Diagnosis may be improved since each image in the sequence is more likely to represent the tissue of interest. Moving tissues, such as associated with the fetus or cardiology imaging may be more accurately monitored by maintaining a scan plane relative to the moving tissue despite the tissue movement. Perfusion measurements, such as associated with contrast agent enhancement applications, may be improved.
-
FIG. 1 shows one embodiment of amedical imaging system 10 for stabilizing a scan plane within a region or volume. Thesystem 10 includes atransducer 12, abeamformer 14, abeamformer controller 16, aprocessor 18, animage processor 20, adisplay 22 and auser interface 24. Additional, different or fewer components may be provided, such as not having theuser interface 24,image processor 20 ordisplay 22. In one embodiment, thesystem 10 is a medical diagnostic ultrasound system for acquiring image information using acoustic energy. In alternative embodiments, other medical imaging systems may be used, such as a computed tomography, magnetic resonance, X-ray or other now known or latter developed imaging systems. Using feedback from received data to the scanner, such as theultrasound beamformer 14 used for acquisition, the position of a subsequent scan is controlled to maintain a tissue of interest within the scan plane or acquisition region. - The
transducer 12 is a multi-dimensional array of elements. For example, a 1.5, 1.75 or two-dimensional array of elements are provided. Annular, wobbler, or other mechanically or electrically steerable arrays may be used. Two-dimensional array is used broadly to include an array of N×M elements where N and M are equal or non-equal, but both greater than 1. Arrays with non-square or non-rectangular element patterns may be provided in any multi-dimensional arrangement. Themulti-dimensional transducer array 12 is steerable in two dimensions, such as along an elevation and azimuth dimension. In alternative embodiments, thetransducer 12 is a one-dimensional linear array for scanning a two-dimensional region. - The
beamformer 14 is an analog or digital ultrasound transmit and/or receive beamformer. In one embodiment, thebeamformer 14 is the beamformer disclosed in U.S. Pat. Nos. 5,675,554 and 5,685,308, the disclosures of which are incorporated herein by reference. Thebeamformer 14 is shown in general, but in one embodiment includes both a transmit and receive beamformers as separate devices. The transmit beamformer generates the acoustic energy along the acquisition scan plane. The receive beamformer receives responsive echo signals and provides them to theimage processor 20 and theprocessor 18. - In one embodiment, sufficient beamformer channels are provided on transmit and/or receive to beamform along both the azimuth and elevation dimensions. To reduce the number of beamformer channels, sparse array techniques may be used. Alternatively, plane wave imaging techniques are provided. In yet another alternative embodiment, the number of cables between the
transducer 12 and thebeamformer 14 is reduced by time division multiplexing for allowing a greater number of channels while minimizing the size of the cable. In yet another alternative embodiment, sufficient channels are provided for beamforming along an azimuth dimension, and switchable connections between the channels and elements of the arrays are used to position a linear array of elements along any of azimuth and elevation positions on the plane of thetransducer 14. As a result, an acquisition scan plane is always normal to at least one dimension but electronic steering is provided for scanning along angles for another dimension. - In one embodiment, the
beamformer 14 includes a plurality of transmit channels connectable with one or more of the elements of thetransducer 12. Each transmit channel includes a delay, amplifier and a waveform generator. Additional, different or fewer components may be provided. The transmit channels generate waveforms with different apodization and delay profiles relative to other waveforms for steering acoustic energy along one or more scan lines. By selecting which transmit channels connect to which elements of thetransducer array 12, ultrasound scan lines are generated along any various azimuthal and elevation locations and angles. - The
beamformer 14 is responsive to thebeamformer controller 16 for positioning the acquisition scan plane. Using the above-described electronic, mechanical or both electronic and mechanical steering, the acquisition scan plane is positioned within a two-dimensional or three-dimensional region. Acoustic energy is transmitted in any of various now known or later developed scan patterns along the scan plane for acquiring data. The acquisition scan plane is used for acquiring data for subsequent images. - As part of a feedback control loop, the
processor 18 is a digital signal processor, general processor, application specific integrated circuit, control processor, detector or other now known or latter developed processor. In one embodiment, theprocessor 18 is a separate component from thebeamformer 14, thebeamformer controller 16 and theimage processor 20. For example, theprocessor 18 is a general, system control processor connected with theuser interface 24. In other embodiments, theprocessor 18 is a processor within thebeamformer 14, thebeamformer controller 16 or theimage processor 20. In yet other embodiments, theprocessor 18 has multiple processors or circuits distributed at a same or different locations throughout thesystem 10. Theprocessor 18 is operable to detect motion within a volume in response to acquired data. Theprocessor 18 identifies motion from the received ultrasound data. While theprocessor 18 is shown connected to thebeamformer 14, in other embodiments, theprocessor 18 connects to an output of theimage processor 20 for processing detected data. - The
beamformer controller 16 is a general processor, application specific integrated circuit, digital signal processor or other now known or later developed controller for controlling thebeamformer 14. In one embodiment, thebeamformer controller 16 is the controller disclosed in U.S. Pat. Nos. 5,675,554 or 5,685,308. Thebeamformer controller 16 is operable to control a position of the data acquisition scan plane relative to themulti-dimensional transducer array 12. Thebeamformer controller 16 receives input from theprocessor 18. The input indicates a desired scan plane position, an amount of motion, a direction of motion, or change. In response to the detected motion provided by theprocessor 18 or calculated by thebeamformer controller 16, thebeamformer controller 16 is operable to alter the position of the data acquisition scan plane for transmit and/or receive operation. For example, thebeamformer controller 16 controls the apodization and delay profile generated across the multiple channels of the transmitbeamformer 14. The connection of the channels to specific elements within the array may also be controlled by thebeamformer controller 16, such as by controlling a multiplexer or transmit and receive switch. As a result, the scan plane is positioned at any of various positions and angles within three-dimensional space relative to thetransducer 12. - The
image processor 20 includes one or more spatial or temporal filters, one or more detectors and a scan converter. Additional, different or fewer components may be provided. Theimage processor 20 receives data responsive to transmission along the acquisition scan plane. The data is then detected and converted to a display format. A resulting image is displayed on thedisplay 22. The detected information or image information may alternatively or additionally be stored for later viewing or processing. In one embodiment, theimage processor 20 is operable to determine one or more quantities as a function of the data, such as a distance between detected data points associated with tissue features. Since the feedback between thebeamformer 14, theprocessor 18 and thebeamformer controller 16 provides for real time or adaptive positioning of the acquisition scan plane, the resulting images generated by the image processor are more likely images of a tissue of interest despite undesired motions. - The
user interface 24 is a keyboard, trackball, mouse, touchpad, touch- screen, slider, knob, button, combinations thereof or other now known or latter developed input device. Theuser interface 24 is shown connected with thebeamformer controller 16. In alternative embodiments, theuser interface 24 connects to thebeamformer controller 16 through one or more other devices, such as theprocessor 18 or a system control processor. The user designates a region of interest within a two-dimensional or three-dimensional image using theuser interface 24. Theuser interface 24 is operable to receive the input indicating a region of interest and store or otherwise communicate the spatial position within the image to thebeamformer controller 16 orprocessor 18. For example, a plurality of two-dimensional images are generated. Once the user positions thetransducer 12 such that a tissue of interest is being imaged, the user indicates the position of the tissue of interest, such as by tracing the tissue of interest or starting an automatic border detection function. Alternatively, the tissue of interest may be automatically set by the system using techniques such as automatic image segmentation. - Once a region of interest is identified by the user, the
system 10 tracks motion of thetransducer 12 relative to the tissue of interest within a three dimensional volume. As the region of interest moves relative to thetransducer 12 due to undesired motion, the acquisition scan plane is altered to account for the movement. As a result, the acquisition scan plane continuously or more likely passes through the region of interest. -
FIG. 2 shows one embodiment of a method for stabilizing imaging or an image plane within a volume in medical diagnostic ultrasound or other medical imaging. Different, additional or fewer acts are provided in other embodiments. The method ofFIG. 2 is applicable to both two-dimensional and three-dimensional tracking.FIGS. 3A-3C show one embodiment of stabilizing a scan region in two dimensions.FIGS. 4A-4C show a graphic representation of an alternative embodiment of the stabilizing a two-dimensional scan region within a three-dimensional volume.FIG. 2 will be described with respect to both embodiments. - Referring to
FIGS. 3A-3C andFIG. 2 , a region ofinterest 40 is identified inact 30. The region ofinterest 40 is identified from a two-dimensional image of a region using a user input and/or automated detection. Theregion 42 represents a two-dimensional region for which the transducer is capable of scanning. The region ofinterest 40 is within theregion 42. For example, an image representing the entirety of theregion 42 or a subset of theregion 42 is acquired. - Frame of reference data, such as data representing the entirety of the
region 42 is acquired inact 32. Different or less sample density may be provided in acquiring the frame of reference than for subsequent imaging. - As represented in
FIG. 3B , motion is tracked inact 34. A plurality ofsub-regions 44, such as areas associated with a plurality of scan lines spaced throughout theregion 42 are acquired. In alternative embodiments, the sub-regions are two-dimensional areas that extend less than an entirety of the depth of theregion 42. Using speckle tracking or tracking of features (e.g., applying gradient processing and then tracking peak gradient locations), any motion of the sub-regions or subimages 44 relative to the reference is determined. For example, any of various constant or adaptive search processes are used to provide a best match or a sufficient match of each of thesub-regions 44 to the reference frame of data. A translation along two dimensions and a rotation for each of thesubimages 44 is determined. In alternative embodiments, a translation along a single dimension, translation along two dimensions without rotation, or translation along a single dimension with rotation is used. The resulting translational and rotational vectors are combined, such as through averaging, to identify an overall motion. Since only sub-regions are scanned for tracking motion, the frame rate is now increased compared to scanning the whole volume for tracking. - As shown in
FIG. 3C , a scan plane position is altered as a function of the tracked motion inact 36. Thescan plane 46 is less than the entirespatial extent 42 possible by the transducer. Thescan plane 46 is sized to just encompass the region ofinterest 40 or to include the region ofinterest 40 as well as additional information. As transducer movement relative to the region ofinterest 40 occurs, thescan plane 46 is positioned to scan a region ofinterest 40 based on the tracked motion. - In
act 38, image data is acquired based on the shifted acquisition scan plane position. Since thescan plane 46 is shifted to account for motion, the region ofinterest 40 appears in the displayed image at a same location for each subsequent image regardless of motion between the region ofinterest 40 and thetransducer 12. For example, the region ofinterest 40 shifts within theregion 42 as shown inFIG. 3A as opposed toFIG. 3B . By shifting thescan plane 46, the region ofinterest 40 is then moved by the estimated motion amount in a reverse direction and shown to the user as theregion 46 shown inFIG. 3C . The region ofinterest 40 appears to be stabilized or stationary and is included within each of the images.Acts -
FIGS. 4A-4C and 2 represent a similar process for positioning a scan plane in a three-dimensional region 52 as opposed to the two-dimensional region 42 of FIGS. 3A-C. The three-dimensional region 52 corresponds to a volume that is a subset or the entirety of the volume that the transducer is operable to scan. As shown inFIGS. 4A , thevolume region 52 is conical but may be pyramid-shaped, cylindrical, or other shapes in other embodiments. Thevolume region 52 corresponds to electric steering of a two-dimensional array in one embodiment. The steering is at any of various angles, such as a normal through to 45 degrees away from normal. Other angles may be used. - In
act 30, the region ofinterest 40 is identified. The user moves the transducer relative to the patient or causes the system to move thescan plane 54 to find the region ofinterest 40. For example, a two-dimensional region is scanned with ultrasound energy. As represented inFIG. 4A , the two-dimensional region is the plane EFGH within thevolume region 52. In one embodiment, theplane 54 is positioned at a center of the transducer array, but may be positioned any where in any orientation within thevolume region 52. In alternative embodiments, a three-dimensional representation is generated for identifying the region ofinterest 40. - Once an image includes the desired region of
interest 40, the user inputs information designating the region ofinterest 40 within theregion 52. For example, the user identifies the region of interest by tracing, by selecting two or more points, by selecting a point, by automatic border detection, automatic segmentation or by other now known or latter developed techniques. Based on the position of thescan plane 54 relative to thetransducer 12 and the position of tissue designated within the image, thesystem 10 determines the spatial location of the region ofinterest 40 within thevolume 52. - Once the region of
interest 40 is identified, reference data is acquired inact 32. The entire three-dimensional volume is scanned. For example, a representative sample of thevolume region 52 is obtained. A larger or smaller three-dimensional volume, such as associated with greater or lesser steering angles, may be scanned. The representative sample is acquired over the entire spatial extent in one embodiment, but may be acquired over lesser spatial extents in other embodiments. The entire spatial extent is based on an area of the two-dimensional transducer array and the steering angle. For example, the two-dimensional array used for acquiring data defines the entire spatial extent of the scan. - The representative sample is acquired over the entire or other spatial extent with a same or different scan line density than for subsequent imaging. Data representing at least portions of the three-dimensional volume are acquired for positions at least partially around the region of interest. A lesser line density, sample density or combinations thereof may be used. In one embodiment, the representative data is equally or evenly spaced throughout the
volume region 52, but unequal or variations in sample or line density may be provided. In one embodiment, the data for the entire volume is acquired with a low resolution, such as using a low frequency or smaller aperture. Low resolution may result in a higher frame rate for scanning the entire spatial extent. - Once the region of
interest 40 is identified and a frame of reference data in a known spatial relationship to the region ofinterest 40 is acquired, a two-dimensional area is repetitively scanned with the multi-dimensional transducer array. The two-dimensional area, such as thescan plane 54, is adaptively positioned within thevolume region 52 as a function of tracked or detected motion. The two-dimensional area can be a C-Plane, B-Plane or any other variation of the above two planes, obtained by rotating C— or B-Planes. Instead of a 2D area, the transducer may also acquire a small 3D volume enclosing the region of interest, such as with two or more spatially distinct scan planes. By repositioning the two-dimensional area or thescan plane 54, the region ofinterest 40 is continually scanned despite relative movement between the region ofinterest 40 and thetransducer 12. - In
act 34, motion within the three-dimensional volume region 52 is tracked. The motion within the volume is repetitively detected for generating a plurality of images. Since thevolume region 52 where motion is detected includes the two-dimensional scan plane 54 and associated region ofinterest 40, the detected motion indicates motion of the region ofinterest 40 within thevolume region 52. - Motion is detected by comparing data acquired at different times, such as comparing each subsequently acquired set of data with the reference frame of data acquired in
act 32. Rather than acquiring data representing theentire volume region 52, such as performed inact 32, a lesser amount of data is acquired to maintain higher frame rate. For example, acoustic energy is transmitted to three sub-regions of the three-dimensional volume region 52 without acquiring data for the entire three-dimensional volume region 52. In one embodiment, two of the sub-regions are along a same set of scan lines. More than three sub-regions along the same or different scan lines may be used. The data is acquired using fewer scan lines than performed for acquiring the reference information inact 32. As shown inFIG. 4B , three sets ofscan lines 56 are transmitted at different angles and locations within thevolume region 52. In one embodiment, each set of sub-regions includes nine adjacent scan lines, but sets of spaced scan lines, sparse scan lines, a greater number of scan lines or a fewer number of scan lines may be used. In one embodiment, each of the sets ofscan lines 56 is of a same or similar scan line density, but different densities may be provided. In one embodiment, the scan line density and scan line positions for each of the sets ofscan lines 56 are the same density and scan lines for a sub-volume used to acquire the reference frame data, but different densities or scan line positions may be used. - Acquisition parameters for obtaining data for motion tracking are the same or different than used for acquiring the reference information. In an alternative embodiment, acquisition of the tracking data is adaptive. For example, the size of each beam, the number of beams or other acquisition parameter is adjusted as a function of a previous motion estimate, the variance associated with the motion estimate or a measure of a tissue rigidity. For large variance motion estimates or low tissue rigidity, the beam size is increased or the number of beams is increased. The acquisition parameters may also be updated as a function of a change in acquisition parameters for imaging. For example, the user selects a different center frequency, aperture, F number, depths of imaging or other imaging parameter. The same parameter is altered for obtaining a tracking data. The same parameter is used for both tracking and imaging. In alternative or additional embodiments, different imaging parameters are used for tracking than for imaging.
- Data associated with a cubed region at two or more different depths along each of the sets of
scan lines 56 is used for comparison and motion detection. As shown inFIG. 4B , six trackingregions 58 are obtained. Additional orfewer sub-regions 58 may be used. In alternative embodiments, one or three or more sub-regions within each of the sets ofscan lines 56 are used. While data representing cubes are acquired in one embodiment, data representing any of other various one, two or three-dimensional shapes may be used. In another embodiment, the data along the entire depth of each of the sets ofscan lines 56 is used for motion detection. By acquiring data inonly sub-regions 58 or along the sets ofscan lines 56, a substantially lesser portion of thevolume region 52 is scanned than is performed for acquiring the reference information or for scanning an entire volume. For example, 50 percent fewer scan lines are acquired as compared to scanning theentire volume 52 with a same density. A greater or lesser percentage may be provided. As a result, the sub-volumes also represent a substantially less total volume than the entire three-dimensional volume region 52. - The
motion vectors 60 are determined by tracking each cube using speckle correlation. A high pass filter or other filtering and acquisition parameters are selected to best identify or provide speckle information. In alternative embodiments, a spatial gradient is applied to the data to identify one or more features within eachsub-region 58. Easily identified landmarks, such a cystic areas, blood vessels or highly echogenic specular targets are tracked instead of tracking pixels within a sub-volume for speckle correlation. In another embodiment, thesub-regions 58 are adaptively placed prior to acquisition by identifying features within the reference frame of data acquired inact 32. Filtering or other techniques in addition to or as an alternative to a spatial gradient function may be used to identify one or more features. A feature pattern or volume around an identified single feature for each sub-volume 58 is identified. Filtering or other functions may be used in addition to or as an alternative to the spatial gradient for identifying a tracking feature. - A
motion vector 60 is determined for each of the sub-volumes 58. For example, a direction, a magnitude or both a direction and a magnitude of the motion are determined by comparing the data from each of the sub-volumes 58 with the reference data acquired inact 32. In one embodiment, a translation within three dimensions is determined without determining rotation. The amount and direction of translation of the sub-volume 58 relative to thevolume region 52 indicates amotion vector 60. Data responsive to the grouped sets of beams is used to determine the direction and magnitude of motion of thevolume region 52 relative to the transducer. A minimum sum of absolute differences, cross correlation, or other now known or latter developed correlation is used to match the data for the sub-volumes 58 with the referenced data. Correlation is performed using data prior to detection, data after detection but prior to scan conversion, data after scan conversion, display image data, or other data. Any of various search patterns involving translating and/or rotating the data representing the sub-volumes 58 relative to the referenced data is used to identify a best match. A coarse search followed by a fine search, a search adapted to expected motion, a size of the region to be searched adapted to previous amounts of motion, or other adaptive or efficient search techniques may be used. - The same reference data is used to compare to each subsequently acquired set of
data representing sub-regions 58. In alternative embodiments, data representing a subsequently acquiredsub-volume 58 is compared to data from a previously acquired sub-volume in a same general area. Given minimal amount of motion, the motion vector may be small enough to track from one sub-volume to a subsequently acquired sub-volume without comparison to the reference frame of data. - As an alternative to finding
individual motion vector 60 for each sub-volume, the sub-volumes are translated and/or rotated as a group to find a single motion vector. Where two or more different motion vectors are detected for a given time, a least squares fit, an average, or other combination of the motion vectors is used to calculate a single transformation indicating motion of thetransducer 12 relative to thevolume region 52. Rigid body motion is assumed for each sub-volume, but warping or other techniques may be used to account for tissue deformation. Translations in three dimensions and rotations about the three dimensions are determined using a least squares fit, such as determined from using six separate motion vectors shown inFIG. 4B . U.S. Pat. No. 6,306,091, the disclosure of which is incorporated herein by reference, discloses various techniques for identifying a rigid body transformation from a plurality of vectors. The motion tracking, subvector or global vector techniques disclosed in the '091 patent are extended to three-dimensional processing. The resulting rigid body transformation represents six degrees of freedom, such as a translation in an X, Y and Z dimensions as well as rotation about each of the dimensions. In alternative embodiments, fewer degrees of freedom or motion associated with only translation, only rotation or a subset of the six degrees of freedom is provided. - Since the characteristics of the speckle may change as a function of position within the
volume region 52, one or more tracking parameters are adjusted as a function of a position of the tracking location within theregion 52. For example, as the speckle is positioned deeper and deeper within theregion 52, diffraction results in larger speckle. Element factor or other factors may change as a position of depth, steering angle or other location withinvolume region 52. The correlation, cross correlation, minimum sum of differences or other matching function is altered based on the position. Through example, a warping, such as a one-, two- or three-dimensional expansion or contraction of the data is performed as part of the correlation operation as a function of the position of the tracking location. By spatially expanding or contracting the data, the data more likely matches the reference data. Other warping may be used. Differences in thresholds for identifying a best or sufficient match, differences in an algorithm apply to track motion or other tracking parameters are altered as a function of the location. Alternatively, the tracking parameters are the same regardless of position. Other types of transformations, besides rigid body transformation may be estimated between the reference data set and the subsequent sub regions. One such technique is image morphing as described in U.S. Pat. No. ______ (application Ser. No. 10/251,044) for Morphing Diagnostic Ultrasound Images for Perfusion Assessment, the disclosure of which is incorporated herein by reference. - In
act 36, the position of theacquisition scan plane 54 is automatically altered relative to thetransducer 12 as a function of the detected motion.FIG. 4C shows transformation of theacquisition scan plane 54 to account for the detected or estimated motion. By translating and rotating theacquisition scan plane 54, subsequent acquisition along thescan plane 54 more likely scans the region ofinterest 40. Theacquisition scan plane 54 is maintained at a position to intersect the region ofinterest 40 over time. The position of theacquisition scan plane 54 is altered within the three-dimensional volume region 52 to account for relative motion between the region ofinterest 40 and thetransducer 12. The motion tracking provides information on the position of the region ofinterest 40 within thevolume region 52 relative to thetransducer 12. The acquisition scan plane (i.e., the transmission and/or reception plane) is adaptively repositioned, altered or updated. The two-dimensional area of the acquisition scan plane is positioned within the volume as a function of and in response to the detected motion and the region ofinterest 40. As shown inFIG. 4C , theacquisition scan plane 54 is positioned in a plane QPRS different than the EFGH plane ofFIG. 4A as a function of themotion vectors 60 shown inFIG. 4B . The region ofinterest 40 moves in a direction opposite to the detected motion. - Using electronic or mechanical steering, the
acquisition scan plane 54 is translated and/or rotated, such as translating and rotating within the six degrees of freedom. In alternative embodiments, theacquisition scan plane 54 is translated and maintained at the same angle relative to the normal to the array or not rotated. Where a two-dimensional transducer is used, six degrees of freedom may be provided for positioning theacquisition scan plane 54. Fewer degrees of freedom may be provided for other multi-dimensional or two-dimensional arrays. Thescan plane 54 is adaptively positioned using one or more degrees of freedom to more likely scan the region ofinterest 40. Where motion is indicated beyond the original extent of thevolume 52 or beyond the ability to acquire a sufficiently largeacquisition scan plane 54, stabilized imaging is ceased, imaging without the stabilization described herein is performed or the process returns to acquire a reference frame of data inact 32 for a new extent of thevolume region 52. In another embodiment, the reference frame of data is acquired for every N frames of data containing the region of interest, where N is a number such as 10. Other values may also be used for N. - In
act 38, image data is acquired. Acoustic energy is electronically or mechanically steered across theacquisition scan plane 54 in any of now known or later developed formats, such as sector, vector, linear or as a plane wave. The data from acoustic echoes represents the tissue intersected within theacquisition scan plane 54. Received data is beamformed, image processed and used to generate a two-dimensional or three-dimensional display. The region ofinterest 40 is represented in the image due to the shift in the scan plane position. In alternative embodiments, spectral Doppler display associated with a range gate position or point, continuous wave Doppler display associated with a line, or M-mode display associated with a line are generated from a point or line within thescan plane 54. The point or line are tracked and adaptively positioned. - The motion tracking of
act 34, the acquisition scan plane position alteration ofact 36 and the acquisition of image data ofact 38 are repeated over time such that the two-dimensionalacquisition scan plane 54 is adaptively positioned to intersect the region of interest over time. The adaptively positioned acquisition scan planes are repetitively scanned for generating images. Upon viewing a sequence of images, the user perceives the region ofinterest 40 as being stationary or stabilized. The region ofinterest 40 is less likely to fade out of the images due to the adapted positioning of thescan plane 54. Theacquisition scan plane 54 is maintained in a position to intersect the region ofinterest 40 during multiple acquisitions accounting for relative motion of thetransducer 12 to the tissue. - By acquiring image data from the two-dimensional area of the
acquisition scan plane 54 or a one-dimensional line or point, rapid or high frame rate imaging is provided. Since the motion tracking uses sub-volumes, the affect on frame rate is greatly reduced as opposed to tracking using theentire volume region 52. As a result, real time or substantially real time two-dimensional imaging is provided with three-dimensional motion tracking. - Further stabilization is provided by shifting the resulting two-dimensional images as a function of an initial position of the region of
interest 40. Adaptive positioning of theacquisition scan plane 54 results in the region ofinterest 40 being continually imaged. The region ofinterest 40 may also or alternatively be shifted within the display two-dimensional image by translation along one or two dimensions and/or rotation to maintain further stabilization. For example, as the transducer shifts to the left relative to the tissue, the region ofinterest 40 may appear to shift to the left within resulting images. The region ofinterest 40 is tracked or the shift is accounted for in the display two-dimensional image. In one embodiment, the shift occurs to the image data in range and azimuth. In another embodiment, theacquisition scan plane 54 extends only over a portion of the width of thevolume region 52 accessible by thetransducer 12. As a result, the positioning of the acquisition scan plane automatically shifts the region ofinterest 40 in the displayed image. Where tissue is compressed due to additional pressure from thetransducer 12 or extended due to a release of pressure, the region ofinterest 40 may appear to shift upwards or downwards on the image. Either through changing a depth associated with theacquisition scan plane 54 or by shifting the resulting image data upwards or downwards, the region ofinterest 40 is maintained in the same location on the display. - A further shift in the acquisition
scan plane position 54 may be performed as a function of time. For example, the difference in time between acquisition of the data used for tracking motion and the acquisition of data used for generating an image is considered. A velocity, acceleration or both velocity and acceleration are determined. The temporal difference is used with the velocity or acceleration information to determine an additional shift. - The tracking of the imaging plane is used for any B-mode, Doppler, M-mode, spectral Doppler, continuous wave Doppler, harmonic, contrast agent imaging or other imaging. Other applications may benefit from tracking the position of the
acquisition scan plane 54 in three-dimensional volume region 52. For example, tumor perfusion using contrast agents or other radiology-based contrast quantification is performed. Contrast agent quantification may also be performed for myocardial perfusion or other cardiology applications. Triggered imaging of the heart is provided so that the resulting images are acquired at a same time during a heart cycle. Alternatively, warping or other non-rigid motion is accounted for throughout the heart cycle. Another application is a biopsy or surgical guidance. Better guidance may be provided by maintaining the scan plane in position relative to the region of interest. Yet other applications are cardiovascular quantitative measurements, such as vascular measurements of carotid plaque assessment, pulsatility, aortic aneurism or others. - In one embodiment, the acquisition scan plane is used for acquiring Doppler information for both Doppler and B-mode information. The motion is tracked using B-mode or other information. Any of various combinations of using the same or different data for tracking and imaging may be used. By stabilizing the scan plane position relative to the region of
interest 40, more aggressive persistence for Doppler imaging may be used with no or minimal decrease in resolution. Vessel structure reconstruction may also be improved. Other high persistence imaging, such as contrast agent imaging to identify microvascular structures, may be improved. - By reducing the artifacts due to patient or sonographer motion for two-dimensional imaging, work flow may be improved, reducing the amount of acquisition data and the amount of acquisition time. Off-line motion tracking processing is eliminated by providing for a tracking and imaging described above in real time or while a patient is being scanned during an imaging session. Real time imaging is provided due to the reduced or minimal impact of acquiring motion information using sub-volumes. Motion tracking is performed in three dimensions without having to acquire consecutive full three-dimensional volume representations of data.
- As an alternative to two-dimensional imaging of a region of interest, stabilization of the acquisition scan plane is used for acquiring a three-dimensional set of data. Using electronic steering, the scan plane is purposefully positioned at different locations within the
volume region 52. To provide regular spacing of the acquisition scan plane for a more uniform density of samples throughout the three-dimensional region, motion of thetransducer 12 relative to the tissue is accounted for as discussed above. The acquisition scan plane position is adjusted as a function of both the motion and the intended displacement for three-dimensional data acquisition. - While the invention has been described above by reference to various embodiments, it should be understood that many changes and modifications can be made without departing from the scope of the invention. For example, the same firings and data are used for both tracking and imaging. As another example, one or more sub-volumes 58 for tracking are positioned within, as parts of, or overlapping with the region of interest. It is therefore intended that the foregoing detailed description be regarded as illustrative rather than limiting, and that it be understood that it is the following claims, including all equivalents, that are intended to define the spirit and the scope of this invention.
Claims (24)
1. A method for stabilizing an image plane in medical imaging, the method comprising:
(a) tracking motion within a region; and
(b) automatically altering an acquisition scan plane position relative to a transducer as a function of the motion.
2. The method of claim 1 wherein (a) comprises performing one of a cross-correlation and a sum of absolute differences.
3. The method of claim 1 wherein (a) comprises comparing data from a first acquisition with data from a second acquisition.
4. The method of claim 1 wherein (b) comprises translating and rotating an acquisition scan plane to the acquisition scan plane position.
5. The method of claim 1 further comprising:
(c) scanning the region with ultrasound energy;
(d) receiving input designating a region of interest within the region;
wherein (b) comprises maintaining the acquisition scan plane position at the region of interest over time.
6. The method of claim 1 wherein (a) comprises tracking the motion within the region, the region being a three-dimensional volume, and wherein (b) comprises altering the acquisition scan plane position relative to the transducer, the transducer being a multi-dimensional array of elements, the alteration maintaining an acquisition scan plane at a region of interest within the three-dimensional volume over time.
7. The method of claim 6 further comprising:
(c) electronically steering acoustic energy across the acquisition scan plane;
wherein (a), (b) and (c) are repeated.
8. The method of claim 6 wherein (a) comprises transmitting acoustic energy to at least three sub-regions of the three-dimensional volume without acquiring data for the entire three-dimensional volume.
9. The method of claim 8 further comprising:
(c) scanning a representative sample of the entire three-dimensional volume;
wherein (a) comprises comparing data responsive to the acoustic energy transmitted to the at least three sub-regions with data responsive to the representative sample.
10. The method of claim 8 wherein (a) comprises:
(a1) transmitting at least three grouped sets of beams spaced apart within the three-dimensional volume;
(a2) determining a direction and a magnitude of motion from data responsive to the at least three grouped sets of beams for each of the at least three grouped sets of beams;
wherein (b) comprises altering the acquisition scan plane position as a function of the at least three directions and at least three magnitudes.
11. The method of claim 1 wherein (b) comprises adaptively altering the acquisition scan plane position in response to the motion;
further comprising:
(c) repetitively scanning the adaptively positioned acquisition scan planes; and
(d) generating two-dimensional images responsive to (c).
12. The method of claim 11 further comprising:
(e) shifting the two-dimensional images as a function of an initial position of the region of interest.
13. The method of claim 1 further comprising:
(c) identifying at least one feature within the region;
wherein (a) comprises tracking motion of the at least one feature.
14. The method of claim 1 wherein (a) comprises tracking one of speckle and a spatial gradient.
15. The method of claim 1 further comprising:
(c) adjusting a tracking parameter for (a) as a function of a position of a tracking location within the region.
16. A method for stabilizing a scan plane within a volume in medical diagnostic ultrasound imaging, the method comprising:
(a) identifying a region of interest;
(b) acquiring data representing at least portions of a three-dimensional volume positioned at least partly around the region of interest;
(c) acquiring data representing sub-volumes of the three-dimensional volume, (c) using fewer scan lines than (b);
(d) comparing the data representing the sub-volumes with the data representing at least the portions of the three-dimensional volume;
(e) detecting motion as a function of (d);
(f) positioning a two-dimensional scan plane within the three-dimensional volume as a function of the region of interest and the detected motion; and
(g) acquiring a-two-dimensional image responsive to the two-dimensional scan plane.
17. The method of claim 16 further comprising:
(h) repeating (c), (d), (e), (f) and (g) over time such that the two-dimensional scan plane is adaptively positioned through the region of interest over time.
18. The method of claim 16 wherein (b) comprises acquiring data representing an entire spatial extent of the three-dimensional volume, the entire spatial extent being based on an area of a two-dimensional transducer array used for (b), (c) and (g), wherein (c) comprises acquiring the data representing sub-volumes of the three-dimensional volume, the sub-volumes together being substantially less than the three-dimensional volume.
19. A method for stabilizing imaging within a volume in medical diagnostic ultrasound imaging, the method comprising:
(a) repetitively scanning a two-dimensional area with a multi-dimensional transducer array;
(b) repetitively detecting motion within a volume including the two-dimensional area; and
(c) adaptively re-positioning the two-dimensional area within the volume as a function of the detected motion.
20. A system for stabilizing a scan plane within a volume in medical imaging, the system comprising:
a multi-dimensional transducer array;
a beamformer controller operative to control a position of a data acquisition scan plane relative to the multi-dimensional transducer array;
a beamformer connected with the multi-dimensional transducer array, the beamformer responsive to the beamformer controller and operative to acquire data representing tissue at the data acquisition scan plane; and
a processor operable to detect motion within a volume;
wherein the beamformer controller is operable to alter the position of the data acquisition scan plane in response to the detected motion.
21. The system of claim 20 wherein the multi-dimensional transducer array comprises a two-dimensional transducer array.
22. The system of claim 20 further comprising:
a user interface connected with the processor, the user interface operable to receive input indicating a region of interest; and
a display operable to display a sequence of two-dimensional images of the region of interest, the two-dimensional images responsive to the data acquisition scan plane.
23. The method of claim 1 further comprising:
(c) obtaining data for motion tracking in response to different acquisition parameters than used for imaging.
24. The method of claim 1 wherein (b) comprises automatically altering an acquisition volume position relative to a transducer as a function of the motion.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/696,608 US20050096538A1 (en) | 2003-10-29 | 2003-10-29 | Image plane stabilization for medical imaging |
US12/240,153 US20090030316A1 (en) | 2003-10-29 | 2008-09-29 | Image plane stabilization for medical imaging |
US12/240,078 US7998074B2 (en) | 2003-10-29 | 2008-09-29 | Image plane stabilization for medical imaging |
US12/239,996 US7993272B2 (en) | 2003-10-29 | 2008-09-29 | Image plane stabilization for medical imaging |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/696,608 US20050096538A1 (en) | 2003-10-29 | 2003-10-29 | Image plane stabilization for medical imaging |
Related Child Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/240,078 Division US7998074B2 (en) | 2003-10-29 | 2008-09-29 | Image plane stabilization for medical imaging |
US12/240,153 Division US20090030316A1 (en) | 2003-10-29 | 2008-09-29 | Image plane stabilization for medical imaging |
US12/239,996 Division US7993272B2 (en) | 2003-10-29 | 2008-09-29 | Image plane stabilization for medical imaging |
Publications (1)
Publication Number | Publication Date |
---|---|
US20050096538A1 true US20050096538A1 (en) | 2005-05-05 |
Family
ID=34550148
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/696,608 Abandoned US20050096538A1 (en) | 2003-10-29 | 2003-10-29 | Image plane stabilization for medical imaging |
US12/240,153 Abandoned US20090030316A1 (en) | 2003-10-29 | 2008-09-29 | Image plane stabilization for medical imaging |
US12/239,996 Active 2024-05-21 US7993272B2 (en) | 2003-10-29 | 2008-09-29 | Image plane stabilization for medical imaging |
US12/240,078 Active 2024-05-23 US7998074B2 (en) | 2003-10-29 | 2008-09-29 | Image plane stabilization for medical imaging |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/240,153 Abandoned US20090030316A1 (en) | 2003-10-29 | 2008-09-29 | Image plane stabilization for medical imaging |
US12/239,996 Active 2024-05-21 US7993272B2 (en) | 2003-10-29 | 2008-09-29 | Image plane stabilization for medical imaging |
US12/240,078 Active 2024-05-23 US7998074B2 (en) | 2003-10-29 | 2008-09-29 | Image plane stabilization for medical imaging |
Country Status (1)
Country | Link |
---|---|
US (4) | US20050096538A1 (en) |
Cited By (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080009722A1 (en) * | 2006-05-11 | 2008-01-10 | Constantine Simopoulos | Multi-planar reconstruction for ultrasound volume data |
US20080021319A1 (en) * | 2006-07-20 | 2008-01-24 | James Hamilton | Method of modifying data acquisition parameters of an ultrasound device |
US20080200808A1 (en) * | 2007-02-15 | 2008-08-21 | Martin Leidel | Displaying anatomical patient structures in a region of interest of an image detection apparatus |
US20090030316A1 (en) * | 2003-10-29 | 2009-01-29 | Chomas James E | Image plane stabilization for medical imaging |
US20100069756A1 (en) * | 2008-09-17 | 2010-03-18 | Kabushiki Kaisha Toshiba | Ultrasonic diagnostic apparatus and computer program product |
US20100086187A1 (en) * | 2008-09-23 | 2010-04-08 | James Hamilton | System and method for flexible rate processing of ultrasound data |
US20100138191A1 (en) * | 2006-07-20 | 2010-06-03 | James Hamilton | Method and system for acquiring and transforming ultrasound data |
US20100185085A1 (en) * | 2009-01-19 | 2010-07-22 | James Hamilton | Dynamic ultrasound processing using object motion calculation |
US20100272322A1 (en) * | 2007-12-19 | 2010-10-28 | Koninklijke Philips Electronics N.V. | Correction for un-voluntary respiratory motion in cardiac ct |
US20100274132A1 (en) * | 2009-04-27 | 2010-10-28 | Chul An Kim | Arranging A Three-Dimensional Ultrasound Image In An Ultrasound System |
US20100331684A1 (en) * | 2009-06-26 | 2010-12-30 | Arminas Ragauskas | Method and Apparatus For Determining The Absolute Value Of Intracranial Pressure |
US20110060226A1 (en) * | 2009-09-04 | 2011-03-10 | University Of Southern California | Fresnel-based beamforming for ultrasonic arrays |
US20110092880A1 (en) * | 2009-10-12 | 2011-04-21 | Michael Gertner | Energetic modulation of nerves |
US20110172528A1 (en) * | 2009-10-12 | 2011-07-14 | Michael Gertner | Systems and methods for treatment using ultrasonic energy |
EP2419021A1 (en) * | 2009-04-14 | 2012-02-22 | Sonosite, Inc. | Systems and methods for adaptive volume imaging |
US20120116224A1 (en) * | 2010-11-08 | 2012-05-10 | General Electric Company | System and method for ultrasound imaging |
US8295912B2 (en) | 2009-10-12 | 2012-10-23 | Kona Medical, Inc. | Method and system to inhibit a function of a nerve traveling with an artery |
US8469904B2 (en) | 2009-10-12 | 2013-06-25 | Kona Medical, Inc. | Energetic modulation of nerves |
US8517962B2 (en) | 2009-10-12 | 2013-08-27 | Kona Medical, Inc. | Energetic modulation of nerves |
WO2013088326A3 (en) * | 2011-12-12 | 2013-09-19 | Koninklijke Philips N.V. | Automatic imaging plane selection for echocardiography |
EP2682060A1 (en) * | 2012-06-11 | 2014-01-08 | Samsung Medison Co., Ltd. | Method and apparatus for displaying three-dimensional ultrasonic image and two-dimensional ultrasonic image |
WO2014076498A3 (en) * | 2012-11-15 | 2014-07-10 | Imperial Innovations Ltd | Echocardiography |
CN104202504A (en) * | 2014-08-19 | 2014-12-10 | 昆明理工大学 | Processing method of real-time electronic image stabilization circuit system based on FPGA (Field Programmable Gate Array) |
US8986211B2 (en) | 2009-10-12 | 2015-03-24 | Kona Medical, Inc. | Energetic modulation of nerves |
US8986231B2 (en) | 2009-10-12 | 2015-03-24 | Kona Medical, Inc. | Energetic modulation of nerves |
US8992447B2 (en) | 2009-10-12 | 2015-03-31 | Kona Medical, Inc. | Energetic modulation of nerves |
US9005143B2 (en) | 2009-10-12 | 2015-04-14 | Kona Medical, Inc. | External autonomic modulation |
US9275471B2 (en) | 2007-07-20 | 2016-03-01 | Ultrasound Medical Devices, Inc. | Method for ultrasound motion tracking via synthetic speckle patterns |
CN105611878A (en) * | 2013-06-28 | 2016-05-25 | 皇家飞利浦有限公司 | Rib blockage delineation in anatomically intelligent echocardiography. |
EP3053528A1 (en) * | 2015-02-05 | 2016-08-10 | Samsung Medison Co., Ltd. | Ultrasound diagnosis apparatus and operating method thereof |
US20160287214A1 (en) * | 2015-03-30 | 2016-10-06 | Siemens Medical Solutions Usa, Inc. | Three-dimensional volume of interest in ultrasound imaging |
US20170169609A1 (en) * | 2014-02-19 | 2017-06-15 | Koninklijke Philips N.V. | Motion adaptive visualization in medical 4d imaging |
CN107405134A (en) * | 2015-03-31 | 2017-11-28 | 皇家飞利浦有限公司 | Supersonic imaging device |
WO2018046455A1 (en) | 2016-09-09 | 2018-03-15 | Koninklijke Philips N.V. | Stabilization of ultrasound images |
JP2019076654A (en) * | 2017-10-27 | 2019-05-23 | ゼネラル・エレクトリック・カンパニイ | Ultrasound diagnostic apparatus and control program therefor |
CN111345845A (en) * | 2018-12-21 | 2020-06-30 | 通用电气公司 | Method and system for increasing effective linear density of volume composite ultrasonic image |
US10772681B2 (en) | 2009-10-12 | 2020-09-15 | Utsuka Medical Devices Co., Ltd. | Energy delivery to intraparenchymal regions of the kidney |
WO2020225240A1 (en) * | 2019-05-06 | 2020-11-12 | Koninklijke Philips N.V. | Systems and methods for controlling volume rate |
US10925579B2 (en) | 2014-11-05 | 2021-02-23 | Otsuka Medical Devices Co., Ltd. | Systems and methods for real-time tracking of a target tissue using imaging before and during therapy delivery |
US20210409082A1 (en) * | 2018-11-13 | 2021-12-30 | Nokia Solutions And Networks Oy | Beamforming monitoring apparatus |
US11272906B2 (en) * | 2014-12-19 | 2022-03-15 | Samsung Electronics Co., Ltd. | Ultrasonic imaging device and method for controlling same |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060058674A1 (en) * | 2004-08-31 | 2006-03-16 | General Electric Company | Optimizing ultrasound acquisition based on ultrasound-located landmarks |
KR101121301B1 (en) * | 2009-09-16 | 2012-03-22 | 삼성메디슨 주식회사 | Ultrasound system and method of performing 3-dimensional measurement |
JP5692079B2 (en) * | 2010-01-20 | 2015-04-01 | コニカミノルタ株式会社 | Displacement estimation method and displacement estimation apparatus |
JP5529568B2 (en) * | 2010-02-05 | 2014-06-25 | キヤノン株式会社 | Image processing apparatus, imaging apparatus, control method, and program |
JP5874636B2 (en) * | 2010-08-27 | 2016-03-02 | コニカミノルタ株式会社 | Diagnosis support system and program |
JP5292440B2 (en) * | 2011-06-03 | 2013-09-18 | 富士フイルム株式会社 | Ultrasonic diagnostic equipment |
KR20140086087A (en) * | 2012-12-28 | 2014-07-08 | 삼성메디슨 주식회사 | Ultrasound system and control method for the same |
US10034657B2 (en) | 2013-07-26 | 2018-07-31 | Siemens Medical Solutions Usa, Inc. | Motion artifact suppression for three-dimensional parametric ultrasound imaging |
WO2015084446A1 (en) * | 2013-12-04 | 2015-06-11 | General Electric Company | Detection of motion in dynamic medical images |
IN2013CH05587A (en) | 2013-12-04 | 2015-06-12 | Gen Electric | |
CN105939670B (en) * | 2014-01-27 | 2019-08-06 | 皇家飞利浦有限公司 | Ultrasonic image-forming system and ultrasonic imaging method |
CA2974377C (en) | 2015-01-23 | 2023-08-01 | The University Of North Carolina At Chapel Hill | Apparatuses, systems, and methods for preclinical ultrasound imaging of subjects |
RU2640298C1 (en) | 2015-10-12 | 2017-12-27 | Общество С Ограниченной Ответственностью "Яндекс" | Method for processing and storing images |
CN211884905U (en) | 2019-08-22 | 2020-11-10 | 贝克顿·迪金森公司 | Balloon dilatation catheter and balloon thereof |
Citations (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3561430A (en) * | 1967-07-20 | 1971-02-09 | William W Filler Jr | Fetal heartbeat rate instrument for monitoring fetal distress |
US4735211A (en) * | 1985-02-01 | 1988-04-05 | Hitachi, Ltd. | Ultrasonic measurement apparatus |
US5255680A (en) * | 1991-09-03 | 1993-10-26 | General Electric Company | Automatic gantry positioning for imaging systems |
US5373848A (en) * | 1993-08-09 | 1994-12-20 | Hewlett-Packard Company | Ultrasonic time-domain method for sensing fluid flow |
US5673830A (en) * | 1995-12-07 | 1997-10-07 | Matthews; Arthur T. | Belt supported pneumatic nail gun holder |
US5675554A (en) * | 1994-08-05 | 1997-10-07 | Acuson Corporation | Method and apparatus for transmit beamformer |
US5685308A (en) * | 1994-08-05 | 1997-11-11 | Acuson Corporation | Method and apparatus for receive beamformer system |
US5782766A (en) * | 1995-03-31 | 1998-07-21 | Siemens Medical Systems, Inc. | Method and apparatus for generating and displaying panoramic ultrasound images |
US5873830A (en) * | 1997-08-22 | 1999-02-23 | Acuson Corporation | Ultrasound imaging system and method for improving resolution and operation |
US5899861A (en) * | 1995-03-31 | 1999-05-04 | Siemens Medical Systems, Inc. | 3-dimensional volume by aggregating ultrasound fields of view |
US5910114A (en) * | 1998-09-30 | 1999-06-08 | Siemens Medical Systems, Inc. | System and method for correcting the geometry of ultrasonic images acquired with a moving transducer |
US5967987A (en) * | 1997-12-18 | 1999-10-19 | Acuson Corporation | Ultrasonic system and method for measurement of fluid flow |
US6132376A (en) * | 1996-02-29 | 2000-10-17 | Acuson Corporation | Multiple ultrasonic image registration system, method and transducer |
US6191862B1 (en) * | 1999-01-20 | 2001-02-20 | Lightlab Imaging, Llc | Methods and apparatus for high speed longitudinal scanning in imaging systems |
US6234968B1 (en) * | 1999-06-15 | 2001-05-22 | Acuson Corporation | 3-D diagnostic medical ultrasound imaging using a 1-D array |
US6254539B1 (en) * | 1999-08-26 | 2001-07-03 | Acuson Corporation | Transducer motion compensation in medical diagnostic ultrasound 3-D imaging |
US6299579B1 (en) * | 1999-06-30 | 2001-10-09 | Atl Ultrasound | Extended field of view ultrasonic diagnostic imaging with image reacquisition |
US6306091B1 (en) * | 1999-08-06 | 2001-10-23 | Acuson Corporation | Diagnostic medical ultrasound systems and methods utilizing estimation of 3-dimensional rigid body transformation |
US6442289B1 (en) * | 1999-06-30 | 2002-08-27 | Koninklijke Philips Electronics N.V. | Extended field of view ultrasonic diagnostic imaging |
US6464642B1 (en) * | 1999-08-20 | 2002-10-15 | Kabushiki Kaisha Toshiba | Ultrasonic diagnosis apparatus |
US20020159769A1 (en) * | 2001-04-06 | 2002-10-31 | Canon Kabushiki Kaisha | Image-shake correcting device |
US6560375B1 (en) * | 1998-08-26 | 2003-05-06 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Video image stabilization and registration |
US20030105401A1 (en) * | 2001-12-05 | 2003-06-05 | James Jago | Ultrasonic image stabilization system and method |
US6618197B1 (en) * | 1996-12-27 | 2003-09-09 | Canon Kabushiki Kaisha | Image stabilizing system |
US6659953B1 (en) * | 2002-09-20 | 2003-12-09 | Acuson Corporation | Morphing diagnostic ultrasound images for perfusion assessment |
US6684098B2 (en) * | 1996-08-16 | 2004-01-27 | Brigham And Women's Hospital, Inc. | Versatile stereotactic device and methods of use |
US20040019447A1 (en) * | 2002-07-16 | 2004-01-29 | Yehoshua Shachar | Apparatus and method for catheter guidance control and imaging |
US6733458B1 (en) * | 2001-09-25 | 2004-05-11 | Acuson Corporation | Diagnostic medical ultrasound systems and methods using image based freehand needle guidance |
US20050096589A1 (en) * | 2003-10-20 | 2005-05-05 | Yehoshua Shachar | System and method for radar-assisted catheter guidance and control |
US7150716B2 (en) * | 2003-02-20 | 2006-12-19 | Siemens Medical Solutions Usa, Inc. | Measuring transducer movement methods and systems for multi-dimensional ultrasound imaging |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS60163643A (en) * | 1984-02-07 | 1985-08-26 | テルモ株式会社 | Ultrasonic measuring method and apparatus |
EP0186290B1 (en) * | 1984-11-09 | 1992-01-15 | Matsushita Electric Industrial Co., Ltd. | Ultrasonic imaging system for simultaneous display of sector-scanned multiple images |
DE69634714T2 (en) * | 1995-03-31 | 2006-01-19 | Kabushiki Kaisha Toshiba, Kawasaki | Therapeutic ultrasound device |
US6334846B1 (en) * | 1995-03-31 | 2002-01-01 | Kabushiki Kaisha Toshiba | Ultrasound therapeutic apparatus |
US6445882B1 (en) * | 1995-05-30 | 2002-09-03 | Nikon Corporation | Camera which compensates for motion by setting the time at which a movable member begins moving and which adjusts the movement of the movable member for motion originating in the camera |
US6086539A (en) * | 1996-12-04 | 2000-07-11 | Acuson Corporation | Methods and apparatus for ultrasound image quantification |
US6003216A (en) * | 1997-03-31 | 1999-12-21 | Mcneil-Ppc, Inc. | Domed compressed tampons |
US5876342A (en) * | 1997-06-30 | 1999-03-02 | Siemens Medical Systems, Inc. | System and method for 3-D ultrasound imaging and motion estimation |
US6511426B1 (en) * | 1998-06-02 | 2003-01-28 | Acuson Corporation | Medical diagnostic ultrasound system and method for versatile processing |
US6277074B1 (en) * | 1998-10-02 | 2001-08-21 | University Of Kansas Medical Center | Method and apparatus for motion estimation within biological tissue |
US6352508B1 (en) | 1998-11-20 | 2002-03-05 | Acuson Corporation | Transducer motion compensation in medical diagnostic ultrasound extended field of view imaging |
US6763032B1 (en) * | 1999-02-12 | 2004-07-13 | Broadcom Corporation | Cable modem system with sample and packet synchronization |
WO2001017135A1 (en) * | 1999-09-01 | 2001-03-08 | Motorola Inc. | Method and device for bandwidth allocation in multiple access protocols with contention-based reservation |
US6807195B1 (en) * | 1999-09-29 | 2004-10-19 | General Instrument Corp. | Synchronization arrangement for packet cable telephony modem |
US6527717B1 (en) * | 2000-03-10 | 2003-03-04 | Acuson Corporation | Tissue motion analysis medical diagnostic ultrasound system and method |
US6931011B2 (en) * | 2001-01-31 | 2005-08-16 | Telcordia Technologies, Inc. | Method and systems for bandwidth management in packet data networks |
US7356172B2 (en) * | 2002-09-26 | 2008-04-08 | Siemens Medical Solutions Usa, Inc. | Methods and systems for motion tracking |
US6824514B2 (en) * | 2002-10-11 | 2004-11-30 | Koninklijke Philips Electronics N.V. | System and method for visualizing scene shift in ultrasound scan sequence |
DE10305603B4 (en) * | 2003-02-11 | 2009-12-03 | Siemens Ag | Device for generating a three-dimensional ultrasound image |
US20050096538A1 (en) | 2003-10-29 | 2005-05-05 | Siemens Medical Solutions Usa, Inc. | Image plane stabilization for medical imaging |
-
2003
- 2003-10-29 US US10/696,608 patent/US20050096538A1/en not_active Abandoned
-
2008
- 2008-09-29 US US12/240,153 patent/US20090030316A1/en not_active Abandoned
- 2008-09-29 US US12/239,996 patent/US7993272B2/en active Active
- 2008-09-29 US US12/240,078 patent/US7998074B2/en active Active
Patent Citations (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3561430A (en) * | 1967-07-20 | 1971-02-09 | William W Filler Jr | Fetal heartbeat rate instrument for monitoring fetal distress |
US4735211A (en) * | 1985-02-01 | 1988-04-05 | Hitachi, Ltd. | Ultrasonic measurement apparatus |
US5255680A (en) * | 1991-09-03 | 1993-10-26 | General Electric Company | Automatic gantry positioning for imaging systems |
US5373848A (en) * | 1993-08-09 | 1994-12-20 | Hewlett-Packard Company | Ultrasonic time-domain method for sensing fluid flow |
US5685308A (en) * | 1994-08-05 | 1997-11-11 | Acuson Corporation | Method and apparatus for receive beamformer system |
US5675554A (en) * | 1994-08-05 | 1997-10-07 | Acuson Corporation | Method and apparatus for transmit beamformer |
US5782766A (en) * | 1995-03-31 | 1998-07-21 | Siemens Medical Systems, Inc. | Method and apparatus for generating and displaying panoramic ultrasound images |
US5899861A (en) * | 1995-03-31 | 1999-05-04 | Siemens Medical Systems, Inc. | 3-dimensional volume by aggregating ultrasound fields of view |
US5673830A (en) * | 1995-12-07 | 1997-10-07 | Matthews; Arthur T. | Belt supported pneumatic nail gun holder |
US6360027B1 (en) * | 1996-02-29 | 2002-03-19 | Acuson Corporation | Multiple ultrasound image registration system, method and transducer |
US6222948B1 (en) * | 1996-02-29 | 2001-04-24 | Acuson Corporation | Multiple ultrasound image registration system, method and transducer |
US6132376A (en) * | 1996-02-29 | 2000-10-17 | Acuson Corporation | Multiple ultrasonic image registration system, method and transducer |
US6684098B2 (en) * | 1996-08-16 | 2004-01-27 | Brigham And Women's Hospital, Inc. | Versatile stereotactic device and methods of use |
US6618197B1 (en) * | 1996-12-27 | 2003-09-09 | Canon Kabushiki Kaisha | Image stabilizing system |
US5873830A (en) * | 1997-08-22 | 1999-02-23 | Acuson Corporation | Ultrasound imaging system and method for improving resolution and operation |
US6083168A (en) * | 1997-08-22 | 2000-07-04 | Acuson Corporation | Ultrasound imaging system and method for improving resolution and operation |
US5967987A (en) * | 1997-12-18 | 1999-10-19 | Acuson Corporation | Ultrasonic system and method for measurement of fluid flow |
US6560375B1 (en) * | 1998-08-26 | 2003-05-06 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Video image stabilization and registration |
US5910114A (en) * | 1998-09-30 | 1999-06-08 | Siemens Medical Systems, Inc. | System and method for correcting the geometry of ultrasonic images acquired with a moving transducer |
US6191862B1 (en) * | 1999-01-20 | 2001-02-20 | Lightlab Imaging, Llc | Methods and apparatus for high speed longitudinal scanning in imaging systems |
US6234968B1 (en) * | 1999-06-15 | 2001-05-22 | Acuson Corporation | 3-D diagnostic medical ultrasound imaging using a 1-D array |
US6442289B1 (en) * | 1999-06-30 | 2002-08-27 | Koninklijke Philips Electronics N.V. | Extended field of view ultrasonic diagnostic imaging |
US6299579B1 (en) * | 1999-06-30 | 2001-10-09 | Atl Ultrasound | Extended field of view ultrasonic diagnostic imaging with image reacquisition |
US6306091B1 (en) * | 1999-08-06 | 2001-10-23 | Acuson Corporation | Diagnostic medical ultrasound systems and methods utilizing estimation of 3-dimensional rigid body transformation |
US6464642B1 (en) * | 1999-08-20 | 2002-10-15 | Kabushiki Kaisha Toshiba | Ultrasonic diagnosis apparatus |
US6254539B1 (en) * | 1999-08-26 | 2001-07-03 | Acuson Corporation | Transducer motion compensation in medical diagnostic ultrasound 3-D imaging |
US6606456B2 (en) * | 2001-04-06 | 2003-08-12 | Canon Kabushiki Kaisha | Image-shake correcting device |
US20020159769A1 (en) * | 2001-04-06 | 2002-10-31 | Canon Kabushiki Kaisha | Image-shake correcting device |
US6733458B1 (en) * | 2001-09-25 | 2004-05-11 | Acuson Corporation | Diagnostic medical ultrasound systems and methods using image based freehand needle guidance |
US20030105401A1 (en) * | 2001-12-05 | 2003-06-05 | James Jago | Ultrasonic image stabilization system and method |
US6589176B2 (en) * | 2001-12-05 | 2003-07-08 | Koninklijke Philips Electronics N.V. | Ultrasonic image stabilization system and method |
US20040019447A1 (en) * | 2002-07-16 | 2004-01-29 | Yehoshua Shachar | Apparatus and method for catheter guidance control and imaging |
US6659953B1 (en) * | 2002-09-20 | 2003-12-09 | Acuson Corporation | Morphing diagnostic ultrasound images for perfusion assessment |
US7150716B2 (en) * | 2003-02-20 | 2006-12-19 | Siemens Medical Solutions Usa, Inc. | Measuring transducer movement methods and systems for multi-dimensional ultrasound imaging |
US20050096589A1 (en) * | 2003-10-20 | 2005-05-05 | Yehoshua Shachar | System and method for radar-assisted catheter guidance and control |
Cited By (80)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090030316A1 (en) * | 2003-10-29 | 2009-01-29 | Chomas James E | Image plane stabilization for medical imaging |
US7998074B2 (en) | 2003-10-29 | 2011-08-16 | Siemens Medical Solutions Usa, Inc. | Image plane stabilization for medical imaging |
US7993272B2 (en) | 2003-10-29 | 2011-08-09 | Siemens Medical Solutions Usa, Inc. | Image plane stabilization for medical imaging |
US20080009722A1 (en) * | 2006-05-11 | 2008-01-10 | Constantine Simopoulos | Multi-planar reconstruction for ultrasound volume data |
US20080021319A1 (en) * | 2006-07-20 | 2008-01-24 | James Hamilton | Method of modifying data acquisition parameters of an ultrasound device |
US20100138191A1 (en) * | 2006-07-20 | 2010-06-03 | James Hamilton | Method and system for acquiring and transforming ultrasound data |
US20080200808A1 (en) * | 2007-02-15 | 2008-08-21 | Martin Leidel | Displaying anatomical patient structures in a region of interest of an image detection apparatus |
US9275471B2 (en) | 2007-07-20 | 2016-03-01 | Ultrasound Medical Devices, Inc. | Method for ultrasound motion tracking via synthetic speckle patterns |
US8660313B2 (en) * | 2007-12-19 | 2014-02-25 | Koninklijke Philips N.V. | Correction for un-voluntary respiratory motion in cardiac CT |
US20100272322A1 (en) * | 2007-12-19 | 2010-10-28 | Koninklijke Philips Electronics N.V. | Correction for un-voluntary respiratory motion in cardiac ct |
US20100069756A1 (en) * | 2008-09-17 | 2010-03-18 | Kabushiki Kaisha Toshiba | Ultrasonic diagnostic apparatus and computer program product |
US8945012B2 (en) * | 2008-09-17 | 2015-02-03 | Kabushiki Kaisha Toshiba | Ultrasonic diagnostic apparatus and computer program product |
US20100086187A1 (en) * | 2008-09-23 | 2010-04-08 | James Hamilton | System and method for flexible rate processing of ultrasound data |
US20100185085A1 (en) * | 2009-01-19 | 2010-07-22 | James Hamilton | Dynamic ultrasound processing using object motion calculation |
US8805047B2 (en) | 2009-04-14 | 2014-08-12 | Fujifilm Sonosite, Inc. | Systems and methods for adaptive volume imaging |
EP2419021A4 (en) * | 2009-04-14 | 2013-12-25 | Sonosite Inc | Systems and methods for adaptive volume imaging |
EP2419021A1 (en) * | 2009-04-14 | 2012-02-22 | Sonosite, Inc. | Systems and methods for adaptive volume imaging |
US9366757B2 (en) * | 2009-04-27 | 2016-06-14 | Samsung Medison Co., Ltd. | Arranging a three-dimensional ultrasound image in an ultrasound system |
US20100274132A1 (en) * | 2009-04-27 | 2010-10-28 | Chul An Kim | Arranging A Three-Dimensional Ultrasound Image In An Ultrasound System |
US8394025B2 (en) * | 2009-06-26 | 2013-03-12 | Uab Vittamed | Method and apparatus for determining the absolute value of intracranial pressure |
US20100331684A1 (en) * | 2009-06-26 | 2010-12-30 | Arminas Ragauskas | Method and Apparatus For Determining The Absolute Value Of Intracranial Pressure |
CN102573651A (en) * | 2009-09-04 | 2012-07-11 | 南加利福尼亚大学 | Fresnel-based beamforming for ultrasonic arrays |
US8523774B2 (en) * | 2009-09-04 | 2013-09-03 | University Of Southern California | Fresnel-based beamforming for ultrasonic arrays |
US20110060226A1 (en) * | 2009-09-04 | 2011-03-10 | University Of Southern California | Fresnel-based beamforming for ultrasonic arrays |
US8986231B2 (en) | 2009-10-12 | 2015-03-24 | Kona Medical, Inc. | Energetic modulation of nerves |
US9358401B2 (en) | 2009-10-12 | 2016-06-07 | Kona Medical, Inc. | Intravascular catheter to deliver unfocused energy to nerves surrounding a blood vessel |
US11154356B2 (en) | 2009-10-12 | 2021-10-26 | Otsuka Medical Devices Co., Ltd. | Intravascular energy delivery |
US8556834B2 (en) | 2009-10-12 | 2013-10-15 | Kona Medical, Inc. | Flow directed heating of nervous structures |
US8512262B2 (en) | 2009-10-12 | 2013-08-20 | Kona Medical, Inc. | Energetic modulation of nerves |
US10772681B2 (en) | 2009-10-12 | 2020-09-15 | Utsuka Medical Devices Co., Ltd. | Energy delivery to intraparenchymal regions of the kidney |
US8469904B2 (en) | 2009-10-12 | 2013-06-25 | Kona Medical, Inc. | Energetic modulation of nerves |
US8715209B2 (en) | 2009-10-12 | 2014-05-06 | Kona Medical, Inc. | Methods and devices to modulate the autonomic nervous system with ultrasound |
US9579518B2 (en) | 2009-10-12 | 2017-02-28 | Kona Medical, Inc. | Nerve treatment system |
US8374674B2 (en) | 2009-10-12 | 2013-02-12 | Kona Medical, Inc. | Nerve treatment system |
US20110092880A1 (en) * | 2009-10-12 | 2011-04-21 | Michael Gertner | Energetic modulation of nerves |
US8517962B2 (en) | 2009-10-12 | 2013-08-27 | Kona Medical, Inc. | Energetic modulation of nerves |
US9352171B2 (en) | 2009-10-12 | 2016-05-31 | Kona Medical, Inc. | Nerve treatment system |
US8295912B2 (en) | 2009-10-12 | 2012-10-23 | Kona Medical, Inc. | Method and system to inhibit a function of a nerve traveling with an artery |
US20110172528A1 (en) * | 2009-10-12 | 2011-07-14 | Michael Gertner | Systems and methods for treatment using ultrasonic energy |
US8986211B2 (en) | 2009-10-12 | 2015-03-24 | Kona Medical, Inc. | Energetic modulation of nerves |
US9199097B2 (en) | 2009-10-12 | 2015-12-01 | Kona Medical, Inc. | Energetic modulation of nerves |
US8992447B2 (en) | 2009-10-12 | 2015-03-31 | Kona Medical, Inc. | Energetic modulation of nerves |
US9005143B2 (en) | 2009-10-12 | 2015-04-14 | Kona Medical, Inc. | External autonomic modulation |
US9119952B2 (en) | 2009-10-12 | 2015-09-01 | Kona Medical, Inc. | Methods and devices to modulate the autonomic nervous system via the carotid body or carotid sinus |
US9119951B2 (en) | 2009-10-12 | 2015-09-01 | Kona Medical, Inc. | Energetic modulation of nerves |
US9125642B2 (en) | 2009-10-12 | 2015-09-08 | Kona Medical, Inc. | External autonomic modulation |
US9174065B2 (en) | 2009-10-12 | 2015-11-03 | Kona Medical, Inc. | Energetic modulation of nerves |
US20120116224A1 (en) * | 2010-11-08 | 2012-05-10 | General Electric Company | System and method for ultrasound imaging |
US9179892B2 (en) * | 2010-11-08 | 2015-11-10 | General Electric Company | System and method for ultrasound imaging |
US20150011886A1 (en) * | 2011-12-12 | 2015-01-08 | Koninklijke Philips N.V. | Automatic imaging plane selection for echocardiography |
CN103997971A (en) * | 2011-12-12 | 2014-08-20 | 皇家飞利浦有限公司 | Automatic imaging plane selection for echocardiography |
EP3363365A1 (en) * | 2011-12-12 | 2018-08-22 | Koninklijke Philips N.V. | Automatic imaging plane selection for echocardiography |
WO2013088326A3 (en) * | 2011-12-12 | 2013-09-19 | Koninklijke Philips N.V. | Automatic imaging plane selection for echocardiography |
RU2642929C2 (en) * | 2011-12-12 | 2018-01-29 | Конинклейке Филипс Н.В. | Automatic selection of visualization plan for echocardiography |
KR101501518B1 (en) * | 2012-06-11 | 2015-03-11 | 삼성메디슨 주식회사 | The method and apparatus for displaying a two-dimensional image and a three-dimensional image |
EP2682060A1 (en) * | 2012-06-11 | 2014-01-08 | Samsung Medison Co., Ltd. | Method and apparatus for displaying three-dimensional ultrasonic image and two-dimensional ultrasonic image |
WO2014076498A3 (en) * | 2012-11-15 | 2014-07-10 | Imperial Innovations Ltd | Echocardiography |
CN105611878A (en) * | 2013-06-28 | 2016-05-25 | 皇家飞利浦有限公司 | Rib blockage delineation in anatomically intelligent echocardiography. |
US20170169609A1 (en) * | 2014-02-19 | 2017-06-15 | Koninklijke Philips N.V. | Motion adaptive visualization in medical 4d imaging |
CN104202504A (en) * | 2014-08-19 | 2014-12-10 | 昆明理工大学 | Processing method of real-time electronic image stabilization circuit system based on FPGA (Field Programmable Gate Array) |
US10925579B2 (en) | 2014-11-05 | 2021-02-23 | Otsuka Medical Devices Co., Ltd. | Systems and methods for real-time tracking of a target tissue using imaging before and during therapy delivery |
US11272906B2 (en) * | 2014-12-19 | 2022-03-15 | Samsung Electronics Co., Ltd. | Ultrasonic imaging device and method for controlling same |
KR102389347B1 (en) * | 2015-02-05 | 2022-04-22 | 삼성메디슨 주식회사 | Untrasound dianognosis apparatus and operating method thereof |
KR20160096442A (en) * | 2015-02-05 | 2016-08-16 | 삼성메디슨 주식회사 | Untrasound dianognosis apparatus and operating method thereof |
US20160228098A1 (en) * | 2015-02-05 | 2016-08-11 | Samsung Medison Co., Ltd. | Ultrasound diagnosis apparatus and operating method thereof |
EP3409210A1 (en) * | 2015-02-05 | 2018-12-05 | Samsung Medison Co., Ltd. | Ultrasound diagnosis apparatus and operating method thereof |
EP3053528A1 (en) * | 2015-02-05 | 2016-08-10 | Samsung Medison Co., Ltd. | Ultrasound diagnosis apparatus and operating method thereof |
US10835210B2 (en) * | 2015-03-30 | 2020-11-17 | Siemens Medical Solutions Usa, Inc. | Three-dimensional volume of interest in ultrasound imaging |
US20160287214A1 (en) * | 2015-03-30 | 2016-10-06 | Siemens Medical Solutions Usa, Inc. | Three-dimensional volume of interest in ultrasound imaging |
US20180116635A1 (en) * | 2015-03-31 | 2018-05-03 | Koninklijke Philips N.V. | Ultrasound imaging apparatus |
US11006927B2 (en) * | 2015-03-31 | 2021-05-18 | Koninklijke Philips N.V. | Ultrasound imaging apparatus |
CN107405134A (en) * | 2015-03-31 | 2017-11-28 | 皇家飞利浦有限公司 | Supersonic imaging device |
CN109937370A (en) * | 2016-09-09 | 2019-06-25 | 皇家飞利浦有限公司 | The stabilization of ultrasound image |
WO2018046455A1 (en) | 2016-09-09 | 2018-03-15 | Koninklijke Philips N.V. | Stabilization of ultrasound images |
US11712225B2 (en) | 2016-09-09 | 2023-08-01 | Koninklijke Philips N.V. | Stabilization of ultrasound images |
JP2019076654A (en) * | 2017-10-27 | 2019-05-23 | ゼネラル・エレクトリック・カンパニイ | Ultrasound diagnostic apparatus and control program therefor |
US20210409082A1 (en) * | 2018-11-13 | 2021-12-30 | Nokia Solutions And Networks Oy | Beamforming monitoring apparatus |
US11700043B2 (en) * | 2018-11-13 | 2023-07-11 | Nokia Solutions And Networks Oy | Beamforming monitoring apparatus |
CN111345845A (en) * | 2018-12-21 | 2020-06-30 | 通用电气公司 | Method and system for increasing effective linear density of volume composite ultrasonic image |
WO2020225240A1 (en) * | 2019-05-06 | 2020-11-12 | Koninklijke Philips N.V. | Systems and methods for controlling volume rate |
Also Published As
Publication number | Publication date |
---|---|
US20090062651A1 (en) | 2009-03-05 |
US20090054779A1 (en) | 2009-02-26 |
US7993272B2 (en) | 2011-08-09 |
US20090030316A1 (en) | 2009-01-29 |
US7998074B2 (en) | 2011-08-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7998074B2 (en) | Image plane stabilization for medical imaging | |
EP1697759B1 (en) | Ultrasonic diagnostic imaging method and system with an automatic control of resolution and frame rate | |
US7347820B2 (en) | Phased array acoustic system for 3D imaging of moving parts | |
JP5681623B2 (en) | Ultrasound imaging of extended field of view with 2D array probe | |
JP5508401B2 (en) | Ultrasound imaging of extended field of view by guided EFOV scanning | |
US5655535A (en) | 3-Dimensional compound ultrasound field of view | |
US6980844B2 (en) | Method and apparatus for correcting a volumetric scan of an object moving at an uneven period | |
US10194888B2 (en) | Continuously oriented enhanced ultrasound imaging of a sub-volume | |
US20110144495A1 (en) | Perfusion Imaging of a Volume in Medical Diagnostic Ultrasound | |
US20100130855A1 (en) | Systems and methods for active optimized spatio-temporal sampling | |
WO1997034529A1 (en) | An improved two-dimensional ultrasound display system | |
JP2009535152A (en) | Extended volume ultrasonic data display and measurement method | |
US9179892B2 (en) | System and method for ultrasound imaging | |
US20070276237A1 (en) | Volumetric Ultrasound Imaging System Using Two-Dimensional Array Transducer | |
WO2015068073A1 (en) | Multi-plane target tracking with an ultrasonic diagnostic imaging system | |
CN109073751B (en) | Probe, system and method for acoustic registration | |
US20050131295A1 (en) | Volumetric ultrasound imaging system using two-dimensional array transducer | |
CN113573645A (en) | Method and system for adjusting field of view of ultrasound probe | |
CN109982643A (en) | Three mode ultrasounds imaging for the imaging of anatomical structure, function and Hemodynamics | |
US20220386999A1 (en) | Super Resolution Ultrasound Imaging | |
Perez et al. | Feasibility of Optical Tracking for Swept Synthetic Aperture Imaging | |
WO2022253673A1 (en) | Super resolution ultrasound imaging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SIEMENS MEDICAL SOLUTIONS USA, INC., PENNSYLVANIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHOMAS, JAMES E.;USTUNER, KUTAY F.;SUMANAWEERA, THILAKA S.;REEL/FRAME:014658/0375 Effective date: 20031021 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |