WO2007101346A1 - Ultrasound simulator and method of simulating an ultrasound examination - Google Patents

Ultrasound simulator and method of simulating an ultrasound examination Download PDF

Info

Publication number
WO2007101346A1
WO2007101346A1 PCT/CA2007/000370 CA2007000370W WO2007101346A1 WO 2007101346 A1 WO2007101346 A1 WO 2007101346A1 CA 2007000370 W CA2007000370 W CA 2007000370W WO 2007101346 A1 WO2007101346 A1 WO 2007101346A1
Authority
WO
WIPO (PCT)
Prior art keywords
ultrasound
doppler
image
data
dus
Prior art date
Application number
PCT/CA2007/000370
Other languages
French (fr)
Inventor
David Steinman
Samira Hirji
David Holdswort
Original Assignee
Robarts Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robarts Research Institute filed Critical Robarts Research Institute
Publication of WO2007101346A1 publication Critical patent/WO2007101346A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N29/00Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
    • G01N29/22Details, e.g. general constructional or apparatus details
    • G01N29/30Arrangements for calibrating or comparing, e.g. with standard objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N29/00Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
    • G01N29/04Analysing solids
    • G01N29/06Visualisation of the interior, e.g. acoustic microscopy
    • G01N29/0654Imaging
    • G01N29/0672Imaging by acoustic tomography
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N29/00Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
    • G01N29/22Details, e.g. general constructional or apparatus details
    • G01N29/26Arrangements for orientation or scanning by relative movement of the head and the sensor
    • G01N29/265Arrangements for orientation or scanning by relative movement of the head and the sensor by moving the sensor relative to a stationary material
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N29/00Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
    • G01N29/44Processing the detected response signal, e.g. electronic circuits specially adapted therefor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N29/00Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
    • G01N29/44Processing the detected response signal, e.g. electronic circuits specially adapted therefor
    • G01N29/4463Signal correction, e.g. distance amplitude correction [DAC], distance gain size [DGS], noise filtering
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/58Testing, adjusting or calibrating the diagnostic device
    • A61B8/587Calibration phantoms

Definitions

  • the present invention relates generally to the field of ultrasound and in particular, to an ultrasound simulator and method of simulating an ultrasound examination.
  • Duplex ultrasound is among the most accessible imaging modalities available today for the diagnosis of carotid disease, a common precursor to the incidence of stroke.
  • Duplex ultrasound provides a view of the anatomy under examination overlaid with blood flow velocity information by combining both B- mode and Doppler ultrasound.
  • Doppler ultrasound detects the velocity of blood travelling through an individual's artery, by transmitting a high frequency signal into the body and detecting shifts in the frequency of returned signals. The detected velocity in turn can be used to approximate the degree of occlusion in the artery due to plaque build-up, or atherosclerosis.
  • the degree of narrowing in the artery, or its stenosis severity is a known indicator of an individual's risk of stroke.
  • B-mode ultrasound provides information relating to tissue properties.
  • Duplex ultrasound combines the blood flow velocity information extracted from Doppler ultrasound, with the tissue property information extracted B-mode ultrasound thereby to enable blood flow and anatomical visualization of the anatomy under examination.
  • DUS is increasingly becoming the sole imaging modality used to determine the appropriate treatment and management steps required for patients with carotid atherosclerotic disease. This is due to the fact that DUS holds many advantages over other imaging modalities, including its capability to display in vivo images non-invasively and in real-time. Ultrasound also remains the least expensive diagnostic imaging tool to purchase, operate and maintain as compared to X-ray computed tomography (CT) or magnetic resonance imaging (MRI). Furthermore, ultrasound machines are comparatively smaller in size than CT or MRI machines making them more portable and convenient for use in clinics.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • Ultrasim ® Computer-based DUS simulators that make use of pre-recorded clinical data are also available such as that sold by MedSim Ltd. of Fort Lauderdale, Florida under the name Ultrasim ® .
  • the Ultrasim ® simulator couples a motion-tracked mock probe to pre-recorded, three-dimensional ultrasound clinical datasets virtually placed within an anthropomorphic mannequin.
  • the Ultrasim ® simulator offers a single DUS module based on pre-recorded Doppler audio clips sampled at a few points on a plane through an artery, under the assumption that blood flow dynamics are symmetric around the artery axis.
  • U.S. Patent No. 5,609,485 to Bergman et al. discloses a computer- based interactive reproduction system device designed to be used by physicians and technicians in medical training and diagnosis using medical systems such as ultrasound machines. Biological data is collected from a living body and stored in memory. An operator manipulates a simulated sensor over a transmitter which may be attached to a simulated body. The transmitter transmits position data to a receiver in the sensor. A reproduction unit processes the pre-recorded biological data and displays data corresponding to the position of the sensor with respect to the transmitter.
  • 6,117,078 to Lysyansky et al. discloses a method and apparatus for providing a virtual volumetric ultrasound phantom to construct an ultrasound training system from any ultrasound system.
  • the ultrasound system and method retrieve and display previously stored ultrasound data to simulate an ultrasound scanning session.
  • a real ultrasound system acquires an image of an ultrasound phantom.
  • the ultrasound image comprises ultrasound echo data for an image/scan plane representing a cross-section or partial volume of the ultrasound phantom.
  • the ultrasound image is analyzed to identify image attributes that are unique for each image/scan plane. A portion of the previously stored data that corresponds to the image attributes is retrieved and displayed.
  • actual position and orientation of the acquired image/scan plane with respect to a known structure within the ultrasound phantom are determined by processing the image/scan plane to obtain a number of geometrical image parameters. Position and orientation of the image/scan plane are calculated from the image parameters using formulas based on a known three-dimensional structure within the phantom. The determination of actual image/scan plane position and orientation is enhanced using image de-correlation techniques. Retrieval of the stored data is based upon the calculated position and orientation or on the obtained image parameters.
  • U.S. Patent No. 6,210,168 to Aiger et al. discloses a method and system for simulating, on a B-mode ultrasound simulator, a D-mode and C-mode Doppler ultrasound examination.
  • Velocity and sound data describing blood flow at selected locations within blood vessels of a patient are gathered during an actual Doppler ultrasound examination.
  • the gathered data are processed off-line to generate - A -
  • Doppler simulation at a designated location on a B-mode image generated from the virtual frame buffer is achieved by performing bilinear interpolation, at the time of simulation, from the data stored in the memory, so as to determine blood flow velocity and sound values for all designated virtual frame buffer voxels.
  • the interpolated blood flow velocity values are depicted as either a gray scale Doppler spectral waveform or a color scale flow map on the screen of the B-mode ultrasound simulator.
  • the sound values are depicted as an audible signal simulating a Doppler sound waveform.
  • U.S. Patent Application Publication No. 2005/0283075 to Ma et al. discloses a three-dimensional, fly-through ultrasound system.
  • a volume is represented using high spatial resolution ultrasound data.
  • the same set of ultrasound data is used for identifying a boundary, for placing the perspective position within the volume and rendering from the perspective position. The identification of the boundary and rendering are automated and performed by a processor.
  • Ultrasound data is used to generate three-dimensional, fly-through representations allowing for virtual endoscopy or other diagnostically useful views of structure or fluid flow channels.
  • the computer-based ultrasound simulators described above all make use of pre-recorded clinical data in order to generate audio and/or video ultrasound images.
  • relying on pre-recorded clinical data reduces the effectiveness of the ultrasound simulators as training tools.
  • displayed images are based on pre-recorded clinical data and are not generated on-the- fiy, operators are unable to learn about the many important operating parameters that influence the representation, and hence the interpretation, of the returned Doppler signal.
  • improvements in Doppler ultrasound simulators are desired. It is therefore an object of the present invention to provide a novel ultrasound simulator and method of simulating an ultrasound examination. Summary of the Invention
  • a method of simulating an ultrasound examination comprising: synthesizing ultrasound data using a computational phantom; and coupling the simulated ultrasound data to motion of a sensor manipulated over a target volume thereby to simulate said ultrasound examination.
  • the synthesized ultrasound data comprises both synthesized Doppler ultrasound data and B-mode data and the computational phantom comprises a computational fluid dynamics (CFD) model.
  • the synthesizing comprises interrogating the CFD model at points within a sample volume; for each point, determining from the CFD model, a velocity vector in the direction of the sensor and converting the velocity vector into a frequency; and summing the frequencies to yield Doppler ultrasound data.
  • the synthesized ultrasound data is displayed thereby to render an image selected from the group comprising a spectrograph, a colour Doppler image, a power Doppler image, a B-mode image, a duplex ultrasound (DUS) image, a colour DUS image and a power Doppler DUS image.
  • the image is displayed at clinical frame rates in response to manipulation of the sensor. Frames of the image are manipulated to maintain synchronism between the displayed image and the cycle of anatomy encompassed by the target volume.
  • a Doppler audio signal is also generated and is represented by a summation of signals generated for each sampled point within the target volume.
  • an ultrasound simulator comprising a motion tracking device outputting position data when moved over a target volume; and processing structure communicating with said motion tracking device, said processing structure synthesizing ultrasound data using a computational phantom and said position data thereby to simulate said ultrasound examination.
  • the ultrasound simulator provides the operator with a realistic experience in the operation of a true clinical DUS system.
  • the library of carotid CFD models allows a variety of carotid conditions to be simulated, including hazardous high-grade stenosis, thereby enabling the operator to gain both practical and theoretical experience in assessing the health of a carotid artery via DUS.
  • Doppler ultrasound data, B-mode data or both Doppler ultrasound and B-mode data is synthesized on-the-fly, the subjectivity, machine parameters and measurement errors associated with using pre-recorded clinical data are avoided. Also, the development, capital and on-going costs associated with tissue-mimicking physical phantoms are avoided.
  • Figure 1 shows a duplex ultrasound simulator comprising a motion tracker, a mannequin and a computer;
  • Figure 2 shows a sensor and a control unit forming part of the motion tracker, a portion of the mannequin and conversion of motion tracker coordinate values from the motion tracker coordinate system to a computational fluid dynamics (CFD) model coordinate system via transformations TFoB ⁇ sensor and TCFD ⁇ FOB;
  • CFD computational fluid dynamics
  • Figure 3 shows the general steps performed during Doppler ultrasound simulation
  • Figure 4 shows information processed and generated during ultrasound simulation
  • Figure 5 shows calibration of the DUS simulator
  • Figure 6 is a time-domain waveform for synthesized Doppler audio
  • Figure 7 shows a frame dropping algorithm employed by the DUS simulator
  • Figure 8 shows three cases handled by the frame dropping algorithm
  • Figure 9 shows B-mode ultrasound data simulation via texture mapping
  • Figure 10 shows union of a sphere and plane and intersection of a sphere and plane achieved via CSG using a stencil buffer
  • Figure 11 shows a DUS simulator image
  • Figure 12 shows a DUS simulator image with no beam steering and an insonation angle that is perpendicular to blood flow
  • Figure 13 shows the effects of positive and negative beam steering on the DUS simulator image
  • Figure 14 shows the effects of sample volume size on DUS simulator images.
  • the DUS simulator employs Doppler physics and clinically relevant carotid geometries and blood flow velocity fields together with B-mode physics and clinically relevant anatomical structures to simulate accurately a DUS system allowing colour DUS, power Doppler DUS, DUS, B-mode, colour Doppler, power Doppler and spectrograph images to be rendered at clinically relevant frame rates.
  • the DUS simulator also simulates the Doppler audio signal which sonographers listen to in order to detect abnormalities in blood flow. Embodiments of the DUS simulator will now be described with particular reference to Figures 1 to 14.
  • the DUS simulator 20 comprises a computer 22, external DUS controls 24 connected to the computer 22, a mannequin 26 in the shape of a torso upon which DUS examinations are performed and a motion tracker 27 such as that manufactured by Ascension Technology Corporation under the name of Flock of Birds ® (FoB).
  • the motion tracker 27 comprises a transmitter 28 embedded within the mannequin 26, a hand-held magnetically tracked, motion sensor 30 embedded in a shell resembling a conventional linear array ultrasound transducer and a control unit 32.
  • the external DUS controls 24 comprise a trackball 40 to provide the operator with fine control over sample volume placement and a programmable keypad 42 to provide the operator with dedicated control over Doppler parameters.
  • the trackball 40 or keypad 42 can be manipulated to enable the operator to steer the virtual beam, rotate an angle correction marker, adjust spectral gain, move and resize the colour box, change pulse repetition frequency (PRF) etc.
  • PRF change pulse repetition frequency
  • Other Doppler and B- mode settings may be adjusted using the external DUS controls 24.
  • the computer 22 executes software that enables realistic ultrasound images as well as Doppler audio to be generated in real-time and on-the-fly in response to movement of the motion sensor 30 over the mannequin 26 as will be described.
  • the computer 22 also stores a library of computational fluid dynamic
  • CFD CFD models representing a variety of carotid conditions including for example, hazardous high-grade stenosis.
  • fluid flow is governed by a set of partial differential equations, the Navier-Stokes equations.
  • the complex domain here, an artery or vein
  • the complex domain must be divided into a volumetric "mesh" of contiguous, regular, three-dimensional shapes (finite elements), connected to each other at their corners (nodes).
  • a set of algebraic equations can be formulated, and solved simultaneously to obtain nodal flow velocities and pressures.
  • the nodal flow velocities are written to a series of datafiles, with each dataf ⁇ le corresponding to one time point during a cardiac cycle.
  • a separate datafile provides information about how the nodes are connected together (i.e., the mesh).
  • Each CFD model is therefore made up of CFD mesh and flow velocity datafiles. Together the CFD mesh and velocity datafiles allow the DUS simulator 20 to identify the flow velocity at any point in the artery or vein of interest.
  • a separate, structural mesh datafile describes the tissue surrounding the artery or vein. Because this surrounding tissue is not moving, velocity information is not required.
  • each structural mesh element has assigned ultrasound properties such as for example acoustic impendances and attenuation coefficients that are used by the DUS simulator 20 to determine the intensity of simulated B-mode ultrasound data.
  • ultrasound properties such as for example acoustic impendances and attenuation coefficients that are used by the DUS simulator 20 to determine the intensity of simulated B-mode ultrasound data.
  • the operator manipulates the handle-held motion sensor 30 over the mannequin 26.
  • the transmitter 28 in the mannequin 26 transmits position data which is received by the sensor 30.
  • the received position data is then conveyed to the control unit 32.
  • the control unit 32 continuously tracks the relative position and orientation of the sensor 30 as the sensor 30 is manipulated over the mannequin 26 and generates (x,y) coordinate values that are transformed to the motion tracker's coordinate system.
  • the x-vector of the sensor 30 represents the direction in which the virtual beam of the sensor is emitted (the axial direction on an image) and the y- vector represents the lateral direction of the virtual beam or the direction that the virtual beam is swept.
  • the motion tracker 27 in turn conveys the coordinate values to the computer 22.
  • the computer 22 upon receipt of the coordinate data further processes the received data to generate either a colour DUS, power Doppler DUS, DUS, B-mode, colour Doppler, power Doppler or spectrograph image depending on the selected mode of operation as well as to simulate Doppler audio.
  • the generated image is displayed on the display screen of the computer 22 on a graphical user interface that resembles a clinical DUS system thereby to provide the operator with a realistic simulation experience. Further specifics concerning the DUS simulator 20 will now be described.
  • the computer 22 in order to simulate Doppler ultrasound, the computer 22 combines a model of Doppler physics and a pulsatile velocity field that is derived from one of the CFD models, where nodal velocities throughout the mesh are solved using equations governing pulsatile blood flow.
  • Figure 3 shows the general steps performed by the computer 22 during Doppler ultrasound simulation. Initially, the sample volume is placed by the operator via manipulation of the trackball 40 at a known location within the CFD model. The sample volume comprises between 100 to 1000 sampling points which are distributed uniformly and randomly throughout a spherical volume. This number of sampling points has been found to produce realistic spectrograph images.
  • each sampling point, or scatterer is weighted according to its distance from the centre of the sample volume via a Gaussian power distribution, to account for non-uniform virtual beam intensity. Efficient velocity interpolation of each sampled point within the sample volume is made possible via the application of a fast, geometric searching algorithm. Once the true velocity value is determined, the scalar velocity component along the direction of the virtual beam is computed. The specified Doppler angle is then applied to solve for the corrected Doppler frequency or velocity.
  • the effects of spectral broadening due to both geometric and transit-time broadening are empirically applied by convolving the Doppler frequency with an intrinsic spectral broadening function. This serves to give the spectrographs a realistic, "smeared" appearance. These steps are performed to produce a power versus velocity spectrum for each time step within a full cardiac cycle. From this, a spectrograph is produced, and characteristic information such as mean and peak velocities are derived.
  • Figure 4 shows information processed and generated by the computer 22 during ultrasound simulation.
  • the computer 22 uses the CFD mesh, flow velocity and structural mesh datafiles as well as the motion tracker coordinate value output.
  • the output coordinate values and datafiles are used to compute spectral data at the sensor's location as described above.
  • the DUS simulator outputs either a B- mode image, a colour DUS image, a DUS image, a power Doppler DUS image, a power Doppler image, a colour Doppler image or a spectrograph that is rendered in real-time on the display of the computer 22.
  • Real-time Doppler audio is also produced at the desired location.
  • the position and orientation of the sensor 30 in terms of the CFD model and structural mesh coordinates is required. This involves performing two coordinate transformations to map the coordinate systems of the motion tracker 27 and the CFD model, as shown in Figure 2.
  • One mapping determines the position and orientation of the sensor 30 relative to the transmitter 28, and is established via the transformation matrix, TF O B ⁇ sensor- This mapping is performed by the control unit 32.
  • a second transformation matrix, T CFD ⁇ FOB is applied to complete the conversion to CFD model coordinates. Calibration of the CFD model over a sample volume representing the anatomy under examination is however required.
  • the control unit 32 of the motion tracker 27 outputs coordinate values representing the positions of the fiducial markers in the motion tracker's coordinate system or space. As the positions of these fiducial markers in the CFD model coordinate system is known, from these two sets of coordinate values, a mapping of the motion tracker and CFD model spaces can be obtained.
  • Figure 5 shows placement of the fiducial markers on a neck mannequin.
  • the fiducial markers are chosen to be noncollinear so that a similarity transformation can be performed, where a translation, scaling and rotation is applied so that all angles and changes in distances are preserved between the CFD model and motion tracker coordinate systems. This is achieved using a least-squares approximation, although those of skill in the art will appreciate that other approximations may be used.
  • three fiducial markers are placed on the surface of the mannequin for virtual examination of the left carotid artery.
  • the first fiducial marker is placed on the left side of the neck mannequin on the same plane as the bifurcation apex to position correctly the CFD model when the first marker is probed with the sensor 30.
  • the second fiducial marker is placed at a known distance below the first fiducial marker to orient the long axis and scale the CFD model when the second marker is probed with the sensor 30.
  • the third fiducial marker is placed at a point that was one quarter of the neck's perimeter from the first fiducial marker in the clockwise direction, along the circumference of the neck mannequin to correctly rotate the CFD model when the third marker is probed with the sensor 30.
  • the colour DUS image is composed of an array of sample volumes whose corresponding pixels characterize the spectral mean velocity of that sample volume.
  • a colour Doppler image is superimposed over a simulated grayscale B-mode image.
  • the spectral mean at a time interval is defined by the following equation as:
  • the colour Doppler image is defined as the cross-product of the axial and lateral directions of the motion sensor 30, restricted to the height and width of the colour box.
  • Each sample volume is colour encoded using a lookup table consisting of one- hundred and seven (107) colour shades that was extracted from a present-day ultrasound machine.
  • Red and yellow hues indicate blood flow towards the sensor, and by convention represent blood flow towards the brain.
  • Blue and cyan hues indicate blood flow away from the sensor, and by convention represent blood flow away from the brain. Darker colours characterize slower blood flow (red or blue), while brighter colours indicate faster blood flow (yellow or cyan).
  • a power Doppler DUS image is simulated in a similar manner, however, power information is displayed in place of velocities.
  • the spectral analyzer of an ultrasound system collects short periods of a signal, and via a fast Fourier transform (FFT), extracts the relative frequencies and amplitudes contributing to that signal.
  • FFT fast Fourier transform
  • Each block of analyzed data shows up as a vertical line on the spectral display, and comprises a number of frequency bins that depend on the chosen FFT size.
  • the power contained in each frequency bin is encoded as the intensity of the corresponding pixels.
  • Temporal resolutions for spectral displays vary from 20 to 200 Hz, where each block of data would have typically been analyzed via a 128 or 256 point FFT.
  • the computer 22 generates spectrographs using a combination of
  • a real spectral display is mimicked by creating a two-dimensional image consisting of 256 frequency bins along the vertical axis and 400 time intervals along the horizontal axis. The 400 time intervals extend over four seconds, which is long enough to display approximately four cardiac cycles.
  • a grey-scale lookup table consisting of 128 entries of black, white and shades of grey, is applied to linearly encode the power values in the spectrograph.
  • Synthesizing B-mode ultrasound images in the DUS simulator 20 involves use of a semi-empirical acoustic field model which is applied onto a number of randomly distributed scatterers located within a precomputed, anthropomorphic, computational volume mesh.
  • ultrasound waves are transmitted from the transducer and towards the body, known as the axial direction. These waves are reflected back at varying intensities depending on the acoustic impedances encountered, attenuation coefficient of the encountered tissue, and depth from which the echo originated. The time at which echoes are returned to the transducer determines the depth from which the echoes were produced.
  • the ultrasound beam is swept laterally across the elements of the transducer face, the corresponding column of pixels in the frame buffer are rendered, until a two- dimensional image is produced.
  • the motion tracker 27 continuously outputs positional and orientation information as the sensor 30 is manipulated, from which a virtual beam is computed. Scatterers within the structural mesh phantom are then sampled at discrete locations along each virtual beam. Tissue properties such as acoustic impedance and attenuation coefficient are known at each of the scatterer locations throughout the structural mesh. The arrangement of acoustic impedances and attenuation coefficients along the axial direction of the virtual beam, along with the depth of the sampled scatterers, determine the signal intensity of the corresponding pixel.
  • I(d) is the intensity; a is the attenuation coefficient; /is the transmit frequency; and d is the round trip distance of the virtual ultrasound wave.
  • Speckle is modeled by applying Rician noise along the virtual beam paths. As speckle is spatially-correlated, this noise is spatially convolved with a Gaussian distribution in the lateral direction.
  • Real-time rendering of the simulated B-mode images is accomplished by exploiting the graphics processing unit (GPU) of the computer 22 to compute and write pixel-values directly. This is beneficial since current graphics cards contain multiple processors to render pixels in parallel, and bandwidth limitations between computer memory and the memory of the graphics card is circumvented. These processors are accessed via a pixel shader program, which is a function that computes effects on a per-pixel basis. Computation of intensity values and rendering of each pixel is executed largely in parallel.
  • texture mapping can be employed.
  • pre-acquired three-dimensional textures of various tissues, and an anthropomorphic computational volume mesh are texture mapped.
  • texture mapping is often applied when modelling a detailed scene where the number of polygons required to produce an image becomes too great and impractical. Similar to pasting wallpaper onto a white wall rather than drawing out an exact pattern by hand, texture mapping allows for a digital image to be pasted onto a single polygon rather than having to model the image explicitly.
  • a three-dimensional volume mesh whose nodal values contain tissue information such as for example, that of blood, fat, or a calcified vessel, at their respective locations, is provided.
  • the sensor 30 supplies the plane normal of the anatomical slice that is displayed on computer screen.
  • a set of pre-recorded three-dimensional textures which are preacquired from a real ultrasound system, is also supplied. These textures contain the B-mode representations of various tissues found in the anatomy. Each of these textures comprise a volume encompassing the entire mesh.
  • the smallest unit within a texture is known as a "texel", analogous to a pixel being the smallest element within an image. Every texel within a texture has an associated weighting, which indicates the amount of tissue present at that location. The assignment of weights is described in further detail below. In order to achieve real-time performance, speedy access to the weights is desired. This is achieved by storing the weighting information on the graphics card of the computer 22 in RGBA format, which consists of a red, green, and blue channel and an additional transparency channel (known as alpha). These channels are used to store the weightings of each tissue texture so that quick access to these weightings is achieved.
  • RGBA format which consists of a red, green, and blue channel and an additional transparency channel (known as alpha).
  • the center point of every texel within each of the textures is firstly probed.
  • the element that contains this point is found within the volume mesh using an efficient geometric search algorithm.
  • the interpolated nodal values indicate the type of tissue present at the sampled point location, and the tissue type is used to assign the weightings at that texel. For instance, if the point corresponds to fat tissue, then the respective texel within the texture corresponding to fat tissue will record a full weighting of 1.0, while the other textures will record a weighting of 0.0.
  • this pre-processing step serves to record the tissue information, as weights in the textures.
  • Figure 9 illustrates this process.
  • the coordinates of the ultrasound image plane is determined.
  • the resolution of the image is set equivalent to that of an actual B-mode image. Every pixel within this image is then probed for its mesh coordinate value, which serves as an index into each of the textures. Each texture returns its weighting at the probed texel, and the resulting shade is applied. Once all pixels have been colour-coded, the scene is rendered and displayed yielding a realistic B-mode simulation requiring little computation.
  • constructive solid geometry is used to synthesize and render B-mode ultrasound.
  • constructive solid geometry (CSG) objects are created via a stencil buffer, to efficiently outline various anatomical structures present in the ultrasound image plane. Pixels on the plane are colour-coded using pre-acquired textures derived from true B-mode ultrasound images.
  • CSG is a technique which allows for the combination of objects (where objects refer to closed polygonal surfaces), using Boolean set operators, such as union, difference and intersection, to create more complex objects.
  • objects where objects refer to closed polygonal surfaces
  • Boolean set operators such as union, difference and intersection
  • an intersection operation is performed between the ultrasound image plane and a number of anthropomorphic surface meshes which model the morphology of all tissues found in the region.
  • the image plane is texture mapped with realistic B-mode images which are preacquired from an actual ultrasound machine. The normal and position of the ultrasound image plane is provided by the output of the sensor.
  • the intersection operation described above is performed by means of the stencil buffer, which is among the several buffers that reside on the computer graphics card.
  • the stencil buffer can be employed in an analogous manner as a real world stencil, or outline.
  • a stencil test compares the value in the stencil buffer to a reference value and determines whether a pixel is eliminated or not, hence acting as a mask. This test is first setup by disabling colour bits from being written to the frame buffer, so that draw calls are not displayed on the screen. The front face of the intersecting plane is then drawn, or written to the frame buffer.
  • next draw call which is to the front face of the surface mesh where the first stencil test is performed: glEnable(GL_STENCIL_TEST); glStencilFunc(GL_ALWAYS, 0, 0); glStencilOp(GL_KEEP, GLJCEEP, INCR);
  • bits in the stencil buffer are incremented wherever the front face of the surface mesh is drawn.
  • the next call decrements values in the stencil buffer wherever the back face of the surface mesh is drawn: glStencilOp(GL_KEEP, GL KEEP, GL_DECR);
  • FIG. 10 shows an example of the stencilling procedure described above used to display the intersection between a plane and a sphere.
  • the virtual DUS simulator involves the intersection of the provided anthropomorphic surface meshes and the ultrasound image plane.
  • the surface meshes provided represent surfaces of the tissues occurring throughout the anatomical region. Each surface mesh belongs to a certain tissue type such as for example, fat.
  • the above stencilling procedure is then executed on all surface meshes that belong to a particular tissue type.
  • the CFD model velocity field information is contained in a number of time-steps that make up one complete heart cycle period. Hence, new velocities that are rendered at every frame update are derived from a particular time-step from the velocity field data.
  • a frame-dropping algorithm is employed. As will be appreciated, render speeds and thus frame rates, may fluctuate depending on computation times and therefore might not be in sync with the hypothetical heart rate of the CFD model.
  • the frame-dropping algorithm ensures that frames are rendered in real-time irrespective of variations in the underlying computation time.
  • the frame dropping algorithm either discards or appends frames that will cause the display to become out of sync depending on whether the software is ahead of or behind "schedule".
  • Figures 7 and 8 illustrate an example of what is meant by ahead of and behind schedule.
  • “Elapsed time” is the true time, or the time dictated by the actual computer CPU wall time that has passed (i.e. the physical time that has passed, as opposed to the number of CPU clock cycles) and “theoretical time” is the time dictated by the program's next scheduled frame number.
  • frame number seven is due, but enough CPU wall time has passed that frame number eight should now be rendered, then the program is behind schedule, and in this case frame number seven is discarded.
  • the last rendered frame will be conserved until it is time for the next frame.
  • CFD model velocities at a cloud of randomly distributed points are sampled within a predefined sample volume power distribution.
  • Each velocity is converted to a Doppler frequency via the Doppler equation, weighted according to the power at its sample volume location, and convolved with an intrinsic spectral broadening function.
  • Spectra are constructed at discrete times, with the velocity sampling points randomly distributed within a nominal temporal window ⁇ t.
  • each point within the CFD velocity field is sampled at some time t 0 at a random location within the sample volume, and a velocity, v, is returned which is converted to a Doppler frequency, f. From this the audio waveform basis function expressed below:
  • T is the time required to traverse the sample volume, which is assumed to be the nominal sample volume diameter divided by the velocity.
  • waveform basis functions are generated at each broadened frequency and their associated powers are summed together.
  • a continuous audio signal is built representing the summation of signal contributions from every point within the sample volume.
  • the DUS simulator 20 allows the operator to steer the virtual beam at three angles, +20°, 0°, -20°. Multiple steering angles are incorporated into the application by rotating the "axial direction" vector of the sensor 30 (i.e. its x-axis) about the slicing plane normal, by the steering angle as shown in Figures 12 and 13. The new vector is used to calculate the blood velocity component along the axial direction. As can be seen in Figure 13, both the colour DUS and spectrograph images are correctly updated.
  • the DUS simulator 20 starts with a 60° default angle between the virtual beam and angle correction marker but allows the angle correction marker to be rotated at increments of 2° from -70° to +70°, as is permitted in conventional ultrasound systems. Adjustment of the angle correction marker then alters the Doppler angle which is used for the derivation of velocities displayed on the spectrogram.
  • the system can support any starting angle and incremental rotation.
  • the spectrograph utilizes a grey scale lookup table. Computed power values serve as indices into the lookup table. By adjusting the grey scale levels, or the scalar range to which the colours are mapped, the basic use of the spectral gain feature, i.e. to vary the strength of the backscattered signal can be mimicked.
  • the operator is allowed to move and resize the colour box. In doing so, the frame rate and pulse repetition frequency (PRF) may be affected.
  • PRF pulse repetition frequency
  • c represents the speed of sound in
  • the DUS simulator 20 allows for interactivity via the keyboard and trackball and detects for certain keyboard and trackball movement events. As an example, Figure 11 shows the list of keyboard events and their associated functions.
  • the sample volume marker may be moved about the display screen via the trackball and keyboard for selection of a region to be viewed in spectrogram mode.
  • the gate size of the sample volume may also be increased or decreased as shown in Figure 14.
  • the left side shows a lmm sample volume that yields a clean spectrograph with little spectral broadening.
  • the right side shows an enlarged sample volume that provides for a broader spectra. Since, in ultrasonography the operator only has control of the axial size of the sample volume, i.e. along the beam direction, an oblate shaped sample volume with radii equivalent to the standard deviation of the Gaussian power distribution along the respective axes is employed.
  • the PRF of a Doppler ultrasound system is primarily associated with the velocity or frequency scale of the colour flow map or spectrogram. Depending on the system's scanhead, various sets of PRF ranges are available for the sonographer. For both CDUS and spectrogram images, various PRFs are provided for and in this example include 3500 Hz, 5000 Hz, 6250 Hz, 8333 Hz, 10000 Hz, 11950 Hz, and
  • fo max is the Doppler frequency
  • f ⁇ the transmit frequency
  • V max is the maximum velocity limit seen on the velocity scale
  • the Doppler angle
  • c is the speed of ultrasound in blood.
  • Settings that modify B-mode ultrasound on typical ultrasound systems are mimicked. This includes, but is not limited to i) gain level, ii) time-gain compensation, and iii) dynamic range.
  • the DUS simulator provides a strong tool for the advancement of current diagnosis protocols for the widespread problem of carotid disease. By realistically simulating a DUS examination, operators are able to gain useful experience that translates directly to real life DUS examinations. Signal processing techniques for improving and expanding the realm of information obtainable from Doppler ultrasound can be tested and analyzed. The DUS simulator also opens the door for the discovery of new and better risk indicators for stroke using Doppler ultrasound.

Abstract

A method of simulating an ultrasound examination comprises synthesizing ultrasound data using a computational phantom and coupling the simulated ultrasound data to motion of a sensor manipulated over a target volume thereby to simulate the ultrasound examination.

Description

ULTRASOUND SIMULATOR AND METHOD OF SIMULATING AN ULTRASOUND EXAMINATION
Field of the Invention
The present invention relates generally to the field of ultrasound and in particular, to an ultrasound simulator and method of simulating an ultrasound examination.
Background of the Invention
Duplex ultrasound (DUS) is among the most accessible imaging modalities available today for the diagnosis of carotid disease, a common precursor to the incidence of stroke. Duplex ultrasound provides a view of the anatomy under examination overlaid with blood flow velocity information by combining both B- mode and Doppler ultrasound. As is known, Doppler ultrasound detects the velocity of blood travelling through an individual's artery, by transmitting a high frequency signal into the body and detecting shifts in the frequency of returned signals. The detected velocity in turn can be used to approximate the degree of occlusion in the artery due to plaque build-up, or atherosclerosis. The degree of narrowing in the artery, or its stenosis severity, is a known indicator of an individual's risk of stroke. Because the returned frequency shifts are in the audible range, flowing blood can be heard, which provides useful auditory feedback. B-mode ultrasound provides information relating to tissue properties. Duplex ultrasound combines the blood flow velocity information extracted from Doppler ultrasound, with the tissue property information extracted B-mode ultrasound thereby to enable blood flow and anatomical visualization of the anatomy under examination.
DUS is increasingly becoming the sole imaging modality used to determine the appropriate treatment and management steps required for patients with carotid atherosclerotic disease. This is due to the fact that DUS holds many advantages over other imaging modalities, including its capability to display in vivo images non-invasively and in real-time. Ultrasound also remains the least expensive diagnostic imaging tool to purchase, operate and maintain as compared to X-ray computed tomography (CT) or magnetic resonance imaging (MRI). Furthermore, ultrasound machines are comparatively smaller in size than CT or MRI machines making them more portable and convenient for use in clinics. Although DUS is advantageous in many respects, DUS unfortunately suffers a drawback. The presence of complex blood flow patterns that are normally indicative of a diseased artery will often obscure the interpretation of DUS data. Also, the interpretation of Doppler blood flow images is highly dependent on several technical and physical factors that are typically encountered during an ultrasound examination. The above in conjunction with image artefacts make DUS one of the most difficult sonography techniques to master and interpret. For this reason, sonographers require considerable experience to achieve familiarity with the scenarios and factors that impact DUS imaging. In fact, sonographers are required to train for hundreds of hours and must be exposed to a wide range of vascular pathologies before certification. Notwithstanding these certification requirements, it has been suggested that, for dependable grading of carotid artery occlusion using DUS, a sonographer should have previously performed at least eight-hundred (800) examinations.
In order to better help understand and interpret DUS images, simulation models play an important role. Physical models such as in vitro phantoms that imitate human vasculature have been considered. Although such phantoms are useful for acquiring accurate DUS data throughout the phantom, these phantoms are generally very costly and time-consuming to build, and lack flexibility. Theoretical models of DUS have also been created; however these theoretical models are normally too computationally intensive to be run in real-time, or involve conditions that are not relevant in a clinical setting.
Computer-based DUS simulators that make use of pre-recorded clinical data are also available such as that sold by MedSim Ltd. of Fort Lauderdale, Florida under the name Ultrasim®. The Ultrasim® simulator couples a motion-tracked mock probe to pre-recorded, three-dimensional ultrasound clinical datasets virtually placed within an anthropomorphic mannequin. The Ultrasim® simulator offers a single DUS module based on pre-recorded Doppler audio clips sampled at a few points on a plane through an artery, under the assumption that blood flow dynamics are symmetric around the artery axis. The pre-recorded audio clips together with pre- recorded blood flow velocity data are interpolated for audio and visual playback in response to the operator-placed mock probe. U.S. Patent No. 5,609,485 to Bergman et al. discloses a computer- based interactive reproduction system device designed to be used by physicians and technicians in medical training and diagnosis using medical systems such as ultrasound machines. Biological data is collected from a living body and stored in memory. An operator manipulates a simulated sensor over a transmitter which may be attached to a simulated body. The transmitter transmits position data to a receiver in the sensor. A reproduction unit processes the pre-recorded biological data and displays data corresponding to the position of the sensor with respect to the transmitter. U.S. Patent No. 6,117,078 to Lysyansky et al. discloses a method and apparatus for providing a virtual volumetric ultrasound phantom to construct an ultrasound training system from any ultrasound system. The ultrasound system and method retrieve and display previously stored ultrasound data to simulate an ultrasound scanning session. A real ultrasound system acquires an image of an ultrasound phantom. The ultrasound image comprises ultrasound echo data for an image/scan plane representing a cross-section or partial volume of the ultrasound phantom. The ultrasound image is analyzed to identify image attributes that are unique for each image/scan plane. A portion of the previously stored data that corresponds to the image attributes is retrieved and displayed. In one embodiment, actual position and orientation of the acquired image/scan plane with respect to a known structure within the ultrasound phantom are determined by processing the image/scan plane to obtain a number of geometrical image parameters. Position and orientation of the image/scan plane are calculated from the image parameters using formulas based on a known three-dimensional structure within the phantom. The determination of actual image/scan plane position and orientation is enhanced using image de-correlation techniques. Retrieval of the stored data is based upon the calculated position and orientation or on the obtained image parameters.
U.S. Patent No. 6,210,168 to Aiger et al. discloses a method and system for simulating, on a B-mode ultrasound simulator, a D-mode and C-mode Doppler ultrasound examination. Velocity and sound data describing blood flow at selected locations within blood vessels of a patient are gathered during an actual Doppler ultrasound examination. The gathered data are processed off-line to generate - A -
sets of blood flow velocity and sound values, which describe blood flow at selected locations in a virtual B-mode frame buffer, and are stored in memory. Doppler simulation at a designated location on a B-mode image generated from the virtual frame buffer is achieved by performing bilinear interpolation, at the time of simulation, from the data stored in the memory, so as to determine blood flow velocity and sound values for all designated virtual frame buffer voxels. The interpolated blood flow velocity values are depicted as either a gray scale Doppler spectral waveform or a color scale flow map on the screen of the B-mode ultrasound simulator. The sound values are depicted as an audible signal simulating a Doppler sound waveform.
U.S. Patent Application Publication No. 2005/0283075 to Ma et al. discloses a three-dimensional, fly-through ultrasound system. A volume is represented using high spatial resolution ultrasound data. By modulating B-mode data with Doppler or blood flow information, the spatial resolution or contrast of the B-mode data may be enhanced. The same set of ultrasound data is used for identifying a boundary, for placing the perspective position within the volume and rendering from the perspective position. The identification of the boundary and rendering are automated and performed by a processor. Ultrasound data is used to generate three-dimensional, fly-through representations allowing for virtual endoscopy or other diagnostically useful views of structure or fluid flow channels.
As will be appreciated, the computer-based ultrasound simulators described above all make use of pre-recorded clinical data in order to generate audio and/or video ultrasound images. Unfortunately, relying on pre-recorded clinical data reduces the effectiveness of the ultrasound simulators as training tools. Since displayed images are based on pre-recorded clinical data and are not generated on-the- fiy, operators are unable to learn about the many important operating parameters that influence the representation, and hence the interpretation, of the returned Doppler signal. As will be appreciated, improvements in Doppler ultrasound simulators are desired. It is therefore an object of the present invention to provide a novel ultrasound simulator and method of simulating an ultrasound examination. Summary of the Invention
Accordingly, in one aspect there is provided a method of simulating an ultrasound examination comprising: synthesizing ultrasound data using a computational phantom; and coupling the simulated ultrasound data to motion of a sensor manipulated over a target volume thereby to simulate said ultrasound examination. hi one embodiment, the synthesized ultrasound data comprises both synthesized Doppler ultrasound data and B-mode data and the computational phantom comprises a computational fluid dynamics (CFD) model. The synthesizing comprises interrogating the CFD model at points within a sample volume; for each point, determining from the CFD model, a velocity vector in the direction of the sensor and converting the velocity vector into a frequency; and summing the frequencies to yield Doppler ultrasound data.
The synthesized ultrasound data is displayed thereby to render an image selected from the group comprising a spectrograph, a colour Doppler image, a power Doppler image, a B-mode image, a duplex ultrasound (DUS) image, a colour DUS image and a power Doppler DUS image. The image is displayed at clinical frame rates in response to manipulation of the sensor. Frames of the image are manipulated to maintain synchronism between the displayed image and the cycle of anatomy encompassed by the target volume.
A Doppler audio signal is also generated and is represented by a summation of signals generated for each sampled point within the target volume.
According to another aspect there is provided an ultrasound simulator comprising a motion tracking device outputting position data when moved over a target volume; and processing structure communicating with said motion tracking device, said processing structure synthesizing ultrasound data using a computational phantom and said position data thereby to simulate said ultrasound examination.
The ultrasound simulator provides the operator with a realistic experience in the operation of a true clinical DUS system. The library of carotid CFD models allows a variety of carotid conditions to be simulated, including hazardous high-grade stenosis, thereby enabling the operator to gain both practical and theoretical experience in assessing the health of a carotid artery via DUS. As Doppler ultrasound data, B-mode data or both Doppler ultrasound and B-mode data is synthesized on-the-fly, the subjectivity, machine parameters and measurement errors associated with using pre-recorded clinical data are avoided. Also, the development, capital and on-going costs associated with tissue-mimicking physical phantoms are avoided.
Brief Description of the Drawings
Embodiments will now be described more fully with reference to the accompanying drawings in which: Figure 1 shows a duplex ultrasound simulator comprising a motion tracker, a mannequin and a computer;
Figure 2 shows a sensor and a control unit forming part of the motion tracker, a portion of the mannequin and conversion of motion tracker coordinate values from the motion tracker coordinate system to a computational fluid dynamics (CFD) model coordinate system via transformations TFoB<sensor and TCFD<FOB;
Figure 3 shows the general steps performed during Doppler ultrasound simulation;
Figure 4 shows information processed and generated during ultrasound simulation; Figure 5 shows calibration of the DUS simulator;
Figure 6 is a time-domain waveform for synthesized Doppler audio; Figure 7 shows a frame dropping algorithm employed by the DUS simulator;
Figure 8 shows three cases handled by the frame dropping algorithm; Figure 9 shows B-mode ultrasound data simulation via texture mapping;
Figure 10 shows union of a sphere and plane and intersection of a sphere and plane achieved via CSG using a stencil buffer;
Figure 11 shows a DUS simulator image; Figure 12 shows a DUS simulator image with no beam steering and an insonation angle that is perpendicular to blood flow; Figure 13 shows the effects of positive and negative beam steering on the DUS simulator image; and
Figure 14 shows the effects of sample volume size on DUS simulator images.
Detailed Description of the Embodiments
A real-time and interactive duplex ultrasound (DUS) simulator and ultrasound examination simulation method are described herein. The DUS simulator employs Doppler physics and clinically relevant carotid geometries and blood flow velocity fields together with B-mode physics and clinically relevant anatomical structures to simulate accurately a DUS system allowing colour DUS, power Doppler DUS, DUS, B-mode, colour Doppler, power Doppler and spectrograph images to be rendered at clinically relevant frame rates. The DUS simulator also simulates the Doppler audio signal which sonographers listen to in order to detect abnormalities in blood flow. Embodiments of the DUS simulator will now be described with particular reference to Figures 1 to 14.
Turning now to Figure 1 , a real-time and interactive duplex ultrasound (DUS) simulator is shown and is generally identified by reference numeral 20. As can be seen, the DUS simulator 20 comprises a computer 22, external DUS controls 24 connected to the computer 22, a mannequin 26 in the shape of a torso upon which DUS examinations are performed and a motion tracker 27 such as that manufactured by Ascension Technology Corporation under the name of Flock of Birds® (FoB). The motion tracker 27 comprises a transmitter 28 embedded within the mannequin 26, a hand-held magnetically tracked, motion sensor 30 embedded in a shell resembling a conventional linear array ultrasound transducer and a control unit 32. The external DUS controls 24 comprise a trackball 40 to provide the operator with fine control over sample volume placement and a programmable keypad 42 to provide the operator with dedicated control over Doppler parameters. For example, the trackball 40 or keypad 42 can be manipulated to enable the operator to steer the virtual beam, rotate an angle correction marker, adjust spectral gain, move and resize the colour box, change pulse repetition frequency (PRF) etc. Of course, other Doppler and B- mode settings may be adjusted using the external DUS controls 24. The computer 22 executes software that enables realistic ultrasound images as well as Doppler audio to be generated in real-time and on-the-fly in response to movement of the motion sensor 30 over the mannequin 26 as will be described. The computer 22 also stores a library of computational fluid dynamic
(CFD) models representing a variety of carotid conditions including for example, hazardous high-grade stenosis. As is known, fluid flow is governed by a set of partial differential equations, the Navier-Stokes equations. To solve these equations for all but the most trivial cases, the complex domain (here, an artery or vein) must be divided into a volumetric "mesh" of contiguous, regular, three-dimensional shapes (finite elements), connected to each other at their corners (nodes). By assuming the shape of the velocity field over an element, a set of algebraic equations can be formulated, and solved simultaneously to obtain nodal flow velocities and pressures. The nodal flow velocities are written to a series of datafiles, with each datafϊle corresponding to one time point during a cardiac cycle. A separate datafile provides information about how the nodes are connected together (i.e., the mesh). Each CFD model is therefore made up of CFD mesh and flow velocity datafiles. Together the CFD mesh and velocity datafiles allow the DUS simulator 20 to identify the flow velocity at any point in the artery or vein of interest. A separate, structural mesh datafile describes the tissue surrounding the artery or vein. Because this surrounding tissue is not moving, velocity information is not required. Instead, each structural mesh element has assigned ultrasound properties such as for example acoustic impendances and attenuation coefficients that are used by the DUS simulator 20 to determine the intensity of simulated B-mode ultrasound data. During an ultrasound examination simulation, the operator manipulates the handle-held motion sensor 30 over the mannequin 26. As this occurs, the transmitter 28 in the mannequin 26 transmits position data which is received by the sensor 30. The received position data is then conveyed to the control unit 32. The control unit 32 continuously tracks the relative position and orientation of the sensor 30 as the sensor 30 is manipulated over the mannequin 26 and generates (x,y) coordinate values that are transformed to the motion tracker's coordinate system. The x-vector of the sensor 30 represents the direction in which the virtual beam of the sensor is emitted (the axial direction on an image) and the y- vector represents the lateral direction of the virtual beam or the direction that the virtual beam is swept. The motion tracker 27 in turn conveys the coordinate values to the computer 22. The computer 22 upon receipt of the coordinate data further processes the received data to generate either a colour DUS, power Doppler DUS, DUS, B-mode, colour Doppler, power Doppler or spectrograph image depending on the selected mode of operation as well as to simulate Doppler audio. The generated image is displayed on the display screen of the computer 22 on a graphical user interface that resembles a clinical DUS system thereby to provide the operator with a realistic simulation experience. Further specifics concerning the DUS simulator 20 will now be described.
In this embodiment, in order to simulate Doppler ultrasound, the computer 22 combines a model of Doppler physics and a pulsatile velocity field that is derived from one of the CFD models, where nodal velocities throughout the mesh are solved using equations governing pulsatile blood flow. Figure 3 shows the general steps performed by the computer 22 during Doppler ultrasound simulation. Initially, the sample volume is placed by the operator via manipulation of the trackball 40 at a known location within the CFD model. The sample volume comprises between 100 to 1000 sampling points which are distributed uniformly and randomly throughout a spherical volume. This number of sampling points has been found to produce realistic spectrograph images. The contribution of each sampling point, or scatterer, is weighted according to its distance from the centre of the sample volume via a Gaussian power distribution, to account for non-uniform virtual beam intensity. Efficient velocity interpolation of each sampled point within the sample volume is made possible via the application of a fast, geometric searching algorithm. Once the true velocity value is determined, the scalar velocity component along the direction of the virtual beam is computed. The specified Doppler angle is then applied to solve for the corrected Doppler frequency or velocity. The effects of spectral broadening due to both geometric and transit-time broadening are empirically applied by convolving the Doppler frequency with an intrinsic spectral broadening function. This serves to give the spectrographs a realistic, "smeared" appearance. These steps are performed to produce a power versus velocity spectrum for each time step within a full cardiac cycle. From this, a spectrograph is produced, and characteristic information such as mean and peak velocities are derived.
Figure 4 shows information processed and generated by the computer 22 during ultrasound simulation. As can be seen, the computer 22 uses the CFD mesh, flow velocity and structural mesh datafiles as well as the motion tracker coordinate value output. The output coordinate values and datafiles are used to compute spectral data at the sensor's location as described above. In this embodiment, depending on the selected mode, the DUS simulator outputs either a B- mode image, a colour DUS image, a DUS image, a power Doppler DUS image, a power Doppler image, a colour Doppler image or a spectrograph that is rendered in real-time on the display of the computer 22. Real-time Doppler audio is also produced at the desired location.
In order to display a slice of the "virtual patient" relative to the sensor 30 within the sample volume, the position and orientation of the sensor 30 in terms of the CFD model and structural mesh coordinates is required. This involves performing two coordinate transformations to map the coordinate systems of the motion tracker 27 and the CFD model, as shown in Figure 2. One mapping determines the position and orientation of the sensor 30 relative to the transmitter 28, and is established via the transformation matrix, TFOB<sensor- This mapping is performed by the control unit 32. A second transformation matrix, TCFD<FOB, is applied to complete the conversion to CFD model coordinates. Calibration of the CFD model over a sample volume representing the anatomy under examination is however required.
During calibration fiducial points whose coordinates are known in the CFD model space are marked on the sample volume. The operator is then prompted to place the sensor 30 sequentially at each of these fiducial markers. In response, the control unit 32 of the motion tracker 27 outputs coordinate values representing the positions of the fiducial markers in the motion tracker's coordinate system or space. As the positions of these fiducial markers in the CFD model coordinate system is known, from these two sets of coordinate values, a mapping of the motion tracker and CFD model spaces can be obtained.
Figure 5 shows placement of the fiducial markers on a neck mannequin. The fiducial markers are chosen to be noncollinear so that a similarity transformation can be performed, where a translation, scaling and rotation is applied so that all angles and changes in distances are preserved between the CFD model and motion tracker coordinate systems. This is achieved using a least-squares approximation, although those of skill in the art will appreciate that other approximations may be used. As can be seen, three fiducial markers are placed on the surface of the mannequin for virtual examination of the left carotid artery. In this example, the first fiducial marker is placed on the left side of the neck mannequin on the same plane as the bifurcation apex to position correctly the CFD model when the first marker is probed with the sensor 30. The second fiducial marker is placed at a known distance below the first fiducial marker to orient the long axis and scale the CFD model when the second marker is probed with the sensor 30. The third fiducial marker is placed at a point that was one quarter of the neck's perimeter from the first fiducial marker in the clockwise direction, along the circumference of the neck mannequin to correctly rotate the CFD model when the third marker is probed with the sensor 30.
Various imaging modes of the DUS simulator 20 will now be described.
Colour DUS Imaging
The colour DUS image is composed of an array of sample volumes whose corresponding pixels characterize the spectral mean velocity of that sample volume. To generate the colour DUS image, a colour Doppler image is superimposed over a simulated grayscale B-mode image. The spectral mean at a time interval is defined by the following equation as:
Figure imgf000012_0001
where: i is the velocity bin number; and P1 is the power contained in that velocity bin.
The colour Doppler image is defined as the cross-product of the axial and lateral directions of the motion sensor 30, restricted to the height and width of the colour box. Each sample volume is colour encoded using a lookup table consisting of one- hundred and seven (107) colour shades that was extracted from a present-day ultrasound machine. Red and yellow hues indicate blood flow towards the sensor, and by convention represent blood flow towards the brain. Blue and cyan hues indicate blood flow away from the sensor, and by convention represent blood flow away from the brain. Darker colours characterize slower blood flow (red or blue), while brighter colours indicate faster blood flow (yellow or cyan). A power Doppler DUS image is simulated in a similar manner, however, power information is displayed in place of velocities.
Spectrograph Imaging As is known, the spectral analyzer of an ultrasound system collects short periods of a signal, and via a fast Fourier transform (FFT), extracts the relative frequencies and amplitudes contributing to that signal. Each block of analyzed data shows up as a vertical line on the spectral display, and comprises a number of frequency bins that depend on the chosen FFT size. The power contained in each frequency bin is encoded as the intensity of the corresponding pixels. Hence, a two- dimensional image is formed that scrolls from left to right in time as vertical lines are appended onto a graph. Temporal resolutions for spectral displays vary from 20 to 200 Hz, where each block of data would have typically been analyzed via a 128 or 256 point FFT. The computer 22 generates spectrographs using a combination of
Doppler physics and CFD approach described above. A real spectral display is mimicked by creating a two-dimensional image consisting of 256 frequency bins along the vertical axis and 400 time intervals along the horizontal axis. The 400 time intervals extend over four seconds, which is long enough to display approximately four cardiac cycles. A grey-scale lookup table consisting of 128 entries of black, white and shades of grey, is applied to linearly encode the power values in the spectrograph. Although specific values for the number of frequency bins, time intervals and lookup table size are described above, those of skill in the art will appreciate that number of frequency bins, time intervals and size of the lookup table can be modified without affecting performance of the DUS simulator. Svnthesized B-mode Imaging
Synthesizing B-mode ultrasound images in the DUS simulator 20 involves use of a semi-empirical acoustic field model which is applied onto a number of randomly distributed scatterers located within a precomputed, anthropomorphic, computational volume mesh. In true B-mode ultrasound, ultrasound waves are transmitted from the transducer and towards the body, known as the axial direction. These waves are reflected back at varying intensities depending on the acoustic impedances encountered, attenuation coefficient of the encountered tissue, and depth from which the echo originated. The time at which echoes are returned to the transducer determines the depth from which the echoes were produced. As the ultrasound beam is swept laterally across the elements of the transducer face, the corresponding column of pixels in the frame buffer are rendered, until a two- dimensional image is produced.
In the DUS simulator 20, a similar approach is taken to mimic the mechanisms of true B-mode ultrasound. The motion tracker 27 as mentioned above continuously outputs positional and orientation information as the sensor 30 is manipulated, from which a virtual beam is computed. Scatterers within the structural mesh phantom are then sampled at discrete locations along each virtual beam. Tissue properties such as acoustic impedance and attenuation coefficient are known at each of the scatterer locations throughout the structural mesh. The arrangement of acoustic impedances and attenuation coefficients along the axial direction of the virtual beam, along with the depth of the sampled scatterers, determine the signal intensity of the corresponding pixel. For instance, scatterers whose acoustic impedance varies largely compared to those of neighbouring scatterers, produce more specular like reflections and the computed intensity of the corresponding pixel is higher, causing it to appear brighter. To mimic the commonly encountered shadowing artifact, pixels behind hyperechoic tissue (such as a calcified plaque) have their intensities annulled. In reality, attenuation of ultrasound waves also increases at greater depths. Hence in the DUS simulator, pixel intensities are also affected by the depth of the sampled scatterers. The following formula expressed below is applied to mimic the dependence of signal intensity on the depth of ultrasound insonation, ultrasound transmit frequency, and attenuation coefficient: I(d) = -a - f - d where:
I(d) is the intensity; a is the attenuation coefficient; /is the transmit frequency; and d is the round trip distance of the virtual ultrasound wave. Speckle is modeled by applying Rician noise along the virtual beam paths. As speckle is spatially-correlated, this noise is spatially convolved with a Gaussian distribution in the lateral direction. Real-time rendering of the simulated B-mode images is accomplished by exploiting the graphics processing unit (GPU) of the computer 22 to compute and write pixel-values directly. This is beneficial since current graphics cards contain multiple processors to render pixels in parallel, and bandwidth limitations between computer memory and the memory of the graphics card is circumvented. These processors are accessed via a pixel shader program, which is a function that computes effects on a per-pixel basis. Computation of intensity values and rendering of each pixel is executed largely in parallel.
Alternatively, rather than using the acoustic field model to synthesize and render B-mode ultrasound images, texture mapping can be employed. In this embodiment, pre-acquired three-dimensional textures of various tissues, and an anthropomorphic computational volume mesh are texture mapped. As is known, texture mapping is often applied when modelling a detailed scene where the number of polygons required to produce an image becomes too great and impractical. Similar to pasting wallpaper onto a white wall rather than drawing out an exact pattern by hand, texture mapping allows for a digital image to be pasted onto a single polygon rather than having to model the image explicitly.
In the DUS simulator 20, a three-dimensional volume mesh whose nodal values contain tissue information such as for example, that of blood, fat, or a calcified vessel, at their respective locations, is provided. The sensor 30 supplies the plane normal of the anatomical slice that is displayed on computer screen. A set of pre-recorded three-dimensional textures, which are preacquired from a real ultrasound system, is also supplied. These textures contain the B-mode representations of various tissues found in the anatomy. Each of these textures comprise a volume encompassing the entire mesh.
The smallest unit within a texture is known as a "texel", analogous to a pixel being the smallest element within an image. Every texel within a texture has an associated weighting, which indicates the amount of tissue present at that location. The assignment of weights is described in further detail below. In order to achieve real-time performance, speedy access to the weights is desired. This is achieved by storing the weighting information on the graphics card of the computer 22 in RGBA format, which consists of a red, green, and blue channel and an additional transparency channel (known as alpha). These channels are used to store the weightings of each tissue texture so that quick access to these weightings is achieved. As a pre-processing step, the center point of every texel within each of the textures is firstly probed. Upon determining a texel's location in mesh coordinates, the element that contains this point is found within the volume mesh using an efficient geometric search algorithm. The interpolated nodal values indicate the type of tissue present at the sampled point location, and the tissue type is used to assign the weightings at that texel. For instance, if the point corresponds to fat tissue, then the respective texel within the texture corresponding to fat tissue will record a full weighting of 1.0, while the other textures will record a weighting of 0.0. Hence, this pre-processing step serves to record the tissue information, as weights in the textures. Figure 9 illustrates this process.
Once a cross-section through the mesh is selected, the coordinates of the ultrasound image plane is determined. The resolution of the image is set equivalent to that of an actual B-mode image. Every pixel within this image is then probed for its mesh coordinate value, which serves as an index into each of the textures. Each texture returns its weighting at the probed texel, and the resulting shade is applied. Once all pixels have been colour-coded, the scene is rendered and displayed yielding a realistic B-mode simulation requiring little computation.
In yet another embodiment, constructive solid geometry is used to synthesize and render B-mode ultrasound. In this embodiment constructive solid geometry (CSG) objects are created via a stencil buffer, to efficiently outline various anatomical structures present in the ultrasound image plane. Pixels on the plane are colour-coded using pre-acquired textures derived from true B-mode ultrasound images.
CSG is a technique which allows for the combination of objects (where objects refer to closed polygonal surfaces), using Boolean set operators, such as union, difference and intersection, to create more complex objects. In the DUS simulator, an intersection operation is performed between the ultrasound image plane and a number of anthropomorphic surface meshes which model the morphology of all tissues found in the region. In order to generate realistic images, the image plane is texture mapped with realistic B-mode images which are preacquired from an actual ultrasound machine. The normal and position of the ultrasound image plane is provided by the output of the sensor.
The intersection operation described above is performed by means of the stencil buffer, which is among the several buffers that reside on the computer graphics card. The stencil buffer can be employed in an analogous manner as a real world stencil, or outline. A stencil test compares the value in the stencil buffer to a reference value and determines whether a pixel is eliminated or not, hence acting as a mask. This test is first setup by disabling colour bits from being written to the frame buffer, so that draw calls are not displayed on the screen. The front face of the intersecting plane is then drawn, or written to the frame buffer. The following calls then setup the next draw call which is to the front face of the surface mesh where the first stencil test is performed: glEnable(GL_STENCIL_TEST); glStencilFunc(GL_ALWAYS, 0, 0); glStencilOp(GL_KEEP, GLJCEEP, INCR); Here, bits in the stencil buffer are incremented wherever the front face of the surface mesh is drawn. The next call decrements values in the stencil buffer wherever the back face of the surface mesh is drawn: glStencilOp(GL_KEEP, GL KEEP, GL_DECR);
Finally, colour bits are once again enabled to write to the frame buffer and the front face of the plane is drawn to display, but only where the plane intersects with the surface mesh: glStencilFunc(GL_NOTEQUAL, 0, 1); Figure 10 shows an example of the stencilling procedure described above used to display the intersection between a plane and a sphere. As described previously however, the virtual DUS simulator involves the intersection of the provided anthropomorphic surface meshes and the ultrasound image plane. As previously described, the surface meshes provided represent surfaces of the tissues occurring throughout the anatomical region. Each surface mesh belongs to a certain tissue type such as for example, fat. In a single pass, the above stencilling procedure is then executed on all surface meshes that belong to a particular tissue type. These set of surface meshes are intersected with a plane that is texture mapped with the corresponding B-mode tissue texture. This is repeated for all tissue types (i.e. blood, fat, plaque, etc.) and the resulting traces are summed together to a single frame buffer. The result is a plane that is completely filled with realistic B- mode textures, and as in a true B-mode image, illustrates the various tissue cross- sections throughout the slice.
Real-Time Image Display
The CFD model velocity field information is contained in a number of time-steps that make up one complete heart cycle period. Hence, new velocities that are rendered at every frame update are derived from a particular time-step from the velocity field data. In order to ensure that frames are being rendered synchronous to the cardiac cycle of the CFD model, a frame-dropping algorithm is employed. As will be appreciated, render speeds and thus frame rates, may fluctuate depending on computation times and therefore might not be in sync with the hypothetical heart rate of the CFD model. The frame-dropping algorithm ensures that frames are rendered in real-time irrespective of variations in the underlying computation time.
To achieve the above, the frame dropping algorithm either discards or appends frames that will cause the display to become out of sync depending on whether the software is ahead of or behind "schedule". Figures 7 and 8 illustrate an example of what is meant by ahead of and behind schedule. "Elapsed time" is the true time, or the time dictated by the actual computer CPU wall time that has passed (i.e. the physical time that has passed, as opposed to the number of CPU clock cycles) and "theoretical time" is the time dictated by the program's next scheduled frame number. Hence, if according to theoretical time, frame number seven is due, but enough CPU wall time has passed that frame number eight should now be rendered, then the program is behind schedule, and in this case frame number seven is discarded. On the other hand, if not enough time has passed and only frame six is due, then the last rendered frame will be conserved until it is time for the next frame.
Synthesized Doppler Audio
As mentioned earlier, CFD model velocities at a cloud of randomly distributed points are sampled within a predefined sample volume power distribution. Each velocity is converted to a Doppler frequency via the Doppler equation, weighted according to the power at its sample volume location, and convolved with an intrinsic spectral broadening function. Spectra are constructed at discrete times, with the velocity sampling points randomly distributed within a nominal temporal window Δt. In order to generate Doppler audio, each point within the CFD velocity field is sampled at some time t0 at a random location within the sample volume, and a velocity, v, is returned which is converted to a Doppler frequency, f. From this the audio waveform basis function expressed below:
Ayjsmπ(t - to)/T cos 2rf(t - t0), as shown in Figure 6 is generated, where: A is sample volume power at the point location; and
T is the time required to traverse the sample volume, which is assumed to be the nominal sample volume diameter divided by the velocity.
In order to account for spectral broadening, waveform basis functions are generated at each broadened frequency and their associated powers are summed together. By repeating this process for points sampled sequentially in time (e.g., t0 =
0 for point 1, to = Δt*i/N for point i of N points, etc.), a continuous audio signal is built representing the summation of signal contributions from every point within the sample volume.
DUS Parameter Control The DUS simulator 20 allows the operator to steer the virtual beam at three angles, +20°, 0°, -20°. Multiple steering angles are incorporated into the application by rotating the "axial direction" vector of the sensor 30 (i.e. its x-axis) about the slicing plane normal, by the steering angle as shown in Figures 12 and 13. The new vector is used to calculate the blood velocity component along the axial direction. As can be seen in Figure 13, both the colour DUS and spectrograph images are correctly updated.
The DUS simulator 20 starts with a 60° default angle between the virtual beam and angle correction marker but allows the angle correction marker to be rotated at increments of 2° from -70° to +70°, as is permitted in conventional ultrasound systems. Adjustment of the angle correction marker then alters the Doppler angle which is used for the derivation of velocities displayed on the spectrogram. The system can support any starting angle and incremental rotation.
As mentioned above, the spectrograph utilizes a grey scale lookup table. Computed power values serve as indices into the lookup table. By adjusting the grey scale levels, or the scalar range to which the colours are mapped, the basic use of the spectral gain feature, i.e. to vary the strength of the backscattered signal can be mimicked.
The operator is allowed to move and resize the colour box. In doing so, the frame rate and pulse repetition frequency (PRF) may be affected. To simulate the relationship between colour box size, PRF, and frame rate, the relationships
PRF= and FR ∞ are applied. Here, c represents the speed of sound in
2- α W tissue, d is the one way distance that the pulse has to travel, and W is the width of the colour box. The DUS simulator 20 allows for interactivity via the keyboard and trackball and detects for certain keyboard and trackball movement events. As an example, Figure 11 shows the list of keyboard events and their associated functions.
The sample volume marker may be moved about the display screen via the trackball and keyboard for selection of a region to be viewed in spectrogram mode. The gate size of the sample volume may also be increased or decreased as shown in Figure 14. The left side shows a lmm sample volume that yields a clean spectrograph with little spectral broadening. The right side shows an enlarged sample volume that provides for a broader spectra. Since, in ultrasonography the operator only has control of the axial size of the sample volume, i.e. along the beam direction, an oblate shaped sample volume with radii equivalent to the standard deviation of the Gaussian power distribution along the respective axes is employed. The PRF of a Doppler ultrasound system is primarily associated with the velocity or frequency scale of the colour flow map or spectrogram. Depending on the system's scanhead, various sets of PRF ranges are available for the sonographer. For both CDUS and spectrogram images, various PRFs are provided for and in this example include 3500 Hz, 5000 Hz, 6250 Hz, 8333 Hz, 10000 Hz, 11950 Hz, and
A . f . V - COS Θ 16667 Hz. The formula PRF = 2 /Dmax = — ^- — ≡* is applied to determine c the various velocity ranges that are allowable for a PRF setting, where fomax is the Doppler frequency, fγ the transmit frequency, Vmax is the maximum velocity limit seen on the velocity scale, Θ the Doppler angle and c is the speed of ultrasound in blood. In
D D E1 f. colour Doppler mode, if Fmax > , then the DUS simulator is reset to
^ ' J T employing the next highest PRF available. Similarly, in spectrogram mode, if
D D E1 . p
Vnmx > then the current PRF setting is again updated. The Doppler angle
4 - /r - cosΘ is applied only for the spectrogram case. Also, if the user is in spectrogram mode,
Q then the PRF will not be permitted to exceed where d is the depth of interest.
2 - d
Settings that modify B-mode ultrasound on typical ultrasound systems are mimicked. This includes, but is not limited to i) gain level, ii) time-gain compensation, and iii) dynamic range.
As will be appreciated, the DUS simulator provides a strong tool for the advancement of current diagnosis protocols for the widespread problem of carotid disease. By realistically simulating a DUS examination, operators are able to gain useful experience that translates directly to real life DUS examinations. Signal processing techniques for improving and expanding the realm of information obtainable from Doppler ultrasound can be tested and analyzed. The DUS simulator also opens the door for the discovery of new and better risk indicators for stroke using Doppler ultrasound.
Although embodiments have been described above with reference to the drawings, those of skill in the art will appreciate that variations and modifications may be made without departing from the spirit and scope thereof as defined by the appended claims.

Claims

Wh at is claimed is:
1. A method of simulating an ultrasound examination comprising: synthesizing ultrasound data using a computational phantom; and coupling the simulated ultrasound data to motion of a sensor manipulated over a target volume thereby to simulate said ultrasound examination.
2. The method of claim 1 wherein said synthesized ultrasound data comprises at least one of synthesized Doppler ultrasound and synthesized B-mode data.
3. The method of claim 2 wherein said synthesized ultrasound data comprises both synthesized Doppler ultrasound and synthesized B-mode data.
4. The method of claim 4 wherein said computational phantom comprises a computational fluid dynamics (CFD) model.
5. The method of claim 4 wherein said synthesizing comprises: interrogating the CFD model at points within a sample volume; for each point, determining from the CFD model, a velocity vector in the direction of the sensor and converting the velocity vector into a frequency; and summing the frequencies to yield Doppler ultrasound data.
6. The method of claim 5 wherein uniform and randomly selected points within said sample volume are interrogated.
7. The method of claim 6 further comprising prior to said summing, weighting each frequency.
8. The method of claim 7 wherein the weighting is based on defined acoustic power and intrinsic broadening distribution properties.
9. The method of claim 3 further comprising displaying said synthesized ultrasound data.
10. The method of claim 9 wherein said displaying comprises rendering an image selected from the group comprising a spectrograph, a colour Doppler image, a power Doppler image, a B-mode image, a duplex ultrasound (DUS) image, a colour DUS image and a power Doppler DUS image.
11. The method of claim 10 wherein said image is displayed at clinical frame rates in response to manipulation of said sensor.
12. The method of claim 11 further comprising manipulating frames of said image to maintain synchronism between said displayed image and the cycle of anatomy encompassed by said target volume.
13. The method of claim 12 wherein said manipulating comprises one of dropping frames and suspending frames.
14. The method of claim 5 further comprising generating a Doppler audio signal.
15. The method of claim 14 wherein said Doppler audio signal is a function of said frequencies.
16. The method of claim 15 wherein said Doppler audio signal is represented by a summation of signals generated for each sampled point within said target volume.
17. The method of claim 3 wherein said B-mode data is synthesized using an acoustic field model.
18. The method of claim 3 wherein said B-mode data is synthesized using texture mapping.
19. The method of claim 3 wherein said B-mode data is synthesized using constructive solid geometry.
20. The method of claim 5 wherein said computational phantom comprises a plurality of CFD models, each representing a different vascular condition.
21. An ultrasound simulator comprising: a motion tracking device outputting position data when moved over a target volume; and processing structure communicating with said motion tracking device, said processing structure synthesizing ultrasound data using a computational phantom and said position data thereby to simulate said ultrasound examination.
22. An ultrasound simulator according to claim 21 wherein said processing structure synthesizes at least one of Doppler ultrasound data and B-mode data.
23. An ultrasound simulator according to claim 22 wherein said processing structure synthesizes both Doppler ultrasound data and B-mode data.
24. An ultrasound simulator according to claim 23 wherein said computational phantom comprises at least one computational fluid dynamics (CFD) model.
25. An ultrasound simulator according to claim 23 wherein said processing structure further generates a Doppler audio signal.
26. An ultrasound simulator according to claim 23 further comprising a library of CFD models.
PCT/CA2007/000370 2006-03-07 2007-03-07 Ultrasound simulator and method of simulating an ultrasound examination WO2007101346A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US77941806P 2006-03-07 2006-03-07
US60/779,418 2006-03-07

Publications (1)

Publication Number Publication Date
WO2007101346A1 true WO2007101346A1 (en) 2007-09-13

Family

ID=38474572

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2007/000370 WO2007101346A1 (en) 2006-03-07 2007-03-07 Ultrasound simulator and method of simulating an ultrasound examination

Country Status (1)

Country Link
WO (1) WO2007101346A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014134289A1 (en) * 2013-03-01 2014-09-04 Heartflow, Inc. Method and system for determining treatments by modifying patient-specific geometrical models
US10573200B2 (en) 2017-03-30 2020-02-25 Cae Healthcare Canada Inc. System and method for determining a position on an external surface of an object
EP3071113B1 (en) * 2013-11-21 2020-07-29 Samsung Medison Co., Ltd. Method and apparatus for displaying ultrasound image

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5230339A (en) * 1991-06-13 1993-07-27 Array Tech, Inc. Performance evaluation of ultrasonic examination equipment
US6117075A (en) * 1998-09-21 2000-09-12 Meduck Ltd. Depth of anesthesia monitor
US6193657B1 (en) * 1998-12-31 2001-02-27 Ge Medical Systems Global Technology Company, Llc Image based probe position and orientation detection
US6210168B1 (en) * 1998-03-16 2001-04-03 Medsim Ltd. Doppler ultrasound simulator

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5230339A (en) * 1991-06-13 1993-07-27 Array Tech, Inc. Performance evaluation of ultrasonic examination equipment
US6210168B1 (en) * 1998-03-16 2001-04-03 Medsim Ltd. Doppler ultrasound simulator
US6117075A (en) * 1998-09-21 2000-09-12 Meduck Ltd. Depth of anesthesia monitor
US6193657B1 (en) * 1998-12-31 2001-02-27 Ge Medical Systems Global Technology Company, Llc Image based probe position and orientation detection

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014134289A1 (en) * 2013-03-01 2014-09-04 Heartflow, Inc. Method and system for determining treatments by modifying patient-specific geometrical models
US9449146B2 (en) 2013-03-01 2016-09-20 Heartflow, Inc. Method and system for determining treatments by modifying patient-specific geometrical models
US10390885B2 (en) 2013-03-01 2019-08-27 Heartflow, Inc. Method and system for determining treatments by modifying patient-specific geometrical models
US11185368B2 (en) 2013-03-01 2021-11-30 Heartflow, Inc. Method and system for image processing to determine blood flow
US11564746B2 (en) 2013-03-01 2023-01-31 Heartflow, Inc. Method and system for image processing to determine blood flow
US11869669B2 (en) 2013-03-01 2024-01-09 Heartflow, Inc. Method and system for image processing to model vasculasture
EP3071113B1 (en) * 2013-11-21 2020-07-29 Samsung Medison Co., Ltd. Method and apparatus for displaying ultrasound image
US10573200B2 (en) 2017-03-30 2020-02-25 Cae Healthcare Canada Inc. System and method for determining a position on an external surface of an object

Similar Documents

Publication Publication Date Title
KR101717695B1 (en) Simulation of medical imaging
JP6297085B2 (en) Ultrasound imaging system for ultrasound imaging of volume of interest and method of operation thereof
US11633175B2 (en) Method and ultrasound system for shear wave elasticity imaging
JP5567502B2 (en) Medical training method and apparatus
Burger et al. Real-time GPU-based ultrasound simulation using deformable mesh models
US11006926B2 (en) Region of interest placement for quantitative ultrasound imaging
EP3569155B1 (en) Method and ultrasound system for shear wave elasticity imaging
US20110125016A1 (en) Fetal rendering in medical diagnostic ultrasound
Aiger et al. Real-time ultrasound imaging simulation
US20170032702A1 (en) Method and Apparatus For Generating an Ultrasound Scatterer Representation
US20070255138A1 (en) Method and apparatus for 3D visualization of flow jets
WO2007100263A1 (en) Method for simulation of ultrasound images
CN107533808A (en) Ultrasonic simulation system and method
US6458082B1 (en) System and method for the display of ultrasound data
WO2007101346A1 (en) Ultrasound simulator and method of simulating an ultrasound examination
Rohling et al. Issues in 3-D free-hand medical ultrasound imaging
CN114173673A (en) Ultrasound system acoustic output control using image data
Hirji et al. Real-time and interactive virtual Doppler ultrasound
EP3843637B1 (en) Ultrasound system and methods for smart shear wave elastography
Gjerald et al. Real-time ultrasound simulation for low cost training simulators
Wiesauer Methodology of three-dimensional ultrasound
Wei Distance Estimation for 3D Freehand Ultrasound-Scans with Visualization System
Amadou et al. Cardiac ultrasound simulation for autonomous ultrasound navigation
Karadayi Study on error and image quality degradation in three-dimensional ultrasound imaging with a mechanical probe
BE Surface Reconstruction in 3D Medical Imaging

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07710704

Country of ref document: EP

Kind code of ref document: A1