EP1458294A1 - Ultrasound imaging system and method - Google Patents

Ultrasound imaging system and method

Info

Publication number
EP1458294A1
EP1458294A1 EP02783462A EP02783462A EP1458294A1 EP 1458294 A1 EP1458294 A1 EP 1458294A1 EP 02783462 A EP02783462 A EP 02783462A EP 02783462 A EP02783462 A EP 02783462A EP 1458294 A1 EP1458294 A1 EP 1458294A1
Authority
EP
European Patent Office
Prior art keywords
ultrasound
image
beamformer
scanhead
interest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP02783462A
Other languages
German (de)
French (fr)
Other versions
EP1458294B1 (en
Inventor
Daniel C. Schmiesing
Cedric Chenal
Lars J. Olsson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips Electronics NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics NV filed Critical Koninklijke Philips Electronics NV
Publication of EP1458294A1 publication Critical patent/EP1458294A1/en
Application granted granted Critical
Publication of EP1458294B1 publication Critical patent/EP1458294B1/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52046Techniques for image enhancement involving transmitter or receiver
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/5206Two-dimensional coordinated display of distance and direction; B-scan display
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52079Constructional features
    • G01S7/52084Constructional features related to particular user interfaces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4405Device being mounted on a trolley

Definitions

  • This invention relates to diagnostic ultrasound imaging, and, more particularly, to a system and method for automatically selecting the position of one or more focal positions of a transmitted ultrasound beam.
  • Ultrasound can be used to image tissues and vessels using a variety of imaging modalities.
  • B-mode scanning can be used to image tissues by portraying the tissues in a gray scale in which the brightness of each region of the image is a function of the intensity of ultrasound returns from corresponding regions of the tissues.
  • B-mode scanning can be used to visualize the shapes of organs and vessels, and to detect the presence of masses, such as tumors, in tissues.
  • Doppler scanning can be used to provide images showing the velocity of moving sound reflectors, such as blood flowing through an artery or vein. Using Doppler scanning to image the flow pattern of blood through a vessel allows the internal shape of the vessel to be inferred. As a result, partial obstructions in blood vessels can be detected.
  • a conventional diagnostic ultrasound imaging system 10 is shown in Figure 1.
  • the ultrasound imaging system 10 includes a scanhead 20 having a transducer face that is placed in contact with a target area containing tissues, organs or blood vessels of interest.
  • the scanhead 20 includes an array of transducer elements 24 each of which transforms a transmit signal into a component of an ultrasound beam and transforms an ultrasound reflection into a respective receive signal.
  • These signals are coupled between the scanhead 20 and an imaging unit 30 through a cable 26.
  • the imaging unit 30 is shown mounted on a cart 34.
  • the imaging system also includes a control panel 38 for allowing a user to interface with the system 10.
  • a display monitor 40 having a viewing screen 44 is placed on an upper surface of the imaging unit 30.
  • the transducer elements 24 in the scanhead 20 collectively transmit a beam 50 of ultrasound energy as shown in Figure 2.
  • Respective electrical signals typically at a frequency of 1-20 MHz, are applied to all or some of the transducer elements 24.
  • the number of transducer elements 24 to which electrical signals are applied determines the size of the transmit aperture.
  • the size of the aperture affects the size of the imaging field and resolution, as explained below, h practice, the phases of the electrical signals applied to the transducer elements 24 are adjusted so that the beam 50 is focused in a focal position 52.
  • the depth to the focal position 52 beneath the transducer face is controlled by the magnitude of the differences in phase of the electrical signals applied to the transducer elements 24.
  • the focal length which corresponds to the effective length of the focal position 52, is determined by the size and gain of the transmit aperture, i.e., the number of transducer elements 24 used to form the beam 50.
  • the focal position 52 should ideally be positioned where features of maximum interest are located so that these features will be in the best attainable focus.
  • the focal position 52 is shown for illustrative purposes in Figure 2 as being considerably “sharper" than typical in practice.
  • the ultrasound from the individual transducer elements 24 is normally diffracted by tissues so that the effective length of the focal position 52 is actually more of an area where the beam 50 is narrowed rather than a location where the beam 50 comes to a point.
  • the transducer elements 24 are also used to receive ultrasound reflections and generate corresponding electrical signals. As shown in Figure 3, the phase and gain of the received signals are also adjusted to effectively generate a receive beam 56 that is focused to a focal position 58 corresponding to the phase differences between the signals coupled from the transducer elements 24. (In the interest of clarity, beam components for only two transducer elements 24 are shown, although it will be understood that beam components would exist for all active transducer elements).
  • the receive beam 56 can also be "steered,” i.e., offset from an axis that is perpendicular to the transducer face, by adjusting the phase differences between the signals coupled from the transducer elements 24.
  • the phase differences between these signals are adjusted as a function of time delay from each ultrasound transmission so that the focal position 58 dynamically varies with depth from a relatively deep position 60 to a relatively shallow position 62 from where the ultrasound is reflected.
  • the focal position 58 for the receive beam 56 varies dynamically with the depth from where the ultrasound is reflected.
  • the disclosed invention relates to the locations of the focal position 52 for the transmit beam 50 rather than the locations of the focal position 58 for the receive beam 56.
  • a typical B-mode ultrasound image 64 is displayed on the viewing screen 44 as shown in Figure 4.
  • the ultrasound image 64 shows a number of anatomical features, such as tissues 66 and a blood vessel 68.
  • the area of interest to the medical practitioner is the vessel 68.
  • the focal position of the transmit beam should ideally be located at the depth of the vessel 68.
  • the conventional ultrasound imaging system 10 ( Figure 1) has the ability to adjust the location of the transmit beam focal position.
  • the location of the focal position along the depth axis of the image 64 is indicated by a cursor 70 on the right hand side of the viewing screen 44.
  • the location of the focal position is adjusted by suitable means, such as by manipulating a control on the control panel 38 ( Figure 1).
  • the viewing screen 44 shows a B-mode image 80 showing tissues 82 containing a relatively large blood vessel 84. A single focal region may be too small to optimally image the vessel 84.
  • a medical practitioner has the option of selecting a number of transmit focal positions, e.g., two focal positions as indicated by the cursors 86, 88 on the right hand side of the viewing screen 44, as shown in Figure 5.
  • the positions of the focal positions are adjusted by suitable means, such as by manipulating a control of the control panel 38.
  • the two transmit focal positions are used by first transmitting a beam of ultrasound focused at the first focal position. Ultrasound reflections are then obtained as explained above, and a first set of data corresponding thereto are stored by suitable means. A second beam of ultrasound focused at the second focal position is then transmitted, and ultrasound reflections are then also obtained and a second set of data corresponding thereto are stored. The image 80 is then formed using both sets of stored data, with the portion of the image in the first focal position predominantly derived from the first set of data and the portion of the image in the second focal position predominantly derived from the second set of data.
  • a preferred way to employ multiple transmit focal regions is described in U.S. Patent 6,315,723.
  • the system 10 has been explained with reference to the B-mode images shown in Figures 4 and 5. However, it will be understood that the same principles apply to other types of images, such as Doppler images.
  • the system 10 can be operated as explained with reference to Figures 4 and 5 to optimally position the transmit focal position(s), it nevertheless has its limitations and problems. For example, it can be fairly time consuming to place the transmit focal positions in the correct position. Additionally, it can require an extraordinary level of expertise to select the proper number of focal positions and correctly position each of the focal positions at the optimal location. For these and other reasons, the focal position(s) are often not positioned in the optimal location, and, in many instances, practitioners do not even attempt to optimally position focal positions.
  • An ultrasound diagnostic imaging system and method uses an image processor that automatically sets the location of a focal position of the beam of ultrasound transmitted by an ultrasound scanhead based on an analysis of an ultrasound image displayed on an ultrasound display.
  • the image processor may analyze the image to automatically identify an area of interest, or the area of interest may be selected manually by a user.
  • the image processor may automatically identify the area of interest by analyzing a characteristic of the image, such as the quality of the image.
  • the image processor may set the location of the focal position to correspond to the position of an area of interest, to maximize the quality of the ultrasound image in the area of interest, or by some other means.
  • the image processor may also select the number of ultrasound transmissions and the locations of respective focal positions based on an analysis of the ultrasound image.
  • the image processor may also dynamically vary the position of a focal position by varying the location of the focal position along a depth axis as a function of the locations of areas of interest along an azimuth axis of the ultrasound display.
  • the act of automatically analyzing the ultrasound image to identify an area of interest in the ultrasound image may comprise automatically analyzing the ultrasound image to identify a predetermined image characteristic
  • the act of automatically analyzing the ultrasound image to identify a predetermined image characteristic may comprise analyzing image data by automated border detection.
  • the act of analyzing image data by automated border detection may comprise analyzing the image data of temporally different images, and wherein the act of automatically setting the location of the focal positions of the beams comprises updating the location of the focal positions of the beams at least periodically.
  • the act of automatically analyzing the ultrasound image to identify an area of interest in the ultrasound image may comprise analyzing the quality of the image in a predetermined area of the image.
  • the act of automatically setting the locations of the focal positions of the beams of transmitted ultrasound may comprise setting the locations of the focal positions to optimize the quality of the image in the predetermined area.
  • a method of setting focal positions of beams of transmitted ultrasound in an ultrasound imaging system having a display on which an ultrasound image is generated, the ultrasound display having a depth axis and an azimuth axis may then comprise viewing the ultrasound image; selecting an area of interest in the ultrasound image; and automatically setting the location of the focal positions of beams of transmitted ultrasound based on a characteristic of the selected area of interest of the ultrasound image.
  • the act of automatically setting the location of the focal positions of the beams of transmitted ultrasound based on the location of the selected area of interest may comprise automatically setting the focal positions to a plurality of locations along the depth axis of the ultrasound display for respective locations along the azimuth axis of the ultrasound display.
  • the act automatically setting the location of the focal positions of the beams of transmitted ultrasound based on the location of the selected area of interest may comprise setting respective locations of at least two focal positions along the depth axis of the ultrasound display.
  • the act of automatically setting the location of the focal positions of the beams of transmitted ultrasound based on a characteristic of the selected area of interest may comprise automatically setting the locations of the focal positions to maximize the quality of the image in the selected area.
  • the act of selecting an area of interest in the ultrasound image may comprise placing an identifying marking on the ultrasound display at a location corresponding to the location of the selected area of interest which segments the selected area of interest.
  • the act of automatically setting the location of the focal positions of the beams of transmitted ultrasound based on a characteristic of the selected area of interest may comprise automatically setting the number of focal positions and the location of each focal position based on a characteristic of the selected area of interest.
  • the ultrasound transducers in the scanhead may be arranged in a linear array.
  • the control line may be operable to apply signals to the beamformer to cause the beamformer to transmit at least two beams of ultrasound having respective focal positions, the positions of the focal positions being controlled by signals applied to the beamformer by the control line.
  • the ultrasound imaging system of the invention may further comprise a user interface device structured to allow a user to enter information identifying the plurality of areas of interest.
  • Figure 1 is an isometric view of a conventional diagnostic ultrasound imaging system.
  • Figure 2 is a schematic diagram illustrating the manner in which an ultrasound scanhead used in the system of Figure 1 transmits a beam of ultrasound energy responsive to electrical signals.
  • Figure 3 is a schematic diagram illustrating the manner in which an ultrasound scanhead used in the system of Figure 1 receives a beam of ultrasound energy and generates corresponding electrical signals.
  • Figure 4 is a schematic illustration of an ultrasound image shown on the viewing screen of a display in a conventional ultrasound imaging system.
  • Figure 5 is a schematic illustration of an ultrasound image shown on the viewing screen of a display in a conventional ultrasound imaging system in which the ultrasound image is generated using two separate focal positions.
  • Figure 6 is an isometric view of one embodiment of a diagnostic 5 ultrasound imaging system according to the present invention.
  • Figure 7 is a schematic illustration of an ultrasound image shown on the viewing screen of the ultrasound imaging system of Figure 6 in accordance with one embodiment of the invention.
  • Figure 8 is a schematic illustration of an ultrasound image shown on the viewing screen of the ultrasound imaging system of Figure 6 in accordance with another embodiment of the invention.
  • Figure 9 is a schematic illustration of an ultrasound image shown on the viewing screen of the ultrasound imaging system of Figure 6 in accordance with a further embodiment of the invention.
  • Figure 10 is a schematic illustration of an ultrasound image shown on the viewing screen of the ultrasound imaging system of Figure 6 in accordance with still another embodiment of the invention.
  • FIG 11 is a block diagram of one embodiment of an imaging unit according to the present invention that may be used in the ultrasound imaging system of Figure 6.
  • FIG. 6 One embodiment of an ultrasound imaging system 100 according to the present invention is shown in Figure 6.
  • the system 100 uses many of same components used in the imaging system 10 of Figure 1. Therefore, in the interests of brevity, these components have been provided with the same reference numerals, and an explanation of their function and operation will not be repeated.
  • the system 100 differs from the system 10 shown in Figure 1 primarily by using an imaging unit 106 that is different from the imaging unit 30 used in the system 10. The components used in the imaging unit 106 will be explained below with reference to Figure 11.
  • the system 100 can generate an ultrasound image 110 in accordance with one embodiment of the invention, as shown in Figure 7.
  • the system 100 generates the image 110 by automatically analyzing the image 110 to locate an area of interest 112 by suitable means.
  • the system 100 may perform this analysis using presently existing algorithms for analyzing ultrasound images, although subsequently developed algorithms for analyzing ultrasound images may also be used.
  • the algorithm analyzes the image 110 to locate the brightest area, and this area is selected as the area of interest.
  • tissue specific imaging is used in which the characteristics of expected areas of interest are defined by the type of image being obtained.
  • tissue specific imaging available on the Philips HDI 5000 ultrasound system, the user selects the anatomy to be imaged and the ultrasound system automatically initializes itself with preferred parameters for scanning the anatomy. For example, the user may select "obstetric imaging" or "cardiac imaging.” In obstetric scanning, the expected areas of interest will be in relatively bright areas of the image.
  • the area of interest can be defined by image characteristics of specific areas of the heart, such as the left ventricle or a mitral valve.
  • the system 100 determines the number of focal positions that should optimally be used to generate the image 110.
  • the system 100 places cursors 114, 116 on the right hand side of the viewing screen 44 to indicate the position of each transmit focal position.
  • the user may manually adjust these automatically determined positions by suitable means, such as by manipulating a control on the control panel 38.
  • the system 100 allows a user to designate an area of interest 120 in an image 122 by suitable means, such as by using a pointing device like a mouse, trackball, or light pen, or by touching the displayed area of interest on a touch-panel display.
  • the system 100 determines from this user input the location of the transmit focal position or number and locations of multiple transmit focal positions that will be used to be used to generate a subsequent version of the image 122.
  • the system also preferably delineates the area of interest 120 selected by the user with a segmenting border 126 or other indicia, and places a cursor 128 on the right hand side of the viewing screen 44 to show the position of the focal position along the depth axis.
  • the image data of a structure such as the heart is analyzed to identify specific anatomical features, such as the location of the mitral valve plane and the boundaries of the myocardium. If diagnosis of the mitral valve performance is of interest to the clinician, for instance, the identification of the mitral valve plane will identify the depth of the mitral valve in the ultrasound image.
  • Automated border detection may be used to identify other structures in the body, such as anatomy of the fetus in an obstetrical patient.
  • An embodiment of the present invention using automated border detection identifies the anatomy of interest, then the depth of the identified anatomy is used to set the number and or location of the focal position(s) in an ultrasound image.
  • the technique can be employed to analyze image data periodically or continuously, and therefore can track anatomical features over time.
  • This information can be used to update the focal position periodically or continuously, thus constantly optimizing the focus even in the presence of scanliead or anatomical motion.
  • some form of hysteresis or integration is used so that the focal position is not changed and does not appear to jitter for small motional changes.
  • the frame rate of the system 100 decreases with an increase in the number of focal positions used.
  • the user may limit the number of multiple focal position which the system may use, the minimum acceptable frame rate, or select the degree to which the automatic selection of multiple transmit focal positions is discouraged.
  • the system 100 may select the use of multiple focal positions, but it will do so only where the use of multiple transmit focal positions is very important to the quality of the image 110.
  • the system 100 may automatically select the use of multiple focal positions whenever multiple 5 focal positions would noticeably improve the quality of the image.
  • the system 100 either automatically selects an area of interest 130 in an image 132 or allows a user to designate the area of interest 130 by suitable means, such as by using a pointing device or by delineating the area with a segmenting border 134.
  • the system 100 then automatically analyzes the quality of the image 132 in the area of interest 130 by a conventional or hereinafter developed algorithm, such as by analyzing the sharpness of image detail.
  • the system 100 alters the position of the focal position and/or the number of focal positions used until the optimum position of the focal position is determined.
  • the system 100 also preferably places cursors 136, 138 at the right hand edge of the viewing screen to indicate the number of focal positions used to create the image 132 and their respective positions. Also, techniques can be used to limit the amount of processing needed to determine the location of focal position(s) by, for example, limiting the frequency at which the focal position(s) are calculated or using hysteresis or thresholding so that a new focal position is calculated only for relatively large changes in factors used to determine the position of the focal position(s).
  • the system 100 solves the aforementioned problem of selecting the location of a transmit focal position where two areas of interest are located at different depths.
  • the areas of interest may be located either automatically or through user input, as explained above.
  • an image 140 on the viewing screen 44 includes tissues 142 containing a first blood vessel 144 and a second blood vessel 146.
  • the first blood vessel 144 is positioned at a first location along an azimuth axis 150, i.e., at the left hand side of the viewing screen 44, and at a first location along a depth axis 152, i.e., near the bottom of the viewing screen 44.
  • the second blood vessel 146 is positioned at a second location along the azimuth axis 15O, i.e., at the right hand side of the viewing screen 44, and at a second location along a depth axis 152, r.e., toward the top of the viewing screen 44.
  • the different locations of the vessels 144, 146 along the depth axis 152 precludes any single location of a focal position from being optimum to image both vessels 144, 146.
  • Two different focal positions maybe used, one located at the depth of the first blood vessel 144 and the other located at the depth of the second blood vessel 146. However, this approach would reduce the frame rate of the imaging system 100.
  • the system 100 automatically alters the position of one or more focal positions along the depth axis 152 as a function of the position of each area of interest along the azimuth axis 150.
  • the system 100 selects a location for the focal position that corresponds to the position of the blood vessel 144 along the depth axis of the image 140.
  • the system 100 When the system 100 is imaging the blood vessel 146 on the right hand side of the image 140, the system 100 selects a location for the focal position that corresponds to the position of the blood vessel 146 along the depth axis of the image 140. As a result, the position of the focal position is dynamically variable along the depth axis 152, which can even be accomplished within a common image frame, a benefit made possible when the areas of interest occupy different lateral areas of the image.
  • each of the areas of interest in the image 140 may be selected by automatically analyzing the image 140 as explained above with reference to Figure 7 or by allowing a user to designate the areas of interest as explained above with reference to Figure 8.
  • the dynamically varying locations of the focal position may also be set using the techniques explained above, such as by optimizing the quality of the image 140 in predetermined areas as explained above with reference to Figure 9.
  • Other combinations and alternatives will be apparent to one skilled in the art.
  • the user may manually initiate setting the location of the one or more focal positions by suitable means, such as by pressing a key on the control panel 38 ( Figure 1).
  • the system 100 may operate in the background to periodically set the location of the one or more focal positions.
  • Other means of initiating the setting of the focal positions can also be used.
  • the imaging unit 106 includes an acquisition subsystem 200 that includes a beamformer 204 coupled to the scanhead 20 through the cable 26. Electric signals from the transducer elements of scanhead 20 are applied to the beamformer 204, which processes signals corresponding to echoes of each acquired scanline into a beam. As explained above, the beamformer 204 applies electrical signals to the transducer elements 26 in the scanhead 20 to cause the scanhead to transmit a beam of ultrasound energy. The beamformer 204 controls the respective delays of the signals applied to the transducer elements of scanhead 20 to focus the transmit beam to a specific depth. The location of the focal position is controlled by control data applied through a control line 210.
  • the received signals from the beamformer 204 are applied to a Signal & Image Processing subsystem 220, which includes a conventional signal processor 224 and a conventional scan converter 226.
  • the signal processor 224 receives the signals from the beamformer 204 to generate image data corresponding to an image, such as the images shown in Figures 7-10.
  • the signal processor 224 may also analyze the image data corresponding to a predetermined portion of the image to determine the optimum location of the focal position or the optimum number of focal positions and their optimum locations, as explained above with reference to Figures 7-10.
  • the signal processor 224 also interfaces with the control panel 38 (not shown) to receive user input, such as a command to initiate the focal position adjustment process or information designating an image area that should be analyzed for quality, as explained above.
  • the signal processor 224 After the signal processor 224 determines the optimum location of the focal position or the optimum number of focal positions and their optimum locations, it applies appropriate control data to the beamformer over control line 210 to control the location of the focal position(s).
  • the signal processor 224 can also couple other data to the beamformer 204, such as data controlling the sizes of the transmit and receive apertures.
  • the image data from the signal processor are then applied to the scan converter 226, which arranges the image data into respective ultrasonic image frame 5 data of the desired image format.
  • the image frame data from the Signal & Image Processing subsystem 220 are then transferred to a display subsystem 230, which includes a video processor 232 and the display monitor 40.
  • the video processor 232 converts the image frame data from the scan converter 226 into appropriate video signals, such as NTSC or SVGA 10 video signals, for use by the display monitor 40.

Abstract

An ultrasound imaging system includes an ultrasound scanhead coupled to an image processor that causes ultrasound images to be generated on the viewing screen of a display. The image processor includes a beamformer generating transmit and receive beams, a signal processor and a conventional scan converter. The signal processor receives signals from the beamformer to generate image data corresponding to an image, analyzes the image data and, based on the analysis, determines the number of transmit focal positions that should be used as well as the optimum location for each focal position. The signal processor then couples data to the beamformer to set the location of the focal position(s).

Description

ULTRASOUND IMAGING SYSTEM AND METHOD
TECHNICAL FIELD
This invention relates to diagnostic ultrasound imaging, and, more particularly, to a system and method for automatically selecting the position of one or more focal positions of a transmitted ultrasound beam.
BACKGROUND OF THE INVENTION
Ultrasound can be used to image tissues and vessels using a variety of imaging modalities. For example, B-mode scanning can be used to image tissues by portraying the tissues in a gray scale in which the brightness of each region of the image is a function of the intensity of ultrasound returns from corresponding regions of the tissues. B-mode scanning can be used to visualize the shapes of organs and vessels, and to detect the presence of masses, such as tumors, in tissues. Doppler scanning can be used to provide images showing the velocity of moving sound reflectors, such as blood flowing through an artery or vein. Using Doppler scanning to image the flow pattern of blood through a vessel allows the internal shape of the vessel to be inferred. As a result, partial obstructions in blood vessels can be detected.
A conventional diagnostic ultrasound imaging system 10 is shown in Figure 1. The ultrasound imaging system 10 includes a scanhead 20 having a transducer face that is placed in contact with a target area containing tissues, organs or blood vessels of interest. As explained below, the scanhead 20 includes an array of transducer elements 24 each of which transforms a transmit signal into a component of an ultrasound beam and transforms an ultrasound reflection into a respective receive signal. These signals are coupled between the scanhead 20 and an imaging unit 30 through a cable 26. The imaging unit 30 is shown mounted on a cart 34. The imaging system also includes a control panel 38 for allowing a user to interface with the system 10. A display monitor 40 having a viewing screen 44 is placed on an upper surface of the imaging unit 30.
In operation, the transducer elements 24 in the scanhead 20 collectively transmit a beam 50 of ultrasound energy as shown in Figure 2. Respective electrical signals, typically at a frequency of 1-20 MHz, are applied to all or some of the transducer elements 24. The number of transducer elements 24 to which electrical signals are applied determines the size of the transmit aperture. The size of the aperture affects the size of the imaging field and resolution, as explained below, h practice, the phases of the electrical signals applied to the transducer elements 24 are adjusted so that the beam 50 is focused in a focal position 52. The depth to the focal position 52 beneath the transducer face is controlled by the magnitude of the differences in phase of the electrical signals applied to the transducer elements 24. The focal length, which corresponds to the effective length of the focal position 52, is determined by the size and gain of the transmit aperture, i.e., the number of transducer elements 24 used to form the beam 50. The focal position 52 should ideally be positioned where features of maximum interest are located so that these features will be in the best attainable focus. The focal position 52 is shown for illustrative purposes in Figure 2 as being considerably "sharper" than typical in practice. The ultrasound from the individual transducer elements 24 is normally diffracted by tissues so that the effective length of the focal position 52 is actually more of an area where the beam 50 is narrowed rather than a location where the beam 50 comes to a point.
As previously mentioned, the transducer elements 24 are also used to receive ultrasound reflections and generate corresponding electrical signals. As shown in Figure 3, the phase and gain of the received signals are also adjusted to effectively generate a receive beam 56 that is focused to a focal position 58 corresponding to the phase differences between the signals coupled from the transducer elements 24. (In the interest of clarity, beam components for only two transducer elements 24 are shown, although it will be understood that beam components would exist for all active transducer elements). The receive beam 56 can also be "steered," i.e., offset from an axis that is perpendicular to the transducer face, by adjusting the phase differences between the signals coupled from the transducer elements 24. In practice, the phase differences between these signals are adjusted as a function of time delay from each ultrasound transmission so that the focal position 58 dynamically varies with depth from a relatively deep position 60 to a relatively shallow position 62 from where the ultrasound is reflected. Thus, in contrast to the constant position of focal position 52 for the transmit beam 50, the focal position 58 for the receive beam 56 varies dynamically with the depth from where the ultrasound is reflected. As explained below, the disclosed invention relates to the locations of the focal position 52 for the transmit beam 50 rather than the locations of the focal position 58 for the receive beam 56.
A typical B-mode ultrasound image 64 is displayed on the viewing screen 44 as shown in Figure 4. The ultrasound image 64 shows a number of anatomical features, such as tissues 66 and a blood vessel 68. In the specific case shown in Figure 4, the area of interest to the medical practitioner is the vessel 68. As a result, the focal position of the transmit beam should ideally be located at the depth of the vessel 68. The conventional ultrasound imaging system 10 (Figure 1) has the ability to adjust the location of the transmit beam focal position. As shown in Figure 4, the location of the focal position along the depth axis of the image 64 is indicated by a cursor 70 on the right hand side of the viewing screen 44. The location of the focal position is adjusted by suitable means, such as by manipulating a control on the control panel 38 (Figure 1). As a result, a medical practitioner can place the focal position of the transmit beam at the area of greatest interest in the ultrasound image 64. It is possible for objects of interest to be larger than can be effectively focused by a single focus position, or that there are multiple objects at different depths of field, which cannot be adequately focused by a single focus position. One solution to this problem is provided by the conventional ultrasound imaging system 10 generating an image using two or more transmit focal positions, as shown in Figure 5. The viewing screen 44 shows a B-mode image 80 showing tissues 82 containing a relatively large blood vessel 84. A single focal region may be too small to optimally image the vessel 84. For this reason, a medical practitioner has the option of selecting a number of transmit focal positions, e.g., two focal positions as indicated by the cursors 86, 88 on the right hand side of the viewing screen 44, as shown in Figure 5. The positions of the focal positions are adjusted by suitable means, such as by manipulating a control of the control panel 38.
The two transmit focal positions are used by first transmitting a beam of ultrasound focused at the first focal position. Ultrasound reflections are then obtained as explained above, and a first set of data corresponding thereto are stored by suitable means. A second beam of ultrasound focused at the second focal position is then transmitted, and ultrasound reflections are then also obtained and a second set of data corresponding thereto are stored. The image 80 is then formed using both sets of stored data, with the portion of the image in the first focal position predominantly derived from the first set of data and the portion of the image in the second focal position predominantly derived from the second set of data. A preferred way to employ multiple transmit focal regions is described in U.S. Patent 6,315,723.
The operation of the system 10 has been explained with reference to the B-mode images shown in Figures 4 and 5. However, it will be understood that the same principles apply to other types of images, such as Doppler images. Although the system 10 can be operated as explained with reference to Figures 4 and 5 to optimally position the transmit focal position(s), it nevertheless has its limitations and problems. For example, it can be fairly time consuming to place the transmit focal positions in the correct position. Additionally, it can require an extraordinary level of expertise to select the proper number of focal positions and correctly position each of the focal positions at the optimal location. For these and other reasons, the focal position(s) are often not positioned in the optimal location, and, in many instances, practitioners do not even attempt to optimally position focal positions. In fact, practitioners are sometimes not even aware that the position of the focal position can be adjusted or that multiple focal positions can be used. There is therefore a need for a system and method that can quickly and easily select the optimal number of focal positions and their optimum positions without the need for extraordinary operating expertise.
SUMMARY OF THE INVENTION An ultrasound diagnostic imaging system and method uses an image processor that automatically sets the location of a focal position of the beam of ultrasound transmitted by an ultrasound scanhead based on an analysis of an ultrasound image displayed on an ultrasound display. The image processor may analyze the image to automatically identify an area of interest, or the area of interest may be selected manually by a user. The image processor may automatically identify the area of interest by analyzing a characteristic of the image, such as the quality of the image. The image processor may set the location of the focal position to correspond to the position of an area of interest, to maximize the quality of the ultrasound image in the area of interest, or by some other means. The image processor may also select the number of ultrasound transmissions and the locations of respective focal positions based on an analysis of the ultrasound image. The image processor may also dynamically vary the position of a focal position by varying the location of the focal position along a depth axis as a function of the locations of areas of interest along an azimuth axis of the ultrasound display.
EMDOBIDMENTS OF THE INVENTION
According to the different embodiments of the invention, the act of automatically analyzing the ultrasound image to identify an area of interest in the ultrasound image may comprise automatically analyzing the ultrasound image to identify a predetermined image characteristic, and the act of automatically analyzing the ultrasound image to identify a predetermined image characteristic may comprise analyzing image data by automated border detection.
The act of analyzing image data by automated border detection may comprise analyzing the image data of temporally different images, and wherein the act of automatically setting the location of the focal positions of the beams comprises updating the location of the focal positions of the beams at least periodically.
The act of automatically analyzing the ultrasound image to identify an area of interest in the ultrasound image may comprise analyzing the quality of the image in a predetermined area of the image. The act of automatically setting the locations of the focal positions of the beams of transmitted ultrasound may comprise setting the locations of the focal positions to optimize the quality of the image in the predetermined area.
According to the invention, a method of setting focal positions of beams of transmitted ultrasound in an ultrasound imaging system having a display on which an ultrasound image is generated, the ultrasound display having a depth axis and an azimuth axis, may then comprise viewing the ultrasound image; selecting an area of interest in the ultrasound image; and automatically setting the location of the focal positions of beams of transmitted ultrasound based on a characteristic of the selected area of interest of the ultrasound image.
The act of automatically setting the location of the focal positions of the beams of transmitted ultrasound based on the location of the selected area of interest may comprise automatically setting the focal positions to a plurality of locations along the depth axis of the ultrasound display for respective locations along the azimuth axis of the ultrasound display.
The act automatically setting the location of the focal positions of the beams of transmitted ultrasound based on the location of the selected area of interest may comprise setting respective locations of at least two focal positions along the depth axis of the ultrasound display. The act of automatically setting the location of the focal positions of the beams of transmitted ultrasound based on a characteristic of the selected area of interest may comprise automatically setting the locations of the focal positions to maximize the quality of the image in the selected area. The act of selecting an area of interest in the ultrasound image may comprise placing an identifying marking on the ultrasound display at a location corresponding to the location of the selected area of interest which segments the selected area of interest.
The act of automatically setting the location of the focal positions of the beams of transmitted ultrasound based on a characteristic of the selected area of interest may comprise automatically setting the number of focal positions and the location of each focal position based on a characteristic of the selected area of interest.
In an ultrasound imaging system of the invention the ultrasound transducers in the scanhead may be arranged in a linear array. In an ultrasound imaging system of the invention, the control line may be operable to apply signals to the beamformer to cause the beamformer to transmit at least two beams of ultrasound having respective focal positions, the positions of the focal positions being controlled by signals applied to the beamformer by the control line.
The ultrasound imaging system of the invention may further comprise a user interface device structured to allow a user to enter information identifying the plurality of areas of interest.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 is an isometric view of a conventional diagnostic ultrasound imaging system.
Figure 2 is a schematic diagram illustrating the manner in which an ultrasound scanhead used in the system of Figure 1 transmits a beam of ultrasound energy responsive to electrical signals.
Figure 3 is a schematic diagram illustrating the manner in which an ultrasound scanhead used in the system of Figure 1 receives a beam of ultrasound energy and generates corresponding electrical signals.
Figure 4 is a schematic illustration of an ultrasound image shown on the viewing screen of a display in a conventional ultrasound imaging system.
Figure 5 is a schematic illustration of an ultrasound image shown on the viewing screen of a display in a conventional ultrasound imaging system in which the ultrasound image is generated using two separate focal positions.
Figure 6 is an isometric view of one embodiment of a diagnostic 5 ultrasound imaging system according to the present invention. Figure 7 is a schematic illustration of an ultrasound image shown on the viewing screen of the ultrasound imaging system of Figure 6 in accordance with one embodiment of the invention.
Figure 8 is a schematic illustration of an ultrasound image shown on the viewing screen of the ultrasound imaging system of Figure 6 in accordance with another embodiment of the invention.
Figure 9 is a schematic illustration of an ultrasound image shown on the viewing screen of the ultrasound imaging system of Figure 6 in accordance with a further embodiment of the invention. Figure 10 is a schematic illustration of an ultrasound image shown on the viewing screen of the ultrasound imaging system of Figure 6 in accordance with still another embodiment of the invention.
Figure 11 is a block diagram of one embodiment of an imaging unit according to the present invention that may be used in the ultrasound imaging system of Figure 6.
DETAILED DESCRIPTION OF THE TNVENTION
One embodiment of an ultrasound imaging system 100 according to the present invention is shown in Figure 6. The system 100 uses many of same components used in the imaging system 10 of Figure 1. Therefore, in the interests of brevity, these components have been provided with the same reference numerals, and an explanation of their function and operation will not be repeated. The system 100 differs from the system 10 shown in Figure 1 primarily by using an imaging unit 106 that is different from the imaging unit 30 used in the system 10. The components used in the imaging unit 106 will be explained below with reference to Figure 11. The system 100 can generate an ultrasound image 110 in accordance with one embodiment of the invention, as shown in Figure 7. The system 100 generates the image 110 by automatically analyzing the image 110 to locate an area of interest 112 by suitable means. The system 100 may perform this analysis using presently existing algorithms for analyzing ultrasound images, although subsequently developed algorithms for analyzing ultrasound images may also be used. In one embodiment, the algorithm analyzes the image 110 to locate the brightest area, and this area is selected as the area of interest. In another embodiment, "tissue specific imaging" is used in which the characteristics of expected areas of interest are defined by the type of image being obtained. With "tissue specific imaging," available on the Philips HDI 5000 ultrasound system, the user selects the anatomy to be imaged and the ultrasound system automatically initializes itself with preferred parameters for scanning the anatomy. For example, the user may select "obstetric imaging" or "cardiac imaging." In obstetric scanning, the expected areas of interest will be in relatively bright areas of the image. For cardiac imaging, the area of interest can be defined by image characteristics of specific areas of the heart, such as the left ventricle or a mitral valve.
Based on factors such as the size of the area of interest, the location of bright objects in the image, the identity of the anatomy of interest such as a fetus in obstetric imaging or the heart in cardiac imaging, statistical analysis (homogeneity, distinctiveness of features, etc.) or other factors such as the current depth of focus of the transmit beam, the system 100 determines the number of focal positions that should optimally be used to generate the image 110. The system 100 then places cursors 114, 116 on the right hand side of the viewing screen 44 to indicate the position of each transmit focal position. In one embodiment of the invention, the user may manually adjust these automatically determined positions by suitable means, such as by manipulating a control on the control panel 38. In another embodiment of the invention shown in Figure 8, the system 100 allows a user to designate an area of interest 120 in an image 122 by suitable means, such as by using a pointing device like a mouse, trackball, or light pen, or by touching the displayed area of interest on a touch-panel display. The system 100 then determines from this user input the location of the transmit focal position or number and locations of multiple transmit focal positions that will be used to be used to generate a subsequent version of the image 122. The system also preferably delineates the area of interest 120 selected by the user with a segmenting border 126 or other indicia, and places a cursor 128 on the right hand side of the viewing screen 44 to show the position of the focal position along the depth axis.
A preferred technique for automated border segmentation is described in U.S. patent [application serial number 09/732,613], entitled "Automated Border Detection in
Ultrasonic Diagnostic Images. As described therein, the image data of a structure such as the heart is analyzed to identify specific anatomical features, such as the location of the mitral valve plane and the boundaries of the myocardium. If diagnosis of the mitral valve performance is of interest to the clinician, for instance, the identification of the mitral valve plane will identify the depth of the mitral valve in the ultrasound image. Automated border detection may be used to identify other structures in the body, such as anatomy of the fetus in an obstetrical patient. An embodiment of the present invention using automated border detection identifies the anatomy of interest, then the depth of the identified anatomy is used to set the number and or location of the focal position(s) in an ultrasound image. Since automated border detection can operate on time sequential images, the technique can be employed to analyze image data periodically or continuously, and therefore can track anatomical features over time. This information can be used to update the focal position periodically or continuously, thus constantly optimizing the focus even in the presence of scanliead or anatomical motion. Preferably some form of hysteresis or integration is used so that the focal position is not changed and does not appear to jitter for small motional changes.
As previously mentioned, the frame rate of the system 100 decreases with an increase in the number of focal positions used. In one embodiment, the user may limit the number of multiple focal position which the system may use, the minimum acceptable frame rate, or select the degree to which the automatic selection of multiple transmit focal positions is discouraged. In this embodiment, the system 100 may select the use of multiple focal positions, but it will do so only where the use of multiple transmit focal positions is very important to the quality of the image 110. In other embodiments where frame rate performance limitations do not exist, the system 100 may automatically select the use of multiple focal positions whenever multiple 5 focal positions would noticeably improve the quality of the image.
In another embodiment of the invention shown in Figure 9, the system 100 either automatically selects an area of interest 130 in an image 132 or allows a user to designate the area of interest 130 by suitable means, such as by using a pointing device or by delineating the area with a segmenting border 134. The system 100 then automatically analyzes the quality of the image 132 in the area of interest 130 by a conventional or hereinafter developed algorithm, such as by analyzing the sharpness of image detail. After each analysis of the area of interest 130, the system 100 alters the position of the focal position and/or the number of focal positions used until the optimum position of the focal position is determined. The system 100 also preferably places cursors 136, 138 at the right hand edge of the viewing screen to indicate the number of focal positions used to create the image 132 and their respective positions. Also, techniques can be used to limit the amount of processing needed to determine the location of focal position(s) by, for example, limiting the frequency at which the focal position(s) are calculated or using hysteresis or thresholding so that a new focal position is calculated only for relatively large changes in factors used to determine the position of the focal position(s).
In still another embodiment of the invention shown in Figure 10, the system 100 solves the aforementioned problem of selecting the location of a transmit focal position where two areas of interest are located at different depths. The areas of interest may be located either automatically or through user input, as explained above. As shown in Figure 10, an image 140 on the viewing screen 44 includes tissues 142 containing a first blood vessel 144 and a second blood vessel 146. The first blood vessel 144 is positioned at a first location along an azimuth axis 150, i.e., at the left hand side of the viewing screen 44, and at a first location along a depth axis 152, i.e., near the bottom of the viewing screen 44. The second blood vessel 146 is positioned at a second location along the azimuth axis 15O, i.e., at the right hand side of the viewing screen 44, and at a second location along a depth axis 152, r.e., toward the top of the viewing screen 44. The different locations of the vessels 144, 146 along the depth axis 152 precludes any single location of a focal position from being optimum to image both vessels 144, 146. Two different focal positions maybe used, one located at the depth of the first blood vessel 144 and the other located at the depth of the second blood vessel 146. However, this approach would reduce the frame rate of the imaging system 100. According to one embodiment of the invention, after the areas of interest, e.g., the blood vessels 144, 146, are selected either automatically or manually, as previously described, the system 100 automatically alters the position of one or more focal positions along the depth axis 152 as a function of the position of each area of interest along the azimuth axis 150. When the system 100 is imaging the blood vessel 144 on the left hand side of the image 140, the system 100 selects a location for the focal position that corresponds to the position of the blood vessel 144 along the depth axis of the image 140. When the system 100 is imaging the blood vessel 146 on the right hand side of the image 140, the system 100 selects a location for the focal position that corresponds to the position of the blood vessel 146 along the depth axis of the image 140. As a result, the position of the focal position is dynamically variable along the depth axis 152, which can even be accomplished within a common image frame, a benefit made possible when the areas of interest occupy different lateral areas of the image.
The embodiment of the invention explained with reference to Figure 10 may also be combined with other embodiments. For example, each of the areas of interest in the image 140 may be selected by automatically analyzing the image 140 as explained above with reference to Figure 7 or by allowing a user to designate the areas of interest as explained above with reference to Figure 8. The dynamically varying locations of the focal position may also be set using the techniques explained above, such as by optimizing the quality of the image 140 in predetermined areas as explained above with reference to Figure 9. Other combinations and alternatives will be apparent to one skilled in the art.
In operation, the user may manually initiate setting the location of the one or more focal positions by suitable means, such as by pressing a key on the control panel 38 (Figure 1). Alternatively, the system 100 may operate in the background to periodically set the location of the one or more focal positions. Other means of initiating the setting of the focal positions can also be used.
One embodiment of the imaging unit 106 used in the system of Figure 6 is shown in Figure 11. The imaging unit 106 includes an acquisition subsystem 200 that includes a beamformer 204 coupled to the scanhead 20 through the cable 26. Electric signals from the transducer elements of scanhead 20 are applied to the beamformer 204, which processes signals corresponding to echoes of each acquired scanline into a beam. As explained above, the beamformer 204 applies electrical signals to the transducer elements 26 in the scanhead 20 to cause the scanhead to transmit a beam of ultrasound energy. The beamformer 204 controls the respective delays of the signals applied to the transducer elements of scanhead 20 to focus the transmit beam to a specific depth. The location of the focal position is controlled by control data applied through a control line 210.
The received signals from the beamformer 204 are applied to a Signal & Image Processing subsystem 220, which includes a conventional signal processor 224 and a conventional scan converter 226. The signal processor 224 receives the signals from the beamformer 204 to generate image data corresponding to an image, such as the images shown in Figures 7-10. The signal processor 224 may also analyze the image data corresponding to a predetermined portion of the image to determine the optimum location of the focal position or the optimum number of focal positions and their optimum locations, as explained above with reference to Figures 7-10. The signal processor 224 also interfaces with the control panel 38 (not shown) to receive user input, such as a command to initiate the focal position adjustment process or information designating an image area that should be analyzed for quality, as explained above. After the signal processor 224 determines the optimum location of the focal position or the optimum number of focal positions and their optimum locations, it applies appropriate control data to the beamformer over control line 210 to control the location of the focal position(s). The signal processor 224 can also couple other data to the beamformer 204, such as data controlling the sizes of the transmit and receive apertures. The image data from the signal processor are then applied to the scan converter 226, which arranges the image data into respective ultrasonic image frame 5 data of the desired image format.
The image frame data from the Signal & Image Processing subsystem 220 are then transferred to a display subsystem 230, which includes a video processor 232 and the display monitor 40. The video processor 232 converts the image frame data from the scan converter 226 into appropriate video signals, such as NTSC or SVGA 10 video signals, for use by the display monitor 40.
From the foregoing it will be appreciated that, although specific embodiments of the invention have been described herein for purposes of illustration, various modifications may be made without deviating from the spirit and scope of the invention. Accordingly, the invention is not limited except as by the appended claims.

Claims

CLAIMS:
1. A method of automatically setting focal positions of a set of beam of transmitted ultrasound in an ultrasound imaging system having a display on which an ultrasound image is generated, the ultrasound display having a depth axis and an azimuth axis, the method comprising: automatically analyzing the ultrasound image to identify an area of interest in the ultrasound image; and/or selecting an area of interest in the ultrasound image; and automatically setting the location of the cal positions of the beams of transmitted ultrasound to correspond to the location of the area of interest of the ultrasound image.
2. The method of claim 1 wherein the act of automatically setting the location of the focal positions of the beams of transmitted ultrasound comprises automatically setting respective locations of at least two focal positions along the depth axis of the ultrasound display.
3. The method of claim 2 wherein the act of setting respective locations of at least two focal positions along the depth axis of the ultrasound display comprises setting the respective locations of the focal positions along the depth axis of the ultrasound display for respective locations along the azimuth axis of the ultrasound display.
4. The method of claim 1 wherein the act of automatically setting the location of the focal positions of the beams of transmitted ultrasound to correspond to the location of the area of interest of the ultrasound image comprises automatically setting the number of focal positions and the location of each focal position.
5. The method of claim 1 wherein the act of selecting an area of interest comprises selecting a plurality of areas of interest in the ultrasound image along both the depth axis and the azimuth axis, and wherein the act of automatically setting the location of the focal positions of the beams of transmitted ultrasound comprises automatically setting the location of a respective focal position of the beams of transmitted ultrasound based on a characteristic of each of the selected areas of interest of the ultrasound image.
6. An ultrasound imaging system comprising: a scanhead having an array of ultrasound transducers; a beamformer coupled to the scanhead, the beamformer applying electrical signals to the scanhead to cause the scanliead to transmit beams of ultrasound and receiving electrical signals from the scanhead responsive to ultrasound echoes received by the scanhead, the beams of ultrasound transmitted by the scanhead being focused in a focal position determined by a control signal applied to the beamformer; an image processor coupled to receive signals from the beamformer, the image processor converting the signals to image data corresponding to an ultrasound image, the image processor being operable to analyze the image data to identify an area of interest in an ultrasound image corresponding to the image data, the image processor further being operable to generate the control signal to set the location of the focal position of the beams of transmitted ultrasound to correspond to the location of the area of interest of the ultrasound image; and a display coupled to the image processor, the display generating an ultrasound image corresponding to the image data.
7. The ultrasound imaging system of claim 6 wherein the image processor is further operable to apply signals to the beamformer to cause the beamformer to transmit at least two beams of ultrasound having respective focal positions, the positions of the focal positions being controlled by signals applied to the beamformer by the image processor.
8. The ultrasound imaging system of claimό wherein the image processor is further operable to apply signals to the beamformer to set the respective locations of the focal positions along a depth axis of the ultrasound display.
9. The ultrasound imaging system of claim 6 wherein the image processor is operable to analyze the ultrasound image to identify a predetermined image characteristic.
10. The ultrasound imaging system of claim 9, wherein the image processor comprises an automated border detection processor which is operable to automatically identify a predetermined image feature.
11. An ultrasound imaging system comprising: a scanhead having an array of ultrasound transducers; a beamformer coupled to the scanhead, the beamformer applying electrical signals to the scanhead to cause the scanhead to transmit a beam of ultrasound and receiving electrical signals from the scanhead responsive to ultrasound echoes received by the scanhead, the beam of ultrasound transmitted by the scanhead being focused in a focal position determined by a control signal applied to the beamformer; a user interface device structured to allow a user to enter information, including information identifying an area of interest in an ultrasound image along both a depth axis and an azimuth axis; an image processor coupled to receive signals from the beamformer and from the user interface, the image processor converting the signals to image data corresponding to an ultrasound image; a control line, responsive to the user interface device, the control line further being operable to couple the control signal to set the location of the focal position of the beam of transmitted ultrasound based on the entered information; and a display coupled to the image processor, the display generating an ultrasound image corresponding to the image data.
12. The ultrasound imaging system of claim 11 wherein user interface device is structured to allow a user to enter information identifying a plurality of areas of interest in an ultrasound image, and wherein the control line is operable to apply signals to the beamformer to cause the beamformer to set the locations of respective focal positions of beams of transmitted ultrasound.
13. An ultrasound imaging system comprising: a scanhead having an array of ultrasound transducers; a beamformer coupled to the scanhead, the beamformer applying electrical signals to the scanhead to cause the scanhead to transmit a beam of ultrasound and receiving electrical signals from the scanhead responsive to ultrasound echoes received by the scanhead, the beam of ultrasound transmitted by the scanhead being focused in a focal position determined by a control signal applied to the beamformer; an image processor coupled to receive signals from the beamformer and from the user interface, the image processor converting the signals to image data corresponding to an ultrasound image, the image processor further being operable to generate the control signal to set the location of the focal position of the beam of transmitted ultrasound to correspond to the locations of a plurality of areas of interest along the depth axis as a function of the locations of the areas of interest along the azimuth axis; and a display coupled to the image processor, the display generating an ultrasound image corresponding to the image data.
14. The ultrasound imaging system of claim 13 wherein the image processor is operable to analyze the ultrasound image and to identify the plurality of areas of interest based on the analysis.
15. The ultrasound imaging system of claim 13 wherein the image processor is operable to apply signals to the beamformer to cause the beamformer to set respective locations of at least two focal positions along the depth axis of the ultrasound display.
EP20020783462 2001-12-14 2002-11-26 Ultrasound imaging system and method Expired - Lifetime EP1458294B1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US10/023,080 US6544179B1 (en) 2001-12-14 2001-12-14 Ultrasound imaging system and method having automatically selected transmit focal positions
US23080 2001-12-14
PCT/IB2002/005025 WO2003051202A1 (en) 2001-12-14 2002-11-26 Ultrasound imaging system and method

Publications (2)

Publication Number Publication Date
EP1458294A1 true EP1458294A1 (en) 2004-09-22
EP1458294B1 EP1458294B1 (en) 2011-01-12

Family

ID=21813014

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20020783462 Expired - Lifetime EP1458294B1 (en) 2001-12-14 2002-11-26 Ultrasound imaging system and method

Country Status (8)

Country Link
US (1) US6544179B1 (en)
EP (1) EP1458294B1 (en)
JP (1) JP2005511235A (en)
CN (1) CN100346749C (en)
AT (1) ATE494838T1 (en)
AU (1) AU2002347530A1 (en)
DE (1) DE60238942D1 (en)
WO (1) WO2003051202A1 (en)

Families Citing this family (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080262356A1 (en) * 2002-06-07 2008-10-23 Vikram Chalana Systems and methods for ultrasound imaging using an inertial reference unit
US20060025689A1 (en) * 2002-06-07 2006-02-02 Vikram Chalana System and method to measure cardiac ejection fraction
US7819806B2 (en) * 2002-06-07 2010-10-26 Verathon Inc. System and method to identify and measure organ wall boundaries
US20070276247A1 (en) * 2002-06-07 2007-11-29 Vikram Chalana Systems and methods for ultrasound imaging using an inertial reference unit
US20090112089A1 (en) * 2007-10-27 2009-04-30 Bill Barnard System and method for measuring bladder wall thickness and presenting a bladder virtual image
US20100036252A1 (en) * 2002-06-07 2010-02-11 Vikram Chalana Ultrasound system and method for measuring bladder wall thickness and mass
US20040127797A1 (en) * 2002-06-07 2004-07-01 Bill Barnard System and method for measuring bladder wall thickness and presenting a bladder virtual image
US7520857B2 (en) * 2002-06-07 2009-04-21 Verathon Inc. 3D ultrasound-based instrument for non-invasive measurement of amniotic fluid volume
US8221321B2 (en) 2002-06-07 2012-07-17 Verathon Inc. Systems and methods for quantification and classification of fluids in human cavities in ultrasound images
US8221322B2 (en) * 2002-06-07 2012-07-17 Verathon Inc. Systems and methods to improve clarity in ultrasound images
US20090062644A1 (en) * 2002-06-07 2009-03-05 Mcmorrow Gerald System and method for ultrasound harmonic imaging
GB2391625A (en) * 2002-08-09 2004-02-11 Diagnostic Ultrasound Europ B Instantaneous ultrasonic echo measurement of bladder urine volume with a limited number of ultrasound beams
US6669639B1 (en) * 2002-10-08 2003-12-30 Koninklijke Philips Electronics N.V. Ultrasonic diagnostic imaging system with articulating display
US6692441B1 (en) * 2002-11-12 2004-02-17 Koninklijke Philips Electronics N.V. System for identifying a volume of interest in a volume rendered ultrasound image
EP1631895A2 (en) * 2003-05-27 2006-03-08 Koninklijke Philips Electronics N.V. Diagnostic imaging system control with multiple control functions
WO2005050571A2 (en) * 2003-11-21 2005-06-02 Koninklijke Philips Electronics, N.V. Ultrasound imaging system and method having adaptive selection of image frame rate and/or number of echo samples averaged
JP4643172B2 (en) * 2004-04-21 2011-03-02 株式会社東芝 Portable diagnostic imaging equipment
EP1779784B1 (en) * 2004-06-07 2015-10-14 Olympus Corporation Electrostatic capacity type ultrasonic transducer
EP1621897B1 (en) * 2004-07-28 2010-07-28 Medison Co., Ltd. Ultrasound imaging apparatus having a function of selecting transmit focal points and method thereof
US7831081B2 (en) * 2005-08-15 2010-11-09 Boston Scientific Scimed, Inc. Border detection in medical image analysis
CN100484479C (en) 2005-08-26 2009-05-06 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic image enhancement and spot inhibition method
WO2007072362A2 (en) * 2005-12-19 2007-06-28 Koninklijke Philips Electronics, N.V. Automatic ultrasound scanning initiated by protocol stage
US8177718B2 (en) * 2006-03-01 2012-05-15 Koninklijke Philips Electronics N.V. Linear array ultrasound transducer with variable patch boundaries
JP5348829B2 (en) * 2006-06-08 2013-11-20 株式会社東芝 Ultrasonic diagnostic apparatus, ultrasonic image display method, and ultrasonic image display program
US8167803B2 (en) * 2007-05-16 2012-05-01 Verathon Inc. System and method for bladder detection using harmonic imaging
US8225998B2 (en) * 2008-07-11 2012-07-24 Es&S Innovations Llc Secure ballot box
JP5658151B2 (en) * 2008-08-07 2015-01-21 ベラソン インコーポレイテッドVerathon Inc. Apparatus, system and method for measuring the diameter of an abdominal aortic aneurysm
CN102078202A (en) * 2009-11-30 2011-06-01 Ge医疗系统环球技术有限公司 Method and ultrasonic imaging device for distinguishing artery and vein
US20110245641A1 (en) * 2010-03-31 2011-10-06 Nellcor Puritan Bennett Llc Monitor With Multi-Position Base
JP5560134B2 (en) 2010-08-03 2014-07-23 富士フイルム株式会社 Ultrasonic image generator
JP2013090827A (en) * 2011-10-26 2013-05-16 Fujifilm Corp Ultrasonic diagnostic equipment and ultrasonic image generation method
JP6049371B2 (en) * 2011-11-09 2016-12-21 東芝メディカルシステムズ株式会社 Ultrasound diagnostic system
CN102793564B (en) * 2012-07-30 2015-06-10 飞依诺科技(苏州)有限公司 Parameter automatic optimization method of multi-angle M model for ultrasonic imaging
WO2014033584A1 (en) * 2012-08-30 2014-03-06 Koninklijke Philips N.V. Coupled segmentation in 3d conventional ultrasound and contrast-enhanced ultrasound images
KR101250481B1 (en) 2012-09-14 2013-04-03 주식회사 오스테오시스 Method for driving ultrasonic apparatus
JP2014124429A (en) * 2012-12-27 2014-07-07 Seiko Epson Corp Ultrasonic measurement device, program and ultrasonic measurement method
KR101654674B1 (en) * 2013-11-28 2016-09-06 삼성전자주식회사 Method and ultrasound apparatus for providing ultrasound elastography
CN105640587A (en) * 2014-11-12 2016-06-08 Ge医疗系统环球技术有限公司 Method and device enhancing intervention apparatus in ultrasonic image
KR20160066928A (en) * 2014-12-03 2016-06-13 삼성전자주식회사 Apparatus and method for computer aided diagnosis, Apparatus for controlling ultrasonic transmission pattern of probe
US20200037984A1 (en) * 2017-02-14 2020-02-06 Koninklijke Philips N.V. Focus tracking in ultrasound system for device tracking
JP7167045B2 (en) * 2017-03-10 2022-11-08 コーニンクレッカ フィリップス エヌ ヴェ Location devices and systems for positioning acoustic sensors
CN107037130B (en) * 2017-06-09 2019-09-20 长春理工大学 Monocular vision three-D ultrasonic nondestructive detection system and detection method
EP3469993A1 (en) 2017-10-16 2019-04-17 Koninklijke Philips N.V. An ultrasound imaging system and method
JP7033430B2 (en) * 2017-10-19 2022-03-10 富士フイルムヘルスケア株式会社 Ultrasound imaging device and ultrasonic imaging method
US11607194B2 (en) * 2018-03-27 2023-03-21 Koninklijke Philips N.V. Ultrasound imaging system with depth-dependent transmit focus
CN109044398B (en) * 2018-06-07 2021-10-19 深圳华声医疗技术股份有限公司 Ultrasound system imaging method, device and computer readable storage medium
KR102631789B1 (en) * 2018-07-30 2024-02-01 삼성메디슨 주식회사 Ultrasound imaging apparatus and controlling method thereof
CN110465008B (en) * 2019-08-28 2021-02-12 黄晶 Focused ultrasound treatment system
CN114431892A (en) * 2020-10-30 2022-05-06 通用电气精准医疗有限责任公司 Ultrasonic imaging system and ultrasonic imaging method
CN112472139B (en) * 2020-12-18 2023-01-03 深圳市德力凯医疗设备股份有限公司 Imaging parameter configuration method, storage medium and ultrasonic equipment

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH064074B2 (en) * 1983-02-14 1994-01-19 株式会社日立製作所 Ultrasonic diagnostic device and sound velocity measuring method using the same
US5072735A (en) 1988-06-21 1991-12-17 Kabushiki Kaisha Toshiba Ultrasonic imaging apparatus
EP0355176B1 (en) * 1988-08-17 1992-12-16 Siemens Aktiengesellschaft Device for the contactless desintegration of concrements in a living thing body
JPH02193065A (en) * 1989-01-20 1990-07-30 Canon Inc Ultrasonic apparatus
JPH04116458A (en) 1990-09-07 1992-04-16 Olympus Optical Co Ltd Scanning acoustic microscope
US5357962A (en) 1992-01-27 1994-10-25 Sri International Ultrasonic imaging system and method wtih focusing correction
JPH10507936A (en) 1994-08-05 1998-08-04 アキュソン コーポレイション Method and apparatus for a transmit beam generator system
US5551433A (en) 1994-08-05 1996-09-03 Acuson Corporation Method and apparatus for a geometric aberration transform in an adaptive focusing ultrasound beamformer system
US5581517A (en) 1994-08-05 1996-12-03 Acuson Corporation Method and apparatus for focus control of transmit and receive beamformer systems
JP3327129B2 (en) 1996-07-22 2002-09-24 松下電器産業株式会社 Ultrasound diagnostic equipment
US5740805A (en) 1996-11-19 1998-04-21 Analogic Corporation Ultrasound beam softening compensation system
US6106465A (en) 1997-08-22 2000-08-22 Acuson Corporation Ultrasonic method and system for boundary detection of an object of interest in an ultrasound image
JP4116143B2 (en) * 1998-04-10 2008-07-09 株式会社東芝 Ultrasonic diagnostic equipment
US6168564B1 (en) 1998-10-02 2001-01-02 Sci-Med Life Systems, Inc. Steerable transducer array for intracardial ultrasonic imaging
US6077226A (en) * 1999-03-30 2000-06-20 General Electric Company Method and apparatus for positioning region of interest in image
US6193660B1 (en) * 1999-03-31 2001-02-27 Acuson Corporation Medical diagnostic ultrasound system and method for region of interest determination
US6315723B1 (en) 1999-10-08 2001-11-13 Atl Ultrasound Ultrasonic diagnostic imaging system with synthesized transmit focus
US6217516B1 (en) 1999-11-09 2001-04-17 Agilent Technologies, Inc. System and method for configuring the locus of focal points of ultrasound beams

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO03051202A1 *

Also Published As

Publication number Publication date
DE60238942D1 (en) 2011-02-24
US6544179B1 (en) 2003-04-08
CN1604756A (en) 2005-04-06
WO2003051202A1 (en) 2003-06-26
ATE494838T1 (en) 2011-01-15
AU2002347530A1 (en) 2003-06-30
EP1458294B1 (en) 2011-01-12
CN100346749C (en) 2007-11-07
JP2005511235A (en) 2005-04-28

Similar Documents

Publication Publication Date Title
US6544179B1 (en) Ultrasound imaging system and method having automatically selected transmit focal positions
US6629929B1 (en) Method and apparatus for automatically setting the transmit aperture and apodization of an ultrasound transducer array
KR102223048B1 (en) Region of interest placement for quantitative ultrasound imaging
US6464641B1 (en) Method and apparatus for automatic vessel tracking in ultrasound imaging
EP1715360B1 (en) Ultrasound diagnostic apparatus and ultrasound image processing program
EP1041395A2 (en) Method and apparatus for positioning region of interest in image
EP2633818B1 (en) Ultrasonic diagnostic apparatus
WO2009074948A1 (en) Robotic ultrasound system with microadjustment and positioning control using feedback responsive to acquired image data
JP2007513726A (en) Ultrasound imaging system with automatic control of penetration, resolution and frame rate
US20100099987A1 (en) Ultrasonic diagnostic device, ultrasonic image processing apparatus, ultrasonic image acquiring method and ultrasonic diagnosis display method
US5538003A (en) Quick method and apparatus for identifying a region of interest in an ultrasound display
CN101467892B (en) Ultrasonic diagnostic apparatus, ultrasonic image processing apparatus, and ultrasonic image processing method
JP7325544B2 (en) Method and system for guiding the acquisition of cranial ultrasound data
KR20050013602A (en) Ultrasound quantification in real-time using acoustic data in more than two dimensions
US7108658B2 (en) Method and apparatus for C-plane volume compound imaging
EP0414261B1 (en) Ultrasonic diagnosing apparatus
US20210338203A1 (en) Systems and methods for guiding the acquisition of an ultraound image
US11627943B2 (en) Ultrasound imaging system and method for deriving depth and identifying anatomical features associated with user identified point or region
EP3941356B1 (en) Methods and systems for adjusting the field of view of an ultrasound probe
CN110575198A (en) Analysis device and analysis method
EP4331499A1 (en) Ultrasound imaging systems and methods
JP2023551705A (en) Analysis of ultrasound image data of rectus abdominis muscle
JP2000005175A (en) Ultrasonic diagnostic device
JP2007044354A (en) Ultrasonic diagnostic equipment and ultrasonic diagnostic equipment control program

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20040714

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LI LU MC NL PT SE SK TR

AX Request for extension of the european patent

Extension state: AL LT LV MK RO SI

17Q First examination report despatched

Effective date: 20070530

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LI LU MC NL PT SE SK TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REF Corresponds to:

Ref document number: 60238942

Country of ref document: DE

Date of ref document: 20110224

Kind code of ref document: P

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 60238942

Country of ref document: DE

Effective date: 20110224

REG Reference to a national code

Ref country code: NL

Ref legal event code: VDEP

Effective date: 20110112

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20110112

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20110512

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20110413

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20110423

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20110112

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20110112

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20110112

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20110112

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20110412

Ref country code: BE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20110112

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20110112

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20110112

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20110112

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20110112

26N No opposition filed

Effective date: 20111013

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20110112

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 60238942

Country of ref document: DE

Effective date: 20111013

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20111130

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20111130

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20111130

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20120731

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20111126

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20111130

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20111126

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20110112

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 60238942

Country of ref document: DE

Representative=s name: VOLMER, GEORG, DIPL.-ING., DE

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 60238942

Country of ref document: DE

Representative=s name: MEISSNER BOLTE PATENTANWAELTE RECHTSANWAELTE P, DE

Effective date: 20140328

Ref country code: DE

Ref legal event code: R082

Ref document number: 60238942

Country of ref document: DE

Representative=s name: MEISSNER, BOLTE & PARTNER GBR, DE

Effective date: 20140328

Ref country code: DE

Ref legal event code: R082

Ref document number: 60238942

Country of ref document: DE

Representative=s name: VOLMER, GEORG, DIPL.-ING., DE

Effective date: 20140328

Ref country code: DE

Ref legal event code: R081

Ref document number: 60238942

Country of ref document: DE

Owner name: KONINKLIJKE PHILIPS N.V., NL

Free format text: FORMER OWNER: KONINKLIJKE PHILIPS ELECTRONICS N.V., EINDHOVEN, NL

Effective date: 20140328

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 60238942

Country of ref document: DE

Representative=s name: MEISSNER BOLTE PATENTANWAELTE RECHTSANWAELTE P, DE

Ref country code: DE

Ref legal event code: R082

Ref document number: 60238942

Country of ref document: DE

Representative=s name: MEISSNER, BOLTE & PARTNER GBR, DE

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20211129

Year of fee payment: 20

Ref country code: GB

Payment date: 20211123

Year of fee payment: 20

REG Reference to a national code

Ref country code: DE

Ref legal event code: R071

Ref document number: 60238942

Country of ref document: DE

REG Reference to a national code

Ref country code: GB

Ref legal event code: PE20

Expiry date: 20221125

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF EXPIRATION OF PROTECTION

Effective date: 20221125