US20030007074A1 - Vehicle zone monitoring apparatus - Google Patents

Vehicle zone monitoring apparatus Download PDF

Info

Publication number
US20030007074A1
US20030007074A1 US10/171,007 US17100702A US2003007074A1 US 20030007074 A1 US20030007074 A1 US 20030007074A1 US 17100702 A US17100702 A US 17100702A US 2003007074 A1 US2003007074 A1 US 2003007074A1
Authority
US
United States
Prior art keywords
image
vehicle
straight line
line segment
pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/171,007
Inventor
Nobuharu Nagaoka
Takayuki Tsuji
Masahito Watanabe
Hiroshi Hattori
Kouzou Simamura
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Arriver Software AB
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Assigned to HONDA GIKEN KOGYO KABUSHIKI KAISHA reassignment HONDA GIKEN KOGYO KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HATTORI, HIROSHI, NAGAOKA, NOBUHARU, SIMAMURA, KOUZOU, TSUJI, TAKAYUKI, WATANABE, MASAHITO
Publication of US20030007074A1 publication Critical patent/US20030007074A1/en
Priority to US12/287,433 priority Critical patent/US8144195B2/en
Assigned to ARRIVER SOFTWARE AB reassignment ARRIVER SOFTWARE AB ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: VEONEER SWEDEN AB
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads

Definitions

  • the present invention relates to a vehicle zone monitoring apparatus that detects physical bodies such as other vehicles, pedestrians, and animals that are present in the vicinity of the vehicle.
  • Japanese Unexamined Patent Application, First Publication, No. 2001-6069 is a known example of a zone monitoring apparatus that detects physical bodies present in the vicinity of a vehicle that move, such as pedestrians and animals.
  • This zone monitoring apparatus calculates the distance between the objects in the vicinity of the vehicle and the vehicle from images obtained by two infrared cameras, and then calculates the motion vector of the objects from the position data of the objects found in a time sequence.
  • the apparatus detects the objects having a high possibility of colliding with the vehicle from the relationship between the direction of progress of the vehicle and the movement vector of the object.
  • Japanese Unexamined Patent Application, First Publication, No. 2001-108758 discloses technology in which objects are detected by eliminating zones that exhibit temperatures clearly different from the physical body temperature of a pedestrian from an infrared image photographed by a photographing device provided on the vehicle.
  • the apparatus determines whether or not the object is a pedestrian by further identifying the aspect ratio of the object.
  • an object extracting device for example, steps S 1 to S 7 in the embodiments
  • an artificial structure identifying device for example, steps S 25 to S 28 in the embodiments
  • a second aspect of the present invention is characterized in comprising an artificial structure eliminating device (for example, step S 30 in the embodiments) that eliminates the objects identified to be artificial structures using the artificial structure identifying device from the objects extracted by the object extracting device.
  • an artificial structure eliminating device for example, step S 30 in the embodiments
  • a third aspect of the present invention is characterized in that said reference images include an image representing a line segment, and said artificial structure identifying device identifies objects that includes a line segment.
  • a fourth aspect of the invention is characterized in that said artificial structure identifying device comprises a reference image dimension altering device (for example, steps S 32 , S 52 , and S 72 in the embodiment) that alters the size of said reference image so as to conform to the distance between said vehicle and said object.
  • a reference image dimension altering device for example, steps S 32 , S 52 , and S 72 in the embodiment
  • FIG. 1 is a block diagram showing the structure of the vehicle zone monitoring apparatus according to the embodiment of the present invention.
  • FIG. 2 is a drawing showing the installation positions of the infrared cameras, sensors, display and the like in the vehicle.
  • FIG. 3 is a flowchart showing the processing sequence for all operations in the image processing unit of the vehicle zone monitoring apparatus of this embodiment.
  • FIGS. 4 A and FIG. 4B are drawings showing the gray scales obtained by the infrared camera and the binary image thereof.
  • FIG. 5A, FIG. 5B, and FIG. 5C are drawings showing the conversion processing and labeling for the run length data.
  • FIGS. 6A and 6B are drawings showing the time trace of the object.
  • FIG. 7 is a drawing showing the turning angle compensation of the object image.
  • FIGS. 8A and 8B are drawings showing the search image in the right image and the search zone set in the left image.
  • FIG. 9 is a drawing showing the correlation calculation processing that uses the search zone as an object.
  • FIGS. 10A and 10B are drawings showing the calculation method for object parallax in the distance calculation of the object.
  • FIG. 11 is a drawing showing the offset of the object position in the image generated by the turning of the vehicle.
  • FIG. 12 is a flowchart showing the details of the warning determining processing in the image processing unit of the vehicle zone monitoring apparatus of this embodiment.
  • FIG. 13 is a drawing showing the zone partition in the forward direction of the vehicle.
  • FIG. 14 is a drawing showing the case in which a collision can occur easily.
  • FIG. 15 is a flowchart showing the details of the detection processing of a vertical linear part in the image processing unit of the vehicle zone monitoring apparatus of the embodiment.
  • FIG. 16 is a flowchart showing the details of the process of detecting the horizontal linear part in the image processing unit of the vehicle zone monitoring apparatus of this embodiment.
  • FIG. 17 is a flowchart showing the details of the process of detecting the quadrangle part in the image processing unit of the vehicle zone monitoring apparatus of this embodiment.
  • FIGS. 18A and 18B are drawings showing the details of the vertical linear part extracted pattern in the search of the image.
  • FIG. 19 is a drawing showing the search of the reference pattern for the vertical part search.
  • FIGS. 20A and 20B are drawings showing the details of the vertical linear part extracted pattern in the search of the image.
  • FIGS. 21A and 21B are drawings showing the details of the quadrangle part extracted pattern in the search of the image.
  • FIG. 22 is a drawing showing an example of a highway structure obtained by the infrared camera.
  • FIG. 23 is a flowchart showing the details of the process of detecting identical shapes in the image processing unit of the vehicle zone monitoring apparatus of this embodiment.
  • FIGS. 24A and 24B are drawings showing the search of the object pattern for detecting identical shapes.
  • FIG. 1 is a block diagram showing the structure of the vehicle zone monitoring apparatus according to the embodiment of the present invention.
  • reference numeral 1 is an image processing unit that provides a CPU (Central Processing System) that controls the vehicle zone monitoring apparatus of this embodiment, and has connected thereto two infrared cameras 2 R and 2 L that can detect infrared light, a yaw rate sensor 3 that detects the oscillation of this vehicle physical body, a velocity sensor 4 that detects the traveling velocity (vehicle velocity) of this vehicle, and a brake sensor 5 for detecting the operation of the brake.
  • the image processing unit 1 detects an object that moves, such as a pedestrian or animal, in front of the vehicle from signals representing the infrared image in the vicinity of the vehicle and the travel state of the vehicle, and determines when the possibility of a collision is high.
  • a CPU Central Processing System
  • a speaker 6 that issues a warning by voice
  • a meter display that integrates a meter that numerically represents the travel state of the vehicle
  • a NAVIDisplay disposed on the console of the vehicle
  • a HUD Head Up Display
  • the image processing unit 1 comprises an A/D conversion circuit that converts an input analog signal into a digital signal, an image memory that stores digitalized image signal, a CPU (Central Processing Unit) that carries out each type of operation processing, RAM (Random Access Memory) used in order to store data that the CPU is currently processing, ROM (Read Only Memory) that stores programs executed by the CPU, tables, maps and the like, and the like, a driving signal for the speaker 6 , and an output circuit that outputs display signals and the like of the HUD 7 a, for example, and is structured such that each of the output signals of the infrared cameras 2 R and 2 L, the yaw rate sensor 3 , the velocity sensor 4 , the and the brake sensors 5 are converted to digital signals and input into the CPU.
  • A/D conversion circuit that converts an input analog signal into a digital signal
  • an image memory that stores digitalized image signal
  • a CPU Central Processing Unit
  • RAM Random Access Memory
  • ROM Read Only Memory
  • the infrared cameras 2 R and 2 L are disposed at substantially symmetrical positions with respect to the center part in the transverse direction of the vehicle 10 , and the optical axes of the two infrared cameras 2 R and 2 L are parallel to each other.
  • the height of both cameras from the surface of the road surface is fixed so as to be equal.
  • the infrared cameras 2 R and 2 L have the property that the output signal level increases (the brightness increases) as the temperature of the object increases.
  • the HUD 7 a is provided so that the display image is displayed at a position on the front window of the vehicle 10 that does not block the forward visual horizon of the driver.
  • FIG. 3 is a flowchart showing the processing sequence in the image processing unit 1 of the vehicle zone monitoring apparatus of this embodiment.
  • the image processing unit 1 obtains an infrared image, which is the output signal of the infrared cameras 2 R and 2 L (step S 1 ), carries out AID conversion thereon (step S 2 ), and stores the gray scale image in the image memory (step S 3 ). Moreover, the right image from the infrared camera R 2 is obtained, and then the left image from the infrared camera 2 L is obtained. In addition, in the right image and the left image, the horizontal position on the display screen of the same object is displayed with an offset, and thus, by this offset (parallax), the distance to this object can be calculated.
  • the right image obtained by the infrared camera R 2 serves as the reference image
  • the binary processing of this signal image that is, the processing, in which the zones brighter than a brightness threshold ITH are assigned “1” (white) and the dark zones are assigned “0” (black) is carried out (step S 4 ).
  • FIG. 4A shows the gray scale image obtained from the infrared camera 2 R, and by carrying out binary processing thereon, the image in FIG. 4B is obtained. Moreover, in FIG. 4B, the physical bodies surrounded by the frames P 1 to P 4 are set as the objects displayed in white on the display screen (below, referred to as the “high brightness zone”).
  • step S 5 When the image data that has undergone binary processing has been obtained from the infrared camera, processing is carried out in which the binary image data is converted to run length data (step S 5 ).
  • FIG. 5A is a drawing to explain this, and in this figure the zone that has become white due to binary conversion is shown as the lines L 1 to L 8 .
  • Lines L 1 to L 8 all have a width of 1 pixel in the y direction, and while they are actually arranged without a space between them in the y direction, they have been separated for the sake of the explanation.
  • the lines L 1 to L 8 respectively have the lengths 2 pixels, 2 pixels, 3 pixels, 8 pixels, 7 pixels, 8 pixels, 8 pixels, and 8 pixels.
  • the run length data is shown by the coordinates of the start point of each of the lines (the point on the left end of each line) and the length (number of pixels) from the start point to the end point (the point on the right end of each line).
  • line L 3 comprises the 3 pixels (x 3 , y 5 ), (x 4 , y 5 ), and (x 5 , y 5 ), and thus (x 3 , y 5 , 3 ) becomes the run length data.
  • step S 7 the processing in which the object is extracted is carried out. That is, as shown in FIG. 5B, among the lines L 1 to L 8 that have been converted to run length data, the lines L 1 to L 3 , which are the parts overlapping in the y direction, are treated as one object 1 , lines L 4 to L 8 are treated as one object 2 , and the object labels 1 and 2 are added to the run length data.
  • the high brightness zones shown in FIG. 4B are respectively recognized as objects 1 through 4 .
  • step S 8 When the extraction of the objects has completed, as shown in FIG. 5C, next the center of gravity G, surface area S, and the aspect ratio ASPECT of the circumscribed quadrangle represented by the broken lines is calculated (step S 8 ).
  • the surface area S is calculated by adding the lengths of the run length data for the same object.
  • the coordinate of the center of gravity G is calculated as the x coordinate of the line that bisects the surface area S in the x direction, and the y coordinate of the line that bisects it in the y direction.
  • the aspect ratio ASPECT is calculated as the Dy/Dx ratio of Dy and Dx shown in FIG. 5C.
  • the position of the center of gravity G can be substituted for by the position of the center of gravity of the circumscribed quadrangle.
  • step S 9 next recognition of the time trace, that is, the sampling of each cycle, of the same object us carried out (step S 9 ).
  • k serves as the time during which time t, an analogue amount, is made discrete at a sampling cycle, and as shown in FIG. 6A, in the case that objects A and B are extracted at time k, objects C and D extracted at time (k+1) are determined to be identical to objects A and B.
  • objects A and B are determined to be identical to objects C and D
  • objects C and D have their labels changed respectively to objects A and B.
  • the object A and the object C satisfy the conditions for the identification of identity described above and the object B and the object D satisfy the conditions for the identification of identity described above, and thus the objects C and D are respectively recognized to be the objects A and B.
  • the positions coordinates (of the center of gravity) of each of the recognized objects is stored in the memory as time series position data to be used on later calculation processing.
  • step S 10 the velocity VCAR detected by the velocity sensor 4 and the yaw rate YR detected by the yaw rate sensor 3 are read, and as shown in FIG. 7, the turning angle ⁇ r of the vehicle 10 is calculated by integrating the yaw rate YR with respect to time (step S 10 ).
  • step S 9 and step S 10 are carried out in parallel, and in steps S 11 through S 13 , processing that calculates the distance z between the object and the vehicle 10 is carried out. Because this calculation requires a longer time than step S 9 and step S 10 , they are executed at a longer cycle than step S 9 and S 10 (for example, at a cycle about three times the execution cycle of steps S 1 to S 10 ).
  • the vehicle search zone in which the image corresponding to the searched image (below, referred to as the “corresponding image”) from the left image is set, and the corresponding image is extracted by executing the correlation calculation (step S 12 ).
  • the searched zone R 2 in the left image is set, and the brightness difference total value C (a, b), which indicates the degree of the correlation with the searched image R 1 in the searched zone R 2 , is calculated by the Eq. 1 shown below, and the zone in which this total value C (a, b) becomes minimum is extracted as the corresponding image.
  • this correlation calculation is carried out using the gray scale image, not the binary image.
  • a zone R 2 a (shown by the broken line in FIG. 8B) that is narrower than the searched zone R 2 is set to serve as the search zone.
  • IR (m, n) is the brightness value of the position of the coordinate (m, n) in the searched zone R 1 shown in FIG. 9
  • IL (a+m ⁇ M, b+n ⁇ N) is the brightness value of the position of the coordinate (m, n) in the search one R 1 and the local zone R 3 having the same shape, where the coordinates (a, b) in the search zone are the base points.
  • the position of the corresponding image is defined by finding the position at which the total value C (a, b) of the brightness difference is minimized by changing the coordinates (a, b) of the base point.
  • step S 12 Due to the processing in step S 12 , as shown in FIG. 10A and FIG. 10B, because the searched image R 1 and the corresponding image R 4 corresponding to this object are extracted, next the distance dR (number of pixels) between the position of the center of gravity of the searched image R 1 and the image center line LCTR and the distance dL (number of pixels) between the position of the center of gravity of the corresponding image R 4 and the image center line LCTR are found, and by applying the following Eq. 2, the distance z between the vehicle 10 and the object is calculated (step S 13 ).
  • z BxF ( d ⁇ ⁇ L + d ⁇ ⁇ R )
  • ⁇ p BxF ⁇ ⁇ ⁇ d ⁇ p Eq . ⁇ 2
  • B is the base line length, that is, the distance in the horizontal direction between the center position of the photographic element of the infrared camera 2 R and the center position of the photographic element of the infrared camera 2 L (the separation of the light beam axis of both infrared cameras);
  • F is the focal distance of the lenses of the infrared cameras 2 R and 2 L,
  • p is the pixel separation in the photographic element of the infrared cameras 2 R and 2 L, and
  • step S 10 When the calculation of the turning angle ⁇ r in step S 10 and the calculation of the distance to the object in step S 13 have completed, the coordinates (x, y) in the image and the distance z calculated by Eq. 2 are applied to the following Eq. 3, and converted to real spatial coordinates (X, Y, Z) (step S 14 ).
  • the real spatial coordinates (X, Y, Z) have as their origin O the position of the center point of the installation position of the infrared cameras 2 R and 2 L (the position at which they are fixed on the vehicle 10 ), they are fixed as shown in the figure, and the coordinates in the image are determined by x in the horizontal direction and y in the vertical direction, where the center of the image serves as the origin.
  • [ X Y Z ] [ xc ⁇ z / f yc ⁇ z / f z ] Eq . ⁇ 3
  • (xc, yc) are the coordinates (x, y) of the right image that have been converted to coordinates of a virtual image in which the real spatial origin O and the center of the image have been made to coincide based on the relative positional relationship between the installation position of the infrared camera 2 R and the real special origin O.
  • f is the ratio of the focus distance F and the pixel interval P.
  • turning angle compensation is carried out in order to compensate the positional shift in the image due to the turning of the vehicle 10 (step S 15 ).
  • [ Xr Yr Zr ] [ cos ⁇ ⁇ ⁇ ⁇ ⁇ r 0 - sin ⁇ ⁇ ⁇ ⁇ ⁇ r 0 1 0 sin ⁇ ⁇ ⁇ ⁇ ⁇ r 0 cos ⁇ ⁇ ⁇ ⁇ ⁇ r ] ⁇ [ X Y Z ] Eq . ⁇ 4
  • u is a parameter that takes an arbitrary value
  • Xav, Yav, and Zav are respectively the average values of the X coordinate, Y coordinate, and Z coordinate of the real spatial position data sequence.
  • the numerical value in the parenthesis added to P which denotes the coordinates of each of the data point, indicates that the larger the value, the older the data.
  • P(0) denotes the most recent position coordinate
  • P(1) denotes the position coordinate of one sample cycle back
  • P(2) denotes the position coordinate two sample cycles back.
  • the relative motion vector is found as the vector from the position coordinate Pv(N ⁇ 1) calculated in Eq. 8 towards Pv(0).
  • step S 16 when the relative motion vector has been found, next the possibility of a collision with the detected object is determined, and a warning determination process, which issues a warning when the possibility is high, is executed (step S 17 ).
  • step S 17 of the flowchart shown in FIG. 3 will be explained with reference to the flowchart shown in FIG. 12.
  • the image processing unit 1 calculates the relative velocity Vs in the Z direction using the following Eq. 7 from the animal 20 approaching the distance Zv (0) from the distance Zv (N ⁇ 1) during the time ⁇ T, and carries out collision determination processing (step S 21 ).
  • the collision determination processing is processing that determines whether there is a possibility of a collision when the following Equations 8 and 9 are satisfied.
  • step S 21 in the case it has been determined that there is a possibility of a collision with the animal 20 (YES in step S 21 ), the flow proceeds next to step S 22 .
  • step S 21 when Eq. 8 and/or Eq. 9 are not satisfied, it is determined that there is no possibility of a collision with the animal 20 (NO in step S 21 ), and the warning determination processing completes.
  • Vs ( Zv ( N ⁇ a ) ⁇ Zv (0))/ ⁇ T Eq.7
  • Zv(0) is the most recent distance detection value (v is attached in order to indicate that this is data after compensation using the approximately straight line LMV, while the Z coordinate is a value identical to that before compensation), and Zv(N ⁇ 1) is the detected distance value before the time ⁇ T.
  • T is an allowable time and signifies that the possibility of a collision is determined time T before the predicted collision time, and is about 2 to 5 seconds, for example.
  • H is a predetermined height that defines the range of the Y direction, that is the height direction, and is set, for example, to about twice the height of the vehicle 10 .
  • step S 22 next it is determined whether or not the object is within the approach zone determination (step S 22 ).
  • AR 1 is the zone corresponding to the range having added the allowance ⁇ (for example, about 50 to 100 cm) to both sides of the width ⁇ of the vehicle 10 , or in other words, the zone having a width ( ⁇ /2+ ⁇ ) on both sides of the axle at the center part in the width direction of vehicle 10 , and if the object continues to be present as-is, the possibility of a collision is extremely high.
  • these zones are called approach determination zones.
  • the zones AR 2 and AR 3 are zones (in the outside transverse direction of the approach determination zone) in which the absolute value of the X coordinate is larger than the approach determination zone, an invasive collision determination, described below, is made about the object inside this zone, and thus this is called the invasive determination zone.
  • these zones have a predetermined height H in the Y direction, as shown in the above Eq. 9.
  • step S 21 The answer in the above step S 21 becomes affirmative (YES) in the case that an object is present in either the approaching determination zone AR 1 or the invasive determination zones AR 2 and AR 3 .
  • step S 22 it is determined whether or not the object is in the approaching determination zone AR 1 , and in the case that it is determined that the object is in the approaching determination zone AR 1 (YES in step S 22 ), the flow proceeds directly to step S 24 . In contrast, in the case that it is determined the object is not in the approaching determination zone AR 1 (NO in step S 22 ), invasive collision determination processing is carried out (step S 23 ).
  • the invasive collision determination processing in step S 23 distinguishes whether or not the difference between xc(0), which is the most recent x coordinate on the image (the character c, as will be explained below, is attached in order to signify that it is a coordinate on which compensation has been carried out that makes the center position of the image align with the real spatial origin point O) and xc(N ⁇ 1), which is the x coordinate before the time ⁇ T, satisfies the following Eq. 10, and in the case that it is satisfied, it is determined that the possibility of a collision is high.
  • step S 23 in the case that it has been determined that the possibility of a collision is high (YES in step S 23 ), the flow proceeds to step S 24 . In contrast, in the case that it has been determined that the possibility of a collision is low (NO in step S 23 ), the warning determination processing completes.
  • step S 24 it is determined whether or not to carry out a warning output determination process, that is, a warning output (step S 24 ).
  • the warning output determination process determines whether or not the driver of the vehicle 10 is carrying out a braking action from the output BR of the brake sensor 5 .
  • the predetermined threshold value GTH is determined by the following Eq. 11. This is the value corresponding to the condition in which the vehicle 10 stops at a running distance equal to or less than the distance Zv(0) in the case that the acceleration Gs during the braking action is maintained as-is.
  • GTH Vs 2 2 ⁇ Zv ⁇ ( 0 ) Eq . ⁇ 11
  • step S 25 there are the following steps: identifying whether or not a part indicating a straight line segment is included in the image of the object (step S 25 ); whether or not an angle in the image of the object is a right angle (step S 26 ); whether or not the image of the object conforms to the shape of a pre-registered artificial structure (step S 27 ); and whether or not a plurality of identical shapes are included in the image of the object (step S 28 ).
  • step S 25 it is identified whether or not a part indicating a straight line segment is included in the image of the object.
  • step S 25 in the case that a part indicating a straight line segment is not included in the image of the object (NO in step S 25 ), it is identified whether or not an angle in the image of the object is a right angle (step S 26 ).
  • step S 26 in the case that an angle in the image of the object is not a right angle (NO in step S 26 ), it is identified whether or not the image of the object conforms to the shape of a pre-registered artificial structure (step S 27 ).
  • step S 27 in the case that the image of the object does not conform to the shape of a pre-registered artificial structure (NO in step S 27 ), whether or not pluralities of identical shapes are included in the image of the object is identified (step S 28 ).
  • step S 28 in the case that a plurality of identical shapes are not included in the image of the object (NO in step S 28 ), the possibility that the object is a pedestrian or an animal is high, and thus a warning is issued by voice through the speaker 3 , and at the same time, by the image display apparatus 7 , for example, the image obtained by the infrared camera 2 R is displayed, and the approaching object is given an highlighted display (for example, highlighted by being surrounded by a frame) (step S 29 ).
  • the image display apparatus 7 for example, the image obtained by the infrared camera 2 R is displayed, and the approaching object is given an highlighted display (for example, highlighted by being surrounded by a frame) (step S 29 ).
  • step S 25 in the case that a part indicating a straight line segment in the image of the object is included (YES in step S 25 ), or in step S 26 , in the case that an angle in the image of the object is a right angle (YES in step S 26 ), or in step S 27 , in the case that the image of the object conforms to the shape of a pre-registered artificial structure (YES in step S 27 ), or further in step S 28 , the case that a plurality of identical shapes are included in the image of the object (YES in step S 28 ), the object is treated as an artificial structure, and the object extracted in step S 7 of FIG. 3 is eliminated (step S 30 ), no warning is issued, and the warning determination processing is completed.
  • step S 25 the method for the identification of the shape of the object in FIG. 12 described above, and in particular, the search processing for straight line segments and right angle segments in step S 25 , step S 26 , step S 28 , and step S 30 will be explained with reference to the figures.
  • FIG. 15, FIG. 16, FIG. 17, and FIG. 23 are flowcharts showing in further detail the processing in step S 25 , step S 26 , step S 28 , and a part of the processing of step S 30 .
  • FIG. 15 is a flowchart showing the vertical straight line segment identification.
  • step S 31 the image of the object and the right straight line segment image pattern, which is a reference image for carrying out correlation calculation, are selected (step S 31 ), and depending on the distance between the vehicle 10 and the object found in step S 13 of the flowchart shown in FIG. 3, the pattern size of the reference image is determined so as to be in proportion to the size of the image of the real space projected onto the display image (step S 32 ).
  • step S 33 next the search zone in proximity to the object is set.
  • the setting of the search zone is carried out as follows. Specifically, as shown in FIG. 19, the outline (the binary object image 100 ) that has been subject to binary extraction does not necessarily correctly represent the outline of the object 101 . Therefore, with respect to the center 102 of the circumscribed quadrangle of the binary object image (OBJ) 100 , the width and height of the search area are set to a height and width defined by respective upper, lower, left, and right a [pixel] ranges, and the search range 103 for the correlation calculation comprises these four a ⁇ a [pixel] ranges.
  • step S 35 it is identified whether or not a part having a high correlation with the right straight line segment extraction pattern “Pat_Line_R” is present (step S 35 ).
  • step S 35 in the case that a part having a high correlation with the right straight line segment extraction pattern “Pat_Line_R” is present (YES in step S 35 ), in order to determine whether or not the part having a high correlation and the object 101 are identical physical bodies, the distance of OBJ_Pat 104 is calculated in the same manner as the calculation of the distance of the object by the above Eq. 2 (step S 36 ).
  • the object 101 and OBJ_PAT 104 can be identified to be identical physical bodies, and thus by comparing the calculated parallax ⁇ d and ⁇ d_P instead comparing distances, it can be identified whether or not the object 101 and OBJ_PAT 104 are identical physical bodies (step S 37 ). Specifically, using the following Eq. 14, it is determined whether or not the parallax error is smaller than an allowable value TH.
  • step S 37 in the case that it is identified that the object 101 and the OBJ_Pat 104 are identical physical bodies (YES in step S 37 ), it is determined that there is a vertical straight line segment in the object 101 (step S 38 ), having a vertical straight line segment is treated as being an artificial highway structure (step S 39 ), and the vertical straight line determination completes.
  • step S 35 in the case that a part having a high correlation with the right straight line segment extraction pattern “Pat_Line_R” is not present (NO in step S 35 ), or in step S 37 , in the case that the object 101 and the OBJ_Pat 104 are not identified as identical physical bodies (NO in step S 37 ), the flow proceeds to step S 40 , and it is identified whether or not the reference pattern used in the correlation calculation is a left straight line segment image pattern (step S 40 ).
  • step S 40 in the case that the reference pattern used in the correlation calculation was not a left straight line segment image pattern (NO in step S 40 ), the left straight line segment image pattern prepared in advance is selected (step S 41 ), and the flow returns to step S 32 .
  • step S 32 and step S 33 described above the same action is carried out on the left straight line segment image pattern as the action carried out on the right straight line segment image pattern, and the a ⁇ b [pixel] left straight line segment extraction pattern “Pat_Line_L” extracted from the left straight line segment image pattern shown in FIG. 18B serves as the reference pattern.
  • step S 34 from inside the search range 103 in proximity to the object, the part (OBJ_Pat) having a high correlation with the left straight line segment extraction pattern “Pat_Line_L” is searched for using correlation calculation.
  • step S 35 to step S 39 As a result of the correlation calculation using the left straight line segment extraction pattern, the actions from step S 35 to step S 39 described above are carried out, and when a vertical straight line segment is identified to be present in the object 101 , the object 101 is treated as an artificial road structure, and the vertical straight line segment determination completes.
  • step S 40 when the flow proceeds to the determination of step S 40 again, the search of the vertical straight line segment by both the right straight line segment extraction pattern and the left straight line segment extraction pattern has already completed (YES in step S 40 ), no vertical straight line segment is identified as being present (step S 42 ), and the flow proceeds to the horizontal straight line segment identification.
  • the reason that a correlation calculation is carried out using both the right straight line segment extraction pattern and the left straight line segment extraction pattern, and the distance between the vehicle 10 and the respective parts having high correlations is compared to the distance between object and the vehicle 10 is because in the case that a plurality of objects overlap and are recognized as one object, there is the possibility that the right or left straight line segments of the objects detected in the vertical straight line segment identification are not parts of the object subject to collision determination. Therefore, the distance between the object and the vehicle 10 is compared to the distance between the detected object and the right or left straight line segment of the vehicle 10 , and it is identified whether both are identical physical bodies.
  • an upper edge straight line segment image pattern which is the reference image for carrying out correlation calculation on the image of the object, is selected (step S 51 ), and depending on the distance between the vehicle 10 and the object found in step S 13 in the flowchart shown in FIG. 3, the pattern size of the reference image is determined so as to be in proportion to the size of the image in real space projected on the display screen (step S 52 ).
  • the b ⁇ a [pixel] straight line segment pattern is extracted from the upper edge straight line segment image pattern prepared in advance, and the upper edge straight line segment extraction pattern “Pat_Line_U” serves as the reference pattern.
  • the b ⁇ a [pixel] lower edge straight line segment extraction pattern “Pat_Line_D” extracted from the lower edge straight line segment image pattern prepared in advance is shown in FIG. 20B.
  • step S 53 next the search zone in proximity to the object is set.
  • the setting of the search zone is also carried out similarly to the vertical straight line segment identification described above. That is, with respect to the center of the circumscribed quadrangle of the binary object image (OBJ), the width and height of the search area are set to a height and width defined by respective upper, lower, left, and right a [pixel] ranges, and the search range 103 for the correlation calculation comprises these four a ⁇ a [pixel] ranges.
  • OJ binary object image
  • step S 55 it is identified whether or not a part having a high correlation with the upper edge straight line segment extraction pattern “Pat_Line U” is present (step S 55 ).
  • step S 55 in the case that a part having a height correlation with the upper edge straight line segment extraction pattern “Pat_Line_U” is present (YES in Step S 55 ), it is identified whether there is a horizontal straight line segment in the object (step S 56 ). Having a horizontal straight line segment means that the object is to be treated as an artificial highway structure (step S 57 ), and the horizontal straight line segment determination completes.
  • step S 55 in the case that a part having a high correlation with the upper edge straight line segment extracting pattern “Pat_Line_U” is not present (NO in step S 55 ), it is identified whether or not the reference pattern used in the correlation calculation is a lower edge straight line segment image pattern (step S 58 ).
  • step S 58 in the case that the reference pattern used in the correlation calculation is not the lower edge straight line segment image pattern (NO in step S 58 ), a lower edge straight line segment image pattern prepared in advance is selected (step S 59 ), and the flow returns to step S 52 .
  • step S 52 and step S 53 described above the same actions carried out on the upper edge straight line segment image pattern are carried out on the lower edge straight line segment image pattern, and the b ⁇ a [pixel] lower edge straight line segment extraction pattern “Pat_Line_D” extracted from the lower edge straight line segment image pattern shown in FIG. 20B serves as the reference pattern.
  • step S 54 the part (OBJ_Pat) having a high correlation with the lower edge straight line segment extraction pattern “Pat_Line_D” is found using the correlation calculation from within the search range in proximity to the object.
  • step S 55 to step S 57 the actions from step S 55 to step S 57 described above are carried out, and when it is identified that a horizontal straight line segment is present in the image, the object is treated as an artificial structure, and the horizontal straight line segment identification completes.
  • step S 58 when the flow proceeds to the identification of step S 58 again, because the search for horizontal straight line segments using both the upper edge straight line segment extraction pattern and the lower edge straight line segment extraction pattern have already completed (YES in step S 58 ), no horizontal straight line segments are identified as being present (step S 60 ), and the flow proceeds to the right angle segment identification.
  • the reason for finding the distance between the respective parts having a high correlation and the vehicle 10 after carrying out the correlation calculation using both the upper edge straight line segment extraction pattern and the lower edge straight line segment extraction pattern is because, based on the principle of binary vision using the left and right cameras, the distance of the horizontal straight line segment cannot be calculated. Therefore, unlike the case of the vertical straight line segment identification, in the horizontal straight line identification, the identification based only on the correlation of the straight line pattern is carried out.
  • the image of the object and an upper-right right angle segment image pattern which is the reference image for carrying out the correlation calculation, are selected (step S 71 ).
  • the pattern size of the reference image is determined so as to be in proportion to the size of the image in real space projected on the display screen (step S 72 ).
  • FIG. 21A From the upper-right right angle segment image pattern prepared in advance, for example, the a ⁇ a [pixel] right angle segment pattern is extracted, and the upper-right right angle segment extraction pattern “Pat_Corner_R” serves as the reference pattern. Similarly, the a ⁇ a [pixel] upper-left right angle segment extraction pattern “Pat_Corner_L” extracted from the upper-left right angle segment image pattern prepared in advance is shown in FIG. 21B.
  • next the search zone in proximity to the object is set (step S 73 ).
  • the setting of the search zone is also carried out similarly to the vertical straight line segment identification and the horizontal straight line segment identification described above. That is, at the center of an circumscribed quadrangle of a binary object image (OBJ), the width and height set respectively a [pixel] rang for the width and the top and bottom of the binary object image, and this serves as the search range for the correlation calculation.
  • OJ binary object image
  • step S 75 it is determined whether or not a part having a high correlation with the upper-right right angle segment extraction pattern “Pat_Corner_R” is present.
  • step S 75 in the case that a part having a high correlation with the upper edge straight line segment extraction pattern “Pat_Corner_R” is present (YES in Step S 75 ), and the distance of the OBJ_Pat is calculated similarly to the distance calculation of the object using Eq. 2 above in order to identify whether or not the part with a high correlation and the object are identical physical bodies (step S 76 ).
  • the object and OBJ_Pat are identified as being identical physical bodies and thus by comparing the detected parallax Ad and Ad_P instead of comparing the distances, it can be identified whether or not the object and OBJ_Pat are identical physical bodies (step S 77 ). Specifically, using the above Eq. 14, it is identified whether or not the parallax error is smaller than an allowable value TH.
  • step S 77 in the case the object and OBJ_Pat are identified as being identical physical bodies (YES in step S 77 ), a right angle segment in the object is identified to be present (step S 78 ), having a right angle segment is treated as being an artificial highway structure (step S 79 ), and the right angle segment identification completes.
  • step S 75 in the case that a part having a high correlation with the upper-right right angle segment extraction pattern “Pat_Corner_R” is not present (NO in step S 75 ), or in step S 77 , in the case that the object and OBJ_Pat are not identified as identical physical bodies (NO in step S 77 ), the flow proceeds to step S 80 , and it is identified whether or not the reference pattern used in the correlation calculation is an upper-left right angle segment image pattern (step S 80 ).
  • step S 80 in the case that the reference pattern used in the correlation calculation is not the upper-left right angle segment image pattern (NO in step S 80 ), the upper-left right angle segment image pattern prepared in advance is selected (step 81 ), and the flow proceeds to step S 72 .
  • step S 72 and step S 73 described above the same action carried out for the upper-right right angle segment image pattern is carried out for the upper-left right angle segment image pattern, and the a ⁇ a [pixel] upper-left right angle segment extraction pattern “Pat_Corner_L” extracted from the upper-left right angle segment image pattern shown in FIG. 21B serves as the reference pattern.
  • step S 74 the part having a high correlation with the upper-left right angle segment extraction pattern “Pat_Corner_L” is searched for using the correlation calculation from within the search zone in proximity to the object.
  • step S 75 to step S 79 are carried out, and when a right angle segment is identified as being present in the object, the object 101 is treated as an artificial highway structure, and the right angle segment identification completes.
  • step S 80 when the flow proceeds to the identification of step S 80 again, the search for right angle segments using both the upper-right right angle segment extraction pattern and the upper-left right angle segment extraction pattern has already completed (YES in step S 80 ), and thus no right angle segment is identified as being present (step S 82 ).
  • step S 83 it is determined that the object is not an artificial highway structure (step S 83 ), the right angle segment determination completes, and the processing in step S 27 to determine the shape of the object in FIG. 12 described above is executed.
  • the reason that the correlation calculation is carried out using both the upper-right right angle segment extraction pattern and the upper-left right angle segment extraction pattern and that the distance between the parts having respective high correlations and the vehicle 10 is compared to the distance between the object a the vehicle 10 is the same as the case of the vertical straight line segment identification.
  • the identification of identical shapes is a process in which a highway structure 50 (for example, an upper and lower round lens disposed in a traffic signal) structured from a plurality of physical bodies having an identical shape is searched for from among the infrared images obtained by the infrared cameras.
  • a highway structure 50 for example, an upper and lower round lens disposed in a traffic signal
  • step S 91 the image of the object and an object pattern “Pat”, which is a reference image for carrying out the correlation calculation, are set (step S 91 ).
  • the object pattern “Pat” is the reference image that sets the zone one size larger than the binary object image (OBJ) 200 , as shown in FIG. 24B, in the case, for example, that the part of the lens in the highway structure 50 that emits heat is extracted as the binary object image (OBJ) 200 as shown in FIG. 24A.
  • step S 92 the search zone in proximity to the object is set.
  • the setting of the search zone is carried out as follows. Specifically, as shown in FIG. 24A, the range of the search zone is set such there is an upper and lower a [pixel] height above and below the binary object image 200 and there is a left and right b/2 [pixel] width on the right and left with respect to the center of the binary object image 200 , and this serves as the respective upper search range 202 and the lower search range 203 using the correlation calculation.
  • a part (OBJ_Pat) having a high correlation with the object pattern “Pat” is searched for using the correlation calculation from within the upper search range 202 and the lower search range 203 in proximity to the object (step S 93 ).
  • step S 94 it is identified whether or not a part having a high correlation with the object pattern “Pat” is present.
  • step S 94 in the case that a part having a high correlation with the object pattern “Pat” is present (YES in step S 94 ), a shape identical to the object is identified as being present (step S 95 ), having an identical shape is treated as being an artificial highway structure (step S 96 ), and the identical shape identification completes. Moreover, in the example in FIG. 22, from the center of the infrared image, a highway structure (traffic signal) having a plurality (2) of identical objects (round lenses) is detected.
  • step S 94 in the case that a part having a high correlation with the object pattern “Pat” is not present (NO in step S 94 ), no shape identical to the object is identified as being present (step S 97 ), having no identical shape is treated as not being an artificial highway structure (step S 98 ), and the identical shape identification completes.
  • the setting of the search zone in which the object pattern is searched for was set in the vertical direction of the binary object image (OBJ) 200 , but because the physical bodies having an identical shape may also be arranged left to right, after searching in a vertical direction, the search zone can be set left to right, and an object pattern searched for.
  • OBJ binary object image
  • the image processing unit 1 comprises an object extraction device, an artificial structure identification device, and a reference image dimension altering device. More concretely, S 1 to S 7 in FIG. 3 correspond to the object extraction device, S 25 to S 28 in FIG. 12 correspond to the artificial structure identification device, and S 30 in FIG. 12 corresponds to the artificial structure elimination device. Furthermore, S 32 in FIG. 15, S 52 in FIG. 16, and S 72 in FIG. 17 correspond to the reference image dimension altering devices.
  • the result of monitoring the environment in the vicinity of the vehicle is treated by being classified into moving physical bodies such as pedestrians and animals, and artificial highway structures, and thus, for example, in the case that the environment in the vicinity of the vehicle is displayed to the driver of the vehicle, the method of displaying these objects can be different, and the driver appropriately notified of physical bodies towards which more careful attention should be paid.
  • the image of a plurality of objects that emit heat present in an infrared image photographed by a photographing device is compared with a reference image, and it becomes possible to distinguish whether this physical body is an artificial structure having a determined shape or a moving physical body such as a pedestrian or animal.
  • a fourth aspect of the present invention by compensating for differences in the sizes between the object image and reference image produced by the distance between the object and vehicle and comparing both with an appropriate size, the effect is attained that the precision in detecting whether or not the object is an artificial structure is improved.
  • the information about these physical bodies can be used in the vehicle control, and in the case that this information is displayed as information or warnings to the driver of the vehicle, it can be used as material for determining for altering the display method of the information and warnings depending on the content and importance of the object or the control method of the vehicle.

Abstract

The present invention provides a vehicle zone monitoring apparatus that eliminates artificial structures based on shape identification using a reference image from an infrared image photographed by a photographing device provided in the vehicle, and detects remaining objects as physical bodies that move, such as pedestrians and animals. The vehicle zone monitoring apparatus that detects the physical bodies present in the vicinity of the vehicle from infrared images photographed by infrared cameras 2R and 2L provided on the vehicle comprises an object extracting device that extracts object images that emit infrared radiation from the infrared image and an artificial structure identifying device that identifies whether or not an object is an artificial structure by comparing the object image extracted by the object image extracting device to a reference image that comprises a straight line pattern or a right angle pattern.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0001]
  • The present invention relates to a vehicle zone monitoring apparatus that detects physical bodies such as other vehicles, pedestrians, and animals that are present in the vicinity of the vehicle. [0002]
  • 2. Description of the Related Art [0003]
  • Japanese Unexamined Patent Application, First Publication, No. 2001-6069, is a known example of a zone monitoring apparatus that detects physical bodies present in the vicinity of a vehicle that move, such as pedestrians and animals. This zone monitoring apparatus calculates the distance between the objects in the vicinity of the vehicle and the vehicle from images obtained by two infrared cameras, and then calculates the motion vector of the objects from the position data of the objects found in a time sequence. In addition, the apparatus detects the objects having a high possibility of colliding with the vehicle from the relationship between the direction of progress of the vehicle and the movement vector of the object. [0004]
  • In addition, Japanese Unexamined Patent Application, First Publication, No. 2001-108758, discloses technology in which objects are detected by eliminating zones that exhibit temperatures clearly different from the physical body temperature of a pedestrian from an infrared image photographed by a photographing device provided on the vehicle. In this technology, for objects extracted from portions that eliminate the zone exhibiting a temperature clearly different from the physical body temperature of a pedestrian, the apparatus determines whether or not the object is a pedestrian by further identifying the aspect ratio of the object. [0005]
  • However, in the conventional zone monitoring apparatuses described above, although objects that emit infrared light can be detected, detecting objects besides pedestrians and animals is a problem. These objects include ones that naturally emit heat such as vending machines, the telephone poles and lamp poles that have been heated by exposure to the sun during the day, and have a low importance in terms of the vehicle travel. [0006]
  • In particular, there is the problem that physical bodies that have a temperature approximately the same as the physical body temperature of a pedestrian or have an oblong shape that is the same as that of a pedestrian cannot be distinguished at all from pedestrians. [0007]
  • Furthermore, when pedestrians and animals having indefinite shapes are extracted from the objects by identifying their shape, there is the problem that improving the precision of the detection is difficult. [0008]
  • In consideration of the above problems, it is an object of the present invention to provide a vehicle zone monitoring apparatus that eliminates artificial structures by shape identification using reference images from the infrared image photographed by a photographing device provided on the vehicle, and detects the remaining objects as physical bodies that move, such as pedestrians and animals. [0009]
  • SUMMARY OF THE INVENTION
  • In order to solve the problems described above, a first aspect of the present invention is a vehicle zone monitoring apparatus that detects physical bodies present in the vicinity of the vehicle from infrared images photographed by a photographing device comprises an object extracting device (for example, [0010] steps S 1 to S 7 in the embodiments) that extracts objects that emit infrared light from the infrared images, and an artificial structure identifying device (for example, steps S 25 to S 28 in the embodiments) that compares the image of the object extracted by the object extracting device to a reference image that is an element that defines an artificial structure and identifies whether or not said object is an artificial structure.
  • Due to the structure described above, for a plurality of heat-emitting physical bodies present in the infrared images photographed by the photographing device, an image of this physical body and a reference image are compared, and distinguishing whether the physical body is an artificial structure having a determined shape or a physical body other than this, for example, one that moves, such as a pedestrian or animal becomes possible. [0011]
  • In the vehicle zone monitoring apparatus of the first aspect, a second aspect of the present invention is characterized in comprising an artificial structure eliminating device (for example, [0012] step S 30 in the embodiments) that eliminates the objects identified to be artificial structures using the artificial structure identifying device from the objects extracted by the object extracting device.
  • Due to the structure described above, in order to extract objects other than artificial structures, which should receive attention, artificial structures are eliminated from the objects extracted from the infrared images, and the remaining objects can be recognized as moving physical bodies. [0013]
  • In the vehicle zone monitoring apparatus of the first and second aspect, a third aspect of the present invention is characterized in that said reference images include an image representing a line segment, and said artificial structure identifying device identifies objects that includes a line segment. [0014]
  • Due to the structure described above, by identifying whether or not there is a straight line segment, which easily characterizes artificial structures in the objects, objects having straight line segments can be eliminated as artificial structures, and objects other than artificial objects can be recognized. [0015]
  • In the vehicle zone monitoring apparatus of the first through third aspects, a fourth aspect of the invention is characterized in that said artificial structure identifying device comprises a reference image dimension altering device (for example, steps S [0016] 32, S 52, and S 72 in the embodiment) that alters the size of said reference image so as to conform to the distance between said vehicle and said object.
  • Due to the structure described above, by complementing the differences in size between the object image and the reference image that occur due to the distance between the object and the vehicle and referring to both using an appropriate size, the precision in detecting whether or not an object is an artificial structure can be improved.[0017]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram showing the structure of the vehicle zone monitoring apparatus according to the embodiment of the present invention. [0018]
  • FIG. 2 is a drawing showing the installation positions of the infrared cameras, sensors, display and the like in the vehicle. [0019]
  • FIG. 3 is a flowchart showing the processing sequence for all operations in the image processing unit of the vehicle zone monitoring apparatus of this embodiment. [0020]
  • FIGS. [0021] 4A and FIG. 4B are drawings showing the gray scales obtained by the infrared camera and the binary image thereof.
  • FIG. 5A, FIG. 5B, and FIG. 5C are drawings showing the conversion processing and labeling for the run length data. [0022]
  • FIGS. 6A and 6B are drawings showing the time trace of the object. [0023]
  • FIG. 7 is a drawing showing the turning angle compensation of the object image. [0024]
  • FIGS. 8A and 8B are drawings showing the search image in the right image and the search zone set in the left image. [0025]
  • FIG. 9 is a drawing showing the correlation calculation processing that uses the search zone as an object. [0026]
  • FIGS. 10A and 10B are drawings showing the calculation method for object parallax in the distance calculation of the object. [0027]
  • FIG. 11 is a drawing showing the offset of the object position in the image generated by the turning of the vehicle. [0028]
  • FIG. 12 is a flowchart showing the details of the warning determining processing in the image processing unit of the vehicle zone monitoring apparatus of this embodiment. [0029]
  • FIG. 13 is a drawing showing the zone partition in the forward direction of the vehicle. [0030]
  • FIG. 14 is a drawing showing the case in which a collision can occur easily. [0031]
  • FIG. 15 is a flowchart showing the details of the detection processing of a vertical linear part in the image processing unit of the vehicle zone monitoring apparatus of the embodiment. [0032]
  • FIG. 16 is a flowchart showing the details of the process of detecting the horizontal linear part in the image processing unit of the vehicle zone monitoring apparatus of this embodiment. [0033]
  • FIG. 17 is a flowchart showing the details of the process of detecting the quadrangle part in the image processing unit of the vehicle zone monitoring apparatus of this embodiment. [0034]
  • FIGS. 18A and 18B are drawings showing the details of the vertical linear part extracted pattern in the search of the image. [0035]
  • FIG. 19 is a drawing showing the search of the reference pattern for the vertical part search. [0036]
  • FIGS. 20A and 20B are drawings showing the details of the vertical linear part extracted pattern in the search of the image. [0037]
  • FIGS. 21A and 21B are drawings showing the details of the quadrangle part extracted pattern in the search of the image. [0038]
  • FIG. 22 is a drawing showing an example of a highway structure obtained by the infrared camera. [0039]
  • FIG. 23 is a flowchart showing the details of the process of detecting identical shapes in the image processing unit of the vehicle zone monitoring apparatus of this embodiment. [0040]
  • FIGS. 24A and 24B are drawings showing the search of the object pattern for detecting identical shapes. [0041]
  • DETAILED DESCRIPTION OF THE INVENTION
  • Below, an embodiment of the present invention will be explained with reference to the figures. [0042]
  • FIG. 1 is a block diagram showing the structure of the vehicle zone monitoring apparatus according to the embodiment of the present invention. [0043]
  • In FIG. 1, [0044] reference numeral 1 is an image processing unit that provides a CPU (Central Processing System) that controls the vehicle zone monitoring apparatus of this embodiment, and has connected thereto two infrared cameras 2R and 2L that can detect infrared light, a yaw rate sensor 3 that detects the oscillation of this vehicle physical body, a velocity sensor 4 that detects the traveling velocity (vehicle velocity) of this vehicle, and a brake sensor 5 for detecting the operation of the brake. Thereby, the image processing unit 1 detects an object that moves, such as a pedestrian or animal, in front of the vehicle from signals representing the infrared image in the vicinity of the vehicle and the travel state of the vehicle, and determines when the possibility of a collision is high.
  • Connected to the [0045] image processing unit 1 are a speaker 6 that issues a warning by voice, and, for example, a meter display that integrates a meter that numerically represents the travel state of the vehicle, a NAVIDisplay disposed on the console of the vehicle, or a HUD (Head Up Display) that displays information on the front window at a position that does not interfere with the forward view of the driver, in order to notify the driver of the vehicle about objects with which the danger of a collision is high.
  • In addition, the [0046] image processing unit 1 comprises an A/D conversion circuit that converts an input analog signal into a digital signal, an image memory that stores digitalized image signal, a CPU (Central Processing Unit) that carries out each type of operation processing, RAM (Random Access Memory) used in order to store data that the CPU is currently processing, ROM (Read Only Memory) that stores programs executed by the CPU, tables, maps and the like, and the like, a driving signal for the speaker 6, and an output circuit that outputs display signals and the like of the HUD 7 a, for example, and is structured such that each of the output signals of the infrared cameras 2R and 2L, the yaw rate sensor 3, the velocity sensor 4, the and the brake sensors 5 are converted to digital signals and input into the CPU.
  • In addition, as shown in FIG. 2, on the front of the [0047] vehicle 10, the infrared cameras 2R and 2L are disposed at substantially symmetrical positions with respect to the center part in the transverse direction of the vehicle 10, and the optical axes of the two infrared cameras 2R and 2L are parallel to each other. In addition, the height of both cameras from the surface of the road surface is fixed so as to be equal. Moreover, the infrared cameras 2R and 2L have the property that the output signal level increases (the brightness increases) as the temperature of the object increases.
  • In addition, the [0048] HUD 7 a is provided so that the display image is displayed at a position on the front window of the vehicle 10 that does not block the forward visual horizon of the driver.
  • Next, the operation of this embodiment will be explained with reference to the figures. [0049]
  • FIG. 3 is a flowchart showing the processing sequence in the [0050] image processing unit 1 of the vehicle zone monitoring apparatus of this embodiment.
  • First, the [0051] image processing unit 1 obtains an infrared image, which is the output signal of the infrared cameras 2R and 2L (step S 1), carries out AID conversion thereon (step S 2), and stores the gray scale image in the image memory (step S 3). Moreover, the right image from the infrared camera R2 is obtained, and then the left image from the infrared camera 2L is obtained. In addition, in the right image and the left image, the horizontal position on the display screen of the same object is displayed with an offset, and thus, by this offset (parallax), the distance to this object can be calculated.
  • Next, the right image obtained by the infrared camera R[0052] 2 serves as the reference image, the binary processing of this signal image, that is, the processing, in which the zones brighter than a brightness threshold ITH are assigned “1” (white) and the dark zones are assigned “0” (black), is carried out (step S 4).
  • FIG. 4A shows the gray scale image obtained from the [0053] infrared camera 2R, and by carrying out binary processing thereon, the image in FIG. 4B is obtained. Moreover, in FIG. 4B, the physical bodies surrounded by the frames P1 to P4 are set as the objects displayed in white on the display screen (below, referred to as the “high brightness zone”).
  • When the image data that has undergone binary processing has been obtained from the infrared camera, processing is carried out in which the binary image data is converted to run length data (step S [0054] 5).
  • FIG. 5A is a drawing to explain this, and in this figure the zone that has become white due to binary conversion is shown as the lines L[0055] 1 to L8. Lines L1 to L8 all have a width of 1 pixel in the y direction, and while they are actually arranged without a space between them in the y direction, they have been separated for the sake of the explanation. In addition, the lines L1 to L8 respectively have the lengths 2 pixels, 2 pixels, 3 pixels, 8 pixels, 7 pixels, 8 pixels, 8 pixels, and 8 pixels. The run length data is shown by the coordinates of the start point of each of the lines (the point on the left end of each line) and the length (number of pixels) from the start point to the end point (the point on the right end of each line). For example, line L3 comprises the 3 pixels (x3, y5), (x4, y5), and (x5, y5), and thus (x3, y5, 3) becomes the run length data.
  • Next, from the image data converted into run length data, by labeling the object (step S [0056] 6), the processing in which the object is extracted is carried out (step S7). That is, as shown in FIG. 5B, among the lines L1 to L8 that have been converted to run length data, the lines L1 to L3, which are the parts overlapping in the y direction, are treated as one object 1, lines L4 to L8 are treated as one object 2, and the object labels 1 and 2 are added to the run length data. By this processing, for example, the high brightness zones shown in FIG. 4B are respectively recognized as objects 1 through 4.
  • When the extraction of the objects has completed, as shown in FIG. 5C, next the center of gravity G, surface area S, and the aspect ratio ASPECT of the circumscribed quadrangle represented by the broken lines is calculated (step S [0057] 8).
  • Here, the surface area S is calculated by adding the lengths of the run length data for the same object. In addition, the coordinate of the center of gravity G is calculated as the x coordinate of the line that bisects the surface area S in the x direction, and the y coordinate of the line that bisects it in the y direction. Furthermore, the aspect ratio ASPECT is calculated as the Dy/Dx ratio of Dy and Dx shown in FIG. 5C. Moreover, the position of the center of gravity G can be substituted for by the position of the center of gravity of the circumscribed quadrangle. [0058]
  • When the center of gravity, the surface area, and the aspect ratio of the circumscribed quadrangle have been calculated, next recognition of the time trace, that is, the sampling of each cycle, of the same object us carried out (step S [0059] 9). In a time trace, k serves as the time during which time t, an analogue amount, is made discrete at a sampling cycle, and as shown in FIG. 6A, in the case that objects A and B are extracted at time k, objects C and D extracted at time (k+1) are determined to be identical to objects A and B. Specifically, when the following identity determination conditions 1 to 3 are satisfied, objects A and B are determined to be identical to objects C and D, and objects C and D have their labels changed respectively to objects A and B.
  • 1) When the position coordinates of the center of gravity in the image of the object i (=A, B) at time k are set respectively to (xi (k), yi (k)) and the position coordinates of the center of gravity in the image of the object j (=C, D) at time (k+1) are set respectively to (xj (k+1), yj (k+1)), then |xj (k+1)−xi (k)|<Δx|yj (k+1)−yi (k)|<Δy, where Δx and Δy denote the allowable values of the amount of movement in the picture respectively in the x direction and the y direction. [0060]
  • 2) When the surface area of the object i (=A, B) in the image at time k is Si (k) and the surface area of the object j (=C, D) in the image at time (k+1) is Sj (k+1), then Sj (k+1)/Si (k)<1±ΔS, where ΔS denotes the allowable values of the change in area. [0061]
  • 3) When the aspect ratio of the circumscribed quadrangle of the object i (=A, B) at time k is ASPECT i (k) and the aspect ratio of the circumscribed quadrangle of the object j (=C, D) is ASPECT j (k+1), then ASPECT j (k+1)/ASPECT i (k)<1±ΔASPECT, where ΔASPECT denotes the allowable values of the aspect ratio. [0062]
  • For example, when comparing FIG. 6A and FIG. 6B, although the size of each of the objects in the image becomes larger, the object A and the object C satisfy the conditions for the identification of identity described above and the object B and the object D satisfy the conditions for the identification of identity described above, and thus the objects C and D are respectively recognized to be the objects A and B. In this manner, the positions coordinates (of the center of gravity) of each of the recognized objects is stored in the memory as time series position data to be used on later calculation processing. [0063]
  • Moreover, the processing in steps S [0064] 4 to S 9 explained above is carried out on a binary reference image (in this embodiment, the right image).
  • Next, the velocity VCAR detected by the [0065] velocity sensor 4 and the yaw rate YR detected by the yaw rate sensor 3 are read, and as shown in FIG. 7, the turning angle θr of the vehicle 10 is calculated by integrating the yaw rate YR with respect to time (step S 10).
  • In contrast, the processing of [0066] step S 9 and step S 10 is carried out in parallel, and in steps S 11 through S 13, processing that calculates the distance z between the object and the vehicle 10 is carried out. Because this calculation requires a longer time than step S 9 and step S 10, they are executed at a longer cycle than step S 9 and S 10 (for example, at a cycle about three times the execution cycle of steps S 1 to S 10).
  • First, by selecting one among the objects that is traced out by the binary image of the reference object (the right image), as shown in FIG. 8A, in the search image R[0067] 1 (here, the entire area surrounded by the circumscribed quadrangle is made the searched image) is extracted from the right image (step S 11).
  • Next, the vehicle search zone in which the image corresponding to the searched image (below, referred to as the “corresponding image”) from the left image is set, and the corresponding image is extracted by executing the correlation calculation (step S [0068] 12). Specifically, as shown in FIG. 8B, depending on each of the peak coordinates of the searched image R1, the searched zone R2 in the left image is set, and the brightness difference total value C (a, b), which indicates the degree of the correlation with the searched image R1 in the searched zone R2, is calculated by the Eq. 1 shown below, and the zone in which this total value C (a, b) becomes minimum is extracted as the corresponding image. Note that this correlation calculation is carried out using the gray scale image, not the binary image.
  • In addition, when there is past position data for the identical physical body, based on this position data, a zone R[0069] 2 a (shown by the broken line in FIG. 8B) that is narrower than the searched zone R2 is set to serve as the search zone. C ( a , b ) = n = 0 N - 1 m = 0 M - 1 IL ( a + m - M , b + n - N ) - IR ( m , n ) Eq . 1
    Figure US20030007074A1-20030109-M00001
  • Here, IR (m, n) is the brightness value of the position of the coordinate (m, n) in the searched zone R[0070] 1 shown in FIG. 9 and IL (a+m−M, b+n−N) is the brightness value of the position of the coordinate (m, n) in the search one R1 and the local zone R3 having the same shape, where the coordinates (a, b) in the search zone are the base points. The position of the corresponding image is defined by finding the position at which the total value C (a, b) of the brightness difference is minimized by changing the coordinates (a, b) of the base point.
  • Due to the processing in [0071] step S 12, as shown in FIG. 10A and FIG. 10B, because the searched image R1 and the corresponding image R4 corresponding to this object are extracted, next the distance dR (number of pixels) between the position of the center of gravity of the searched image R1 and the image center line LCTR and the distance dL (number of pixels) between the position of the center of gravity of the corresponding image R4 and the image center line LCTR are found, and by applying the following Eq. 2, the distance z between the vehicle 10 and the object is calculated (step S 13). z = BxF ( d L + d R ) × p = BxF Δ d × p Eq . 2
    Figure US20030007074A1-20030109-M00002
  • Here, B is the base line length, that is, the distance in the horizontal direction between the center position of the photographic element of the [0072] infrared camera 2R and the center position of the photographic element of the infrared camera 2L (the separation of the light beam axis of both infrared cameras); F is the focal distance of the lenses of the infrared cameras 2R and 2L, p is the pixel separation in the photographic element of the infrared cameras 2R and 2L, and Δd (=dR+dL) is the amount of parallax.
  • When the calculation of the turning angle θr in step S [0073] 10 and the calculation of the distance to the object in step S 13 have completed, the coordinates (x, y) in the image and the distance z calculated by Eq. 2 are applied to the following Eq. 3, and converted to real spatial coordinates (X, Y, Z) (step S 14).
  • Here, as shown in FIG. 2, the real spatial coordinates (X, Y, Z) have as their origin O the position of the center point of the installation position of the [0074] infrared cameras 2R and 2L (the position at which they are fixed on the vehicle 10), they are fixed as shown in the figure, and the coordinates in the image are determined by x in the horizontal direction and y in the vertical direction, where the center of the image serves as the origin. [ X Y Z ] = [ xc × z / f yc × z / f z ] Eq . 3
    Figure US20030007074A1-20030109-M00003
  • where f=F/p. [0075]
  • Here, (xc, yc) are the coordinates (x, y) of the right image that have been converted to coordinates of a virtual image in which the real spatial origin O and the center of the image have been made to coincide based on the relative positional relationship between the installation position of the [0076] infrared camera 2R and the real special origin O. In addition, f is the ratio of the focus distance F and the pixel interval P.
  • In addition, when the real spatial coordinates have been found, turning angle compensation is carried out in order to compensate the positional shift in the image due to the turning of the vehicle [0077] 10 (step S 15).
  • As shown in FIG. 7, when the vehicle turns, for example, at a turning angle θr in the left direction during the time interval from time k to (k+1), a shift in the x direction by an amount equivalent to Δx, as shown in FIG. 11, occurs in the image obtained by the camera, and the turning angle compensation is a process to compensate this. Specifically, in the following Eq. 4, the real spatial coordinate system (X, Y, Z) is applied, and the compensated coordinates (Xr, Yr, Zr) are calculated. The calculated real spatial position data (Xr, Yr, Zr) is associated with each object and stored in memory. Moreover, in the following explanation, the coordinates after turning angle compensation are denoted (X, Y, Z). [0078] [ Xr Yr Zr ] = [ cos θ r 0 - sin θ r 0 1 0 sin θ r 0 cos θ r ] [ X Y Z ] Eq . 4
    Figure US20030007074A1-20030109-M00004
  • When the turning angle compensation for the real coordinates has completed, next, the approximately straight line LMV corresponding to the relative motion vector between the object and the [0079] vehicle 10 is found from N real spatial position data (for example, N=10) after turning angle compensation obtained during the monitoring period ΔT for one and the same object, that is from the time series data, (step S 16).
  • Concretely, when the direction vector L, which denotes the direction of the approximately straight line LMV, is equal to (lx, ly, lz) where (|L|=1), the straight line represented by the following Eq. 5 is found. [0080] X = u · lx + Xav Y = u · ly + Yav Z = u · lz + Zav Xav = j = 0 N - 1 X ( j ) / N Yav = j = 0 N - 1 Y ( j ) / N Zav = j = 0 N - 1 Z ( j ) / N Eq . 5
    Figure US20030007074A1-20030109-M00005
  • Here, u is a parameter that takes an arbitrary value, and Xav, Yav, and Zav are respectively the average values of the X coordinate, Y coordinate, and Z coordinate of the real spatial position data sequence. [0081]
  • Moreover, when the parameter u is eliminated, Eq. 5 becomes to Eq. 5a: [0082]
  • (X−Xav)/1x=(Y−Yav)/1y=(Z−Zav)/1z   Eq. 5a
  • In addition, in the case, for example, that P(0), P(1), P(2), . . . , P(n−2), P(N−1) denote the time series data after turning angle compensation, the approximately straight line LMV passes through the average position coordinate Pav=(Zav, Yav, Zav) of the time sequence data, and is found as the straight line which is characterized in that the average value of the square of the distance from each of the data points is minimal. [0083]
  • Here, the numerical value in the parenthesis added to P, which denotes the coordinates of each of the data point, indicates that the larger the value, the older the data. For example, P(0) denotes the most recent position coordinate, P(1) denotes the position coordinate of one sample cycle back, and P(2) denotes the position coordinate two sample cycles back. [0084]
  • Next, when the most recent position coordinate P(0)=(X(0), Y(0), Z(0)), the position coordinate P (N−1)=(X (N−1), Y(N−1), Z(N−1)) of the (N−1) sample back (before time ΔT) is compensated to a position on the approximately straight line LMV. Concretely, by applying the Z coordinates Z(0), Z(N−1) to the Eq. 5a above, that is, the following Eq. 6, the position coordinates after compensation Pv(0)=(Xv (0), Yv (0), Zv(0)) and Pv (N−1)=(Xv(N−1), Yv(N−1), Zv(N−1)) are found. [0085] Xv ( j ) = ( Z ( j ) - Zav ) × lx lz - Xav Yv ( j ) = ( Z ( j ) - Zav ) × ly lz - Yav Zv ( j ) = Z ( j ) j = 0 , N - 1 Eq . 6
    Figure US20030007074A1-20030109-M00006
  • The relative motion vector is found as the vector from the position coordinate Pv(N−1) calculated in Eq. 8 towards Pv(0). [0086]
  • By finding the relative motion vector by calculating the approximately straight line that approximates the relative motion locus of the object with respect to the [0087] vehicle 10 from a plurality (N) of data within the monitoring period ΔT in this manner, the influence of position detection error can be reduced, and the possibility of a collision with the object can be more correctly predicted.
  • In addition, in [0088] step S 16, when the relative motion vector has been found, next the possibility of a collision with the detected object is determined, and a warning determination process, which issues a warning when the possibility is high, is executed (step S 17).
  • Moreover, when the warning determination process has been completed, the flow returns to step S[0089] 1, and the above processing is repeated.
  • Next, the warning determination process in step S [0090] 17 of the flowchart shown in FIG. 3 will be explained with reference to the flowchart shown in FIG. 12.
  • Here, as shown in FIG. 14, the case in which there is an [0091] animal 20 progressing at a velocity Vp in a direction that is at an angle of approximately 90° with respect to the direction of progress of the vehicle 10 will be used as an example to be explained.
  • First, the [0092] image processing unit 1 calculates the relative velocity Vs in the Z direction using the following Eq. 7 from the animal 20 approaching the distance Zv (0) from the distance Zv (N−1) during the time ΔT, and carries out collision determination processing (step S 21). The collision determination processing is processing that determines whether there is a possibility of a collision when the following Equations 8 and 9 are satisfied.
  • In [0093] step S 21, in the case it has been determined that there is a possibility of a collision with the animal 20 (YES in step S 21), the flow proceeds next to step S 22.
  • In addition, in [0094] step S 21, when Eq. 8 and/or Eq. 9 are not satisfied, it is determined that there is no possibility of a collision with the animal 20 (NO in step S 21), and the warning determination processing completes.
  • Vs=(Zv(N−a)−Zv(0))/ΔT   Eq.7
  • Zv(0)/Vs≦T   Eq. 8
  • |Yv(0)|≦H   Eq. 9
  • Here, Zv(0) is the most recent distance detection value (v is attached in order to indicate that this is data after compensation using the approximately straight line LMV, while the Z coordinate is a value identical to that before compensation), and Zv(N−1) is the detected distance value before the time ΔT. In addition, T is an allowable time and signifies that the possibility of a collision is determined time T before the predicted collision time, and is about 2 to 5 seconds, for example. In addition, H is a predetermined height that defines the range of the Y direction, that is the height direction, and is set, for example, to about twice the height of the [0095] vehicle 10.
  • When the collision determination processing has completed, next it is determined whether or not the object is within the approach zone determination (step S [0096] 22). For example, in FIG. 13, the zone that can be monitored by the infrared cameras 2R and 2L is indicated by the area AR0 in the circumscribed triangle indicated by the bold solid line, and zones AR1, AR2, and AR3 in the zone AR0, which are closer to the vehicle 10 than Z1=Vs×T serve as the warning zones.
  • Here, AR[0097] 1 is the zone corresponding to the range having added the allowance β (for example, about 50 to 100 cm) to both sides of the width α of the vehicle 10, or in other words, the zone having a width (α/2+β) on both sides of the axle at the center part in the width direction of vehicle 10, and if the object continues to be present as-is, the possibility of a collision is extremely high. Thus, these zones are called approach determination zones. The zones AR2 and AR3 are zones (in the outside transverse direction of the approach determination zone) in which the absolute value of the X coordinate is larger than the approach determination zone, an invasive collision determination, described below, is made about the object inside this zone, and thus this is called the invasive determination zone. Moreover, these zones have a predetermined height H in the Y direction, as shown in the above Eq. 9.
  • The answer in the [0098] above step S 21 becomes affirmative (YES) in the case that an object is present in either the approaching determination zone AR1 or the invasive determination zones AR2 and AR3.
  • Next, in [0099] step S 22, it is determined whether or not the object is in the approaching determination zone AR1, and in the case that it is determined that the object is in the approaching determination zone AR1 (YES in step S 22), the flow proceeds directly to step S 24. In contrast, in the case that it is determined the object is not in the approaching determination zone AR1 (NO in step S 22), invasive collision determination processing is carried out (step S 23).
  • Concretely, the invasive collision determination processing in [0100] step S 23 distinguishes whether or not the difference between xc(0), which is the most recent x coordinate on the image (the character c, as will be explained below, is attached in order to signify that it is a coordinate on which compensation has been carried out that makes the center position of the image align with the real spatial origin point O) and xc(N−1), which is the x coordinate before the time ΔT, satisfies the following Eq. 10, and in the case that it is satisfied, it is determined that the possibility of a collision is high. - α · f 2 ( 1 Zv ( 0 ) - 1 Zv ( N - 1 ) ) xc ( 0 ) - xc ( N - 1 ) α · f 2 ( 1 Zv ( 0 ) - 1 Zv ( N - 1 ) ) Eq . 10
    Figure US20030007074A1-20030109-M00007
  • Moreover, as shown in FIG. 14, in the case that there is an animal progressing in a direction that is at an angle of approximately 90° with respect to the direction of the progress of the [0101] vehicle 10, when Xv(N−1)/Zv(N−1)=Xv(0)/Zr(0), or in other words, when the ratio of the velocity Vp and the relative velocity Vs of the animal is Vp/Vs=Xr(N−1)/Zr(N−1), the bearing θd viewing the animal 20 from the vehicle 10 becomes constant, and the possibility of a collision becomes high. Eq. 10 determines this possibility taking into account the width α of the vehicle 10.
  • In [0102] step S 23, in the case that it has been determined that the possibility of a collision is high (YES in step S 23), the flow proceeds to step S 24. In contrast, in the case that it has been determined that the possibility of a collision is low (NO in step S 23), the warning determination processing completes.
  • In step S [0103] 24, it is determined whether or not to carry out a warning output determination process, that is, a warning output (step S 24).
  • The warning output determination process determines whether or not the driver of the [0104] vehicle 10 is carrying out a braking action from the output BR of the brake sensor 5.
  • In the case that the driver of the [0105] vehicle 10 is carrying out a braking action, the acceleration Gs (positive in the deceleration direction) generated thereby is calculated, and when this acceleration Gs is larger than a predetermined threshold value GTH, it is determined that a collision can be avoided by the braking action, and the warning determination processing completes. Thereby, when an appropriate braking action is carried out, no warning is issued, and the driver will not be excessively annoyed.
  • In addition, when the acceleration Gs is equal to or less than a predetermined threshold GTH, and additionally, if the driver of the [0106] vehicle 10 is not carrying out a braking action, the flow immediately proceeds to the processing in step S 25 and following, and the shape determination of the object is carried out.
  • Moreover, the predetermined threshold value GTH is determined by the following Eq. 11. This is the value corresponding to the condition in which the [0107] vehicle 10 stops at a running distance equal to or less than the distance Zv(0) in the case that the acceleration Gs during the braking action is maintained as-is. GTH = Vs 2 2 × Zv ( 0 ) Eq . 11
    Figure US20030007074A1-20030109-M00008
  • In the shape identification of the object in step S [0108] 25 and following, there are the following steps: identifying whether or not a part indicating a straight line segment is included in the image of the object (step S 25); whether or not an angle in the image of the object is a right angle (step S 26); whether or not the image of the object conforms to the shape of a pre-registered artificial structure (step S 27); and whether or not a plurality of identical shapes are included in the image of the object (step S 28).
  • First, it is identified whether or not a part indicating a straight line segment is included in the image of the object (step S [0109] 25).
  • In [0110] step S 25, in the case that a part indicating a straight line segment is not included in the image of the object (NO in step S 25), it is identified whether or not an angle in the image of the object is a right angle (step S 26).
  • In step S [0111] 26, in the case that an angle in the image of the object is not a right angle (NO in step S 26), it is identified whether or not the image of the object conforms to the shape of a pre-registered artificial structure (step S 27).
  • In step S [0112] 27, in the case that the image of the object does not conform to the shape of a pre-registered artificial structure (NO in step S 27), whether or not pluralities of identical shapes are included in the image of the object is identified (step S 28).
  • In addition, in [0113] step S 28, in the case that a plurality of identical shapes are not included in the image of the object (NO in step S 28), the possibility that the object is a pedestrian or an animal is high, and thus a warning is issued by voice through the speaker 3, and at the same time, by the image display apparatus 7, for example, the image obtained by the infrared camera 2R is displayed, and the approaching object is given an highlighted display (for example, highlighted by being surrounded by a frame) (step S 29).
  • In contrast, in [0114] step S 25, in the case that a part indicating a straight line segment in the image of the object is included (YES in step S 25), or in step S 26, in the case that an angle in the image of the object is a right angle (YES in step S 26), or in step S 27, in the case that the image of the object conforms to the shape of a pre-registered artificial structure (YES in step S 27), or further in step S 28, the case that a plurality of identical shapes are included in the image of the object (YES in step S 28), the object is treated as an artificial structure, and the object extracted in step S 7 of FIG. 3 is eliminated (step S 30), no warning is issued, and the warning determination processing is completed.
  • Next, the method for the identification of the shape of the object in FIG. 12 described above, and in particular, the search processing for straight line segments and right angle segments in [0115] step S 25, step S 26, step S 28, and step S 30 will be explained with reference to the figures.
  • FIG. 15, FIG. 16, FIG. 17, and FIG. 23 are flowcharts showing in further detail the processing in [0116] step S 25, step S 26, step S 28, and a part of the processing of step S 30.
  • In the search for a straight line segment, the [0117] image processing unit 1 starts from the detection of vertical line segments (vertical straight line segment identification). FIG. 15 is a flowchart showing the vertical straight line segment identification.
  • Therefore, first in order to search for vertical straight line segments, the image of the object and the right straight line segment image pattern, which is a reference image for carrying out correlation calculation, are selected (step S [0118] 31), and depending on the distance between the vehicle 10 and the object found in step S 13 of the flowchart shown in FIG. 3, the pattern size of the reference image is determined so as to be in proportion to the size of the image of the real space projected onto the display image (step S 32).
  • Here, the determination of the pattern size of the reference image is carried out as follows. Specifically, in the case that the distance between the [0119] vehicle 10 and the object is calculated as z=L[m] by using the above Eq. 2, the object having a height A[m] and width B[m] at a position at distance L[m] in real space is projected at a size equal to a×b[pixel] on the display screen.
  • a=f×A/L   Eq. 12
  • b=f×B/L   Eq.13
  • Therefore, as shown in FIG. 18A, from the right straight line segment image pattern prepared in advance, for example, the a×b [pixel] straight line segment pattern is extracted, and the right straight line segment extraction pattern “Pat_Line_R” serves as the reference pattern. Similarly, the a×b [pixel] left straight line segment extraction pattern “Pat_Line_L” extracted from the left straight line segment image pattern prepared in advance is shown in FIG. 18B. [0120]
  • When the reference size for the correlation calculation has been found, next the search zone in proximity to the object is set (step S [0121] 33).
  • Here, the setting of the search zone is carried out as follows. Specifically, as shown in FIG. 19, the outline (the binary object image [0122] 100) that has been subject to binary extraction does not necessarily correctly represent the outline of the object 101. Therefore, with respect to the center 102 of the circumscribed quadrangle of the binary object image (OBJ) 100, the width and height of the search area are set to a height and width defined by respective upper, lower, left, and right a [pixel] ranges, and the search range 103 for the correlation calculation comprises these four a×a [pixel] ranges.
  • Next, from within the [0123] search zone 103 in proximity to the object, a section (OBJ_Pat) 104 having a high correlation with the right straight line segment extraction pattern “Pat_Line_R” is found by the correlation calculation (step S 34).
  • In addition, it is identified whether or not a part having a high correlation with the right straight line segment extraction pattern “Pat_Line_R” is present (step S [0124] 35).
  • In [0125] step S 35, in the case that a part having a high correlation with the right straight line segment extraction pattern “Pat_Line_R” is present (YES in step S 35), in order to determine whether or not the part having a high correlation and the object 101 are identical physical bodies, the distance of OBJ_Pat 104 is calculated in the same manner as the calculation of the distance of the object by the above Eq. 2 (step S 36).
  • Moreover, in the case that the actual distance between the [0126] vehicle 10 and the object 101 is equal to the distance between the vehicle 10 and the OBJ_Pat 104 having a high correlation, the object 101 and OBJ_PAT 104 can be identified to be identical physical bodies, and thus by comparing the calculated parallax Δd and Δd_P instead comparing distances, it can be identified whether or not the object 101 and OBJ_PAT 104 are identical physical bodies (step S 37). Specifically, using the following Eq. 14, it is determined whether or not the parallax error is smaller than an allowable value TH.
  • d−Δd P|<TH   Eq. 14
  • In step S [0127] 37, in the case that it is identified that the object 101 and the OBJ_Pat 104 are identical physical bodies (YES in step S 37), it is determined that there is a vertical straight line segment in the object 101 (step S 38), having a vertical straight line segment is treated as being an artificial highway structure (step S 39), and the vertical straight line determination completes.
  • In contrast, in [0128] step S 35, in the case that a part having a high correlation with the right straight line segment extraction pattern “Pat_Line_R” is not present (NO in step S 35), or in step S 37, in the case that the object 101 and the OBJ_Pat 104 are not identified as identical physical bodies (NO in step S 37), the flow proceeds to step S 40, and it is identified whether or not the reference pattern used in the correlation calculation is a left straight line segment image pattern (step S 40).
  • In [0129] step S 40, in the case that the reference pattern used in the correlation calculation was not a left straight line segment image pattern (NO in step S 40), the left straight line segment image pattern prepared in advance is selected (step S 41), and the flow returns to step S 32.
  • In addition, in step S [0130] 32 and step S 33 described above, the same action is carried out on the left straight line segment image pattern as the action carried out on the right straight line segment image pattern, and the a×b [pixel] left straight line segment extraction pattern “Pat_Line_L” extracted from the left straight line segment image pattern shown in FIG. 18B serves as the reference pattern. Furthermore, in step S 34, from inside the search range 103 in proximity to the object, the part (OBJ_Pat) having a high correlation with the left straight line segment extraction pattern “Pat_Line_L” is searched for using correlation calculation.
  • As a result of the correlation calculation using the left straight line segment extraction pattern, the actions from step S [0131] 35 to step S 39 described above are carried out, and when a vertical straight line segment is identified to be present in the object 101, the object 101 is treated as an artificial road structure, and the vertical straight line segment determination completes.
  • In addition, as a result of the correlation calculation using the left straight line segment extraction pattern, when the flow proceeds to the determination of step S [0132] 40 again, the search of the vertical straight line segment by both the right straight line segment extraction pattern and the left straight line segment extraction pattern has already completed (YES in step S 40), no vertical straight line segment is identified as being present (step S 42), and the flow proceeds to the horizontal straight line segment identification.
  • Moreover, in the vertical straight line segment identification described above, the reason that a correlation calculation is carried out using both the right straight line segment extraction pattern and the left straight line segment extraction pattern, and the distance between the [0133] vehicle 10 and the respective parts having high correlations is compared to the distance between object and the vehicle 10 is because in the case that a plurality of objects overlap and are recognized as one object, there is the possibility that the right or left straight line segments of the objects detected in the vertical straight line segment identification are not parts of the object subject to collision determination. Therefore, the distance between the object and the vehicle 10 is compared to the distance between the detected object and the right or left straight line segment of the vehicle 10, and it is identified whether both are identical physical bodies.
  • Next, the horizontal straight line segment determination will be explained with reference to the flowchart shown in FIG. 16. [0134]
  • In the horizontal straight line segment determination, first, in order to search for a horizontal straight line segment, an upper edge straight line segment image pattern, which is the reference image for carrying out correlation calculation on the image of the object, is selected (step S [0135] 51), and depending on the distance between the vehicle 10 and the object found in step S 13 in the flowchart shown in FIG. 3, the pattern size of the reference image is determined so as to be in proportion to the size of the image in real space projected on the display screen (step S 52).
  • Here, the determination of the pattern size of the reference image is carried out in the same manner as the vertical straight line segment identification described above. That is, in the case that the distance between the [0136] vehicle 10 and the object is calculated as z=K[m] by the above Eq. 2, the object having the height B[m] and width A[m], which is at a position having a distance L[m] in real space, is projected at a b×a [pixel] size on the display screen.
  • b=f×B/L   Eq. 15
  • a=f×A/L   Eq. 16
  • Therefore, as shown in FIG. 20A, for example, the b×a [pixel] straight line segment pattern is extracted from the upper edge straight line segment image pattern prepared in advance, and the upper edge straight line segment extraction pattern “Pat_Line_U” serves as the reference pattern. Similarly, the b×a [pixel] lower edge straight line segment extraction pattern “Pat_Line_D” extracted from the lower edge straight line segment image pattern prepared in advance is shown in FIG. 20B. [0137]
  • When the reference pattern size for correlation calculation has been found, next the search zone in proximity to the object is set (step S [0138] 53).
  • Moreover, the setting of the search zone is also carried out similarly to the vertical straight line segment identification described above. That is, with respect to the center of the circumscribed quadrangle of the binary object image (OBJ), the width and height of the search area are set to a height and width defined by respective upper, lower, left, and right a [pixel] ranges, and the [0139] search range 103 for the correlation calculation comprises these four a×a [pixel] ranges.
  • Next, from within the search range in proximity to the object, using the correlation calculation a part (OBJ_Pat) having a high correlation with the upper edge straight line segment extraction pattern “Pat_Line_U” is searched for (step S [0140] 54).
  • In addition, it is identified whether or not a part having a high correlation with the upper edge straight line segment extraction pattern “Pat_Line U” is present (step S [0141] 55).
  • In [0142] step S 55, in the case that a part having a height correlation with the upper edge straight line segment extraction pattern “Pat_Line_U” is present (YES in Step S 55), it is identified whether there is a horizontal straight line segment in the object (step S 56). Having a horizontal straight line segment means that the object is to be treated as an artificial highway structure (step S 57), and the horizontal straight line segment determination completes.
  • In contrast, in [0143] step S 55, in the case that a part having a high correlation with the upper edge straight line segment extracting pattern “Pat_Line_U” is not present (NO in step S 55), it is identified whether or not the reference pattern used in the correlation calculation is a lower edge straight line segment image pattern (step S 58).
  • In [0144] step S 58, in the case that the reference pattern used in the correlation calculation is not the lower edge straight line segment image pattern (NO in step S 58), a lower edge straight line segment image pattern prepared in advance is selected (step S 59), and the flow returns to step S 52.
  • In addition, in step S [0145] 52 and step S 53 described above, the same actions carried out on the upper edge straight line segment image pattern are carried out on the lower edge straight line segment image pattern, and the b×a [pixel] lower edge straight line segment extraction pattern “Pat_Line_D” extracted from the lower edge straight line segment image pattern shown in FIG. 20B serves as the reference pattern. Furthermore, in step S 54, the part (OBJ_Pat) having a high correlation with the lower edge straight line segment extraction pattern “Pat_Line_D” is found using the correlation calculation from within the search range in proximity to the object.
  • As a result of the correlation calculation using the lower edge straight line segment extraction pattern, the actions from step S [0146] 55 to step S 57 described above are carried out, and when it is identified that a horizontal straight line segment is present in the image, the object is treated as an artificial structure, and the horizontal straight line segment identification completes.
  • In addition, as a result of the correlation calculation using the lower edge straight line segment extraction pattern, when the flow proceeds to the identification of step S [0147] 58 again, because the search for horizontal straight line segments using both the upper edge straight line segment extraction pattern and the lower edge straight line segment extraction pattern have already completed (YES in step S 58), no horizontal straight line segments are identified as being present (step S 60), and the flow proceeds to the right angle segment identification.
  • Moreover, in the horizontal straight line segment identification described above, the reason for finding the distance between the respective parts having a high correlation and the [0148] vehicle 10 after carrying out the correlation calculation using both the upper edge straight line segment extraction pattern and the lower edge straight line segment extraction pattern is because, based on the principle of binary vision using the left and right cameras, the distance of the horizontal straight line segment cannot be calculated. Therefore, unlike the case of the vertical straight line segment identification, in the horizontal straight line identification, the identification based only on the correlation of the straight line pattern is carried out.
  • Next, the right angle segment determination will be explained with reference to the flowchart shown in FIG. 17. [0149]
  • In the right angle segment determination, first, in order to search for a right angle segment, the image of the object and an upper-right right angle segment image pattern, which is the reference image for carrying out the correlation calculation, are selected (step S [0150] 71). Depending on the distance between the vehicle 10 and the object found in step S 13 of the flowchart shown in FIG. 3, the pattern size of the reference image is determined so as to be in proportion to the size of the image in real space projected on the display screen (step S 72).
  • Here, the determination of the pattern size of the reference image is carried out in the same manner as that of the vertical straight line segment identification and the horizontal straight line segment identification. Specifically, in the case that the distance between the [0151] vehicle 10 and the object is calculated as z=L[m] by using the above Eq. 2, the object having a height A[m] and width A[m] at a position at distance L[m] in real space is projected at an a×a[pixel] size on the display screen.
  • a=f×A/L   Eq. 17
  • Therefore, as shown in FIG. 21A, from the upper-right right angle segment image pattern prepared in advance, for example, the a×a [pixel] right angle segment pattern is extracted, and the upper-right right angle segment extraction pattern “Pat_Corner_R” serves as the reference pattern. Similarly, the a×a [pixel] upper-left right angle segment extraction pattern “Pat_Corner_L” extracted from the upper-left right angle segment image pattern prepared in advance is shown in FIG. 21B. [0152]
  • When the reference size for the correlation calculation has been found, next the search zone in proximity to the object is set (step S [0153] 73).
  • Moreover, the setting of the search zone is also carried out similarly to the vertical straight line segment identification and the horizontal straight line segment identification described above. That is, at the center of an circumscribed quadrangle of a binary object image (OBJ), the width and height set respectively a [pixel] rang for the width and the top and bottom of the binary object image, and this serves as the search range for the correlation calculation. [0154]
  • Next, from within the search range in proximity to the object, using the correlation calculation, a part (OBJ_Pat) having a high correlation with the upper-right right angle segment extraction pattern “Pat_Corner_R” is searched for (step S [0155] 74).
  • In addition, it is determined whether or not a part having a high correlation with the upper-right right angle segment extraction pattern “Pat_Corner_R” is present (step S [0156] 75).
  • In [0157] step S 75, in the case that a part having a high correlation with the upper edge straight line segment extraction pattern “Pat_Corner_R” is present (YES in Step S 75), and the distance of the OBJ_Pat is calculated similarly to the distance calculation of the object using Eq. 2 above in order to identify whether or not the part with a high correlation and the object are identical physical bodies (step S 76).
  • Moreover, in the case that the actual distance between the [0158] vehicle 10 and the object is equal to the distance between the vehicle and the part OBJ_Pat having a high correlation the object and OBJ_Pat are identified as being identical physical bodies and thus by comparing the detected parallax Ad and Ad_P instead of comparing the distances, it can be identified whether or not the object and OBJ_Pat are identical physical bodies (step S 77). Specifically, using the above Eq. 14, it is identified whether or not the parallax error is smaller than an allowable value TH.
  • In step S [0159] 77, in the case the object and OBJ_Pat are identified as being identical physical bodies (YES in step S 77), a right angle segment in the object is identified to be present (step S 78), having a right angle segment is treated as being an artificial highway structure (step S 79), and the right angle segment identification completes.
  • In contrast, in [0160] step S 75, in the case that a part having a high correlation with the upper-right right angle segment extraction pattern “Pat_Corner_R” is not present (NO in step S 75), or in step S 77, in the case that the object and OBJ_Pat are not identified as identical physical bodies (NO in step S 77), the flow proceeds to step S 80, and it is identified whether or not the reference pattern used in the correlation calculation is an upper-left right angle segment image pattern (step S 80).
  • In [0161] step S 80, in the case that the reference pattern used in the correlation calculation is not the upper-left right angle segment image pattern (NO in step S 80), the upper-left right angle segment image pattern prepared in advance is selected (step 81), and the flow proceeds to step S 72.
  • In addition, in step S [0162] 72 and step S 73 described above, the same action carried out for the upper-right right angle segment image pattern is carried out for the upper-left right angle segment image pattern, and the a×a [pixel] upper-left right angle segment extraction pattern “Pat_Corner_L” extracted from the upper-left right angle segment image pattern shown in FIG. 21B serves as the reference pattern. Furthermore, in step S 74, the part having a high correlation with the upper-left right angle segment extraction pattern “Pat_Corner_L” is searched for using the correlation calculation from within the search zone in proximity to the object.
  • As a result of the correlation calculation using the upper-left right angle segment extraction pattern, the actions from step S [0163] 75 to step S 79 are carried out, and when a right angle segment is identified as being present in the object, the object 101 is treated as an artificial highway structure, and the right angle segment identification completes.
  • In addition, as a result of the correlation calculation using the upper-left right angle segment extraction pattern, when the flow proceeds to the identification of step S [0164] 80 again, the search for right angle segments using both the upper-right right angle segment extraction pattern and the upper-left right angle segment extraction pattern has already completed (YES in step S 80), and thus no right angle segment is identified as being present (step S 82).
  • Therefore, it is determined that the object is not an artificial highway structure (step S [0165] 83), the right angle segment determination completes, and the processing in step S 27 to determine the shape of the object in FIG. 12 described above is executed.
  • Moreover, in the right angle segment identification described above, the reason that the correlation calculation is carried out using both the upper-right right angle segment extraction pattern and the upper-left right angle segment extraction pattern and that the distance between the parts having respective high correlations and the [0166] vehicle 10 is compared to the distance between the object a the vehicle 10 is the same as the case of the vertical straight line segment identification.
  • Next, the determination of identical shapes will be explained with reference to the figures. [0167]
  • As shown in FIG. 22, the identification of identical shapes is a process in which a highway structure [0168] 50 (for example, an upper and lower round lens disposed in a traffic signal) structured from a plurality of physical bodies having an identical shape is searched for from among the infrared images obtained by the infrared cameras.
  • The flowchart shown in FIG. 23 will be used in the explanation. First, in order to search for identical shapes, the image of the object and an object pattern “Pat”, which is a reference image for carrying out the correlation calculation, are set (step S [0169] 91).
  • Here, the object pattern “Pat” is the reference image that sets the zone one size larger than the binary object image (OBJ) [0170] 200, as shown in FIG. 24B, in the case, for example, that the part of the lens in the highway structure 50 that emits heat is extracted as the binary object image (OBJ) 200 as shown in FIG. 24A.
  • When the object pattern for the correlation calculation has been found, next, the search zone in proximity to the object is set (step S [0171] 92).
  • Here, the setting of the search zone is carried out as follows. Specifically, as shown in FIG. 24A, the range of the search zone is set such there is an upper and lower a [pixel] height above and below the [0172] binary object image 200 and there is a left and right b/2 [pixel] width on the right and left with respect to the center of the binary object image 200, and this serves as the respective upper search range 202 and the lower search range 203 using the correlation calculation.
  • Next, a part (OBJ_Pat) having a high correlation with the object pattern “Pat” is searched for using the correlation calculation from within the [0173] upper search range 202 and the lower search range 203 in proximity to the object (step S 93).
  • In addition, it is identified whether or not a part having a high correlation with the object pattern “Pat” is present (step S [0174] 94).
  • In [0175] step S 94, in the case that a part having a high correlation with the object pattern “Pat” is present (YES in step S 94), a shape identical to the object is identified as being present (step S 95), having an identical shape is treated as being an artificial highway structure (step S 96), and the identical shape identification completes. Moreover, in the example in FIG. 22, from the center of the infrared image, a highway structure (traffic signal) having a plurality (2) of identical objects (round lenses) is detected.
  • In contrast, in [0176] step S 94, in the case that a part having a high correlation with the object pattern “Pat” is not present (NO in step S 94), no shape identical to the object is identified as being present (step S 97), having no identical shape is treated as not being an artificial highway structure (step S 98), and the identical shape identification completes.
  • Moreover, in the identical shape identification described above, the setting of the search zone in which the object pattern is searched for was set in the vertical direction of the binary object image (OBJ) [0177] 200, but because the physical bodies having an identical shape may also be arranged left to right, after searching in a vertical direction, the search zone can be set left to right, and an object pattern searched for.
  • In addition, in the present embodiment, the [0178] image processing unit 1 comprises an object extraction device, an artificial structure identification device, and a reference image dimension altering device. More concretely, S1 to S7 in FIG. 3 correspond to the object extraction device, S 25 to S 28 in FIG. 12 correspond to the artificial structure identification device, and S 30 in FIG. 12 corresponds to the artificial structure elimination device. Furthermore, S 32 in FIG. 15, S 52 in FIG. 16, and S 72 in FIG. 17 correspond to the reference image dimension altering devices.
  • In addition, in the embodiment described above, the case in which [0179] infrared cameras 2R and 2L were used as the photographing device, but a television camera that can detects only normal visible light, such as that disclosed in Japanese Unexamined Patent Application, First Publication, No. Hei 2-26490, can also be used. However, by using an infrared camera, the extraction processing of animals or other traveling vehicles or the like can be shortened, and thus it can be realized using a relatively low level of the computational capacity of the computation apparatus. In addition, in the embodiment described above, the example of monitoring in front of the vehicle was provided, but the back of the vehicle or any other direction can be monitored.
  • As explained above, the result of monitoring the environment in the vicinity of the vehicle is treated by being classified into moving physical bodies such as pedestrians and animals, and artificial highway structures, and thus, for example, in the case that the environment in the vicinity of the vehicle is displayed to the driver of the vehicle, the method of displaying these objects can be different, and the driver appropriately notified of physical bodies towards which more careful attention should be paid. [0180]
  • In addition, for example, in the case that the information about these physical bodies is used in vehicle control, depending on the classification and importance of the physical bodies, they can be used as determination material for altering the order of the vehicle control. [0181]
  • As described above, according to a first aspect of the present invention, the image of a plurality of objects that emit heat present in an infrared image photographed by a photographing device is compared with a reference image, and it becomes possible to distinguish whether this physical body is an artificial structure having a determined shape or a moving physical body such as a pedestrian or animal. [0182]
  • Therefore, by classifying objects extracted from the infrared image into artificial structures and natural structures, there is the effect that physical bodies that are important and to which more careful attention should be paid in relation to the travel of the vehicle can be reliably recognized. [0183]
  • In addition, compared to the case in which pedestrians and animals having indefinite shapes are extracted from an object by shape identification of the thing itself, because physical bodies having determined shapes are detected, the effect can be obtained that recognition of the object is carried out with less computing and higher detection precision. [0184]
  • According to a second aspect of the present invention, in order to extract objects excluding artificial structures to which attention should be paid, artificial structures are eliminated from the objects extracted from the infrared image, and the remaining objects can be recognized as moving physical bodies. [0185]
  • Therefore, by excluding artificial structures and treating only objects that are not artificial structures extracted from the infrared image, the effect is attained that the recognition of important physical bodies can be improved. [0186]
  • In addition, according to a third aspect of the present invention, by identifying whether or not there is a straight line segment that simply characterizes artificial structures in the image, objects having straight line segments can be excluded as artificial structures, and objects that are not artificial structures can be recognized. [0187]
  • Therefore, highway structures can be removed from the infrared image comparatively easily, and the effect can be attained that the detection precision of pedestrians and animals that nave an indefinite shape can be improved. [0188]
  • According to a fourth aspect of the present invention, by compensating for differences in the sizes between the object image and reference image produced by the distance between the object and vehicle and comparing both with an appropriate size, the effect is attained that the precision in detecting whether or not the object is an artificial structure is improved. [0189]
  • Therefore, there are the effects that detection errors due to the distance between the vehicle and the object can be avoided and environmental monitoring in the vicinity of the vehicle can be carried out over a wide area. [0190]
  • In this manner, due to recognition by distinguishing physical bodies that move such as pedestrians and animals from artificial highway structures, the information about these physical bodies can be used in the vehicle control, and in the case that this information is displayed as information or warnings to the driver of the vehicle, it can be used as material for determining for altering the display method of the information and warnings depending on the content and importance of the object or the control method of the vehicle. [0191]

Claims (4)

What is claimed is:
1. A vehicle zone monitoring apparatus that detects a physical body present in the vicinity of the vehicle from infrared images photographed by a photographing device, comprising:
an object detection device that detects an object that emits infrared radiation from said infrared images; and
an artificial structure identifying device that compares an image of an object extracted by said object extraction device to a reference image of an element that defines an artificial structure and identifies whether or not said object is an artificial structure.
2. A vehicle zone monitoring apparatus according to claim 1 that comprises an artificial structure eliminating device that eliminates objects identified to be artificial structures by said artificial structure identifying device from the objects extracted by said object extracting device.
3. A vehicle zone monitoring apparatus according to claim 1 or claim 2 wherein:
said reference image includes an image that represents a straight line segment; and
said artificial structure identifying device identifies objects that include a straight line segment as an artificial structure.
4. A vehicle zone monitoring apparatus according to any of claim 1 to claim 3 wherein:
said artificial structure identifying device comprises a reference image dimension altering device that alters the size of said reference image so as to be in proportion to the distance between said vehicle and said object.
US10/171,007 2001-06-28 2002-06-12 Vehicle zone monitoring apparatus Abandoned US20030007074A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/287,433 US8144195B2 (en) 2001-06-28 2008-10-09 Vehicle zone monitoring apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2001-197312 2001-06-28
JP2001197312A JP2003016429A (en) 2001-06-28 2001-06-28 Vehicle periphery monitor device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/287,433 Continuation US8144195B2 (en) 2001-06-28 2008-10-09 Vehicle zone monitoring apparatus

Publications (1)

Publication Number Publication Date
US20030007074A1 true US20030007074A1 (en) 2003-01-09

Family

ID=19034940

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/171,007 Abandoned US20030007074A1 (en) 2001-06-28 2002-06-12 Vehicle zone monitoring apparatus
US12/287,433 Active 2024-04-30 US8144195B2 (en) 2001-06-28 2008-10-09 Vehicle zone monitoring apparatus

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/287,433 Active 2024-04-30 US8144195B2 (en) 2001-06-28 2008-10-09 Vehicle zone monitoring apparatus

Country Status (3)

Country Link
US (2) US20030007074A1 (en)
JP (1) JP2003016429A (en)
DE (1) DE10228638B4 (en)

Cited By (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030083790A1 (en) * 2001-10-29 2003-05-01 Honda Giken Kogyo Kabushiki Kaisha Vehicle information providing apparatus
US20040183906A1 (en) * 2003-03-20 2004-09-23 Nobuharu Nagaoka Device for monitoring around vehicle
US20040246336A1 (en) * 2003-06-04 2004-12-09 Model Software Corporation Video surveillance system
US20050083427A1 (en) * 2003-09-08 2005-04-21 Autonetworks Technologies, Ltd. Camera unit and apparatus for monitoring vehicle periphery
US20050137750A1 (en) * 2003-12-23 2005-06-23 Samsung Electronics Co., Ltd. Method and apparatus for using rotational movement amount of mobile device and computer-readable recording medium for storing computer program
EP1558026A2 (en) * 2004-01-23 2005-07-27 Nissan Motor Company, Limited On-vehicle night vision camera system, display device and display method
US20050231339A1 (en) * 2004-02-17 2005-10-20 Fuji Jukogyo Kabushiki Kaisha Outside-vehicle monitoring system
US20060115117A1 (en) * 2004-11-30 2006-06-01 Honda Motor Co. Ltd. Position detecting apparatus and method of correcting data therein
US20060115163A1 (en) * 2004-11-30 2006-06-01 Honda Motor Co., Ltd. Apparatus for and method of extracting image
US20060115119A1 (en) * 2004-11-30 2006-06-01 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
US20060115122A1 (en) * 2004-11-30 2006-06-01 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
US20060115126A1 (en) * 2004-11-30 2006-06-01 Honda Motor Co., Ltd. Vehicle vicinity monitoring apparatus
US20060115118A1 (en) * 2004-11-30 2006-06-01 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
US20060114320A1 (en) * 2004-11-30 2006-06-01 Honda Motor Co. Ltd. Position detecting apparatus and method of correcting data therein
US20060126896A1 (en) * 2004-11-30 2006-06-15 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
US20060126898A1 (en) * 2004-11-30 2006-06-15 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
US20060204037A1 (en) * 2004-11-30 2006-09-14 Honda Motor Co., Ltd. Vehicle vicinity monitoring apparatus
US20060222208A1 (en) * 2005-03-31 2006-10-05 Joel Leleve Method for the early detection of the arrival of a motor vehicle in a dark sector
US20070107966A1 (en) * 2005-11-15 2007-05-17 Leuze Lumiflex Gmbh & Co. Kg Method for identifying an object within a protective zone with a protective device for a vehicle
US20080012942A1 (en) * 2003-12-25 2008-01-17 Niles Co., Ltd. Imaging System
US20080107345A1 (en) * 2006-11-07 2008-05-08 Simon Melikian System and method for visual searching of objects using lines
WO2008057042A1 (en) 2006-11-10 2008-05-15 Autoliv Development Ab An object detection system
US7526104B2 (en) 2004-11-30 2009-04-28 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
US20110096956A1 (en) * 2008-06-12 2011-04-28 Honda Motor Co., Ltd. Vehicle periphery monitoring device
US7969466B2 (en) 2004-11-30 2011-06-28 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
US20110169955A1 (en) * 2005-02-24 2011-07-14 Aisin Seiki Kabushiki Kaisha Vehicle surrounding monitoring device
US20120105221A1 (en) * 2010-10-29 2012-05-03 National Taiwan University Of Science And Technology Real-time warning system for vehicle windshield and performing method thereof
US20120236122A1 (en) * 2011-03-18 2012-09-20 Any Co. Ltd. Image processing device, method thereof, and moving body anti-collision device
US20130070096A1 (en) * 2011-06-02 2013-03-21 Panasonic Corporation Object detection device, object detection method, and object detection program
EP2579230A1 (en) * 2010-07-09 2013-04-10 Honda Motor Co., Ltd. Device for monitoring vicinity of vehicle
WO2013066351A1 (en) * 2011-11-04 2013-05-10 Empire Technology Development Llc Ir signal capture for images
US20130271562A1 (en) * 2012-04-17 2013-10-17 Electronics And Telecommunications Research Institute User recognition apparatus and method
US20130311035A1 (en) * 2012-05-15 2013-11-21 Aps Systems, Llc Sensor system for motor vehicle
US20140003670A1 (en) * 2012-06-29 2014-01-02 Honda Motor Co., Ltd. Vehicle surroundings monitoring device
US20140247328A1 (en) * 2011-09-06 2014-09-04 Jaguar Land Rover Limited Terrain visualization for a vehicle and vehicle driver
CN104794936A (en) * 2013-12-31 2015-07-22 国际商业机器公司 Method and system for vehicle anti-collision
US9292909B2 (en) * 2009-06-03 2016-03-22 Flir Systems, Inc. Selective image correction for infrared imaging devices
CN105761546A (en) * 2014-12-16 2016-07-13 中国移动通信集团公司 Vehicle collision prevention method, device and system
CN106370162A (en) * 2015-07-24 2017-02-01 丰田自动车株式会社 Animal type determination device
CN106600628A (en) * 2016-12-13 2017-04-26 广州紫川电子科技有限公司 Target object identification method and device based on infrared thermal imaging system
US9928737B2 (en) * 2013-05-27 2018-03-27 Ekin Teknoloji Sanayi Ve Ticaret Anonim Sirketi Mobile number plate recognition and speed detection system
WO2018226437A1 (en) * 2017-06-05 2018-12-13 Adasky, Ltd. Shutterless far infrared (fir) camera for automotive safety and driving systems
US10429492B2 (en) * 2014-09-24 2019-10-01 Denso Corporation Apparatus for calculating misalignment quantity of beam sensor
US10511793B2 (en) 2017-06-05 2019-12-17 Adasky, Ltd. Techniques for correcting fixed pattern noise in shutterless FIR cameras
US10699386B2 (en) 2017-06-05 2020-06-30 Adasky, Ltd. Techniques for scene-based nonuniformity correction in shutterless FIR cameras
US10929955B2 (en) 2017-06-05 2021-02-23 Adasky, Ltd. Scene-based nonuniformity correction using a convolutional recurrent neural network
US11012594B2 (en) 2017-06-05 2021-05-18 Adasky, Ltd. Techniques for correcting oversaturated pixels in shutterless FIR cameras
US11170232B2 (en) * 2016-08-01 2021-11-09 Connaught Electronics Ltd. Method for capturing an object in an environmental region of a motor vehicle with prediction of the movement of the object, camera system as well as motor vehicle
US11317497B2 (en) 2019-06-20 2022-04-26 Express Imaging Systems, Llc Photocontroller and/or lamp with photocontrols to control operation of lamp
US20220262017A1 (en) * 2019-07-18 2022-08-18 Toyota Motor Europe Method for calculating information relative to a relative speed between an object and a camera
US11472404B2 (en) * 2017-09-01 2022-10-18 Murakami Corporation Collision prediction device, collision prediction method, and program

Families Citing this family (55)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10348109A1 (en) 2003-10-16 2005-05-19 Bayerische Motoren Werke Ag Method and device for visualizing a vehicle environment
JP3922245B2 (en) * 2003-11-20 2007-05-30 日産自動車株式会社 Vehicle periphery monitoring apparatus and method
DE10355474A1 (en) * 2003-11-27 2005-06-23 Daimlerchrysler Ag Motor vehicle distance warning system has an additional classification unit for classifying detected obstacles into at least living and non-living classes
DE102004010197B4 (en) * 2004-03-02 2015-04-16 Sick Ag Method for checking the function of a position-finding or environment detection device of a vehicle or for controlling a digital map
JP3987057B2 (en) 2004-06-14 2007-10-03 本田技研工業株式会社 Vehicle periphery monitoring device
DE102004054072A1 (en) * 2004-11-09 2006-05-11 Siemens Ag Sensor system for detecting a pedestrian impact
JP4611919B2 (en) * 2006-03-23 2011-01-12 本田技研工業株式会社 Pedestrian recognition device
JP4171501B2 (en) 2006-04-25 2008-10-22 本田技研工業株式会社 Vehicle periphery monitoring device
JP4813304B2 (en) * 2006-09-19 2011-11-09 本田技研工業株式会社 Vehicle periphery monitoring device
DE102007021643B4 (en) 2007-05-09 2017-11-30 Docter Optics Se Camera for digital image processing, control or support device with such a camera and vehicle with such a camera
DE102007025108A1 (en) 2007-05-30 2008-12-11 Docter Optics Gmbh Lens especially for a driver assistance system
JP5108605B2 (en) * 2008-04-23 2012-12-26 三洋電機株式会社 Driving support system and vehicle
JP4864953B2 (en) * 2008-10-08 2012-02-01 本田技研工業株式会社 Vehicle periphery monitoring device
EP2364575B1 (en) 2008-11-17 2016-01-27 Express Imaging Systems, LLC Electronic control to regulate power for solid-state lighting and methods thereof
CN201402328Y (en) * 2009-03-30 2010-02-10 德尔福技术有限公司 Safety detection device for distance from vehicle
KR20120032472A (en) * 2009-05-01 2012-04-05 익스프레스 이미징 시스템즈, 엘엘씨 Gas-discharge lamp replacement with passive cooling
WO2010135582A2 (en) 2009-05-20 2010-11-25 Express Imaging Systems, Llc Apparatus and method of energy efficient illumination
US8872964B2 (en) 2009-05-20 2014-10-28 Express Imaging Systems, Llc Long-range motion detection for illumination control
TW201146015A (en) * 2010-06-14 2011-12-16 Hon Hai Prec Ind Co Ltd Monitoring system and method
WO2012053102A1 (en) * 2010-10-22 2012-04-26 パイオニア株式会社 Terminal device, image processing method and image processing program executed by terminal device
EP2642463A4 (en) 2010-11-16 2014-06-25 Honda Motor Co Ltd Peripheral monitoring device for vehicle
JP5533766B2 (en) * 2011-04-05 2014-06-25 株式会社デンソー Vehicle display device
US8901825B2 (en) 2011-04-12 2014-12-02 Express Imaging Systems, Llc Apparatus and method of energy efficient illumination using received signals
US9360198B2 (en) 2011-12-06 2016-06-07 Express Imaging Systems, Llc Adjustable output solid-state lighting device
US9497393B2 (en) * 2012-03-02 2016-11-15 Express Imaging Systems, Llc Systems and methods that employ object recognition
JP5480925B2 (en) * 2012-03-05 2014-04-23 本田技研工業株式会社 Vehicle periphery monitoring device
US9210751B2 (en) 2012-05-01 2015-12-08 Express Imaging Systems, Llc Solid state lighting, drive circuit and method of driving same
US9204523B2 (en) 2012-05-02 2015-12-01 Express Imaging Systems, Llc Remotely adjustable solid-state lamp
US9131552B2 (en) 2012-07-25 2015-09-08 Express Imaging Systems, Llc Apparatus and method of operating a luminaire
US8896215B2 (en) 2012-09-05 2014-11-25 Express Imaging Systems, Llc Apparatus and method for schedule based operation of a luminaire
US9301365B2 (en) 2012-11-07 2016-03-29 Express Imaging Systems, Llc Luminaire with switch-mode converter power monitoring
US9210759B2 (en) 2012-11-19 2015-12-08 Express Imaging Systems, Llc Luminaire with ambient sensing and autonomous control capabilities
US9288873B2 (en) 2013-02-13 2016-03-15 Express Imaging Systems, Llc Systems, methods, and apparatuses for using a high current switching device as a logic level sensor
US9466443B2 (en) 2013-07-24 2016-10-11 Express Imaging Systems, Llc Photocontrol for luminaire consumes very low power
JP2015024713A (en) * 2013-07-25 2015-02-05 トヨタ自動車株式会社 Collision determination device
US9414449B2 (en) 2013-11-18 2016-08-09 Express Imaging Systems, Llc High efficiency power controller for luminaire
WO2015116812A1 (en) 2014-01-30 2015-08-06 Express Imaging Systems, Llc Ambient light control in solid state lamps and luminaires
US9572230B2 (en) 2014-09-30 2017-02-14 Express Imaging Systems, Llc Centralized control of area lighting hours of illumination
WO2016064542A1 (en) 2014-10-24 2016-04-28 Express Imaging Systems, Llc Detection and correction of faulty photo controls in outdoor luminaires
US9462662B1 (en) 2015-03-24 2016-10-04 Express Imaging Systems, Llc Low power photocontrol for luminaire
US9538612B1 (en) 2015-09-03 2017-01-03 Express Imaging Systems, Llc Low power photocontrol for luminaire
US9924582B2 (en) 2016-04-26 2018-03-20 Express Imaging Systems, Llc Luminaire dimming module uses 3 contact NEMA photocontrol socket
DE102016215058A1 (en) 2016-08-12 2018-02-15 Bayerische Motoren Werke Aktiengesellschaft Improved control of the interior temperature
US10230296B2 (en) 2016-09-21 2019-03-12 Express Imaging Systems, Llc Output ripple reduction for power converters
US9985429B2 (en) 2016-09-21 2018-05-29 Express Imaging Systems, Llc Inrush current limiter circuit
US10098212B2 (en) 2017-02-14 2018-10-09 Express Imaging Systems, Llc Systems and methods for controlling outdoor luminaire wireless network using smart appliance
US10904992B2 (en) 2017-04-03 2021-01-26 Express Imaging Systems, Llc Systems and methods for outdoor luminaire wireless control
US11375599B2 (en) 2017-04-03 2022-06-28 Express Imaging Systems, Llc Systems and methods for outdoor luminaire wireless control
US10219360B2 (en) 2017-04-03 2019-02-26 Express Imaging Systems, Llc Systems and methods for outdoor luminaire wireless control
US10568191B2 (en) 2017-04-03 2020-02-18 Express Imaging Systems, Llc Systems and methods for outdoor luminaire wireless control
CN107791952A (en) * 2017-09-25 2018-03-13 长安大学 One kind driving blind area living body detection warning system and method
WO2019177019A1 (en) * 2018-03-14 2019-09-19 パナソニックIpマネジメント株式会社 Traffic signal recognizing device, traffic signal recognition method, and program
US11234304B2 (en) 2019-05-24 2022-01-25 Express Imaging Systems, Llc Photocontroller to control operation of a luminaire having a dimming line
US10915766B2 (en) * 2019-06-28 2021-02-09 Baidu Usa Llc Method for detecting closest in-path object (CIPO) for autonomous driving
US11212887B2 (en) 2019-11-04 2021-12-28 Express Imaging Systems, Llc Light having selectively adjustable sets of solid state light sources, circuit and method of operation thereof, to provide variable output characteristics

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5341142A (en) * 1987-07-24 1994-08-23 Northrop Grumman Corporation Target acquisition and tracking system
US5410346A (en) * 1992-03-23 1995-04-25 Fuji Jukogyo Kabushiki Kaisha System for monitoring condition outside vehicle using imaged picture by a plurality of television cameras
US5461357A (en) * 1992-01-29 1995-10-24 Mazda Motor Corporation Obstacle detection device for vehicle
US5488674A (en) * 1992-05-15 1996-01-30 David Sarnoff Research Center, Inc. Method for fusing images and apparatus therefor
US5764136A (en) * 1996-03-27 1998-06-09 Her Majesty The Queen In Right Of Canada As Represented By Communications Research Centre Ultrasonic detection system for safety of vehicle passengers
US5831669A (en) * 1996-07-09 1998-11-03 Ericsson Inc Facility monitoring system with image memory and correlation
US5880777A (en) * 1996-04-15 1999-03-09 Massachusetts Institute Of Technology Low-light-level imaging and image processing
US6151539A (en) * 1997-11-03 2000-11-21 Volkswagen Ag Autonomous vehicle arrangement and method for controlling an autonomous vehicle
US6411328B1 (en) * 1995-12-01 2002-06-25 Southwest Research Institute Method and apparatus for traffic incident detection
US6445832B1 (en) * 2000-10-10 2002-09-03 Lockheed Martin Corporation Balanced template tracker for tracking an object image sequence
US6449384B2 (en) * 1998-10-23 2002-09-10 Facet Technology Corp. Method and apparatus for rapidly determining whether a digitized image frame contains an object of interest
US6570608B1 (en) * 1998-09-30 2003-05-27 Texas Instruments Incorporated System and method for detecting interactions of people and vehicles
US6658150B2 (en) * 1999-12-02 2003-12-02 Honda Giken Kogyo Kabushiki Kaisha Image recognition system
US6720920B2 (en) * 1997-10-22 2004-04-13 Intelligent Technologies International Inc. Method and arrangement for communicating between vehicles
US6737963B2 (en) * 2001-03-30 2004-05-18 Koninklijke Philips Electronics N.V. Driver tailgating and following aid
US6788817B1 (en) * 1999-11-04 2004-09-07 Honda Giken Kogyo Kabushikikaisha Object recognition system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5414439A (en) * 1994-06-09 1995-05-09 Delco Electronics Corporation Head up display with night vision enhancement
JPH09119982A (en) 1995-10-26 1997-05-06 Mitsubishi Heavy Ind Ltd Missile guiding system
JP3727400B2 (en) 1996-02-22 2005-12-14 株式会社日本自動車部品総合研究所 Crossing detection device
US5937077A (en) * 1996-04-25 1999-08-10 General Monitors, Incorporated Imaging flame detection system
JP3515926B2 (en) 1999-06-23 2004-04-05 本田技研工業株式会社 Vehicle periphery monitoring device
JP2001108758A (en) 1999-10-06 2001-04-20 Matsushita Electric Ind Co Ltd Human detector

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5341142A (en) * 1987-07-24 1994-08-23 Northrop Grumman Corporation Target acquisition and tracking system
US5461357A (en) * 1992-01-29 1995-10-24 Mazda Motor Corporation Obstacle detection device for vehicle
US5410346A (en) * 1992-03-23 1995-04-25 Fuji Jukogyo Kabushiki Kaisha System for monitoring condition outside vehicle using imaged picture by a plurality of television cameras
US5488674A (en) * 1992-05-15 1996-01-30 David Sarnoff Research Center, Inc. Method for fusing images and apparatus therefor
US6411328B1 (en) * 1995-12-01 2002-06-25 Southwest Research Institute Method and apparatus for traffic incident detection
US5764136A (en) * 1996-03-27 1998-06-09 Her Majesty The Queen In Right Of Canada As Represented By Communications Research Centre Ultrasonic detection system for safety of vehicle passengers
US5880777A (en) * 1996-04-15 1999-03-09 Massachusetts Institute Of Technology Low-light-level imaging and image processing
US5831669A (en) * 1996-07-09 1998-11-03 Ericsson Inc Facility monitoring system with image memory and correlation
US6720920B2 (en) * 1997-10-22 2004-04-13 Intelligent Technologies International Inc. Method and arrangement for communicating between vehicles
US6151539A (en) * 1997-11-03 2000-11-21 Volkswagen Ag Autonomous vehicle arrangement and method for controlling an autonomous vehicle
US6570608B1 (en) * 1998-09-30 2003-05-27 Texas Instruments Incorporated System and method for detecting interactions of people and vehicles
US6449384B2 (en) * 1998-10-23 2002-09-10 Facet Technology Corp. Method and apparatus for rapidly determining whether a digitized image frame contains an object of interest
US6625315B2 (en) * 1998-10-23 2003-09-23 Facet Technology Corp. Method and apparatus for identifying objects depicted in a videostream
US6788817B1 (en) * 1999-11-04 2004-09-07 Honda Giken Kogyo Kabushikikaisha Object recognition system
US6658150B2 (en) * 1999-12-02 2003-12-02 Honda Giken Kogyo Kabushiki Kaisha Image recognition system
US6445832B1 (en) * 2000-10-10 2002-09-03 Lockheed Martin Corporation Balanced template tracker for tracking an object image sequence
US6737963B2 (en) * 2001-03-30 2004-05-18 Koninklijke Philips Electronics N.V. Driver tailgating and following aid

Cited By (98)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7141796B2 (en) * 2001-10-29 2006-11-28 Honda Giken Kogyo Kabushiki Kaisha Vehicle information providing apparatus
US20030083790A1 (en) * 2001-10-29 2003-05-01 Honda Giken Kogyo Kabushiki Kaisha Vehicle information providing apparatus
US20040183906A1 (en) * 2003-03-20 2004-09-23 Nobuharu Nagaoka Device for monitoring around vehicle
US7330568B2 (en) 2003-03-20 2008-02-12 Honda Motor Co., Ltd. Device for monitoring around vehicle
US7956889B2 (en) 2003-06-04 2011-06-07 Model Software Corporation Video surveillance system
US7859564B2 (en) 2003-06-04 2010-12-28 Model Software Corporation Video surveillance system
US20080211907A1 (en) * 2003-06-04 2008-09-04 Model Software Corporation Video surveillance system
US20080030579A1 (en) * 2003-06-04 2008-02-07 Model Software Corporation Video surveillance system
US20040246336A1 (en) * 2003-06-04 2004-12-09 Model Software Corporation Video surveillance system
US8605155B2 (en) * 2003-06-04 2013-12-10 Model Software Corporation Video surveillance system
US7605856B2 (en) * 2003-09-08 2009-10-20 Autonetworks Technologies, Ltd. Camera unit and apparatus for monitoring vehicle periphery
US20050083427A1 (en) * 2003-09-08 2005-04-21 Autonetworks Technologies, Ltd. Camera unit and apparatus for monitoring vehicle periphery
US7747348B2 (en) * 2003-12-23 2010-06-29 Samsung Electronics Co., Ltd. Method and apparatus for using rotational movement amount of mobile device and computer-readable recording medium for storing computer program
US20050137750A1 (en) * 2003-12-23 2005-06-23 Samsung Electronics Co., Ltd. Method and apparatus for using rotational movement amount of mobile device and computer-readable recording medium for storing computer program
US20080012942A1 (en) * 2003-12-25 2008-01-17 Niles Co., Ltd. Imaging System
EP1558026A2 (en) * 2004-01-23 2005-07-27 Nissan Motor Company, Limited On-vehicle night vision camera system, display device and display method
US20060043295A1 (en) * 2004-01-23 2006-03-02 Nissan Motor Co., Ltd. On-vehicle night vision camera system, display device and display method
EP1558026A3 (en) * 2004-01-23 2005-08-03 Nissan Motor Company, Limited On-vehicle night vision camera system, display device and display method
US7078692B2 (en) 2004-01-23 2006-07-18 Nissan Motor Co., Ltd. On-vehicle night vision camera system, display device and display method
CN100413324C (en) * 2004-01-23 2008-08-20 日产自动车株式会社 On-vehicle night vision camera system, display device and display method
US20050231339A1 (en) * 2004-02-17 2005-10-20 Fuji Jukogyo Kabushiki Kaisha Outside-vehicle monitoring system
US7567687B2 (en) * 2004-02-17 2009-07-28 Fuji Jukogyo Kabushiki Kaisha Outside-vehicle monitoring system
US7489805B2 (en) 2004-11-30 2009-02-10 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
US20060115118A1 (en) * 2004-11-30 2006-06-01 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
US20060115117A1 (en) * 2004-11-30 2006-06-01 Honda Motor Co. Ltd. Position detecting apparatus and method of correcting data therein
US20060115163A1 (en) * 2004-11-30 2006-06-01 Honda Motor Co., Ltd. Apparatus for and method of extracting image
US20060115119A1 (en) * 2004-11-30 2006-06-01 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
US7969466B2 (en) 2004-11-30 2011-06-28 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
US7388476B2 (en) 2004-11-30 2008-06-17 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
US20060204037A1 (en) * 2004-11-30 2006-09-14 Honda Motor Co., Ltd. Vehicle vicinity monitoring apparatus
US20060126898A1 (en) * 2004-11-30 2006-06-15 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
US20060115122A1 (en) * 2004-11-30 2006-06-01 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
US7515737B2 (en) 2004-11-30 2009-04-07 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
US7526104B2 (en) 2004-11-30 2009-04-28 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
US7545955B2 (en) 2004-11-30 2009-06-09 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
US7561719B2 (en) 2004-11-30 2009-07-14 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
US7567688B2 (en) 2004-11-30 2009-07-28 Honda Motor Co., Ltd. Apparatus for and method of extracting image
US20060126896A1 (en) * 2004-11-30 2006-06-15 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
US7590263B2 (en) 2004-11-30 2009-09-15 Honda Motor Co., Ltd. Vehicle vicinity monitoring apparatus
US7599521B2 (en) 2004-11-30 2009-10-06 Honda Motor Co., Ltd. Vehicle vicinity monitoring apparatus
US20060114320A1 (en) * 2004-11-30 2006-06-01 Honda Motor Co. Ltd. Position detecting apparatus and method of correcting data therein
US7616806B2 (en) 2004-11-30 2009-11-10 Honda Motor Co., Ltd. Position detecting apparatus and method of correcting data therein
US7620237B2 (en) 2004-11-30 2009-11-17 Honda Motor Co., Ltd. Position detecting apparatus and method of correcting data therein
US20060115126A1 (en) * 2004-11-30 2006-06-01 Honda Motor Co., Ltd. Vehicle vicinity monitoring apparatus
US8872919B2 (en) * 2005-02-24 2014-10-28 Aisin Seiki Kabushiki Kaisha Vehicle surrounding monitoring device
US20110169955A1 (en) * 2005-02-24 2011-07-14 Aisin Seiki Kabushiki Kaisha Vehicle surrounding monitoring device
US8063935B2 (en) * 2005-03-31 2011-11-22 Valeo Vision Method for the early detection of the arrival of a motor vehicle in a dark sector
US20060222208A1 (en) * 2005-03-31 2006-10-05 Joel Leleve Method for the early detection of the arrival of a motor vehicle in a dark sector
US20070107966A1 (en) * 2005-11-15 2007-05-17 Leuze Lumiflex Gmbh & Co. Kg Method for identifying an object within a protective zone with a protective device for a vehicle
US7743865B2 (en) * 2005-11-15 2010-06-29 Leuze Lumiflex Gmbh & Co. Kg Method for identifying an object within a protective zone with a protective device for a vehicle
US7831098B2 (en) * 2006-11-07 2010-11-09 Recognition Robotics System and method for visual searching of objects using lines
US20080107345A1 (en) * 2006-11-07 2008-05-08 Simon Melikian System and method for visual searching of objects using lines
US20100052885A1 (en) * 2006-11-10 2010-03-04 Mattias Hanqvist Object detection system
WO2008057042A1 (en) 2006-11-10 2008-05-15 Autoliv Development Ab An object detection system
US8446269B2 (en) * 2006-11-10 2013-05-21 Autoliv Development Ab Object detection system
US20110096956A1 (en) * 2008-06-12 2011-04-28 Honda Motor Co., Ltd. Vehicle periphery monitoring device
US8189868B2 (en) * 2008-06-12 2012-05-29 Honda Motor Co., Ltd. Vehicle periphery monitoring device
US9292909B2 (en) * 2009-06-03 2016-03-22 Flir Systems, Inc. Selective image correction for infrared imaging devices
EP2579230A1 (en) * 2010-07-09 2013-04-10 Honda Motor Co., Ltd. Device for monitoring vicinity of vehicle
US9158738B2 (en) 2010-07-09 2015-10-13 Honda Motor Co., Ltd. Apparatus for monitoring vicinity of a vehicle
EP2579230A4 (en) * 2010-07-09 2013-11-06 Honda Motor Co Ltd Device for monitoring vicinity of vehicle
US20120105221A1 (en) * 2010-10-29 2012-05-03 National Taiwan University Of Science And Technology Real-time warning system for vehicle windshield and performing method thereof
US9858488B2 (en) * 2011-03-18 2018-01-02 Any Co. Ltd. Image processing device, method thereof, and moving body anti-collision device
US20120236122A1 (en) * 2011-03-18 2012-09-20 Any Co. Ltd. Image processing device, method thereof, and moving body anti-collision device
US20130070096A1 (en) * 2011-06-02 2013-03-21 Panasonic Corporation Object detection device, object detection method, and object detection program
US9152887B2 (en) * 2011-06-02 2015-10-06 Panasonic Intellectual Property Management Co., Ltd. Object detection device, object detection method, and object detection program
US10063836B2 (en) * 2011-09-06 2018-08-28 Jaguar Land Rover Limited Terrain visualization for a vehicle and vehicle driver
US20140247328A1 (en) * 2011-09-06 2014-09-04 Jaguar Land Rover Limited Terrain visualization for a vehicle and vehicle driver
WO2013066351A1 (en) * 2011-11-04 2013-05-10 Empire Technology Development Llc Ir signal capture for images
US8976249B2 (en) 2011-11-04 2015-03-10 Empire Technology Development Llc IR signal capture for images
US9398288B2 (en) * 2011-11-04 2016-07-19 Empire Technology Development Llc IR signal capture for images
US20150145960A1 (en) * 2011-11-04 2015-05-28 Empire Technology Development Llc Ir signal capture for images
US20130271562A1 (en) * 2012-04-17 2013-10-17 Electronics And Telecommunications Research Institute User recognition apparatus and method
US9432594B2 (en) * 2012-04-17 2016-08-30 Electronics And Telecommunications Research Institute User recognition apparatus and method
US9738253B2 (en) * 2012-05-15 2017-08-22 Aps Systems, Llc. Sensor system for motor vehicle
US20130311035A1 (en) * 2012-05-15 2013-11-21 Aps Systems, Llc Sensor system for motor vehicle
US20140003670A1 (en) * 2012-06-29 2014-01-02 Honda Motor Co., Ltd. Vehicle surroundings monitoring device
US9064158B2 (en) * 2012-06-29 2015-06-23 Honda Motor Co., Ltd Vehicle surroundings monitoring device
US9928737B2 (en) * 2013-05-27 2018-03-27 Ekin Teknoloji Sanayi Ve Ticaret Anonim Sirketi Mobile number plate recognition and speed detection system
CN104794936A (en) * 2013-12-31 2015-07-22 国际商业机器公司 Method and system for vehicle anti-collision
US10525882B2 (en) 2013-12-31 2020-01-07 International Business Machines Corporation Vehicle collision avoidance
US10065562B2 (en) * 2013-12-31 2018-09-04 International Business Mahcines Corporation Vehicle collision avoidance
US20150329044A1 (en) * 2013-12-31 2015-11-19 International Business Machines Corporation Vehicle collision avoidance
US10429492B2 (en) * 2014-09-24 2019-10-01 Denso Corporation Apparatus for calculating misalignment quantity of beam sensor
CN105761546A (en) * 2014-12-16 2016-07-13 中国移动通信集团公司 Vehicle collision prevention method, device and system
CN106370162A (en) * 2015-07-24 2017-02-01 丰田自动车株式会社 Animal type determination device
US11170232B2 (en) * 2016-08-01 2021-11-09 Connaught Electronics Ltd. Method for capturing an object in an environmental region of a motor vehicle with prediction of the movement of the object, camera system as well as motor vehicle
CN106600628A (en) * 2016-12-13 2017-04-26 广州紫川电子科技有限公司 Target object identification method and device based on infrared thermal imaging system
WO2018226437A1 (en) * 2017-06-05 2018-12-13 Adasky, Ltd. Shutterless far infrared (fir) camera for automotive safety and driving systems
US10699386B2 (en) 2017-06-05 2020-06-30 Adasky, Ltd. Techniques for scene-based nonuniformity correction in shutterless FIR cameras
US10819919B2 (en) 2017-06-05 2020-10-27 Adasky, Ltd. Shutterless far infrared (FIR) camera for automotive safety and driving systems
US10929955B2 (en) 2017-06-05 2021-02-23 Adasky, Ltd. Scene-based nonuniformity correction using a convolutional recurrent neural network
US11012594B2 (en) 2017-06-05 2021-05-18 Adasky, Ltd. Techniques for correcting oversaturated pixels in shutterless FIR cameras
US10511793B2 (en) 2017-06-05 2019-12-17 Adasky, Ltd. Techniques for correcting fixed pattern noise in shutterless FIR cameras
US11472404B2 (en) * 2017-09-01 2022-10-18 Murakami Corporation Collision prediction device, collision prediction method, and program
US11317497B2 (en) 2019-06-20 2022-04-26 Express Imaging Systems, Llc Photocontroller and/or lamp with photocontrols to control operation of lamp
US20220262017A1 (en) * 2019-07-18 2022-08-18 Toyota Motor Europe Method for calculating information relative to a relative speed between an object and a camera
US11836933B2 (en) * 2019-07-18 2023-12-05 Toyota Motor Europe Method for calculating information relative to a relative speed between an object and a camera

Also Published As

Publication number Publication date
US20090046151A1 (en) 2009-02-19
DE10228638B4 (en) 2007-12-20
JP2003016429A (en) 2003-01-17
DE10228638A1 (en) 2003-01-16
US8144195B2 (en) 2012-03-27

Similar Documents

Publication Publication Date Title
US8144195B2 (en) Vehicle zone monitoring apparatus
US7141796B2 (en) Vehicle information providing apparatus
US7474765B2 (en) Image recognition apparatus
US8175331B2 (en) Vehicle surroundings monitoring apparatus, method, and program
JP4410292B1 (en) Vehicle periphery monitoring device
US7969466B2 (en) Vehicle surroundings monitoring apparatus
JP2001006096A (en) Peripheral part monitoring device for vehicle
JP2002298298A (en) Periphery monitoring device for vehicle
US7388476B2 (en) Vehicle surroundings monitoring apparatus
JP2001169310A (en) Distance detector
US20060126897A1 (en) Vehicle surroundings monitoring apparatus
US7526104B2 (en) Vehicle surroundings monitoring apparatus
JP4425852B2 (en) Vehicle periphery monitoring device
US20060115122A1 (en) Vehicle surroundings monitoring apparatus
JP3916930B2 (en) Approach warning device
JP2001018738A (en) Apparatus for monitoring ambient condition of vehicle
JP3949628B2 (en) Vehicle periphery monitoring device
JP4943403B2 (en) Vehicle periphery monitoring device
JP4567072B2 (en) Vehicle periphery monitoring device
US7545955B2 (en) Vehicle surroundings monitoring apparatus
JP3961269B2 (en) Obstacle alarm device
JP2004362265A (en) Infrared image recognition device
JP3859429B2 (en) Moving object detection device
JP4358183B2 (en) Vehicle periphery monitoring device

Legal Events

Date Code Title Description
AS Assignment

Owner name: HONDA GIKEN KOGYO KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NAGAOKA, NOBUHARU;TSUJI, TAKAYUKI;WATANABE, MASAHITO;AND OTHERS;REEL/FRAME:013015/0367

Effective date: 20020513

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: ARRIVER SOFTWARE AB, SWEDEN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:VEONEER SWEDEN AB;REEL/FRAME:059596/0826

Effective date: 20211230