Language selection

Search

Patent 2392652 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent Application: (11) CA 2392652
(54) English Title: SYSTEM AND METHOD FOR ESTIMATING EGO-MOTION OF A MOVING VEHICLE USING SUCCESSIVE IMAGES RECORDED ALONG THE VEHICLE'S PATH OF MOTION
(54) French Title: SYSTEME ET PROCEDE D'ESTIMATION DE L'AUTO-DEPLACEMENT D'UN VEHICULE EN MOUVEMENT AU MOYEN D'IMAGES SUCCESSIVES ENREGISTREES LE LONG DE LA TRAJECTOIRE DE DEPLACEMENT DU VEHICULE
Status: Dead
Bibliographic Data
(51) International Patent Classification (IPC):
  • H04N 5/14 (2006.01)
  • G01S 11/12 (2006.01)
  • G05D 1/02 (2006.01)
  • G06K 9/00 (2006.01)
  • G06T 7/20 (2006.01)
  • H04B 1/66 (2006.01)
  • H04N 7/18 (2006.01)
  • G01S 5/16 (2006.01)
(72) Inventors :
  • SHASHUA, AMNON (Israel)
  • STEIN, GIDEON (Israel)
  • MANO, OFER (Israel)
(73) Owners :
  • SHASHUA, AMNON (Not Available)
  • STEIN, GIDEON (Not Available)
  • MANO, OFER (Not Available)
(71) Applicants :
  • MOBILEYE, INC. (United States of America)
(74) Agent: SIM & MCBURNEY
(74) Associate agent:
(45) Issued:
(86) PCT Filing Date: 2000-11-27
(87) Open to Public Inspection: 2001-05-31
Availability of licence: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): Yes
(86) PCT Filing Number: PCT/US2000/032143
(87) International Publication Number: WO2001/039120
(85) National Entry: 2002-05-27

(30) Application Priority Data:
Application No. Country/Territory Date
60/167,587 United States of America 1999-11-26
60/230,166 United States of America 2000-09-01

Abstracts

English Abstract




An ego-motion determination system (fig. 1, item 12) is disclosed for
generating an estimate as to the ego-motion of a vehicle (fig. 1, item 10)
moving along a roadway (fig. 1, item 11). The ego-motion determination system
includes an image information receiver (fig. 1, item 13) and a processor (fig.
1, item 14). The image information receiver (fig. 1, item 13) is configured to
receive image information relating to a series of at least two images recorded
as the vehicle moves along a roadway (fig. 1, item 11). The processor is
configured to process the image information received by the image receiver to
generate an ego-motion estimate of the vehicle, including the translation of
the vehicle in the forward direction and the rotation of the vehicle around a
vertical axis as between, for example, successive images (see fig. 3)


French Abstract

L'invention concerne un système de détermination d'auto-déplacement permettant d'estimer l'auto-déplacement d'un véhicule se déplaçant sur une route. Ce système comprend un récepteur d'informations relatives aux images et un processeur. Le récepteur d'informations relatives aux images est configuré de façon à recevoir les informations relatives aux images concernant une série d'un au moins deux images enregistrées pendant le déplacement du véhicule sur une route. Le processeur est configuré afin de traiter les informations relatives aux images reçues par le récepteur d'images afin de fournir un estimation de l'auto-déplacement du véhicule, y compris la translation du véhicule dans la direction avant et la rotation du véhicule autour d'un axe vertical comme entre, par exemple, des images successives.

Claims

Note: Claims are shown in the official language in which they were submitted.



-19-

CLAIMS

1. An ego-motion determination system for generating an estimate as to the ego-
motion of a vehicle
moving along a roadway comprising:
A. an image information receiver configured to receive image information
relating to a series
of at least two images recorded as the vehicle moves along a roadway; and
B. a processor configured to process the image information received by the
image receiver to
generate an ego-motion estimate of the vehicle.

Description

Note: Descriptions are shown in the official language in which they were submitted.



WO 01/39120 CA 02392652 2002-05-27 pCT/US00/32143
-1-
SYSTEM AND METHOD FOR ESTIMATING EGO-MOTION OF A MOVING VEHICLE USING
SUCCESSIVE IMAGES RECORDI;'D ALONG THE VEHICLE'S PATH OF MOTION
FIELD OF THE INVENTION
The invention relates generally to the field of systems and methods for
estimating ego-
motion (that is, "self motion) of a moving vehicle, and more specifically to
systems and methods that
estimate ego-motion using successively-recorded images recorded along the
vehicle's path of motion.
BACKGROUND OF THE INVENTION
Accurate estimation of the ego- ("self ") motion of a vehicle relative to a
roadway is an
important component in autonomous driving and computer vision-based driving
assistance. Using
computer vision techniques to provide assistance while driving, instead of
mechanical sensors,
allows for the use of the information that is recorded for use in estimating
vehicle movement to also
be used in detecting obstacles, identifying lanes and the like, without the
need for calibration
between sensors as would be necessary with mechanical sensors. This reduces
cost and maintenance.
There are several problems in estimating ego-motion of a vehicle. Typically,
roads have few
feature points, if any. The most obvious features in a road, such as lane
markings, have a generally
linear structure, whereas background image structures, such as those
associated with other vehicles,
buildings, trees, and the like, will typically have many feature points. This
will make image- or
optical-flow-based estimation difficult in practice. In addition, typically
images that are recorded
for ego-motion estimation will contain a large amount of "outlier" information
that is either not
useful in estimating ego-motion, or that may result in poor estimation. For
example, in estimating
of ego-motion relative to a fixed structure such as a road, images of objects
that are moving relative
to the road, such as other moving vehicles and even moving wipers, will
contribute false information
for the ego-motion estimation. In addition, conditions that degrade image
quality, such as raindrops
and glare, will also make accurate ego-motion estimation difficult.
SUMMARY OF THE INVENTION
The invention provides a new and improved system and method for estimating ego-
motion
using successively-recorded images recorded along the vehicle's path of
motion..
In brief summary, the invention provides an ego-motion determination system
for generating
an estimate as to the ego-motion of a vehicle moving along a roadway. The ego-
motion
determination system includes an image information receiver and a processor.
The image
information receiver is configured to receive image information relating to a
series of at least two


WO 01/39120 CA 02392652 2002-05-27 pCT~S00/32143
-2-
images recorded as the vehicle moves along a roadway. The processor is
configured to process the
image information received by the image receiver to generate an ego-motion
estimate of the vehicle,
including the translation of the vehicle in the forward direction and the
rotation of the vehicle around
a vertical axis as between, for example, successive images.
Several methodologies are disclosed for generating the ego-motion estimate of
the vehicle.
BRIEF DESCRIPTION OF THE DRAWINGS
This invention is pointed out with particularity in the appended claims. The
above and
further advantages of this invention may be better understood by referring to
the following
description taken in conjunction with the accompanying drawings, in which:
FIG. 1 schematically depicts a vehicle moving on a roadway and including an
ego-motion
estimation system constructed in accordance with the invention;
FIG. 2 is a flow chart depicting operations performed by the ego-motion
estimation system
in determining ego-motion of the vehicle in accordance with one methodology;
and
FIG. 3 is a flow chart depicting operations performed by the ego-motion
estimation system
in determining ego-motion of the vehicle in accordance with a second
methodology.
DETAILED DESCRIPTION OF AN ILLUSTRATIVE EMBODIMENT
FIG. 1 schematically depicts a vehicle 10 moving on a roadway 11 and including
an ego-
motion estimation system 12 constructed in accordance with the invention. The
vehicle 10 may be
any kind of vehicle 10 that may move on the roadway 11, including, but not
limited to automobiles,
trucks, buses and the like. The ego-motion estimation system 12 includes a
camera 13 and a ego-
motion estimation system processor 14. The camera 13 is mounted on the vehicle
10 and is
preferably pointed in a forward direction, that is, in the direction in which
the vehicle would
normally move, to record successive images as the vehicle moves over the
roadway. Preferably as
t'he camera 13 records each image, it will provide the image to the ego-motion
estimation system
processor 14. The ego-motion estimation system processor 14, in turn, will
process information that
it obtains from the successive images, possibly along with other information,
such as information
from the vehicle's speedometer (not separately shown) to determine the ego-
motion (that is, the self
motion) of the vehicle relative to the roadway 11. The ego-motion estimation
system processor 14
may also be mounted in or on the vehicle 11 and may form part thereof. The ego-
motion estimates
generated by the ego-motion estimation system processor 14 may be used for a
number of things,
including, but not limited to obstacle and lane detection, autonomous driving
by the vehicle, perhaps
also using positioning information from, for example, the global positioning
system ("GPS") and


WO 01/39120 CA 02392652 2002-05-27 pCT~S00/32143
-3-
roadway mapping information from a number of sources known to those skilled in
the art, and the
like. Operations performed by the ego-motion estimation system processor 14 in
determining ego-
motion of the vehicle 10 will be described in connection with the flow charts
depicted in FIGS. 2 and
3.
Before proceeding further, it would be helpful to provide background of the
operations
performed by the ego-motion estimation system processor 14 depicted in FIG. 1.
Generally, as
between two images 'I' and 'h' the ego-motion estimation system processor 14
attempts to determine
T T
the translation t = (tX , tY , tZ ) and rotation w = (wX , wY, wZ ) (where "T"
refers to the transpose
operation, t; refers to translation along the respective "X," "Y" and "Z"
axes, and w; refers to rotation
around the respective axis) of the camera 13 affixed to the vehicle 10. Since
the camera 13 is affixed
to the vehicle 10, the translation and rotation of the camera 13 will also
conform to the translation
T
and rotation of the vehicle 10. In that case, for a point p = ~x, y) (where
"x" and "y" are
coordinates of a point or feature in the image) that is a proj ection of a
point P = ~X , Y, Z)T (where
"X," "Y" and "Z" are coordinates of the point in three-dimensional space), the
flow vector for the
point, that is, the vector indicating the motion of the same point in three
dimensional space from is
position in the image'h to the image ~I'', has components (u, v)
T
u= ZS,Tt+ ~pX S,~ w
(1),
T
v= 2SZt+CpXS2) w
where "X" in equations (1) refers to the matrix cross-product and
f 0 ~ x/ f
= o SZ = f p = ylf (
-x -y 1
where "f' is the focal length of the camera 13, which is presumed to be known.
The roadway on
which the vehicle 10 is traveling is modeled as a plane. The equation for
points on a plane is


CA 02392652 2002-05-27
WO 01/39120 PCT/US00/32143
-4-
AX + BY+ BZ = 1 (3),
where the "X" and "Y" axes correspond to horizontal "x" and vertical "y" axes
of the image plane,
and the "Z" axis is perpendicular to the image plane. The camera 13 may be
tilted slightly
downwardly to increase the amount of the image that corresponds to the road,
and reduce the amount
of the image that corresponds to other features, such as other traffic,
buildings, trees and the like, and
in that case, the images 'F and'Y' will be rectified so that the images' z
axis will be parallel to the
plane of the roadway, as will be described below.
Dividing equation (3) by "Z" provides
1
Z = ax + by + c (4),
where a=Alf, b=Blf and c=C. Substituting equation (4) into equations ( 1 )
results in
T
a - (ax+by+c)S;t+CpXS~~ w
(5).
T
v - (ax+by+c)S2t+CpX,S2) w
Expanding equations (5) results in
a - - (ctz + aftX )x + (bftx - w1 ) + ( fwy + cftx ) +
w 6
wy - atZ x2 x + btZ xy ( )
f f
v - (wZ + aftx )x + (- ctZ + bfty ) y - ( fwx + cfty ) +
7.
wy - atZ xy - wx + btz y2 ( )
f f


WO 01/39120 CA 02392652 2002-05-27 pCT~S00/32143
-5-
Equations (6) and (7) are a special case (the "calibrated camera 13" case) of
an eight parameter model
for a camera 13 moving relative to a plane:
a - a,x+ aZy+ a3 + a7x2 + agxy (8)
v - a4x+ asy+ a6 + a7xy+ agyz (9)
Given the flow vector (u, v), one can recover the parameters a;, i=1,...,8,
from which one can recover
the motion parameters t and w.
One problem arises in connection with the methodology described above in
connection with
equations (1 ) through (9), namely, given the large number of parameters a;
whose values need to be
determined, it is difficult to devise a satisfactory method that will reject
outliers. This follows from
the fact that a relatively large number of optical flow vectors (u,v) will
need to be used in
determining the values of the parameters, which, in turn, requires a
corresponding number of points
in both images. In addition, it is difficult to differentiate between optical
flow due to rotation around
the X and Y axes and translation along the X and Z axes, respectively.
Accordingly, it is desirable to reduce the number of motion parameters to a
minimum. The
motion of a vehicle 10 along a roadway can be modeled as being constrained to
be a translation along
the Z axis, as the vehicle 10 moves forward or in reverse, and a rotation
around the X and Y axes,
as the vehicle 10's path deviates from a straight-line course. In that case,
equation (5) reduces to
z
a - (ax + by + c)xt= - f wX + f + f wy
(10).
2
v - (ax + by + c)yt~ - f + f wx + f wy
If the images are rectified to ensure that the plane of the roadway is
parallel to the camera 13's XZ
plane, that is, so that the images would be as if the camera 13's optical axis
is parallel to the plane
of the road, then, in equations (10) a=0 and c=0, in which case


WO 01/39120 CA 02392652 2002-05-27 pCT/US00/32143
-6-
xz
a - bxytZ - f wx + f + f, wy
(11).
z x
v _ _ by2tz _ f + f wX + f wv
In order to rectify the images, the camera 13 will need to be calibrated. A
methodology for
calibrating the camera 13 and rectifying the images will be described below.
In equations (11) there are three motion parameters, tZ (translation along the
Z axis), wX
(rotation around the X axis) and wY (rotation around the Y axis) to be
determined from the flow
vectors (u, v) associated with points in at least some portions of the images
'~' and 'h'. Finding
corresponding points in the images 'I' and 'h', that is, points that are
projections of the same point
in three-dimensional space in the respective images is based on a "photometric
constraint"
I (x, y, t) - I (x + u8t, y + v8t, t + 8t) = 0 (12),
T
which essentially states that the irradiance, or luminance, of the point p =
(x, y) in the image'Y
and and the point p = (x + u8t, y + vet) in the image 'I'', which are
projections of the same point
in three-dimensional space into the respective images, are the same. In
practice, equation (12) will
not hold exactly because of noise. If, for every point, the noise is modeled
as zero mean Gaussian
noise, equation 12 reduces to
P(I (x, y, t) - I (x + u~t, y + vet, t + fit)) = N(~ 2,0) (13)
and a maximum likelihood can be sought.
Equation (13) can be computationally intensive, and, instead of using that
equation, the
motion parameters tZ, wX and wY can be determined directly from the images by
combining the
geometric constraints embodied in equation (11) with the photometric
constraints embodied in
equation (12). In that operation, given two consecutive images 'l? and'F', the
goal is to determine
the probability
P~ m I' , Y'') (14).


W~ 01/39120 CA 02392652 2002-05-27 PCT/US00/32143
of that the motion of the vehicle 10 is m = (t= , wx , wy ) given the two
images. The motion m that
maximizes ( 14) is the estimate of the camera 13 motion between the two images
'h and ~', and, thus,
the estimate of vehicle 10 motion between the two images.
According to Bayes' rule,
~ ~
~ PC Y'' ~''m) PCm
(15),
PC m~Y~' '~''~ P Y,'
~l ~
where PC Y~' ~Y~ , mJ is the probability that, given image 'I', motion m will
result in image 'Y',
~ ~
PC m~ is the a priori probability that the motion is m , and P(Y~') is the a
priori probability that
the image is 'I''. It will be assumed that P~ m~ , the probability that the
motion is m , is uniform
~
in a small region M around the previous motion estimate, that is, the motion
estimate generated as
between the "i-1 st" and "i-th" images; it will be appreciated that, if the
time period between the times
at which images 'F and 'P' are recorded is sufficiently small, this assumption
will hold. It will
further be appreciated that, in equation ( 15), the denominator P( Y~') does
not depend on the motion
~
m , and so it does not affect the search for a maximum.
The probability that, given image ~I', motion m will result in image ~'', PC
Y~' ~Y~ , m) , can
~
be determined by warping the image 'I'' according to the motion m , thereby to
generate a warped
~
image Y~' and determining the sum squared difference ("SSD")
1
S~m~ _ - ~ ~ y'~(x'Y~- ~'(x'Y)~ 2 (
x,yeR


WO 01/39120 CA 02392652 2002-05-27 pCT~S00/32143
_g_
between corresponding patches, or regions, R in the images that are believed
to be a projection of
the roadway in the two images. In equation 16, "N" is the number of points in
the region R. It will
be appreciated that, if the images 'F and 'F' are recorded at times "t" and
"t+bt, respectively, the
warped image Y~' will represent the image that is assumed would be recorded at
time "t" if the
motion is m . Using the SSD criteria (equation 16), PC Y~' ~ ~ , m~ , the
probability that image 'I''
would result from image 'h and a motion m is given by the probability density
function
S(m)
P~ Y'' I Yl , m~ = ce °z (17),
where "c" is a normalization factor and "6" is the variance of the noise,
which is modeled as a zero
mean Gaussian function. Since it is assumed that P( m~ , the probability that
the motion is m , is
uniform in a small region M around a previous motion estimate, the problem of
finding the
maximum likelihood motion m for a patch of image 'I' reduces to finding the
maximum of the
probability density function
PC ml Y , Y' ~ = ce aZ
for motion m E M .
Since the motion for which the ego-motion estimation system processor 14 is to
generate an
estimate is the translational and rotational motion of the vehicle 10 relative
to the road, it is desirable
for the ego-motion estimation system processor 14 to consider only regions of
the images 'I' and'Y'
that comprise projections of the road, and ignore other regions of the images.
However, it should
be noted that the set R of regions, or patches, of the images that proj
ections of the roadway in the two


WO 01/39120 CA 02392652 2002-05-27 PCT/US00/32143
-9-
images 'I' and 'I", is not known. ~'o accommodate that, instead of attempting
to maximize the
function defined in equation (18), the in-rage can be tessellated into a set
of patches W;, and a
probability density P~ ml W W'~ enerated for each atch usin a uations 16 and
18 for the
g p g q ( ) ( )
respective patches. The probability density over the entire pair of images 'Y
and'I'' will be
~PCmWW'~~a,~
n ( ~ 1 f f I
p~ ml ~' , ~''~ - c ' (
where 7~; and a; are weighting functions whose values generally reflect the
confidence that the "i-th"
patch is a proj ection of the road. The value of the gradient strength (3; for
a patch reflects the degree
to which the patch the contains a texture, and thus will more likely to
contain useful information for
use in determining ego motion of the vehicle. The motion m E M for which
equation ( 19) is the
maximum will be deemed to correspond to the actual translational and
rotational motion of the
vehicle 10 as between the locations at which images'I' and 'I'' were recorded.
The weighting function ~,; for the respective "i-th" patch is generated using
patches W; and
W'; from respective images 'f and 'f'. In determining the values for the
weighting functions ~,;, for
patches W;, W'; that are not of the road, the motion model reflected in
equation ( 11 ) is not a good fit;
instead, a better fit can be obtained using some other motion of the patch. In
addition, for planar
image artifacts moving on the roadway surface, such as moving shadows, the
maximum of equation
(18) will occur far away from the initial guess. Accordingly, the value of the
weighting function ~,;
for the "i-th" patch W;, W;' will correspond to the ratio between the best fit
using the motion model
in a local region ( M ) near the initial guess and the best fit using any
motion model over large
search region "L." Accordingly, if
S,~m)
P = max exp - Z (20)


CA 02392652 2002-05-27
WO 01/39120 PCT/US00/32143
-10-
for all m E M is the value for the best fit in a local search region, where
S;() denotes the SSD over
all points in the "i-th" patch, and
Si ~ ml
P2 = max exp - Z (21)
for all m E L the value for the best fit over all possible image motions, then
P
= p (22).
2
Generally, generating the value for PZ (equation 21) can be computationally
intensive. To avoid
generating PZ according to equation (21 ), the value for PZ for each patch can
be estimated by using
the SSD as between a patch in the image ~l' and the correspondingly-positioned
patch in the image
'l'', as well as the SSD's as between the patch in the image 'h and patches
translated horizontally and
vertically around the correspondingly-positioned patch in the image 'f', for a
selected number of
points. That is, if the patch in image'F consists of points p(x,y) centered on
p(a,b) (that is, points
p(a-a, b-(3) through p(a+a,b+(3) (a and (3 being integers), with the patch
being of dimensions 2a+1
by 2(3+1), PZ is generated by using the SSD as between the patch of the same
size in image 'II'
consisting of points p(x,y) centered on p(a,b), as well as SSD's as between
the patch in image 'f and
patches of the same size in image'f' that are centered on points p(a-S, b-8)
through p(a+b, b+S), a
total of (28+1 )2 patches in image 'I''. Each patch in image 'h' can be
considered as one of the possible
image motions. In one embodiment, 8 is selected to be seven, in which case
there will be two
hundred and twenty five patches in 'h' for which the SSD will be generated in
generating the value
for P2.
Unlike weighting function 7~;, weighting function a; for the respective "i-th"
patch is generated
using only patch W; and W'; from image 'f'. Generally, it should be
appreciated that, in three-
dimensional space, there are three dominant directions for lines, namely, (i)
vertical, for buildings,
automobiles, and other objects that extend above the roadway surface, (ii)
horizontal lines that are
generally parallel to the direction of the roadway, and (iii) horizontal lines
that are generally
perpendicular to the direction of the roadway. In an image 'h', vertical lines
(that is, lines of type (i))


WO 01/39120 CA 02392652 2002-05-27 pCT/US00/32143
-11-
and horizontal lines that are generally perpendicular to the direction of the
roadway (that is, lines of
type (iii)) will project in to image 'F' as vertical and horizontal lines,
respectively. On the other hand,
horizontal lines that are generally parallel to the direction of the roadway
(that is, lines of type (ii))
will appear in image 'P' as lines that pass through a common point, that is, a
vanishing point. It will
be appreciated that patches W'; in image 'f' that are projections of the
roadway will predominately
contain lines of type (ii) and (iii). On the other hand, patches W' that are
projections of obstacles,
such as automobiles will predominately contain lines of type (i) and (iii),
while patches W' that are
projections of, for example, buildings, fences, and the like, will contain
lines of type (i) and (ii).
Accordingly, the value for weighting function a; for patch W; will reflect the
degree to which
it is deemed to contain projections of lines of type (ii) and (iii), and not
projections of lines of types
(i) and (iii) or types (i) and (ii). Generally, the directions of lines, if
any, passing though a patch can
be determined in relation to the gradients of the luminance at the various
points in the patch W;.
Each point in the patch W'; whose gradient (IX,Iy) is above a selected
threshold is considered to lie
at or near a line, with the direction of the line being perpendicular to the
direction of the gradient.
Thus, for those points, the direction of the line associated therewith can be
determined, as can
whether the line is of type (i), (ii) or (iii). Thus, for each patch W'; in
the image 'h', three sums S";
(n=i, ii or iii) are generated each corresponding to the sum of the magnitudes
of the gradients of the
points in the patch that are associated with lines of the respective type, the
magnitude corresponding
to G=(IXz+IYZ)o.s. A patch W'; in image 'F' is deemed to be:
(a) a proj ection of the roadway if the sums S"; and S"'; are both large and
significantly larger
than sum S';, since the sums indicate that the patch is associated with a line
or lines that are
horizontal and in the direction of the roadway and/or perpendicular thereto,
but not a line or lines that
are vertical;
(b) be a projection of an obstacle, that is, an object generally in the path
of the vehicle, if
sums S'; and S"'; are both large and significantly larger than sum S";, since
the sums indicate that the
patch is associated with a line or lines that are vertical and/or horizontal
and perpendicular to the
direction of the roadway, but not a line or lines that are horizontal and in
the direction of the
roadway; and
(c) a projection of an object to the side of the path of the vehicle if sums
S'; and S"; are both
large and significantly larger than sum S"';, since the sums indicate that the
patch is associated with
a line or lines that are vertical and/or horizontal and in the direction of
the roadway.


CA 02392652 2002-05-27
WO 01/39120 PCT/US00/32143
-12-
The value of the weighting function a; is assigned to the patch based on the
degree to which the patch
W; is deemed to be a projection of the roadway (case (a) above).
It will be appreciated that, if, for a patch W';, the sum S"; is relatively
large, indicating that
the patch is associated with a line that is horizontal and in the direction of
the roadway, but sums S';
and S"'; are relatively small, indicating that the patch is not associated
with a line that is vertical or
horizontal and perpendicular to the direction of the roadway, it generally
cannot be determined from
the set of sums generated for the patch whether the patch is a projection of
the roadway (case (a)
above) or a projection of an object to the side of the path of the vehicle
(case (c) above). However,
since the patch is not associated with a line that is vertical, it will
generally not be deemed to be a
projection of an obstacle (case (b) above). In that case, an assessment as to
whether the patch is a
projection of the roadway (case (a) above) or a projection of an object to the
side of the path of the
vehicle (case (c) above) by refernng to patches adjacent thereto.
As noted above, the value of the gradient strength (3; for a patch reflects
the degree to which
the patch the contains a texture, and thus will more likely to contain useful
information for use in
determining ego motion of the vehicle. The gradient strength (3; corresponds
to
-i
S;Cm)
- ~ eXP - 2 (23).
mEL
For relatively uniform patches, the value of the SSD S; ( m~ will be
relatively low for all motions,
in which case the value of (3; will be relatively low. On the other hand, for
patches with texture, the
value of the SSD will be relatively high for most motions, in which case the
value of ~;_will be
relatively high.
With this background, operations performed by the ego-motion estimation system
processor
14 will be describe in connection with the flow chart depicted in FIG. 2. In
connection with FIG.
2, it is assumed that the ego-motion estimation system processor 14 already
has image 'I', which it
may have used in connection with determining the translational and rotational
motion up to the
location at which image'I' was recorded. With reference to FIG. 2, after the
ego-motion estimation
system processor 14 has received image 'I'' (step 100), it will rectify the
image according to
information provided during the camera 13 calibration operation (described
below) to provide that


CA 02392652 2002-05-27
WO 01/39120 PCT/US00/32143
-13-
the optical axis is parallel to the plane defined by the roadway (step 1 Ol ).
In addition, the ego-
motion estimation system processor 14 will generate an initial guess as to the
translational and
rotational motion, using the previous motion estimate and, perhaps information
from other sensors
if available (step 102). For example, the ego-motion estimation system
processor 14 may make use
of information from the vehicle 10's speedometer, as well as information as to
the time period
between the time at which image 'h was recorded and the time at which image
'F' was recorded, in
generating the initial guess. Generally, it will be appreciated that the time
period will be fixed, and
will preferably the same for each successive pair of images 'h and 'P'. After
the ego-motion
estimation system processor 14 has generated the initial guess, it will use
the initial guess to warp
image 'f' toward image 'h, thereby to generate a warped image ~Y' (step 103).
After the ego-motion estimation system processor 14 has generated the warped
image (step
103), it will divide the image 'P and the warped image ~Y' into patches and,
for each pair of
corresponding patches in the two images 'f and ~' , generate the weighting
value ~,;. In that
operation, the ego-motion estimation system processor 14 will select a patch
in the image 'f (step
104) and generate values for PZ (step 105), P, (equation 20) (step 106) and
~.; (equation 22) (step 107)
as described above. In addition, the ego-motion estimation system processor 14
can generate the
value for ~; (equation 23) and a; (step 108). After the ego-motion estimation
system processor 14
has generated performed steps 105 through 108 for the selected patch, it will
determine whether all
of the patches in image 'f have been processed (step 109) and if not, return
to step 104 to select
another patch and perform steps 105 through 109 in connection therewith.
The ego-motion estimation system processor 14 will perform steps 104 through
109 in
connection with each patch in the image'f.. After the ego-motion estimation
system processor 14
has performed steps 104 through 109 in connection with all of the patches in
the image 'P, it will
sequence from step 109 to step 110 to search for the motion m that maximizes
the value provided
by equation ( 19) (step 110). That motion m will comprise values for
translation tZ and rotation wX,
wY parameters that will constitute the estimate of the motion of the vehicle
10 as between the point
in time at which image 'Y was recorded and the point in time at which image
'h' is recorded. The
ego-motion estimation system processor 14 can perform operations described
above in connection
with each successive pair of images'F and 'f' to estimate the motion of the
vehicle 10.


CA 02392652 2002-05-27
WO 01/39120 PCT/US00/32143
-14-
In performing steps 106 (to generate the values for P~) and 110 (to determine
the motion m
that maximizes the value provided by equation (19)), the ego-motion estimation
system processor
14 can perform a gradient descent that is limited to a selected cube-shaped
region around the initial
guess.
In determining the initial guess (step 102) for each new image'h', the ego-
motion estimation
system processor 14 can use the estimate of the motion generated for the
previously-received image.
In addition, the size of the region M can be adjusted adaptively.
A second methodology for determining the ego-motion of a vehicle 10 will be
described in
connection with FIG. 3. As with the methodology described above in connection
with FIG. 2, it is
assumed that the images have been rectified so that the image planes are
perpendicular to the plane
represented by the roadway on which the vehicle 10 is traveling, and their
horizontal ("x") axes are
parallel to the plane represented by the roadway. By way of background, in
that case, the equation
of a plane
AX + BY+ CZ = 1 (24)
(reference equation (3)) reduces to
BY=1
(25),
1
in which case Y = IBI is the height of the optical axis of the camera 13 (or,
more specifically, the
Z axis) above the road. Since, for a point p(x,y) in an image that is a
projection of a point P(X, Y,Z)
Y
in three-dimensional space, y = f Z , equation (25) becomes
1
z = by (26)
The brightness constraint is
ulX+vIy+I~=0


CA 02392652 2002-05-27
WO 01/39120 PCT/US00/32143
-15-
for each point, where, at each point (x,y) in the image, IX and IY are the
horizontal and vertical
components of the spatial gradient of the luminance and h is the time gradient
of the luminance. In
addition, the equations for the components (u, v) of the flow vector
(reference equation (1)) can be
written
a - 2 Si t + SST [ w] x p
(28)
v - 2 S2 t + S2 [w]xp
where [w]X is a skew-symmetric matrix, in which case
2 STt + ST [w]xP + Ir - 0 (29)~
where
.~x
S - fly (30).
- xIX - yI y
For motion constrained to a plane, equation (29) reduces to
(ax + by + c) ST t + ST [w] x p + It - o (31).
Since the images are rectified, equation (26) will hold. In addition, since
rotational motion is
constrained so as to occur only around the vertical ("Y") axis, wX=wZ 0.
Accordingly, equation (31 )
will correspond to
0 0 - wY x~ f
byST t + ST 0 0 0 y~ f + I~ - 0 (32).
- wY 0 0 1
Expanding equation 32, and using equation (30),
bySTt - ( flx + xIX + yIy)wY + It - 0 (33)


CA 02392652 2002-05-27
WO 01/39120 PCT/US00/32143
-16-
Generally, the portion uWY of the vertical component of the flow vector that
arises due to rotation
(wY) of the vehicle 10 will be larger than the portion u~ that arises due to
translation (tX) along the
"X" axis, and the portions will have a very different form. The portion u~
that arises due to
translation along the "X" axis will depend on the distance that the vehicle 10
moves between the
times the images are recorded, and will be larger for points towards the
bottom of the image than for
points further up. In that case, expressions for uWY and u~ are
x2
uwY - f, + f wY ~ fwY (34)
1
u~ _ Z ftX - gyftX (35)
Since, for small rotations around the vertical "Y," axis, the angle of
rotation wY is approximately
t
wY ~ X (36),
tz
tx
f
uwY _ tZ _ Z (37)
urx 1 ft tz
Z x
It will be appreciated that values for tY, the component of the translation t
in the vertical direction,
and wX and wZ, the X and Z components of rotation w, will be zero.
Accordingly, after the ego-
motion estimation system processor 14 receives a new image ~I'', it will
determine the values for t,
and t3, the components of the translation t in the forward (along the Z axis)
and side (along the X
axis) directions, and wY, the component of rotation around the vertical (Y)
axis. In that operation,
the ego-motion estimation system processor 14 will generate an initial
estimate as to the motion (step
150, FIG. 3) and use that initial estimate to generate a warped image ~Y'
(step 1 S 1 ). The ego-
motion estimation system processor 14 can use information from a number of
sources in connection
with generating the initial estimate (step 150), including information from,
for example, the vehicle


CA 02392652 2002-05-27
WO 01/39120 PCT/US00/32143
-17-
10's speedometer. Thereafter, the ego-motion estimation system processor 14
divides the image 'h
and Y~' into patches (step 152) and determines which patches are likely to
comprise images of the
roadway (step 153). In determining which patches are likely to comprise images
of the roadway
(reference step 153), the ego-motion estimation system processor 14 can
generate an SSD (equation
16) for corresponding patches in the images 'h and ~Y' and determine that
patches in the warped
image Y~' that comprise images of the roadway will be those patches with a
relatively high SSD
value. The ego-motion estimation system processor 14 then uses the patches
identified in step 153
to minimize a cost function of the form
P
min ~ ~ At - b~2 + ~~ ~ W, dT + ~.Z I W ~t t° )I p (38)~
t x,yER i
where W, and Wz are weighting matrices that essentially describe confidence in
the priors or
smoothness values (step 154). Equation (38) can be formalized in the form of a
Kalman filter, and
the value of "p" can be selected to be one or two depending on whether the L,
or LZ norm is to be
used.
As noted above, for methodologies described above in connection with both
FIGS. 2 and 3,
the ego-motion estimation system processor 14 will initially rectify the
images as received from the
camera 13. In the above description, the images I and I' are images as
rectified by the ego-motion
estimation system processor 14. To rectify the images as received from the
camera 13, the camera
13 will need to be calibrated during a calibration operation prior to use in
connection with recording
images for use in estimating vehicle 10 motion as described above. Before
describing operations to
be performed during calibration, it would be helpful to consider the effects
of incorrect calibration.
If, for example, a vehicle is driving along a straight road, and if the
optical axis of the camera is
aligned with the direction of motion, the flow field of successively recorded
images will be an
expansion field with the focus of expansion located at the center of the
respective images, that is, at
the origin (x,y)=(0,0) of the image's rectilinear coordinate system. On the
other hand, if the camera
is mounted on the vehicle with a small rotation around the vertical ("Y") axis
in three-dimensional
space, then the focus of expansion will be displaced along the image's
horizontal ("x") axis. In that
case, the motion model defined by equation (11) will not account for the flow
field, but will be well
approximated by a forward translation and a rotational velocity wy around the
vertical ("Y") axis.


WO 01/39120 CA 02392652 2002-05-27 pCT/US00/32143
-18-
Accordingly, errors in the orientation of the camera around the vertical ("Y")
axis in three
dimensional space will create a bias in the rotation estimate, in which case a
curved path would be
estimated instead of a straight path. Similarly, errors in the camera's
orientation around the
horizontal ("X") axis in three dimensional space will cause a bias in the
pitch estimate. Based on
these observations, a calibration operation can be performed by having the
camera record a sequence
of images while the vehicle is being driven down a straight roadway. The
vehicle's ego-motion is
estimated as described above in connection with FIGS. 2 or 3, and calibration
parameters are
estimated that would cause the ego-motion to integrate into a straight path.
The invention provides a number of advantages. In particular, the invention
provides an
arrangement for determining ego-motion of a vehicle 10 on a roadway from a
series of images
recorded by a camera 13 mounted on the vehicle 10, at least a portion of the
images comprising
projections of the roadway, and without requiring mechanical sensors which are
normally not
provided with a vehicle 10 and that would, if provided, increase the cost and
maintenance expenses
thereof.
It will be appreciated that a system in accordance with the invention can be
constructed in
whole or in part from special purpose hardware or a general purpose computer
system, or any
combination thereof, any portion of which may be controlled by a suitable
program. Any program
may in whole or in part comprise part of or be stored on the system in a
conventional manner, or it
may in whole or in part be provided in to the system over a network or other
mechanism for
transferring information in a conventional manner. In addition, it will be
appreciated that the system
may be operated and/or otherwise controlled by means of information provided
by an operator using
operator input elements (not shown) which may be connected directly to the
system or which may
transfer the information to the system over a network or other mechanism for
transferring
information in a conventional manner.
The foregoing description has been limited to a specific embodiment ofthis
invention. It will
be apparent, however, that various variations and modifications may be made to
the invention, with
the attainment of some or all of the advantages of the invention. It is the
object of the appended
claims to cover these and such other variations and modifications as come
within the true spirit and
scope of the invention.
What is claimed as new and desired to be secured by Letters Patent of the
United States is:

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date Unavailable
(86) PCT Filing Date 2000-11-27
(87) PCT Publication Date 2001-05-31
(85) National Entry 2002-05-27
Dead Application 2004-11-29

Abandonment History

Abandonment Date Reason Reinstatement Date
2003-11-27 FAILURE TO PAY APPLICATION MAINTENANCE FEE

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Application Fee $300.00 2002-05-27
Maintenance Fee - Application - New Act 2 2002-11-27 $100.00 2002-05-27
Extension of Time $200.00 2003-08-28
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
SHASHUA, AMNON
STEIN, GIDEON
MANO, OFER
Past Owners on Record
None
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Representative Drawing 2002-05-27 1 5
Abstract 2002-05-27 1 19
Claims 2002-05-27 1 12
Drawings 2002-05-27 4 59
Description 2002-05-27 18 814
Cover Page 2002-10-30 1 42
PCT 2002-05-27 4 157
Assignment 2002-05-27 3 124
Correspondence 2002-10-28 1 26
PCT 2002-05-28 3 144
Correspondence 2003-08-28 2 62
Correspondence 2003-09-15 1 15