Language selection

Search

Patent 2282240 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent: (11) CA 2282240
(54) English Title: SYSTEM AND COMPUTER-IMPLEMENTED METHOD FOR MODELING THE THREE-DIMENSIONAL SHAPE OF AN OBJECT BY SHADING OF A TWO-DIMENSIONAL IMAGE OF THE OBJECT
(54) French Title: SYSTEME ET PROCEDE INFORMATIQUES DE MODELISATION DE LA FORME TRIDIMENSIONNELLE D'UN OBJET PAR OMBRAGE D'UNE IMAGE BIDIMENSIONNELLE DE L'OBJET
Status: Expired and beyond the Period of Reversal
Bibliographic Data
(51) International Patent Classification (IPC):
  • G06T 15/10 (2011.01)
(72) Inventors :
  • HERKEN, ROLF (Germany)
  • THAMM, TOM-MICHAEL (Germany)
(73) Owners :
  • MENTAL IMAGES GMBH
(71) Applicants :
  • MENTAL IMAGES GMBH (Germany)
(74) Agent: SMART & BIGGAR LP
(74) Associate agent:
(45) Issued: 2009-12-29
(86) PCT Filing Date: 1998-02-20
(87) Open to Public Inspection: 1998-08-27
Examination requested: 2002-12-16
Availability of licence: N/A
Dedicated to the Public: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): Yes
(86) PCT Filing Number: PCT/IB1998/000612
(87) International Publication Number: WO 1998037515
(85) National Entry: 1999-08-20

(30) Application Priority Data:
Application No. Country/Territory Date
60/038,888 (United States of America) 1997-02-21

Abstracts

English Abstract


A computer graphics system
generates a three-dimensional
model of an object in an
interactive manner under control
of an operator. An initial model
for the object to be modeled
is displayed to the operator as
illuminated from a particular
illumination direction and
projected onto an image plane.
The operator can update shading
of a pixel on the image plane, and,
based on the updated shading,
the computer graphics system
generates an updated normal
vector for the updated pixel,
which identifies the normal of
the surface of the object projected
onto the updated pixel. Using the
updated normal vector field and a
height field, which represents the
height of the respective portion of
the object as projected onto the respective pixels of the image plane, the
computer graphics system generates an updated height value for
the updated pixel, thereby to update the height field. The updated normal
vector field and the updated height field define the updated model
of the object, which corresponds to the updated shape of the object as updated
based on the updated shading. The computer graphics
system can then display to the operator an image of the object as defined by
the updated model. If the updated model is satisfactory,
the computer graphics system saves the updated model as the final model. On
the other hand, if the updated model is not satisfactory,
the operator can further update the shading and enable the computer graphics
system to generate a further updated normal vector field
and updated height field, thereby to generate a further updated model for the
object. The operations can be repeated until the operator
determines that the object is satisfactory.


French Abstract

Un système infographique génère un modèle tridimensionnel d'un objet de manière interactive sous la commande d'un opérateur. Un modèle initial de l'objet à modéliser est affiché à l'opérateur, illuminé à partir d'une direction d'illumination particulière et projeté sur un plan d'image. L'opérateur peut actualiser l'ombrage d'un pixel sur le plan d'image et, sur la base de l'ombrage actualisé, le système infographique génère un vecteur perpendiculaire actualisé pour le pixel actualisé, lequel identifie la perpendiculaire de la surface de l'object projeté sur le pixel actualisé. L'utilisation du champ du vecteur perpendiculaire actualisé et d'un champ de hauteur, lequel représente la hauteur de la partie respective de l'objet projetée sur les pixels respectifs du plan d'image, permet au système infographique de générer une valeur de hauteur actualisée pour le pixel actualisé, actualisant ainsi le champ de hauteur. Le champ du vecteur perpendiculaire actualisé et le champ du vecteur actualisé définissent le modèle actualisé de l'objet, lequel correspond à la forme actualisée de l'objet tel qu'elle est actualisée sur la base de l'ombrage actualisé. Le système infographique peut alors afficher à l'opérateur une image de l'objet telle qu'elle est définie par le modèle actualisé. Si le modèle actualisé est satisfaisant, le système infographique conserve le modèle actualisé comme modèle final. En revanche, si le modèle actualisé n'est pas satisfaisant, l'opérateur peut à nouveau actualiser l'ombrage et permettre au système infographique de générer un autre champ de vecteur perpendiculaire actualisé et un autre champ de hauteur actualisé, afin de générer ainsi un autre modèle actualisé de l'objet. Les opérations peuvent être répétées jusqu'à ce que l'opérateur détermine que l'objet est satisfaisant.

Claims

Note: Claims are shown in the official language in which they were submitted.


-22-
CLAIMS:
1. A computer graphics system for generating a
geometrical model representing geometry of at least a
portion of a surface of a three-dimensional object by
shading by an operator in connection with a two-dimensional
image of the object, the image representing the object as
projected onto an image plane, the computer graphics system
comprising:
A. an operator input device configured to receive
shading information provided by the operator, the shading
information representing a change in brightness level of at
least a portion of the image;
B. a model generator configured to receive the
shading information from the operator input device and to
generate in response thereto an updated geometrical model of
the object, the model generator being configured to use the
shading information to determine at least one geometrical
feature of the updated geometrical model; and
C. an object display configured to display the
image of the object as defined by the updated geometrical
model.
2. A computer graphics system as defined in claim 1
in which the operator input device includes a pen and
digitizing tablet.
3. A computer graphics system as defined in claim 1
further comprising an updated model store configured to
store the updated geometrical model as a final geometrical
model for the object under control of the operator.
4. A computer graphics system as defined in claim 1
further comprising an initial model generator configured to

-23-
generate an initial geometrical model for the object, the
object display initially displaying an initial image of the
object as defined by the initial geometrical model to the
operator.
5. A computer graphics system as defined in claim 4
in which the initial geometrical model comprises a default
initial geometrical model provided by the computer graphics
system.
6. A computer graphics system as defined in claim 4
in which the initial geometrical model is generated in
response to shading input provided by the operator for at
least one reference pixel.
7. A computer graphics system as defined in claim 1
in which the model generator comprises:
A. an updated normal vector generator configured
to generate, from updating of the shading as provided by the
operator of the image, an updated normal vector for at least
a portion of the object; and
B. an updated height value generator configured to
generate from the updated normal vector an updated height
value for the at least a portion of the object, the updated
height value representing a height of the at least a portion
of the object from the image plane, thereby to update the
geometrical model of the object for the at least a portion
of the object.
8. A computer graphics system as defined in claim 7
in which the updated normal vector generator is configured
to select the updated normal vector n1 for the at least a
portion of the object in accordance with
n1.cndot.L=I

-24-
where "L" represents an illumination vector indicative of an
illumination level and illumination direction for the object
and "I" represents brightness of the at least a portion of
the object as displayed on the image plane.
9. A computer graphics system as defined in claim 8
in which the updated normal vector has a predetermined
magnitude.
10. A computer graphics system as defined in claim 9
in which the predetermined magnitude is "one".
11. A computer graphics system as defined in claim 8
in which the updated normal vector generator is further
configured to select the updated normal vector n1 for the at
least the portion of the object in accordance with
n1 .cndot. (n0xL) =0
where "no" represents a normal vector for the at least a
portion of the object prior to the shading.
12. A computer graphics system as defined in claim 8
in which the updated normal vector generator is further
configured to select the updated normal vector n1 for the at
least the portion of the object in accordance with
| (n1,n0xL) <.epsilon..delta.
where .epsilon..delta. is a predetermined value.
13. A computer graphics system as defined in claim 7
in which the updated height value generator is configured to
generate the updated height value in accordance with a
Bézier-Bernstein interpolation methodology.
14. A computer graphics system as defined in claim 13
in which the updated height value generator is configured to

-25-
generate the updated height value in relation to a plurality
of height values along a plurality of directions along said
image plane for the at least the portion of the object.
15. A computer graphics system as defined in claim 1
in which said model generator is configured to generate a
hierarchical surface representation of the geometrical model
comprising a plurality of resolution levels.
16. A computer graphics system as defined in claim 15,
the object display being configured to display said image in
a plurality of image resolution levels, said model generator
being configured to generate the hierarchical surface
representation of the geometrical model in a plurality of
hierarchical surface resolution levels each corresponding to
respective image resolution levels.
17. A computer graphics system as defined in claim 16
in which the model generator is configured to generate the
plurality of hierarchical surface resolution levels in
response to the operator providing shading information at
the respective image resolution levels.
18. A computer implemented graphics method for
generating a geometrical model representing geometry of at
least a portion of a surface of a three-dimensional object
by shading by an operator in connection with a two-
dimensional image of the object, the image representing the
object as projected onto an image plane, the method
comprising the steps of:
A. receiving shading information provided by the
operator in connection with the image of the object, the
shading information representing a change in brightness
level of at least a portion of the image;

-26-
B. generating in response to the shading
information an updated geometrical model of the object, the
shading information being used to determine at least one
geometrical feature of the updated geometrical model; and
C. displaying the image of the object as defined
by the updated geometrical model.
19. A method as defined in claim 18 further comprising
the step of storing the updated geometrical model as a final
geometrical model for the object under control of the
operator.
20. A method as defined in claim 18 further comprising
an initial model generation step in which an initial
geometrical model for the object is generated and displayed
to the operator.
21. A method as defined in claim 20 in which the
initial geometrical model comprises a default initial
geometrical model.
22. A method as defined in claim 20 in which the
initial geometrical model is generated in response to
shading input provided by the operator for at least one
reference pixel.
23. A method as defined in claim 18 in which the model
generation step comprises the steps of
A. generating, from updating of the shading of the
image as provided by the operator, an updated normal vector
for at least a portion of the object; and
B. generating from the updated normal vector, an
updated height value for the at least a portion of the
object, the updated height value representing a height of

-27-
the at least a portion of the object from the image plane,
thereby to update the geometrical model of the object for
the at least a portion of the object.
24. A method as defined in claim 23 in which the
updated normal vector generation step includes the step of
selecting the updated normal vector n1 for the at least a
portion of the object in accordance with
n1 .cndot. L=I
where "L" represents an illumination vector indicative of an
illumination level and illumination direction for the object
and "I" represents brightness of the at least a portion of
the object as displayed on the image plane.
25. A method as defined in claim 24 in which the
updated normal vector has a predetermined magnitude.
26. A method as defined in claim 25 in which the
predetermined magnitude is "one".
27. A method as defined in claim 24 in which the
updated normal vector generation step further includes the
step of selecting the updated normal vector n1 for the at
least the portion of the object in accordance with
n1 .cndot. (n0xL)=0
where "n0" represents a normal vector for the at least a
portion of the object prior to the shading.
28. A method as defined in claim 24 in which the
updated normal vector generation step further includes the
step of selecting the updated normal vector n1 for the at
least the portion of the object in accordance with
| (n1, n0xL) | < .epsilon..delta.

-28-
where .epsilon..delta. is a predetermined value.
29. A method as defined in claim 23 in which the
updated height value generation step includes the step of
generating the updated height value in accordance with a
Bézier-Bernstein interpolation methodology.
30. A method as defined in claim 29 in which the
updated height value generation step includes the step of
generating the updated height value in relation to a
plurality of height values along a plurality of directions
along said image plane for the at least the portion of the
object.
31. A method as defined in claim 18 in which said
model generation step includes the step of generating a
hierarchical surface representation of the geometrical model
comprising a plurality of resolution levels.
32. A method as defined in claim 31, the object
display step including the step of displaying said image in
a plurality of image resolution levels, said model
generation step including the step of generating the
hierarchical surface representation of the geometrical model
in a plurality of hierarchical surface resolution levels
each corresponding to respective image resolution levels.
33. A method as defined in claim 32 in which the model
generation step includes the step of generating the
plurality of hierarchical surface resolution levels in
response to the operator providing shading information at
the respective image resolution levels.
34. A computer readible medium having computer
executable instructions stored thereon by one or more
computers for use in connection with a computer for

-29-
generating a geometrical model representing geometry of at
least a portion of a surface of a three-dimensional object
by shading by an operator in connection with a two-
dimensional image of the object, the image representing the
object as projected onto an image plane, the computer-
readable medium having encoded thereon:
A. an operator input module configured to enable
the computer to receive shading information provided by the
operator in connection with the image of the object, the
shading information representing a change in brightness
level of at least a portion of the image;
B. a model generator module configured to enable
the computer to receive the shading information from the
operator input device and to generate in response thereto an
updated geometrical model of the object, the model generator
module being configured to enable the computer to use the
shading information to determine at least one geometrical
feature of the updated geometrical model; and
C. an object display module configured to enable
the computer to display the image of the object as defined
by the updated geometrical model.
35. A computer readable medium as defined in claim 34
further comprising an updated model store module configured
to enable the computer to store the updated geometrical
model as a final geometrical model for the object under
control of the operator.
36. A computer readable medium as defined in claim 34
further comprising an initial model generator module
configured to enable the computer to generate an initial
geometrical model for the object, the object display module
initially enabling the computer to display an initial image

-30-
of the object as defined by the initial geometrical model to
the operator.
37. A computer readable medium as defined in claim 36
in which the initial geometrical model comprises a default
initial geometrical model provided by the computer.
38. A computer readable medium as defined in claim 36
in which the initial geometrical model is generated in
response to shading input provided by the operator for at
least one reference pixel.
39. A computer readable medium as defined in claim 34
in which said model generator module comprises:
A. an updated normal vector generator module
configured to enable the computer to generate, from updating
of the shading of the image as provided by the operator, an
updated normal vector for at least a portion of the object;
B. an updated height value generator module
configured to enable the computer to generate from the
updated normal vector an updated height value for the at
least a portion of the object, the updated height value
representing a height of the at least a portion of the
object from the image plane, thereby to update the
geometrical model of the object for the at least a portion
of the object.
40. A computer readable medium as defined in claim 39
in which the updated normal vector generator module is
configured to enable the computer to select the updated
normal vector n1 for the at least a portion of the object in
accordance with
n1.cndot.L=I

-31-
where "L" represents an illumination vector indicative of an
illumination level and illumination direction for the object
and "I" represents brightness of the at least a portion of
the object as displayed on the image plane.
41. A computer readable medium as defined in claim 40
in which the updated normal vector has a predetermined
magnitude.
42. A computer readable medium as defined in claim 41
in which the predetermined magnitude is "one".
43. A computer readable medium as defined in claim 40
in which the updated normal vector generator module is
further configured to enable the computer to select the
updated normal vector n1 for the at least the portion of the
object in accordance with
n1 .cndot. (n0xL) =0
where "n0" represents a normal vector for the at least a
portion of the object prior to the shading.
44. A computer readable medium as defined in claim 40
in which the updated normal vector generator is further
configured to enable the computer to select the updated
normal vector n1 for the at least the portion of the object
in accordance with
¦ (n1, n0xL) ¦ <.epsilon..delta.
where .epsilon..delta. is a predetermined value.
45. A computer readable medium as defined in claim 39
in which the updated height value generator is configured to
enable the computer to generate the updated height value in
accordance with a Bézier-Bernstein interpolation
methodology.

-32-
46. A computer readable medium as defined in claim 45
in which the updated height value generator is configured to
enable the computer to generate the updated height value in
relation to a plurality of height values along a plurality
of directions along said image plane for the at least the
portion of the object.
47. A computer readable medium as defined in claim 34
in which said model generator module is configured to enable
the computer to generate a hierarchical surface
representation of the geometrical model comprising a
plurality of resolution levels.
48. A computer readable medium as defined in claim 47,
the object display module is configured to enable the
computer to display said image in a plurality of image
resolution levels, said model generator module being
configured to enable the computer to generate the
hierarchical surface representation of the geometrical model
in a plurality of hierarchical surface resolution levels
each corresponding to respective image resolution levels.
49. A computer readable medium as defined in claim 48
in which the model generator module is configured to enable
the computer to generate the plurality of hierarchical
surface resolution levels in response to the operator
providing shading information at the respective image
resolution levels.

Description

Note: Descriptions are shown in the official language in which they were submitted.


CA 02282240 1999-08-20
WO 98/37515 PCT/IB98/00612
SYSTEM AND COMPUTER-IMPLEMENTED METHOD FOR MODELING THE THREE-
DIMENSIONAL SHAPE OF AN OBJECT BY SHADING OF A TWO-DIMENSIONAL IMAGE OF THE
OBJECT
FIELD OF THE INVENTION
The invention relates generally to the field of computer graphics, computer-
aided geometric
design and the like, and more particularly to generating a three-dimensional
model of an object.
BACKGROUND OF TIiE INVENTION
In computer graphics, computer-aided geometric design and the like, an artist,
draftsman or
the like (generally referred to herein as an "operator) attempts to generate a
three-dimensional model
of an object, as maintained by a computer, from lines defining two-dimensional
views of objects.
Conventionally, computer-graphical arrangements generate a three-dimensional
model from, for
example, various two-dimensional line drawings comprising contours and/or
cross-sections of the
object and by applying a number of operations to such lines which will result
in two-dimensional
surfaces in three-dimensional space, and subsequent modification of parameters
and control points
of such surfaces to correct or otherwise modify the shape of the resulting
model of the object. After
a three-dimensional model for the object has been generated, it may be viewed
or displayed in any
of a number of orientations.
In a field of artificial intelligence commonly referred to as robot vision or
machine vision
(which will generally be referred to herein as "machine vision"), a
methodology referred to as "shape
from shading" is used to generate a three-dimensional model of an existing
object from one or more
two-dimensional images of the object as recorded by a camera. Generally, in
machine vision, the
type of the object recorded on the image(s) is initially unknown by the
machine, and the model of
the object that is generated is generally used to, for example, facilitate
identification of the type of
the object depicted on the image(s) by the machine or another device.
In the shape from shading methodology, the object to be modeled is illuminated
by a light
source, and a camera, such as a photographic or video camera, is used to
record the image(s) from
which the object will be modeled. It is assumed that the orientation of a
light source, the camera
position and the image plane relative to the object are known. In addition, it
is assumed that the
reflectance properties of the surface of the object are also known. It is
further assumed that an
orthographic projection technique is used to project the surface of the object
onto the image plane,

CA 02282240 1999-08-20
WO 98/37515 PCT/IB98/00612
-2-
that is, it is assumed that an implicit camera that is recording the image on
the image plane has a
focal length of infinity. The image plane represents the x,y coordinate axes
(that is, any point on the
image plane can be identified by coordinates (x,y)), and the z axis is thus
normal to the image plane; as a result, any point on the surface of the object
that can be projected onto the image plane can be
represented by the coordinates (x,y,z). The image of the object as projected
onto the image plane
is represented by an image irradiance function I(x,y) over a two-dimensional
domain S2c1[8', while
the shape of the object is given by a height function z(x,y) over the domain
U. The image irradiance
function I(x,y) represents the brightness of the object at each point (x,y) in
the image. In the shape
from shading methodology, given I(x,y) for all points (x,y) in the domain, the
shape of an object,
given by z(x,y), is determined.
In determining the shape of an object using the shape from shading
methodology, several
assumptions are made, namely,
(i) the direction of the light source is known;
(ii) the shape of the object is continuous;
(iii) the reflectance properties of the surface of the object are homogenous
and known; and
(iv) the illumination over at least the portion of the surface visible in the
image plane is
uniform.
Under these assumptions, the image irradiance function I(x,y) for each point
(x,y) on the image plane
can be deternuned as follows. First, changes in surface orientation of the
object is given by means
of first partial derivatives of the height function z(x,y) with respect to
both x and y,
>
P(x,Y) = Z~zY) and q(x,y) = az~x'Y)
(1),
Y
where p-q space is referred to as the "gradient space." Every point (p,q) of
the gradient space
corresponds to a particular value for the surface gradient. If the surface is
continuous, values for p
and q are dependent on each other since the cross-partial-derivatives have to
be equal, that is:
aP(x,Y) _ gq(x,Y)
(2)
ay 9 x
- - -

CA 02282240 1999-08-20
WO 98/37515 PCT/IB98/00612
-3-
(Equation (2) holds if the surface is continuous because each partial
derivative represents the second
partial derivative of the height fiznction z(x,y) with respect to both x and
y, and x and y are
independent.) Equation (2) is referred to as the "integrability constraint,"
which, if it holds, will
ensure that the surface is smooth and satisfies equation (1).
The relationship between the image irradiance function I(x,y) and the surface
orientation (p,q)
is given by a function R(p,q), which is referred to as a reflectance map
I(x,y) = R(p(x,y),q(x,y)) (3).
Equation (3) is referred to as the "image irradiance equation." As an example,
a relatively simple
reflectance map exists for objects which have a Lambertian surface. A
Lambertian surface appears
to be equally bright from all viewing directions, with the brightness being
proportional to the light
flux incident on the surface. The reflection RL(p,q) is proportional to the
cosine of the angle a
between a direction that is normal to the surface, which is represented by the
vector x and the
incident light ray direction, which is represented by the vector L, that is,
RL(p,q) = cos a = n-L (4),
where n=( p, q,l) , given through p(x,y),q(x,y) and L=(xL , yL , zL ) gives
the direction of the
light source.
Typically, shape from shading is performed in two steps. First, the partial
derivatives p and
q of the height function z(x,y) are determined to get the normal information n
and in the second step
the height z(x,y) is reconstructed from p and q. The partial derivatives p an
q can be determined by
solving the system of equations consisting of the image irradiance equation
(3) and the integrability
constraint equation (2). Since images can be noisy and the assumptions noted
above are sometimes
not perfectly satisfied, there may be no solution using this methodology, and
in any case there will
be no unique solution.
SUMMARY OF THE INVENTION
The invention provides a new and improved system and method for generating a
three-
dimensional model of an object by shading as applied by an operator or the
like to a two-dimensional
image of the object in the given state of its creation at any point in time.

CA 02282240 1999-08-20
WO 98/37515 PCT/IB98/00612
-4-
In brief summary, the invention provides a computer graphics system for
facilitating the
generation of a three-dimensional model of an object in an interactive manner
with an operator, such
as an artist or the like. Generally, the operator will have a mental image of
the object whose model is to be generated, and the operator will co-operate
with the computer graphics system to develop
the model. The computer graphics system will display one or more images of the
obj ect as currently
modeled from rotational orientations, translational positions, and scaling or
zoom settings as selected
by the operator, and the operator can determine whether the object corresponds
to the mental image.
In the model generation process, an initial model for the object is
initialized and an image
thereof is displayed to the operator by the computer graphics system. The
image that is displayed
will reflect a particular position of a light source and camera relative to
the object, the position of the
light source relative to the object defining an illumination direction, and
the position of the camera
relative to the object defining an image plane onto which the image of the
object is projected. Any
initial model, defining at least an infinitesimally small fragment of the
surface for the object to be
modeled, can be used, preferably occupying at least one pixel of the image
plane. The initial model
will identify, for the point or points on the image plane onto which the image
of the surface fragment
is projected, respective height values for the surface fragment defining the
distance from the image
plane for the surface fragment at that (those) point(s). The bollection of
height value(s) for the
respective points on the image plane comprise a height field which defines the
initial model for the
obj ect.
The initial model used in the model generation process may be one of a
plurality of default
models as provided by the computer graphics system itself, such as a model
defining a hemi-
spherical or -ellipsoid shape. Alternatively, the initial model may be
provided by the operator by
providing an initial shading of at least one pixel of the image plane, through
an operator input device
provided by the computer graphics system. If the initial model is provided by
the operator, one of
the points, or pixels, on the image plane is preferably selected to provide a
"reference" portion of the
initial surface fragment for the object, the reference initial surface
fragment portion having a selected
spatial position, rotational orientation, and height value with respect to the
image plane, and the
computer graphics system determines the initial model for the rest of the
surface fragment (if any)
in relation to shading (if any) applied to other pixels on the image plane. In
one embodiment, the
reference initial surface fragment portion is selected to be the portion of
the surface fragment for
which the first point or pixel to which the operator applies shading. In
addition, in that embodiment,

CA 02282240 1999-08-20
WO 98/37515 PCT/IB98/00612
-5-
the reference initial surface fragment portion is determined to be parallel to
the image plane, so that
a vector normal thereto is orthogonal to the image plane and it has a selected
height value. In any
case, computer graphics system will display the image of the initial model,
the image defining the
shading of the object associated with the initial model as illuminated from
the particular illumination
direction and projected onto the image plane.
After the initial model has been developed and the image for the object
associated with the
initial model as projected onto the image plane has been displayed, the
operator can update the
shading of the image on the image plane, using, for example, a conventional
pressure sensitive pen
and digitizing tablet. In updating the shading, the operator can increase or
reduce the shading at
particular points in the image, thereby to control the brightness, or
intensity values, of the image at
those points. In addition, the operator can add to the surface fragment by
providing shading at points
on the image plane proximate those points onto which the surface fragment is
currently projected.
Furthermore, in an erasing mode of the shading operation, the operator can
remove portions of the
surface fragment by, for example, marking as unshaded the particular points on
the image plane onto
which the portions of the surface fragment that are to be removed are
projected. After the shading
of a point of the image plane has been updated, if the point is not marked as
being unshaded, the
computer graphics system will use the updated shading to generate an updated
normal vector which
identifies, for that point, the normal vector of portion of the surface of the
object as projected onto
the respective point, and, using the updated normal vector field and the
height field, will generate
an updated height field for the obj ect. The updated normal vector field and
the updated height field
define the updated model of the object, which corresponds to the updated shape
of the object as
updated based on the shading provided by the operator.
After generating the updated model of the object, the computer graphics system
can display
an image of the object, as defined by the updated model, to the operator. If
the updated model is
satisfactory, the computer graphics system can save the updated model as the
final model. On the
other hand, if the updated model is not satisfactory, the operator can update
the shading further.
thereby to enable the computer graphics system to generate a further updated
normal vector field and
updated height field, thereby to generate a further updated model for the
object. The computer
graphics system and operator can repeat these operations until the operator
determines that the object
is satisfactory.

CA 02282240 2008-05-26
24101-274
-6-
A computer graphics system constructed in
accordance with the invention avoids the necessity of
solving partial differential equations, which is required in
prior art systems which operate in accordance with the
shape-from-shading methodology.
Embodiments of the invention also allow the
operator to perform conventional computer graphics
operations in connection with the object, including rotation
and spatial translation of the object to facilitate
projection of an image of the object onto an image plane
from any of a number of rotational orientations and spatial
positions, and scaling or zooming to facilitate enlargement
or reduction of the object and/or the image. In such
embodiments, the operator can update the shading of the
image from any particular three-dimensional rotational
and/or translational orientation and position, and from the
scaling or zoom setting, as selected by the operator. In
addition, embodiments of the invention allow the operator to
trim any surface fragment at any moment in time or the
updated final object, which may consist of a plurality of
such surface fragments, in a conventional manner by
projecting two-dimensional trim curves onto the surface of
the object. The operator can use the input device,
operating in an appropriate drawing mode, to draw these trim
curves on the image plane.
According to one aspect of the present invention,
there is provided a computer graphics system for generating
a geometrical model representing geometry of at least a
portion of a surface of a three-dimensional object by
shading by an operator in connection with a two-dimensional
image of the object, the image representing the object as
projected onto an image plane, the computer graphics system
comprising: A. an operator input device configured to

CA 02282240 2008-05-26
24101-274
-6a-
receive shading information provided by the operator, the
shading information representing a change in brightness
level of at least a portion of the image; B. a model
generator configured to receive the shading information from
the operator input device and to generate in response
thereto an updated geometrical model of the object, the
model generator being configured to use the shading
information to determine at least one geometrical feature of
the updated geometrical model; and C. an object display
configured to display the image of the object as defined by
the updated geometrical model.
According to another aspect of the present
invention, there is provided a computer implemented graphics
method for generating a geometrical model representing
geometry of at least a portion of a surface of a three-
dimensional object by shading by an operator in connection
with a two-dimensional image of the object, the image
representing the object as projected onto an image plane,
the method comprising the steps of: A. receiving shading
information provided by the operator in connection with the
image of the object, the shading information representing a
change in brightness level of at least a portion of the
image; B. generating in response to the shading information
an updated geometrical model of the object, the shading
information being used to determine at least one geometrical
feature of the updated geometrical model; and C. displaying
the image of the object as defined by the updated
geometrical model.
According to still another aspect of the present
invention, there is provided a computer readible medium
having computer executable instructions stored thereon by
one or more computers for use in connection with a computer
for generating a geometrical model representing geometry of

CA 02282240 2008-05-26
24101-274
-6b-
at least a portion of a surface of a three-dimensional
object by shading by an operator in connection with a two-
dimensional image of the object, the image representing the
object as projected onto an image plane, the computer-
readable medium having encoded thereon: A. an operator input
module configured to enable the computer to receive shading
information provided by the operator in connection with the
image of the object, the shading information representing a
change in brightness level of at least a portion of the
image; B. a model generator module configured to enable the
computer to receive the shading information from the
operator input device and to generate in response thereto an
updated geometrical model of the object, the model generator
module being configured to enable the computer to use the
shading information to determine at least one geometrical
feature of the updated geometrical model; and C. an object
display module configured to enable the computer to display
the image of the object as defined by the updated
geometrical model.
BRIEF DESCRIPTION OF THE DRAWINGS
This invention is pointed out with particularity
in the appended claims. The above and further advantages of
this invention may be better understood by referring to the
following description taken in conjunction with the
accompanying drawings, in which:
FIG. 1 depicts a computer graphics system for
generating a three-dimensional model of an object by shading
as applied by an operator or the like to a two-dimensional
image of the object in the given state of its creation at any
point in time, constructed in accordance with the invention;
FIGS. 2 through 6 are diagrams that are useful in
understanding the operations performed by the computer

CA 02282240 2008-05-26
24101-274
-6c-
graphics system depicted in FIG. 1 in determining the
updating of the model of an object by shading as applied to
the two-dimensional image of the object in its given state
of creation at any point in time; and
FIG. 7 is a flow-chart depicting operations
performed by the computer graphics system and operator in
connection with the invention.
DETAILED DESCRIPTION OF AN ILLUSTRATIVE EMBODIMENT
FIG. 1 depicts a computer graphics system 10 for
generating a three-dimensional model of an object by shading
as applied by an operator or the like to a two-dimensional
image of the object

CA 02282240 1999-08-20
WO 98/37515 PCT/IB98/00612
-7-
in the given state of its creation at any point in time, constructed in
accordance with the invention.
With reference to FIG. 1, the computer graphics system includes a processor
module 11, one or more
operator input devices 12 and one or more display devices 13. The display
device(s) 13 will
typically comprise a frame buffer, video display terminal or the like, which
will display information
in textual and/or graphical form on a display screen to the operator. The
operator input devices 12
for a computer graphics system 10 will typically include a pen 14 which is
typically used in
conjunction with a digitizing tablet 15, and a trackball or mouse device 16.
Generally, the pen 14
and digitizing tablet will be used by the operator in several modes. In one
mode, particularly useful
in connection with the invention, the pen 14 and digitizing tablet are used to
provide updated shading
information to the computer graphics system. In other modes, the pen and
digitizing tablet are used
by the operator to input conventional computer graphics information, such as
line drawing for, for
example, surface trimming and other information, to the computer graphics
system 10, thereby to
enable the system 10 to perform conventional computer graphics operations. The
trackball or mouse
device 16 can be used to move a cursor or pointer over the screen to
particular points in the image
at which the operator can provide input with the pen and digitizing tablet.
The computer graphics
system 10 may also include a keyboard (not shown) which the operator can use
to provide textual
input.to the system 10.
The processor module 11 generally includes a processor, which may be in the
form of one
or more microprocessors, a main memory, and will generally include one a mass
storage subsystem
including one or more disk storage devices. The memory and disk storage
devices will generally
store data and programs (collectively, "information") to be processed by the
processor,. and will store
processed data which has been generated by the processor. The processor module
includes
connections to the operator input device(s) 12 and the display device(s) 13,
and will receive
information input by the operator through the operator input device(s) 12,
process the input
information, store the processed information in the memory and/or mass storage
subsystem. In
addition, the processor module can provide video display information, which
can form part of the
information obtained from the memory and disk storage device as well as
processed data generated
thereby, to the display device(s) for display to the operator. The processor
module 11 may also
include connections (not shown) to hardcopy output devices such as printers
for facilitating the
generation of hardcopy output, modems and/or network interfaces (also not
shown) for connecting

CA 02282240 1999-08-20
WO 98/37515 PCT/IB98/00612
-8-
the system 10 to the public telephony system and/or in a computer network for
facilitating the
transfer of information, and the like.
The computer graphics system 10 generates from input provided by the operator,
through the
pen and digitizing tablet and the mouse, information defining the initial and
subsequent shape of a
three-dimensional object, which information may be used to generate a two-
dimensional image of
the corresponding object for display to the operator, thereby to generate a
model of the object. The
image displayed by the computer graphics system 10 represents the image of the
object as
illuminated from an illumination direction and as projected onto an image
plane, with the object
having a spatial position and rotational orientation relative to the
illumination direction and the
image plane and a scaling and/or zoom setting as selected by the operator. The
initial model used
in the model generation process may be one of a plurality of default models as
provided the computer
graphics system itself, such as a model defining a hemi-spherical or -
ellipsoid shape. Alternatively,
the initial model may be provided by the operator by providing an initial
shading of at least one pixel
of the image plane, using the pen 14 and digitizing tablet 15. If the initial
model is provided by the
operator, one of the pixels on the image plane is selected to provide a
"reference" portion of the
initial surface fragment for the object, the reference initial surface
fragment portion having a selected
spatial position, rotational orientation and height value with respect to the
image plane, and the
computer graphics system determines the initial model for the rest of the
surface fragment (if any)
in relation to shading (if any) applied to other pixels on the image plane. In
one embodiment, the
reference initial surface fragment portion is selected to be the portion of
the surface fragment for
which the first pixel on the image plane to which the operator applies
shading. In addition, in that
embodiment, the reference initial surface fragment portion is determined to be
parallel to the image
plane, so that a vector normal to the reference initial surface fragment
portion is orthogonal to the
image plane and the reference initial surface fragment portion has a height
value as selected by the
operator. In any case, computer graphics system will display the image of the
initial model, the
image defining the shading of the object associated with the initial model as
illuminated from the
particular illumination direction and projected onto the image plane.
The operator, using the mouse and the pen and digitizing tablet, will provide
updated shading
of the image of the initial object, and/or extend the object by shading
neighboring areas on the image
plane, and the computer graphics system 10 will generate an updated model
representing the shape
of the object based on the updated shading provided by the operator. In
updating the shading, the

CA 02282240 2006-08-22
24101-274
-9-
operator can increase or decrease the amount of shading applied to particular
points on the image
plane. In addition, the operator, using the mouse or trackball and the pen and
digitizing tablet, can
perform conventional computer graphics operations in connection with the
image, such as trimming
o:f the surface representation of the object defined by the model. The
computer graphics system 10
can use the updated shading and other computer graphic information provided by
the operator to
generate the updated model defining the shape of the object, and further
generate from the updated
model a two-dimensional image for display to the operator, from respective
spatial position(s),
rotational orientation(s) and scaling and/or zoom settings as selected by the
operator. If the operator
determines that the shape of the object as represented by the updated model is
satisfactory, he or she
can enable the computer graphics system 10 to store the updated model as
defining the shape of the
final object. On the other hand, if the operator determines that the shape of
the object as represented
by the updated model is not satisfactory, he or she can cooperate with the
computer graphics system
to further update the shading and other computer graphic information, in the
process using three-
dimensional rotation and translation and scaling or zooming as needed. As the
shading and other
computer graphic information is updated, the computer graphics system 10
updates the model
information, which is again used to provide a two-dimensional image of the
object, from rotational
orientations, translation or spatial position settings, and scale and/or zoom
settings as selected by the
operator. These operations can continue until the operator determines that the
shape of the object
is satisfactory, at which point the computer graphics system 10 will store the
updated model
in.formation as representing the final object.
The detailed operations performed by the computer graphics system 10 in
determining the
shape of an object will be described in connection with FIGS. 2 through 7.
With reference to FIG.
2, in the operations of the computer graphics system 10, it is assumed that
the image of the object
is projected onto a two-dimensional image plane 20 that is tessellated into
pixels having a
predetermined number of rows and columns. The image plane 20 defines an x,y
Cartesian plane,
with rows extending in the "x" direction and columns extending in the "y"
direction. The projection
of the surface of the object, which is identified in FIG. 2 by reference
numeral 22, that is to be
formed is orthographic, with the direction of the camera's "eye" being in the
"z" direction, orthogonal
to the x,y image plane. Each point on the image plane corresponds to a picture
element, or "pixel,"
represented herein by ;j, with iE[1,N] and jE[1,M], where "N" is the maximum
number of columns
(index "i" ranging over the columns in the image plane) and "M" is the maximum
number of rows

CA 02282240 1999-08-20
WO 98/37515 PCT/IB98/00612
-10-
(index "j" ranging over the rows in the image plane). In the illustrative
image plane 20 depicted in
FIG. 2, the number of columns "N" is eight, and the number of rows "M" is
nine. If the display
device(s) 13 which are used to depict the image plane 20 to the operator are
raster-scan devices, the rows may correspond to scan lines used by the
device(s) to display the image. Each pixel ;.
corresponds to a particular point (x;,y) of the coordinate system, and "M" by
"N" identifies the
resolution of the image. In addition, the computer graphics system 10 assumes
that the object is
illuminated by a light source having a direction L=(xL , YL , zL ), where " L
" is a vector, and that
the surface of the object is Lambertian. The implicit camera, whose image
plane is represented by
the image plane 20, is assumed to be view the image plane 20 from a direction
that is orthogonal to
the image plane 20, as is represented by the arrow with the label "CAMERA."
As noted above, the computer graphics system 10 initializes the object with at
least an
infinitesimally small portion of the object to be modeled as the initial
model. For each pixel ;j the
height value z(x,y) defining the height of the portion of the object projected
onto the pixel is known,
and is defined as a height field H(x,y) as follows:
H(x, y) = {Z(x,.Y):V (x,.Y) E ~ } (5),
where "V(x,y)ES2" refers to "for all points (x,y) in the domain SZ," with the
domain 0 referring to the
image plane 20. Furthermore, for each pixel e;j, the normal n(x,y) of the
portion of the surface of
the basic initial object projected thereon is also known and is defined as a
normal field N(x,y) as
follows:
N(x, y) n(x, y):vz(x, y) E H(x, y)} (6).
In FIG. 2, the normal associated with the surface 22 of the object projected
onto one the pixels of the
image plane 20 is represented by the arrow labeled "n."
After the computer graphics system 10 displays the image representing the obj
ect defined by
the initial model, which is displayed to the operator on the display 13 as the
image on image plane =
20, the operator can begin to modify it (that is, the image) by updating the
shading the image using
the pen 14 and digitizing tablet 15 (FIG. 1). It will be appreciated that the
image of the initial model
as displayed by the computer graphics system will itself be shaded to
represent the shape of the
object as defined by the initial model, as illuminated from the predetermined
illumination direction

CA 02282240 1999-08-20
WO 98/37515 PCT/IB98/00612
-11-
and as projected onto the image plane. Each pixel ;il on the image plane will
have an associated
intensity value I(x,y) (which is also referred to herein as a "pixel value")
which represents the relative
brightness of the image at the pixel ;j, and which, inversely, represents the
relative shading of the
pixel. If the initial pixel value for each pixel ;j is given by Io(x,y),
which represents the image
intensity value or brightness of the respective pixel ;; at location (x,y) on
the image plane 20, and
the pixel value after shading is represented by II(x,y), then the operator
preferably updates the
shading for the image such that, for each pixel
IIi(x,Y)- Io(x,Y)I < s1 .for(x,y) E S2 (7),
where "eI" (e,>0) is a predetermined bound value selected so that, if equation
(7) is satisfied for each
pixel, the shape of the object can be updated based on the shading provided by
the operator.
After the operator updates the shading for a pixel, the computer graphics
system 10 will
perform two general operations in generation of the updated shape for the
object. In particular, the
computer graphics system 10 will
(i) first determine, for each pixel ;il whose shading is updated, a
respective new normal
vector nl(x,y); and
(ii) after generating an updated normal vector nl(x,y), determine a new height
value z(x,y).
The computer graphics system 10 will perform these operations (i) and (ii) for
each pixel ;j whose
shading is updated, as the shading is updated, thereby to provide a new normal
vector field N(x,y)
and height field H(x,y). Operations performed by the computer graphics system
10 in connection
with updating of the normal vector n, (item (i) above) for a pixel ;j will be
described in connection
with FIGS. 3 and 4, and operations performed in connection with updating of
the height value z(x,y)
(item (ii) above) for the pixel e;j will be described in connection with FIGS.
5 and 6.
With reference initially to FIG. 3, that FIG. depicts a portion of the object,
identified by
reference numeral 30, after a pixel's shading has been updated by the
operator. In the following, it
will be assumed that the updated normal vector, identified by the arrow
identified by legend "n,,"
for a point z(x,y) on the surface of the object 30, is to be determined. The
normal vector identified
by legend "no," represents the normal to the surface prior to the updating.
The illumination direction
is represented by the line extending from the vector corresponding to the
arrow identified by legend
"L." "L" specifically represents an illumination vector whose direction is
based on the direction of
illumination from the light source illuminating the object, and whose
magnitude represents the

CA 02282240 1999-08-20
WO 98/37515 PCT/IB98/00612
-12-
magnitude of the illumination on the object provided by the light source. In
that case, based on the
updating, the set of possible new normal vectors lie on the surface of the
cone 31 which is defined
by: n, L I (8),
that is, the set of vectors for which the dot product with the illumination
vector corresponds to the
pixel value "I" for the pixel after the updating of the shading as provided by
the operator. In
addition, since the normal vector nl is, as is the case with all nonnal
vectors, normalized to have a
predetermined magnitude value, preferably the value "one," the updated normal
vector has a
magnitude corresponding to:
nt ' nt - I1nt 1I = 1 (9),
where "11n,11" refers to the magnitude of updated normal vector n,.
Equations (8) and (9) define a set of vectors, and the magnitudes of the
respective vectors,
one of which is the updated normai vector for the updated object at point
z(x,y). The computer
graphics system 10 will select one of the vectors from the set as the
appropriate updated normal
vector ni as follows. As noted above, the updated normal vector will lie on
the surface of cone 31.
It is apparent that, if the original normal vector no and the illumination
vector L are not parallel, then
they (that is, the prior normal vector no and the illumination vector L) will
define a plane. This
follows since the point z(x,y) at which the illumination vector L impinges on
the object 30, and the
origin of the normal vector no on object 30, is the same point, and the tail
of the illumination vector
and head of the prior normal vector no will provide the two additional points
which, with the point
z(x,y), suffices to defined a plane. Thus, if a plane, which is identified by
reference numeral 32, is
constructed on which both the illumination vector L and the prior normal
vector no lie, that plane 32
will intersect the cone along two lines, which are represented by lines 33 in
FIG. 3. One of the lines
33 lies on the surface of the cone 31 which is on the side of the illumination
vector L towards the
prior normal vector no, and the other line 33 lies on the surface of the cone
31 which is on the side
of the illumination vector L away from the normal vector na, and the correct
updated normal vector ,
n, is defined by the line on the cone 31 which is on the side of the
illumination vector L towards the
prior normal vector no.

CA 02282240 1999-08-20
WO 98/37515 PCT/IB98/00612
-13-
Based on these observations, the direction of the updated normal vector can be
determined
from equation (8) and the following. Since the prior normal vector nfl and the
illumination vector
L form a plane 32, their cross product, "nox L" defines a vector that is
normal to the plane 32. Thus,
since the updated normal vector nl also lies in the plane 32, the dot product
of the updated normal
vector n, with the vector defined by the cross product between the prior
normal vector no and the
illumination vector L has the value zero, that is,
n, = (no x L) = 0 (10).
In addition, since the difference between the pixel values Io and I1 provided
by the prior shading and
the updated shading is bounded eI (equation (7) above), the angle 6 between
the prior normal vector
no and the updated normal vector n, is also bounded by some maximum positive
value sb. As a result,
equation (10) can be re-written as
1x L), < es (11).
This is illustrated diagrammatically in FIG. 4. FIG. 4 depicts a portion of
the cone 32 depicted in
FIG. 3, the updated normal vector ni, and a region, identified by reference
numeral 34, that
represents the maximum angle ea from the prior normal vector in which the
updated normal vector
n1 is constrained to lie.
The computer graphics system 10 (FIG. 1) will generate an updated normal
vector n, for each
pixel ;j in the image plane 20 based on the shading provided by the operator,
thereby to generate
an updated vector field N(x,y). After the computer graphics system 10 has
generated the updated
normal vector for a pixel, it can generate a new height value z(x,y) for that
pixel, thereby to update
the height field H(x,y) based on the updated shading. Operations performed by
the computer
graphics system 10 in connection with updating the height value z(x,y) will be
described in
connection with FIGS. 5 and 6. FIG. 5 depicts an illustrative updated shading
for the image plane
20 depicted in FIG. 2. For the image plane 20 depicted in FIG. 5, the pixels
;; have been provided
with coordinates, with the rows being identified by numbers in the range from
1 through 8, inclusive,
and the columns being identified by letters in the range A through I
inclusive. As shown in FIG. 5,
in the updated shading, the pixels 9E,1 through E,3, D,3 through n,4 and
c,s tlu'ough c,8 have all
been modified, and the computer graphics system 10 is to generate an updated
height value h(x,y)
therefor for use as the updated height value for the pixel in the updated
height field H(x,y). To

CA 02282240 1999-08-20
WO 98/37515 PCT/IB98/00612
-14-
accomplish that, the computer graphics system 10 performs several operations,
which will be
described below, to generate a height value for each pixel ;j whose shading
has been modified along
a vertical direction, a horizontal direction, and two diagonal directions, and
generates the final height
value for the pixel as the average of the four height values (that is, the
height values along the
vertical, horizontal, and two diagonal directions).
The operations performed by the computer graphics system 10 in generating an
updated
height value will be described in connection with one of the modified pixels
in the image plane 20,
namely, pixel D,4, along one of the directions, namely, the horizontal
direction. Operations
performed in connection with the other directions, and the other pixels whose
shading is updated,
will be apparent to those skilled in the art. In generating an updated height
value, the computer
graphics system 10 makes use ofBezier-Bernstein interpolation, which defines a
curve P(t) of degree
"n" as
n
P(t) _E B, n t' (1- t)n-` (12),
i=o i
where "t" is a numerical parameter on the interval between "zero" and "one,"
inclusive, and vectors
B; (defined by components (b;X,b;y,b;Z)) define "n+l" control points for the
curve P(t), with control
points Bo and Bõ comprising the endpoints of the curve. The tangents of the
curve P(t) at the
endpoints correspond to the vectors BoB, and Bõ_iB,,. In one embodiment, the
computer graphics
system 10 uses a cubic Bezier-Bernstein interpolation
Pn=3 (t) = Bo (I - t)3 + 3Blt(1- t)2 + 3B2t2 (1 - t) + B3t3 (13)
to generate the updated height value. The points Bo, B1, B2, and B3 are
control points for the cubic
curve Põ_3(t).
Equation (13), as applied to the determination of the updated height value h,
for the pixel D,4
corresponds to
h, = hQ (1- t)3 + 3Blt(1- t)2 + 3B2t2 (1- t) + hbt3 (14).
It will be appreciated from equation (14) that, for "t" equal to "zero," the
updated height value h, for
pixel D,4 corresponds to ha, which is the height value for pixel c,4, and
for "t" equal to "one," the
updated height value h, for pixel p,4 corresponds to hb, which is the height
value for pixel mE,4- On

CA 02282240 1999-08-20
WO 98/37515 PCT/IB98/00612
-15-
the other hand, for "t" having a value other than zero or one, the updated
height value hi is a function
of the height values ha and hb of the pixels c,4 and s,4 and the height
values for control points B,
and Bz.
As noted above, for an "n" degree curve P(t), the tangents at the endpoints Bo
and B,
correspond to the vectors BaBt and Bõ-,Bn. Thus, for the curve Põ_3(t) shown
in FIG. 6, the vector
B1Bo that is defined by endpoint Bo and adjacent control point Bi is tangent
to the curve Pn_3(t) at
endpoint Bo and the vector B2B3 defined by endpoint B3 and adjacent control
point B2 is tangent to
the curve at endpoint B3. Accordingly, the vector B 1Ba is orthogonal to the
normal vector na at pixel
Gc.4 and the vector B2B3 is orthogonal to the normal vector nb at pixel E,4.
Thus,
0 = (Bl - Bo) = nQ and 0 = (B2 - B3) = nb (25),
which leads to
0 = (BI - hQ ) = nQ and 0 = (B2 - hb ) = nb (16).
For the determination of the updated height value h, for the horizontal
direction (see FIG. 5),
the equation (14), which is in vector form, gives rise to the following
equations for each of the
dimensions "x" and "z" (the "z" dimension being orthogonal to the image
plane):
hlx = h,,,_, (1- t)3 + 3b,xt(1- t)2 + 3b2xt2 (1- t) + hb_,t3 (17)
and
h,Z = h. (1- t)3 + 3blZt(1- t)2 + 3b2zt2 (1- t) + hbZt3 (18),
where the "x" and "z" subscripts in equations (17) and (18) indicate the
respective "x" and "z"
components for the respective vectors in equation (14). It will be appreciated
that, for equations (17)
and (18), only value of the "z" component, hlZ, of the height value is
unlrnown; the value of the "x"
component, hl,,=, will be a function of the position of the pixel whose height
value is being
determined, in this case pixel D,4. In addition, equation (16) gives rise to
the following two
equations
0 = (bix - hb.,)na.x + (biy - hby )na.v + (biz - hn,-)naZ (19),
and

CA 02282240 1999-08-20
WO 98/37515 PCT/IB98/00612
-16-
0= (b2x-hb.,)nbX+(b2y-hby)nby+(b2Z-heZ)naZ (20),
where subscripts "x," "y" and "z" in equations (19) and (20) indicate the
respective "x," "y" and "z"
components for the respective vectors in equation (16).
In addition, as noted above, there is the further constraint on the curve
Põ_3(t), in particular
the constraint that the updated normal n, be normal to the curve at the point
corresponding to pixel
n,a.If the vector B012B123 in FIG. 6 is tangent to the curve at the point
corresponding to pixel D,4,
then the point h,, whose "z" component corresponds to the updated height
value, also lies on the
vector Bo12B123= Thus,
O= (Boi2 - hi ) ' nI (21),
and
0 = (Bi23 - hi ) ' nt (22).
Based on the convex combination depicted in FIG. 6,
B012 = Boi + t(B12 - Boi) (23)
= Bol (I - t) + B12t
and
B123 = Bi2 + t(B23 - B12) (24),
= B12 (1- t) + B23t
which lead to
B012 = Bo + t(Bl - Bo ) +
t[B,+t(B2-B,Bo-t(B,-Bo (25)
)~ 4
and
B123 - B, + t( B2 - B1) + (26)
t[B2+t(B3-B2)-B1-t(B2-Bl)]

CA 02282240 1999-08-20
WO 98/37515 PCT/IB98/00612
-17-
Combining equations (21), (23) and (25)
0= (Bo1(1-t)+B12t-hl)=n, (27),
_ (Bo (1- t)2 + 2Blt(1- t) + B2t2 - h4 ) = nl
which leads to
0=(boX (1- t)2 + 2blXt(1- t) + b2xt2 - h,x )ntX and (28)
0=(boZ(1- t)2 + 2blZt(1- t) + b2Zt2 -k)nlZ
for the "x" and "z" components of the respective vectors. Similarly, for
equations (22), (24) and (26),
0 = (bix (1 - t)2 + 2b2xt(1 - t) + b3x,t2 - hIx )niX and (29)
0=(blZ (1- t)2 + 2b2zt(1 - t) + b3zt2 - lz,z)niz
for the "x" and "z" components of the respective vectors.
It will be appreciated that the eight equations (17) through (20), (28) and
(29) are all one-
dimensional in the respective "x" and "z" components. For the equations (17)
through (20), (28) and
(29), there are six unknown values, namely, the value ofparameter t, the
values of the "x" and "z"
components of the vector B1 (that is, values bi,, and b,Z), the "x" and "z"
components of the vector
B2 (that is, values b2x and bZZ), and the "z" component of the vector h, (that
is, value h1z) to the point
Põ-3(t) for the pixel p,4. The eight equations (17) through (20), (28) and
(29) are sufficient to define
a system of equations which will suffice to allow the values for the unknowns
to be determined by
methodologies which will be apparent to those skilled in the art.
The computer graphics system 10 will, in addition to performing the operations
described
above in connection with the horizontal direction (corresponding to the "x"
coordinate axis), also
perform corresponding operations similar to those described above for each of
the vertical and two
diagonal directions to determine the updated height vector hl for the pixel
D,4. After the computer
graphics system 10 determines the updated height vectors for all four
directions, it will average them
together. The "z" component of the average of the updated height vectors
corresponds to the height
value for the updated model for the object.
The operations performed by the computer graphics system 10 will be described
in
connection with the flowchart in FIG. 7. Generally, it is anticipated that the
operator will have a

CA 02282240 1999-08-20
WO 98/37515 PCTi1B98/00612
-18-
mental image of the object that is to be modeled by the computer graphics
system. With reference
to FIG. 7, the initial model for the object is determined (step 100), and the
computer graphics system
displays a two dimensional image thereof to the operator based on a
predetermined illumination
direction, with the display direction corresponding to an image plane
(reference image plane 20
depicted in FIG. 2) (step 101). As noted above, the initial model may define a
predetermined default
shape, such as a hemi-sphere or -ellipsoid, provided by the computer graphics
system, or
alternatively a shape as provided by the operator. In any case, the shape will
define an initial normal
vector field N(x,y) and height field H(x,y), defining a normal vector and
height value for each pixel
in the image. After the computer graphics system 10 has displayed initial
model, the operator can
select one of a plurality of operating modes, including a shading mode in
connection with the
invention, as well as one of a plurality of conventional computer graphics
modes, such as erasure and
trimming (step 102). If the operator selects the shading mode, the operator
will update the shading
of the two-dimensional image by means of, for example, the system's pen and
digitizing tablet (step
103). While the operator is applying shading to the image in step 103, the
computer graphics system
can display the shading to the operator. The shading that is applied by the
operator will
preferably be a representation of the shading of the finished object as it
would appear illuminated
from the predetermined illumination direction, and as projected onto the image
plane as displayed
by the computer graphics system 10.
When the operator has updated the shading for a pixel in step 103, the
computer graphics
system 10 will generate an update to the model of the object. In generating
the updated model, the
computer graphics system 10 will first determine, for each pixel in the image,
an updated normal
vector, as described above in connection with FIGS. 3 and 4, thereby to
provide an updated normal
vector field for the object (step 104). Thereafter, the computer graphics
system 10 will determine,
for each pixel in the image, an updated height value, as described above in
connection with FIGS.
5 and 6, thereby to provide an updated height field for the object (step 105).
After generating the updated normal vector field and updated height field,
thereby to provide
an updated model of the object the computer graphics system 10, will display
an image of the
updated model to the operator from one or more directions and zooms as
selected by the operator
(step 106), in the process rotating, translating and scaling and/or zooming
the image as selected by
the operator (step 107). If the operator determines that the updated model is
satisfactory (step 108),
which may occur if, for example, the updated model corresponds to his or her
mental image of the

CA 02282240 1999-08-20
WO 98/37515 PCT/IB98/00612
-19-
object to be modeled, he or she can enable the computer graphics system 10 to
save the updated
model as the final model of the object (step 109). On the other hand, if the
operator determines in
step 107 that the updated model is not satisfactory, he or she can enable the
computer graphics
system 10 to return to step 101.
Returning to step 102, if the operator in that step selects another operating
mode, such as the
erasure mode or a conventional operational mode such as the trimming mode, the
computer graphics
system will sequence to step 110 to update the model based on the erasure
information, or the
trimming and other conventional computer graphic information provided to the
computer graphics
system 10 by the operator. The computer graphics system will sequence to step
107 to display an
image of the object based on the updated model. If the operator determines
that the updated model
is satisfactory (step 108), he or she can enable the computer graphics system
10 to save the updated
model as the final model of the object (step 109). On the other hand, if the
operator determines in
step 107 that the updated model is not satisfactory, he or she can enable the
computer graphics
system 10 to return to step 101.
The operator can enable the computer graphics system 10 to perform steps 101,
103 through
107 and 110 as the operator updates the shading of the image of the object
(step 103), or provides
other computer graphic information (step 110), and the computer graphics
system 10 will generate,
in steps 104 and 105, the updated normal vector field and updated height
field, or, in step 110,
conventional computer graphic components, thereby to define the updated model
of the object.
When the operator determines in step 108 that the updated model corresponds to
his or her mental
image of the object, or is otherwise satisfactory, he or she can enable the
computer graphics system
to store the updated normal vector field and the updated height field to
define the final model for
the object (step 109).
The invention provides a number of advantages. In particular, it provides an
interactive
computer graphics system which allows an operator, such as an artist, to
imagine the desired shape
of an object and how the shading on the object might appear with the object
being illuminated from
a particular illumination direction and as viewed from a particular viewing
direction (as defined by
the location of the image plane). After the operator has provided some shading
input corresponding
to the desired shape, the computer graphics system displays a model of the
object, as updated based
on the shading, to the operator. The operator can accept the model as the
final obj ect, or alternatively
can update the shading further, from which the computer graphics system will
further update the

CA 02282240 1999-08-20
WO 98/37515 PCT/IB98/00612
-20-
model of the object. The computer graphics system constructed in accordance
with the invention
avoids the necessity of solving partial differential equations, which is
required in prior art systems
which operate in accordance with the shape-from-shading methodology.
A further advantage of the invention is that it readily facilitates the use of
a hierarchical
representation for the model of the object that is generated. Thus, if, for
example, the operator
enables the computer graphics system 10 to increase the scale of the object or
zoom in on the object
thereby to provide a higher resolution, it will be appreciated that a
plurality of pixels of the image
will display a portion of the image which, at the lower resolution, were
associated with a single pixel.
In that case, if the operator updates the shading of the image at the higher
resolution, the computer
graphics system will generate the normal vector and height value for each
pixel at the higher
resolution for which the shading is updated as described above, thereby to
generate and/or update
the portion of the model associated with the updated shading at the increased
resolution. The
updated portion of the model at the higher resolution will be associated with
the particular portion
of the model which was previously defined at the lower resolution, thereby to
provide the
hierarchical representation, which may be stored. Thus, the object as defined
by the model inherits
a level of detail which corresponds to a higher resolution in the underlying
surface representation.
Corresponding operations can be performed if the operator enables the computer
graphics system
to decrease the scale of the object or zoom out from the object, thereby
providing a lower
resolution.
It will be appreciated that a number of variations and modifications may be
made to the
computer graphics system 10 as described above in connection with FIGS. 1
through 7. For
example, the computer graphics system 10 can retain the object model
information, that is, the
normal vector field information and height field information, for a number of
updates of the shading
as provided by the operator, which it (that is, system 10) may use in
displaying models of the object
for the respective updates. This can allow the operator to view images of the
respective models to,
for example, enable him or her to see the evolution of the object through the
respective updates. In
addition, this can allow the operator to return to a model from a prior update
as the base which is to
be updated. This will allow the operator to, for example, generate a tree of
obj ects based on different
shadings at particular models.
In addition. although the computer graphics system 10 has been described as
making use of
Bezier-Bernstein interpolation to determine the updated height field h(x,y),
it will be appreciated that

CA 02282240 1999-08-20
WO 98/37515 PCT/IB98/00612
-21-
other forms of interpolation, such as Taylor polynomials and B-splines, may be
used. In addition,
multiple forms of surface representations may be used with the invention.
Indeed, since the model
generation methodology used by the computer graphics system 10 is of general
applicability, all free-
form surface representations as well as piecewise linear surfaces consisting
of, for example, triangles,
quadrilaterals and/or pentagons can be used.
Furthermore, although the computer graphics system 10 has been described as
making use
of an orthogonal projection and a single light source, it will be appreciated
that the other forms of
projection, including perspective projection, and multiple light sources can
be used.
In addition, although the computer graphics system 10 has been described as
providing shape
of an object by shading of an image of the object, it will be appreciated that
it may also provide
computer graphics operations, such as trimming and erasure, through
appropriate operational modes
of the pen 14 and digitizing tablet.
Furthermore, although the computer graphics system has been described as
generating a
model of an object on the assumption that the object's surface is Lambertian,
it will be appreciated
that other surface treatrnents may be used for the object when an image of the
object is rendered.
It will be appreciated that a system in accordance with the invention can be
constiucted in
whole or in part from special purpose hardware or a general purpose computer
system, or any
combination thereof, any portion of which may be controlled by a suitable
program. Any program
may in whole or in part comprise part of or be stored on the system in a
conventional manner, or it
may in whole or in part be provided in to the system over a network or other
mechanism for
transferring information in a conventional manner. In addition, it will be
appreciated that the system
may be operated and/or otherwise controlled by means of information provided
by an operator using
operator input elements (not shown) which may be connected directly to the
system or which may
transfer the information to the system over a network or other mechanism for
transferring
information in a conventional manner.
The foregoing description has been limited to a specific embodiment of this
invention. It will
be apparent, however, that various variations and modifications may be made to
the invention, with
the attainment of some or all of the advantages of the invention. It is the
object of the appended
claims to cover these and such other variations and modifications as come
within the true spirit and
scope of the invention.
What is claimed as new and desired to be secured by Letters Patent is:

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

2024-08-01:As part of the Next Generation Patents (NGP) transition, the Canadian Patents Database (CPD) now contains a more detailed Event History, which replicates the Event Log of our new back-office solution.

Please note that "Inactive:" events refers to events no longer in use in our new back-office solution.

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Event History , Maintenance Fee  and Payment History  should be consulted.

Event History

Description Date
Inactive: IPC from PCS 2022-09-10
Time Limit for Reversal Expired 2013-02-20
Letter Sent 2012-02-20
Inactive: IPC expired 2011-01-01
Inactive: IPC expired 2011-01-01
Grant by Issuance 2009-12-29
Inactive: Cover page published 2009-12-28
Pre-grant 2009-10-14
Inactive: Final fee received 2009-10-14
Notice of Allowance is Issued 2009-09-01
Letter Sent 2009-09-01
Notice of Allowance is Issued 2009-09-01
Inactive: Approved for allowance (AFA) 2009-08-06
Letter Sent 2008-07-15
Amendment Received - Voluntary Amendment 2008-05-26
Reinstatement Requirements Deemed Compliant for All Abandonment Reasons 2008-05-26
Reinstatement Requirements Deemed Compliant for All Abandonment Reasons 2008-05-26
Reinstatement Request Received 2008-05-26
Inactive: Abandoned - No reply to s.30(2) Rules requisition 2007-07-11
Inactive: Abandoned - No reply to s.29 Rules requisition 2007-07-11
Inactive: S.29 Rules - Examiner requisition 2007-01-11
Inactive: S.30(2) Rules - Examiner requisition 2007-01-11
Inactive: Office letter 2006-12-15
Inactive: Multiple transfers 2006-11-23
Letter Sent 2006-08-31
Reinstatement Requirements Deemed Compliant for All Abandonment Reasons 2006-08-21
Amendment Received - Voluntary Amendment 2006-08-21
Reinstatement Request Received 2006-08-21
Inactive: Abandoned - No reply to s.30(2) Rules requisition 2005-09-08
Inactive: S.30(2) Rules - Examiner requisition 2005-03-08
Letter Sent 2003-02-17
Request for Examination Received 2002-12-16
Request for Examination Requirements Determined Compliant 2002-12-16
All Requirements for Examination Determined Compliant 2002-12-16
Letter Sent 2000-06-12
Inactive: Single transfer 2000-05-18
Inactive: Cover page published 1999-10-28
Inactive: IPC assigned 1999-10-25
Inactive: First IPC assigned 1999-10-25
Inactive: Courtesy letter - Evidence 1999-10-05
Inactive: Notice - National entry - No RFE 1999-10-01
Application Received - PCT 1999-09-30
Application Published (Open to Public Inspection) 1998-08-27

Abandonment History

Abandonment Date Reason Reinstatement Date
2008-05-26
2006-08-21

Maintenance Fee

The last payment was received on 2009-01-06

Note : If the full payment has not been received on or before the date indicated, a further fee may be required which may be one of the following

  • the reinstatement fee;
  • the late payment fee; or
  • additional fee to reverse deemed expiry.

Please refer to the CIPO Patent Fees web page to see all current fee amounts.

Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
MENTAL IMAGES GMBH
Past Owners on Record
ROLF HERKEN
TOM-MICHAEL THAMM
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Representative drawing 1999-10-28 1 5
Description 1999-08-20 21 1,242
Claims 1999-08-20 8 357
Abstract 1999-08-20 1 66
Drawings 1999-08-20 8 114
Cover Page 1999-10-28 2 96
Description 2006-08-22 23 1,304
Claims 2006-08-22 10 350
Drawings 2006-08-22 8 113
Description 2008-05-26 24 1,324
Claims 2008-05-26 11 405
Representative drawing 2009-08-05 1 7
Cover Page 2009-12-02 2 63
Notice of National Entry 1999-10-01 1 208
Reminder of maintenance fee due 1999-10-21 1 111
Courtesy - Certificate of registration (related document(s)) 2000-06-12 1 115
Reminder - Request for Examination 2002-10-22 1 115
Acknowledgement of Request for Examination 2003-02-17 1 174
Courtesy - Abandonment Letter (R30(2)) 2005-11-17 1 167
Notice of Reinstatement 2006-08-31 1 171
Courtesy - Abandonment Letter (R30(2)) 2007-10-03 1 167
Courtesy - Abandonment Letter (R29) 2007-10-03 1 167
Notice of Reinstatement 2008-07-15 1 172
Commissioner's Notice - Application Found Allowable 2009-09-01 1 163
Maintenance Fee Notice 2012-04-02 1 172
Correspondence 1999-10-01 1 16
PCT 1999-08-20 19 698
Fees 2003-02-19 1 37
Fees 2006-01-18 1 35
Correspondence 2006-12-15 1 10
Correspondence 2009-10-14 1 37