Language selection

Search

Patent 2307352 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent Application: (11) CA 2307352
(54) English Title: SYSTEM AND METHOD FOR DISPLAYING A THREE-DIMENSIONAL OBJECT USING MOTION VECTORS TO GENERATE OBJECT BLUR
(54) French Title: SYSTEME ET METHODE D'AFFICHAGE D'UN OBJET TRIDIMENSIONNEL A L'AIDE DE VECTEURS DE MOUVEMENT POUR PRODUIRE UN FLOU
Status: Dead
Bibliographic Data
(51) International Patent Classification (IPC):
  • G06T 13/20 (2011.01)
  • G06T 7/20 (2017.01)
  • G09G 5/38 (2006.01)
  • G06T 15/70 (2006.01)
  • G06F 17/50 (2006.01)
(72) Inventors :
  • CHELSTOWSKI, ILIESE CLAIRE (United States of America)
  • JOHNS, CHARLES R. (United States of America)
  • MINOR, BARRY L. (United States of America)
  • WHITE, GEORGE L., JR. (United States of America)
(73) Owners :
  • INTERNATIONAL BUSINESS MACHINES CORPORATION (United States of America)
(71) Applicants :
  • INTERNATIONAL BUSINESS MACHINES CORPORATION (United States of America)
(74) Agent:
(74) Associate agent:
(45) Issued:
(22) Filed Date: 2000-05-01
(41) Open to Public Inspection: 2000-12-30
Examination requested: 2003-08-26
Availability of licence: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): No

(30) Application Priority Data:
Application No. Country/Territory Date
09/343,443 United States of America 1999-06-30

Abstracts

English Abstract




A system, method, and computer-usable medium for simulating
object blur uses motion vectors. A motion vector, or array of
motion vectors, may be specified on either a per-vertex or
per-primitive (i.e. per-object) basis. A motion vector is opposite
to the direction of the motion, and thus points in the direction of
the blur. The magnitude of the motion vector represents the
distance the vertex or the primitive (i.e. each vertex in the
primitive) travels in one unit of time. When a scene is rendered,
only those objects which are in motion, or which are subject to
depth of field blurring, are rendered over a series of time slices.
All objects which are static (i.e. nonmoving) are rendered directly
into a color buffer, rather than being repeatedly rendered over a
series of time slices. Thus, static (i.e. non-blurred) objects are
rendered only once, while objects which are to be blurred are
rendered over a series of time slices.


Claims

Note: Claims are shown in the official language in which they were submitted.



CLAIMS
The embodiments of the invention in which an exclusive property or
privilege are claimed are defined as follows:
1. A method for displaying a three-dimensional graphics scene,
including a plurality of objects, comprising the steps of:
categorizing the plurality of objects into a first set of
objects and a second set of objects, wherein the first set of
objects contains one or more objects to be blurred, and wherein the
second set of objects contains one or more objects not to be
blurred;
rendering each object in the second set of objects directly
into a first buffer;
dividing a render time period into a plurality of time slices;
for each time slice, performing the following steps:
rendering each object in the first set of objects into
the first buffer; and
accumulating the three-dimensional graphics scene from
the first buffer into an accumulation buffer; and
displaying the three-dimensional graphics scene
2. A method according to claim 1, wherein said categorizing step
comprises the steps of:
determining if a motion vector is associated with a selected
object;
if a motion vector is associated with the selected object,
assigning the selected object to the first set of objects; and
if a motion vector is not associated with the selected object,
14


assigning the selected object to the second set of objects.
3. A method according to claim 2, wherein said determining step
comprises the step of determining if a motion vector is associated
with a vertex of the selected object.
4. A method according to claim 2, wherein the motion vector is
opposite to a direction of motion of the selected object.
5. A method according to claim 2, wherein a length of the motion
vector is proportional to a speed of motion of the selected object.
6. A method according to claim 2, wherein the motion vector is
opposite to a direction of blur of the selected object.
7. A method according to claim 2, wherein a length of the motion
vector is proportional to a speed of blur of the selected object.
8. A method according to claim 1, wherein said displaying further
comprises the steps of:
copying the accumulation buffer to a frame buffer; and
displaying the frame buffer on a display device.
9. An information handling system, comprising:
a display means;
a plurality of objects to be displayed as a three-dimensional
graphics scene on said display means;
means for categorizing the plurality of objects into a first
set of objects and a second set of objects, wherein the first set




of objects contains one or more objects to be blurred, and wherein
the second set of objects contains one or more objects not to be
blurred;
means for rendering each object in the second set of objects
directly into a first buffer;
means for dividing a render time period into a plurality of
time slices;
means for rendering each object in the first set of objects
into the first buffer during each time slice;
means for accumulating the three-dimensional graphics scene
from the first buffer into an accumulation buffer during each time
slice; and
means for displaying the three-dimensional graphics scene on
said display device.
10. An information handling system according to claim 9, wherein
said means for categorizing comprises:
means for determining if a motion vector is associated with a
selected object;
means for assigning the selected object to the first set of
objects if a motion vector is associated with the selected object;
and
means for assigning the selected object to the second set of
objects if a motion vector is not associated with the selected
object.
11. An information handling system according to claim 10, wherein
said means for determining comprises means for determining if a
motion vector is associated with a vertex of the selected object.

16



12. An information handling system according to claim 10, wherein
the motion vector is opposite to a direction of motion of the
selected object.
13. An information handling system according to claim 10, wherein
a length of the motion vector is proportional to a speed of motion
of the selected object.
14. An information handling system according to claim 10, wherein
the motion vector is opposite to a direction of blur of the
selected object.
15. An information handling system according to claim 9, wherein
said means for displaying further comprises:
means for copying the accumulation buffer to a frame buffer;
and
means for displaying the frame buffer on said display means.
16. A graphics system, comprising:
a display means;
a plurality of objects to be displayed as a three-dimensional
graphics scene on said display means;
means for categorizing the plurality of objects into a first
set of objects and a second set of objects, wherein the first set
of objects contains one or more objects to be blurred, and wherein
the second set of objects contains one or more objects not to be
blurred;
means for rendering each object in the second set of objects
directly into a first buffer;
17


means for dividing a render time period into a plurality of
time slices;
means for rendering each object in the first set of objects
into the first buffer during each time slice;
means for accumulating the three-dimensional graphics scene
from the first,buffer into an accumulation buffer during each time
slice; and
means for displaying the three-dimensional graphics scene on
said display device.
17. A graphics system according to claim 16, wherein said means
for categorizing comprises:
means for determining if a motion vector is associated with a
selected object;
means for assigning the selected object to the first set of
objects if a motion vector is associated with the selected object;
and
means for assigning the selected object to the second set of
objects if a motion vector is not associated with the selected
object.
18. A graphics system according to claim 17, wherein said means
for determining comprises means for determining if a motion vector
is associated with a vertex of the selected object.
19. A graphics system according to claim 17, wherein the motion
vector is opposite to a direction of motion of the selected object.
20. A graphics system according to claim 17, wherein a length of

18




the motion vector is proportional to a speed of motion of the
selected object.
21. A graphics system according to claim 17, wherein the motion
vector is opposite to a direction of blur of the selected object.
22. A graphics system according to claim 16, wherein said means
for displaying further comprises:
means for copying the accumulation buffer to a frame buffer;
and
means for displaying the frame buffer on said display means.
23. A computer program product on a computer usable medium, the
computer usable medium having computer usable program means
embodied therein for displaying a three-dimensional graphics scene
on a display device, the computer usable program means comprising:
means for categorizing a plurality of objects into a first set
of objects and a second set of objects, wherein the first set of
objects contains one or more objects to be blurred, and wherein the
second set of objects contains one or more objects not to be
blurred;
means for rendering each object in the second set of objects
directly into a first buffer;
means for dividing a render time period into a plurality of
time slices;
means for rendering each object in the first set of objects
into the first buffer during each time slice;
means for accumulating the three-dimensional graphics scene
from the first buffer into an accumulation buffer during each time



19



slice; and
means for displaying the three-dimensional graphics scene on
the display device.
24. A computer program product according to claim 23, wherein the
computer usable program means further comprises:
means for determining if a motion vector is associated with a
selected object;
means for assigning the selected object to the first set of
objects if a motion vector is associated with the selected object;
and
means for assigning the selected object to the second set of
objects if a motion vector is not associated with the selected
object.
25. A computer program product according to claim 24, wherein said
means for determining comprises means for determining if a motion
vector is associated with a vertex of the selected object.
26. A computer program product according to claim 24, wherein the
motion vector is opposite to a direction of motion of the selected
object.
27. A computer program product according to claim 24, wherein a
length of the motion vector is proportional to a speed of motion of
the selected object.
28. A computer program product according to claim 24, wherein the
motion vector is opposite to a direction of blur of the selected
object.



29. A computer program product according to claim 23, wherein said
means for displaying further comprises:
means for copying the accumulation buffer to a frame buffer;
and
means for displaying the frame buffer on the display device.
21

Description

Note: Descriptions are shown in the official language in which they were submitted.



CA 02307352 2000-OS-O1
SYSTEM AND METHOD FOR DISPLAYING A THREE-DIMENSIONAL OBJECT
USING MOTION VECTORS TO GENERATE OBJECT BLUR
FIELD OF THE INVENTION
The invention relates to the field of information handling
system, and, more particularly, to a system and method for
displaying a three-dimensional object. Still more particularly,
the invention relates to using motion vectors to generate a
blurring effect for an object.
BACKGROUND OF THE INVENTION
Three-dimensional (3D) graphics systems are used for a variety
of applications, including computer-assisted drafting,
architectural design, simulation trainers for aircraft and other
vehicles, molecular modeling, virtual reality applications, and
video games. Three-dimensional systems are often implemented on
workstations and personal computers, which may or may not include
3D graphics hardware. In systems which include 3D graphics
hardware, a graphics accelerator card typically facilitates the
creation and display of the graphics imagery.
A software application program generates a 3D graphics scene,
and provides the scene, along with lighting attributes, to an
application programming interface (API). Current APIs include
OpenGL, PHIGS, and Direct3D. A 3D graphics scene consists of a
number of polygons which are delimited by sets of vertices. The
vertices are combined to form larger primitives, such as triangles
or other polygons. The triangles (or polygons) are combined to
form surfaces, and the surfaces are combined to form an object.
Each vertex is associated with a set of attributes, typically
AUS9-1998-0492 1


CA 02307352 2000-OS-O1
including: 1) material color, which describes the color of the
object to which the vertex belongs; 2) a normal vector, which
describes the direction to which the surface is facing at the
vertex; and 3) a position, including three Cartesian coordinates x,
y, and z. Each vertex may optionally be associated with texture
coordinates and/or an alpha (i.e, transparency) value. In
addition, the scene typically has a set of attributes, including:
1) an ambient color, which typically describes the amount of
ambient light; and 2) one or more individual light sources. Each
light source has a number of properties associated with it,
including a direction, an ambient color, a diffuse color, and a
specular color.
Rendering is employed within the graphics system to create
two-dimensional image projections of the 3D graphics scene for
display on a monitor or other display device. Typically, rendering
includes processing geometric primitives (e.g., points, lines, and
polygons) by performing one or more of the following operations as
needed: transformation, clipping, culling, lighting, fog
calculation, and texture coordinate generation. Rendering further
includes processing the primitives to determine component pixel
values for the display device, a process often referred to
specifically as rasterization.
In some 3D applications, for example, computer animation and
simulation programs, objects within the 3D graphics scene may be in
motion. In these cases, it is desirable to simulate motion blur
for the objects that are in motion. Without motion blur, objects
in motion may appear to move jerkily across the screen.
Similar techniques are also commonly used to blur objects when
simulating depth of field. Objects which are within the "field of
AUS9-1998-0492 2


CA 02307352 2000-OS-O1
view" are left un-blurred, while objects which are closer or
farther away are blurred according to their distance from the
camera (i.e. viewer).
A prior art method for simulating object blur includes the use
of an accumulation buffer. The accumulation buffer is a
non-displayed buffer that is used to accumulate a series of images
as they are rendered. An entire scene (i.e. each object, .or
primitive, in the scene) is repeatedly rendered into the
accumulation buffer over a series of time slices. The entire scene
is thus accumulated in the accumulation buffer, and then copied to
a frame buffer for viewing on a display device.
A prior art method for using an accumulation buffer to
simulate object blur is illustrated in Figure 1. As shown in
Figure 1, a time period is divided into "n" time slices (step 100).
The time period is the amount of time during which ascene is
visible on a display device, and is analogous to the exposure
interval, or shutter speed, of a video camera shutter. A longer
shutter speed corresponds to a greater amount of blurring, whereas
a shorter shutter speed corresponds to a lesser amount of blurring.
A time-slice count is set to one (step 102). Next, an object (i.e.
primitive) is selected for rendering (step 104). The location,
color, and all other per-vertex values are calculated for each
vertex in the object for this particular time slice (step 106).
The object is then rendered into a color buffer (step 108). A
check is made to determine if the object rendered is the last
object in the scene (step 110). If not, the process loops back to
step 104, and is repeated for each object in the scene.
If the last object in the scene has been rendered (i.e. the
answer to the question in step 110 is "yes"), the scene is
AUS9-1998-0492 3


CA 02307352 2000-OS-O1
accumulated (step 112), meaning it is scaled (for example, by 1/n)
and copied into the accumulation buffer. The time-slice count is
checked to see if it is equal to n (step 114). If not, the time
slice count is incremented (step 116). The process then loops back
to step 104, and is repeated for each time slice. If the
time-slice count is equal to n (i.e. the answer to the question in
step 114 is "yes"), then the accumulation buffer is scaled and
copied to the frame buffer (step 120) and is displayed on a display
screen (step 122).
The use of an accumulation buffer as described in Figure 1 is
a computationally expensive process, as the entire scene (i.e. each
object in the scene) is rendered "n" times for each time period.
Consequently, it would be desirable to have a system and method for
more efficiently simulating object blur in a three-dimensional
graphics environment.
SUMMARY OF THE INVENTION
Accordingly, the present invention is directed to a system,
method, and computer-usable medium for simulating object blur using
motion vectors. A motion vector, or array of motion vectors, may
be specified on either a per-vertex or per-primitive (i.e.
per-object) basis. A motion vector is opposite to the direction of
the motion, and thus points in the direction of the blur. The
magnitude of the motion vector represents the distance the vertex
or the primitive ( i . a . each vertex in the primitive ) travels in one
unit of time.
When a scene is rendered, only those objects which are in
motion, or which are subject to depth of field blurring, are
rendered over a series of time slices. All objects which are
AUS9-1998-0492


CA 02307352 2000-OS-O1
static (i.e. non-blurred) are rendered directly into a color
buffer, rather than being repeatedly rendered over a series of time
slices. Thus, static (i.e. non-blurred) objects are rendered only
once, while objects which are to be blurred are rendered over a
series of time slices. This increases the efficiency of the
rendering process while simulating object blur of the objects which
are in motion and/or subject to depth of field blurring.
BRIEF DESCRIPTION OF THE DRAWINGS
The foregoing and other features and advantages of the present
invention will become more apparent from the detailed description
of the best mode for carrying out the invention as rendered below.
In the description to follow, reference will be made to the
accompanying drawings, where like reference numerals are used to
identify like parts in the various views and in which:
Figure 1 is a flow chart illustrating a prior art method for
simulating object blur;
Figure 2 is a representative system in which the present
invention may be implemented;
Figure 3 depicts a moving object, including a motion vector,
within a static scene;
Figure4 depicts a moving object, including an array of motion
vectors, within a static scene; and
Figures 5A and SB are f low charts illustrating a method for using
motion vectors to simulate object blur in accordance with the
present invention.
DETAILED DESCRIPTION OF A PREFERRED EMBODIMENT OF THE INVENTION
A representative system in which the present invention may be
AUS9-1998-0492 5


CA 02307352 2000-OS-O1
implemented is illustrated in Figure 2. Information handling
system 200 includes one or more processors 202 coupled to a
processor or host bus 204. Cache memory 206 may also be coupled to
host bus 204. A bridge/memory controller 208 provides a path
between host bus 204 and system memory 210, as well as a path
between host bus 204 and peripheral bus 212. Note that system
memory 210 may include both read only memory (ROM) and random
access memory (RAM). Accumulation buffer 214 is included within
system memory 210. Alternately, accumulation buffer 214 can be
included in graphics adapter 218. In one embodiment, peripheral
bus 212 is a PCI bus or other bus suitable for supporting high
performance graphics applications and hardware. Graphics adapter
218 is coupled to peripheral bus 212, and may include local memory
portion 220, and frame buffer 221. System 200 may or may not
include graphics adapter 218, and if graphics adapter 218 is not
present in system 200, then frame buffer 221 may be included in
system memory 210 or in video controller 216. Video controller 216
is coupled to display device 222, and is configured to refresh
display device 222 with a graphics image stored in frame buffer
221. Note that graphics adapter 218 may be suitably integrated in
a single device with video controller 216.
The present invention is a system, method, and computer-usable
medium for simulating object blur using motion vectors. A motion
vector, or array of motion vectors, may be specified on either a
per-vertex or per-primitive (i.e. per-object) basis. A motion
vector is opposite to the direction of the motion, and thus points
in the direction of the blur. The magnitude of the motion vector
represents the distance the vertex or the primitive (i.e. each
vertex in the primitive) travels in one unit of time.
AUS9-1998-0492 6


CA 02307352 2000-OS-O1
When a scene is rendered, only those objects which are in
motion are rendered over a series of time slices. All objects
which are static (i.e. nonmoving) are rendered directly into a
color buffer, rather than being repeatedly rendered over a series
of time slices. In the prior art, every object in a scene (whether
static or in motion) is rendered over a series of time slices, and
then accumulated, as discussed above in the Background Of The
Invention section herein. The present invention renders static
objects only once, and only performs rendering over a series of
time slices for those objects which are in motion. This increases
the efficiency of the rendering process while simulating object
blur of the objects which are in motion. The present invention may
also be used to simulate object blur associated with depth of field
blurring.
Referring to Figure 3, an example of a linearly moving object
within a static scene will now be described. While many objects
within a typical scene may be in motion, for illustrative purposes
Figure 3 depicts a single object 300 in motion within a static
scene 302. Note that motion vector 304 is opposite to the
direction of motion 306. The magnitude of motion vector 304
represents the distance over which object 300 moves in a predefined
period of time. In the example shown in Figure 3, a single motion
vector 304 has been specified for the object. Thus, motion vector
304 applies to each vertex of object 300. During the predefined
time period, the vertices of object 300 have moved from points a1,
b1, and c1 to points a2, b2, and c2 respectively. The magnitude of
motion vector 304 is thus equal to a2-a1. The magnitude is also
equal to b2-b1, and equal to c2-c1.
Of course, each vertex of object 304 does not have to be
AUS9-1998-0492 7


CA 02307352 2000-OS-O1
moving at the same velocity. It is possible to assign a different
motion vector to each vertex of an object. Each vertex may be
moving in a different direction and/or at a different rate of
speed.
Referring to Figure 4, an example of a non-linearly moving
object within a static scene will now be described. As in Figure
3, for illustrative purposes only, a single moving object 400 is
depicted within a static scene 402. There are several motion
vectors 404, 406, 408, and 410 associated with object 400. Each
motion vector has a magnitude equal to a portion of the distance
traveled by object 400 during a predefined time period. In the
example shown, each motion vector 404, 406, 408, and 410 applies to
every vertex of object 400. During the predefined time period, the
vertices of object 400 move from points a1, b1, and c1 to points
a2, b2, and c2 respectively. Each vertex moves uniformly with the
other vertices, however, object 400 (and its vertices) do not move
linearly. Thus, each motion vector 404, 406, 408, and 410
includes a magnitude equal to the distance traveled during a
portion of the predefined time period. As in Figure 3, each motion
vector is opposite to the direction of motion 412. Motion vectors
404, 406, 408, and 410 are referred to as an array of motion
vectors. An array of motion vectors may be assigned to an object,
as shown in Figure 4, in which case the array of motion vectors
applies to every vertex in the object. Alternately, an array of
motion vectors may be assigned to a vertex.
An Application Programming Interface (API) is preferably
provided in order to allow an application program to specify motion
vectors for objects and vertices. An exemplary OpenGL API is
depicted below. Note that this API is shown for illustrative
AUS9-1998-0492 8


CA 02307352 2000-OS-O1
purposes only, and is not meant to be limiting. Those skilled in
the art will appreciate that motion vectors may be specified using
a variety of programming techniques. Further, the use of OpenGL as
an API is not meant to be limiting. The present invention may be
implemented using various APIs, including, but not limited to PHIGS
and Direct3D.
An exemplary OpenGL API is as follows:
Overview
This extension allows object blur to occur via a point, line, or
edge along a specified motion vector. The motion vector is
opposite to the direction of motion, thus it points in the
direction of blur. The magnitude of the vector is the distance
each vertex has traveled in one unit of time.
The "glMotionVector*" routines allow the application to specify
motion vectors or arrays of motion vectors on a per-vertex basis or
a per-primitive basis. The "glMotionEnv*" routines allow the
application to specify the duration of motion, the degree of fade
over time, and whether motion blur (i.e. object blur) is enabled or
disabled.
Procedures And Functions
1. void glMotionVector[bsifd]IBM(T xcomponent, ycomponent,
zcomponent)
Purpose: Specify a motion vector for an object
Variables: Three [b]ytes, [s]hortwords, [i]ntegers,
[f]loating point numbers, or [d]ouble precision floats
specifying a 3D vector
AUS9-1998-0492 9


CA 02307352 2000-OS-O1
2. void glMotionVectorv[bsifd]IBM(T components)
Purpose: Specify a motion vector for a vertex
Variables: An array specifying a 3D vector
3. void glMotionVectorPointerIBM(int size, enum type, sizei
stride, void *pointer)
Purpose: Specify an array of motion vectors for an object or
vertex
Variables: The size, type, and stride of a list of motion
vectors pointed to by the pointer variable
4. void glMotionEnv[if]IBM(GLenum pname, GLfloat param)
Purpose: Specifies the duration and degree of blur
Variables: If pname is equal to
GL MOTION ENV FADE-IBM, then
param specifies the degree to which the object is faded over
time. If pname is equal to GL MOTION ENV DELTA TIME_IBM, then
param specifies the number of units of time to blur.
Referring to Figures 5A and 5B, a flow chart illustrating a
method for using motion vectors to simulate object blur in
accordance with the present invention will now be described. Note
that the steps described in Figures 5A and 5B can be performed
either in software or in hardware (e. g., by a graphics
accelerator) , or by a combination of software and hardware. An
object within a scene is defined for rendering (step 500), meaning
that the location, color, motion vector(s), and other attributes
are defined for the object. The environment of the scene, along
with any motion vectors associated with the object, are used to
AUS9-1998-0492 10


CA 02307352 2000-OS-O1
determine whether the object is static or in motion (step 502).
Note that the determination in step 502 could further include a
determination as to whether or not the object needs to be blurred
due to its depth of field. If the object is not static (i.e. the
answer to the question in step 504 is "no") , then the object is
identified as in "in-motion" object (step 506): If the object is
static (i.e. the answer to the question in step 504 is "yes"), then
the object is rendered into a color buffer. The color buffer can
be any displayed or non-displayed buffer area. For example, the
color buffer may be a portion of system memory 210 or local memory
220 on graphics adapter 218, as described above with reference to
Figure 2.
Referring back to Figures 5A and 5B, a check is made to
determine if the object is the last object in the scene (step 510).
If not (i.e. the answer to the question in step 510 is "no"), then
another object is defined for rendering in step 500. If the object
is the last object in the scene (i.e. the answer to the question in
step 510 is "yes"), then the in-motion objects (i.e. the objects
that require blur) are processed. One skilled in the art will
realize that if there are no "in-motion" objects in the scene, the
color buffer may be copied to the frame buffer at this point, and
the scene may be displayed. For illustrative purposes, the process
depicted in Figures 5A and 5B assumes a combination of static and
"in-motion" objects in the scene.
A predetermined render time period is divided into "n" time
slices (step 512). As discussed above with reference to Figure 1,
the render time period is the amount of time during which a scene
is visible on a display device, and is analogous to the exposure
interval, or shutter speed, of a video camera shutter. A longer
AUS9-1998-0492 11


CA 02307352 2000-OS-O1
shutter speed corresponds to a greater amount of blurring, whereas
a shorter shutter speed corresponds to a lesser amount of blurring.
A time-slice count is set to one (step 514). Next, an "in-motion"
object (i.e. an object identified as an "in-motion" object in step
506) is selected for rendering (step 516). The motion vector or
vectors associated with the object are used, along with other
motion variables to calculate and/or modify the location, color,
and all other attributes for each vertex in the object (step 518).
The object is then rendered into a color buffer (step 520). A
check is made to determine if the object rendered is the last
"in-motion" object in the scene (step 522). If not, the process
loops back to step 516, and is repeated for each "in-motion" object
in the scene.
If the last "in-motion" object in the scene has been rendered
(i.e. the answer to the question in step 522 is "yes"), the scene
is accumulated (step 524), meaning it is scaled (for example, by
1/n) and copied from the color buffer into the accumulation buffer.
The time-slice count is checked to see if it is equal to n (step
526). If not, the time slice count is incremented (step 528), and
the process then loops back to step 516, and is repeated for each
time slice. If the time-slice count is equal to n (i.e. the answer
to the question in step 528 is "yes"), then the accumulation buffer
is scaled and copied to the frame buffer (step 532) and is
displayed on a display screen (step 534).
Note that it is possible to define two entry points into the
method described in Figures 5A and 5B, or alternately, it is
possible to have two separate routines to execute the method
described in Figures 5A and 5B. For example, the determination as
to whether an object is in motion (i.e. needs to be blurred) or not
AUS9-1998-0492 12


CA 02307352 2000-OS-O1
can be made by an application program. If the application program
determines that an object is static, the application program can
call a routine which executes only steps 500 through 510 to render
the static object. If the application program determines that an
object needs to be blurred, the application program can call a
routine which executes only steps 512 through 534 to render the
in-motion object.
Although the invention has been described with a certain
degree of particularity, it should be recognized that elements
thereof may be altered by persons skilled in the art without
departing from the spirit and scope of the invention. One of the
implementations of the invention is as sets of instructions
resident in the random access memory of one or more computer
systems configured generally as described in Figure 2. Until
required by the computer system, the set of instructions may be
stored in another computer readable memory, for example in a hard
disk drive, or in a removable memory such as an optical disk for
eventual use in a CD-ROM drive, or a floppy disk for eventual use
in a floppy disk drive. Further, the set of instructions can be
stored in the memory of another computer and transmitted over a
local area network or a wide area network, such as the Internet,
when desired by the user. One skilled in the art will appreciate
that the physical storage of the sets of instructions physically
changes the medium upon which it is stored electrically,
magnetically, or chemically so that the medium carries computer
usable information. The invention is limited only by the following
claims and their equivalents.
AUS9-1998-0492 13

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date Unavailable
(22) Filed 2000-05-01
(41) Open to Public Inspection 2000-12-30
Examination Requested 2003-08-26
Dead Application 2006-05-01

Abandonment History

Abandonment Date Reason Reinstatement Date
2005-05-02 FAILURE TO PAY APPLICATION MAINTENANCE FEE

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Registration of a document - section 124 $100.00 2000-05-01
Application Fee $300.00 2000-05-01
Maintenance Fee - Application - New Act 2 2002-05-01 $100.00 2001-12-19
Maintenance Fee - Application - New Act 3 2003-05-01 $100.00 2003-01-03
Request for Examination $400.00 2003-08-26
Maintenance Fee - Application - New Act 4 2004-05-03 $100.00 2003-12-22
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
INTERNATIONAL BUSINESS MACHINES CORPORATION
Past Owners on Record
CHELSTOWSKI, ILIESE CLAIRE
JOHNS, CHARLES R.
MINOR, BARRY L.
WHITE, GEORGE L., JR.
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Representative Drawing 2000-12-21 1 5
Drawings 2000-05-01 5 109
Abstract 2000-05-01 1 32
Claims 2000-05-01 8 259
Description 2000-05-01 13 597
Cover Page 2000-12-21 1 41
Assignment 2000-05-01 7 271
Prosecution-Amendment 2003-08-26 1 35