Language selection

Search

Patent 2144914 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent Application: (11) CA 2144914
(54) English Title: COMPUTER GRAPHICS TEXTURE PAGING SYSTEM WITH FRAGMENTARY MIP MAP SELECTION
(54) French Title: SYSTEME DE PAGINATION A SELECTION FRAGMENTAIRE POUR L'INFOGRAPHIE
Status: Dead
Bibliographic Data
(51) International Patent Classification (IPC):
  • G06T 11/40 (2006.01)
  • G06T 15/04 (2011.01)
  • G06T 15/20 (2011.01)
  • G06T 15/20 (2006.01)
(72) Inventors :
  • FITZGERALD, RAYMOND L. (United States of America)
(73) Owners :
  • EVANS & SUTHERLAND COMPUTER CORPORATION (United States of America)
(71) Applicants :
(74) Agent: OYEN WIGGS GREEN & MUTALA LLP
(74) Associate agent:
(45) Issued:
(22) Filed Date: 1995-03-17
(41) Open to Public Inspection: 1995-10-02
Examination requested: 1995-05-16
Availability of licence: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): No

(30) Application Priority Data:
Application No. Country/Territory Date
08/221,832 United States of America 1994-04-01

Abstracts

English Abstract




Dynamic computer graphics images are enhanced by
mapping two-dimensional texture onto objects with texture
definition appropriate to the range from an eye point to the
object. With varying definition texture map levels stored,
each carrying different texture resolutions (degrees of
definition) select sets of levels are paged for mapping
operations based on the range to an object. In an operation
with two select sets, one set (the top) includes the five
least detailed or least defined levels while the other set,
the whole, constitutes all levels. The choice between the
two select sets for texture mapping an object is determined
based on the range, and a specific ratio of pixel view
frustum values provides an effective test criteria.


Claims

Note: Claims are shown in the official language in which they were submitted.


WHAT IS CLAIMED IS:

1. A computer graphics system for producing dynamic
images with textured features as viewed from a moving eye
point comprising:
data storage for graphics image data including texture
map data defined in a plurality of levels, as in relation to
definition;
an image generator to provide image display signals
representative of said dynamic images processed from said
graphics image data and including texture mapping structure
with an active memory, processing said texture map data to
texture features as represented by said display signals;
a data paging structure for selectively paging select
sets of said plurality of texture map levels into said
active memory for texturing features; and
a display unit coupled to receive said display signals
from said image generator to produce dynamic images.

2. A system according to claim 1 wherein said data
paging structure includes means for indicating the distance
from said eye point to a surface for texture mapping and
means for selecting a select set of said texture map levels
in accordance with said distance.

3. A system according to claim 1 wherein said image
generator processes said graphics image data, mapping said
texture map data as texture elements.

4. A system according to claim 3 wherein said means
for indicating distance includes means for determining size
relationships between texture elements of a predetermined
map level and screen pixels to be processed by said image
generator.

5. A system according to claim 1 wherein said data
paging structure selectively pages one of two select sets of

18

texture map levels into said active memory determined by the
distance from said eye point to a picture element in
process.

6. A system according to claim 5 wherein said two
select sets of texture map levels consist of the entire map
and select top levels.

7. A system according to claim 6 wherein said top
levels consists of the five least detailed levels.

8. A system according to claim 1 for texture mapping
image features and wherein said data paging structure
selectively pages a single select set of texture map levels
into said active memory for processing each feature.

9. A system according to claim 1 wherein said image
generator interpolates between map levels to process said
texture map data.

10. A system according to claim 1 wherein said image
generator processes data and said data paging structure
compares a ratio of pixel size and eye point to screen
distance with a ratio of texel size to the range to the
texel.

11. A system according to claim 10 wherein values of
said ratios are squared.

12. A computer graphics process for producing dynamic
images with textured features comprising the steps of:
storing graphics image data including texture map data
defined in a plurality of levels, as in relation to
definition;
paging select sets of levels from said texture map data
for processing said graphics image data;


19

processing said graphics image data to provide image
display signals representative of said dynamic images
processed from said graphics image data and including
mapping said texture map data to texture features of said
images with said select set of levels; and
displaying images in accordance with said image display
signals.

13, A process according to claim 12 wherein said
select sets of levels are paged based on the distance to a
feature as depicted.

14. A process according to claim 13 wherein graphics
image data is processed by pixels and wherein said select
sets are paged in accordance with a ratio of pixel size and
eye point to screen distance with a ratio of texel size to
the range to the texel.





Description

Note: Descriptions are shown in the official language in which they were submitted.


214491~

.


CoN~u~:K GRAPHIC8 ,~.uKE PAGING ~Y8TEN
WITH FRAGNENTA-KY NIP NAP ELECTION

FIELD OF THE INVENTION
The present invention relates to a system of memory
organization and an operation for selective data paging to
generate textured dynamic images on a display device.

BACKGROUND OF THE INVENTION
In recent years, significant advances have occurred in
the field of computer graphics. For example, in the
simulator area, real time dynamic pictures can be displayed,
for example revealing a terrain as it would appear from a
moving aircraft, complete with buildings and various other
features. Typically, such systems utilize a display device,
as in the form of a cathode ray tube (CRT) to provide
dynamic images to visually simulate actual flight
experiences.
Various forms of dynamic displays have been
accomplished utilizing graphics data definitive of objects
and surface textures. However, a common weakness of such
systems has involved the texture capability, both in terms
of the general image quality and the time it takes to
"tweak" the texture so as to make it "behave" properly.
Accordingly, a need exists for systems with greater realism
via phototexture, better behaved texture and texture that
does not require many hours to tune and adjust.
Effective improvements in computer graphics texturing
systems have involved the use of so-called "MIP" maps,
carrying different texture resolutions for the same area.
Essentially, several textures are computed as levels
reflecting the distance from which the texture is to be
viewed. As the distance increases, the texture detail
becomes fuzzy; less sharply defined. Although traditional
MIP map techniques are effective for texturing objects in a

214491~
-



dynamic display, a considerable difficulty arises in storing
and manipulating the volume of data required for advanced
systems.
To consider a specific example, feature textures might
be mapped on the side of a building to indicate a particular
surface structure, e.g., brick. Typically, with the
presence of the building in the scene, texture is paged from
storage for texture mapping the building. By utilizing a
MIP map pyramid (levels of filtered texture data) the
building can be variously textured with regard to definition
as the range changes. That is, fuzziness decreases as the
eye point approaches the building. Of course, the scene may
include many textured features, as buildings. Accordingly,
the volume of MIP map data is considerable, imposing rather
extreme demands on the active or working memory of the
system. Accordingly, a need exists for an improved system
to simplify and enhance operations utilizing MIP map
techniques for texture mapping dynamic displays.

SUMMARY OF THE INVENTION
The system involves selectively paging MIP map levels
into active memory to create dynamic images with respect to
a current eye point. The system is based on the recognition
that under certain circumstances, features of an image can
be textured effectively using less than all the levels in an
entire MIP texture map pyramid. That is, considerable
saving of active memory is afforded with little compromise
to the displayed image by selectively breaking the MIP
texture map pyramid into fragmentary pageable units for
selective use.
Essentially, recognizing that the higher resolution
levels of a MIP map pyramid are applicable only when the eye
point is near the texture map (since these levels will alias
at range), selectively paging levels of the MIP pyramid as
they are needed, has been discovered to be an efficient
online data reduction technique. In accordance herewith,
portions (levels of texture elements or texels) of the MIP

214491~

pyramid are selectively paged into active memory based on
the distance from an object (to be textured) to the eye
point.
In one embodiment of the present system, the entire MIP
texture map pyramid may be paged into active memory.
Alternatively, only several of the lowest levels are paged.
The several select lower levels, e.g., five lower levels of
detail, are referred to as the "top". The present
development is based on the recognition that in many
displays, a very considerable portion of the texture in a
scene can be accounted for by using only the top.
In accordance herewith, it has been determined that the
portion of the texture in a scene that can be accounted for
using only the top may exceed 90%. In a three-dimensional
(3D) system, the area over which the top is sufficient may
be even greater. In one embodiment, for a given eye point
(assuming a uniform distribution) only one-half of one
percent of all texture maps applicable required the entire
MIP pyramid structure. For the balance, over ninety-nine
percent, the tops of the maps would suffice.
As described in detail below, and in accordance with
one operating mode, selective paging of selective sets of
levels is determined based on the relative size of a
specific level texture element (texel) in a map level, and
the perspective picture element (pixel) size. In one
embodiment, the size of a level four texel (highest level of
detail in the top) is compared against the perspective size
(using the range to the feature and the field of view) of
two pixels of the display. If the level four texel size is
too large, then the entire map is paged into active memory,
otherwise, the top is sufficient. Various formats of
selection and pyramid dissection will be apparent from the
detailed description below.

21~91~

BRIEF DESCRIPTION OF THE DRAWINGS
In the drawings, which constitute a part of the
specification, an exemplary embodiment exhibiting various
objectives and features hereof is set forth, specifically:
FIGURE 1 is a graphic representation illustrating a
view frustum radiating from an eye point with respect to
screen space and world space as treated herein;
FIGURE 2 is a plan view of a component pixel frustum
illustrating a representative relation to changing depth;
FIGURE 3 is a graphic representation illustrating
content for a pixel window with respect to a textured
polygon at varying depths;
FIGURE 4 is a graphic representation of a memory
organization for storing MIP maps;
FIGURE 5 is a graphic representation of a MIP map
pyramid including several levels of filtered texture data;
FIGURE 6 is a graphic representation of an area
illustrating the fragments over which entire MIP maps and
MIP tops are utilized;
FIGURE 7 is a diagrammatic perspective view of texels
and a pair of MIP maps illustrating interpolation operation;
FIGURE 8 is a block diagram of a computer graphics system in
accordance with the present invention;
FIGURE 9 is a detailed block diagram of a component of
the system of FIGURE 8; and
FIGURE 10 is a graphic representation illustrating the
operation of the system of FIGURE 9.

DESCRIPTION OF THE ILLUSTRATIVE EMBODIMENT
A detailed illustrative embodiment of the present
invention is disclosed herein; however, recognizing that a
wide variety of specific embodiments of the disclosed system
are possible, it is merely representative. Nevertheless,
the illustrative embodiment is deemed to afford the best
embodiment for purposes of disclosure and to provide a basis
for the claims herein which define the scope of the present
invention.

- 21~491~
Initially, consideration of some graphic
representations will be helpful as related to understanding
the present development. Initially, considerations are with
regard to accomplishing computer graphics displays with
textured surfaces.
The process of applying texture patterns to surfaces is
generally referred to as "texture mapping" and is treated at
length in the book Princi~les of Interactive Computer
Graphics, 2nd edition, Newman & Sproul, McGraw-Hill Book
Company, 1979. Non-uniform texture mapping is well-known in
the art as treated in an article entitled "Survey of Texture
Mapping" by Paul S. Heckbert, published in IEEE Computer
Graphics and Applications, November 1986, pp. 56-67. MIP
maps and their use in computer graphics for texture mapping
are treated in a paper entitled "Pyramidal Parametrics" by
Lance Williams, published July 1983 in Computer Graphics,
vol. 17, no. 3. The article has been identified by the
Association for Computing Machinery as ACM 0-89791-109-
1/83/007/0001.
Texture mapping essentially involves locking textures
to defined objects or polygons to accomplish textured
surfaces in a display. The mapping of texture or other
images onto surfaces is more effective if the texture is
rendered progressively more fuzzy as the polygon moves away
from the viewer. Such operation is in accordance with the
perspective nature of observation by the human eye. For
example, the squares of a checkerboard are vividly clear to
the normal eye when viewed at a distance of ten feet.
However, if the checkerboard is moved away from the eye,
boundaries between individual squares of the board
progressively become more fuzzy, less definition. At some
point, perhaps a few hundred feet, the individual black and
white squares of the checkerboard simply fade to a uniform
gray, totally void of definition. Effective texture mapping
reflects these changes as they would appear to an observer.
As treated below, MIP map texturing involves a MIP
texture map pyramid composed of multiple versions of the

214~91~

same source motif, e.g., bricks or any other pattern, each
version having a progressively coarser resolution.
Accordingly, depending on the distance from the eye point to
the object or feature being textured, an appropriately
coarse map level is selected for use. Actually, in
practice, two map levels are selected from which values are
interpolated.
Referring now to FIGURE 1, a textured polygon 24
(representing part of an object) is illustrated in world
space. Note that the various space designations as used in
the field of computer graphics are treated in the referenced
text, Princi~les of Interactive Computer Graphics. In
summary, world space or object space (three-dimensional)
serves to define objects prior to any geometric
transformations. In eye space, objects are transformed so
that the eye or view point is the origin for coordinates and
view rays are along the Z-axis. Screen space involves
further transformations to account for the perspective
foreshortening of the view pyramid and with clipping
performed. As a function of computer graphics processing,
objects in screen space are mapped to an eye point,
typically on a pixel grid. A discussion of world space and
the related transforms to accomplish displays appears in
Chapter 8 of a book, Fundamentals of Interactive Computer
Graphics by Foley and Van Dam, published in 1984 by Addison-
Wesley Publishing Company.
To represent the polygon 24 in a display related to an
eye point 0, areas of the polygon are defined in screen
space at a screen 28. In accordance with convention, the
screen 28 comprises the base of a pyramidal view frustum 30
with an apex 32 at the eye point O. In a sense, the viewing
screen 28 (base of the frustum 30) may be analogized to the
screen of a television set through which world-space objects
(including the polygon 24) are viewed.
In accordance with traditional practice, the space of
the screen 28 is dissected into small picture elements
(pixels). Specifically, for example, an array of one

21~4914

million pixels may be organized as one thousand rows, each
of one thousand pixels. A representative pixel 34
(idealized and grossly enlarged) is illustrated at a central
location of the screen 28. Note that a ray 36 extending
from the eye point O, passes through the center of the pixel
34 to a point 38 on the polygon 24. The ray 36 exemplifies
perhaps a million of such rays that dissect the scene or
image of primitives (as the polygon 24) into pixels. For a
display, each pixel is processed to accomplish
representative signals in a storage, e.g., a frame buffer,
which is scan converted, for example, into a raster pattern
for driving a display device, e.g., cathode ray tube (CRT).
As illustrated in FIGURE 1, the polygon 24 is to bear a
texture 35 in the form of a checkerboard. In accordance
herewith, the texture 35 is mapped onto the polygon 24
utilizing select levels of a MIP map texture pyramid
depending essentially on the range from the eye point O to
the polygon 24 and the orientation of the polygon
(perspective size). Considerable economy of memory as well
as transfer operations result from the selectivity.
For purposes of explanation, consider that the pixel 34
represents a substantial area with respect to the polygon
24. Actually, the pixel will represent a single color,
however, treating an enlarged area will be helpful to the
explanation. FIGURE 2 shows the pixel 34 in a sectioned
plan view and illustrates the polygon as it might appear at
different ranges in a pixel frustum 39. That is, the
polygon is shown at various relative depths, i.e., indicated
as polygon areas 24a, 24b, and 24c, each progressively more
remote from the pixel 34 in screen space. As the polygon 24
moves away from the pixel 34 (arrow 37) more of the texture
35 (FIGURE 1) is visible and it becomes fuzzy in the
picture. The phenomena is illustrated in FIGURE 3 and will
now be considered.
FIGURE 3 shows the texture areas of the polygon 24
(FIGURE 3) contained by the pixel 34 as the polygon moves
away from the pixel 34 (and the eye point 0) in the Z

21~914

dimension as indicated by the arrow 37 (FIGURE 2). As the
polygon area 24a (FIGURES 2 and 3) is positioned near the
screen 28 (contiguous to the pixel 34) the pixel 34 is
occupied primarily by a light area La and only slightly
occupied by a dark area Da. As represented, an area ratio
of approximately four to one is illustrated.
With progressive depth displacement of the polygon 24,
as illustrated by the polygon area 24b, the pixel 34
embraces greater detail of the texture 35. Of course, the
change simply results from the fact that the polygon 34 is
deeper in the pixel frustum 39 (FIGURE 2). Accordingly, the
frustum has a larger base at the polygon 24 to encompass a
greater area of the texture 35. Further displacement of the
polygon 24 from the screen 28 is illustrated by the polygon
area 24c and results in a still greater area of the texture
35 being located within the pixel 34.
For each of the depicted situations, a different level
of definition or detail is appropriate for displaying a
quality image. That is, the level of detail of an object in
the picture should become fuzzy as the object is moved
further away from the eye point. Relating the phenomena to
FIGURE 3, the border between the area La and area Da of the
pixel area 24a would be substantially sharper than the
borders between the areas Lc and Dc within the pixel 24c.
Accordingly, different computed levels of a MIP map pyramid
are used for texturing a feature or object. That is, rather
than to repeatedly calculate the averages for each pixel, a
MIP map is addressed by texture coordinates U and V to
provide weighted averages that have been computed for the
contribution from a textured polygon to a pixel. To further
illustrate, consider a form of memory organization for
accomplishing such operation, as shown in FIGURE 4.
Areas are defined in FIGURE 4 of progressively reduced
size, to indicate the resolution levels of a MIP map pyramid
that are selectively paged in accordance herewith.
Typically, an image is provided in its color components red,
green and blue. For a pixel in a textured polygon,

- 21~91 4

precomputed texel averages are addressed for each color
component by the coordinates u and v (FIGURE 4). Each of
the color components are provided in look-up tables of
varying degrees of specificity to be identified and
interpolated. Levels of detail are related to distance from
the eye point as indicated by a line 41 and perspective
size. At the most detailed level, the computed texel
averages of blue (B), for example, are stored in a section
40 while the values for the green component are stored in a
lo section 42 and the values for the red component are stored
in a section 44.
In accordance with the memory organization, the fourth
quadrant or section 46 is arranged to progressively include
sets of three smaller sections, defining the color level.
The reducing pattern continues in a similar quadrant-by-
quadrant division until ultimately sections S1 are provided.
While the representation of FIGURE 4 illustrates a
memory organization for the different levels, FIGURE 5 is a
side elevation of a MIP texture map pyramid illustrating the
diminishing MIP map levels in stacked relationship.
For simplicity, consider the pyramid for a single
color. As illustrated, the MIP pyramid 52 comprises n
levels of texture data extending from a tip (least defined
detail level O) downward to a base at level n-1. The
highest levels (lowest level of detail) are designated as a
top 54 (five levels) while the whole is designated 55. At
the base, the level n-l is the highest level of detail
followed by the level n-2.
To consider an exemplary format, based on a 512 x 512
texel array as the maximum size for the base, the following
table indicates sizes.

\\
\\
\\
\\
\\

214491~

TABLE 1.
Level Texel Array
O lxl
1 2x2
2 4x4
3 8x8
4 16x16
32x32
6 64x64
7 128x128



As illustrated in FIGURE 5, select set of map levels O
through n-l are paged as alternatives, either the top 54 or
the entire map 52. In various arrangements, the MIP pyramid
may be broken into any number of select sets of levels;
however, in accordance with one operating embodiment, a
break into two pieces or sets has been found to be
effective.
With the MIP map texture pyramid stored for selectively
addressing or paging, in accordance herewith, selection of a
select set of levels, either the whole 55 or top 54, is
based on range and field of view.
For one embodiment based on two sets of levels (top 54
and whole 55), FIGURE 6 illustrates the areas of
selectivity. That is, a series of concentric rings RO, Rl,
R2, R3 and R4 define annular areas with respect to texturing
operations. In that regard, only the shaded area within the
ring R4 requires the entire or whole 55 of the map pyramid
to be used in texturing operations. Conversely, excluding
the shaded area within the ring R4, in the areas within each
of the larger concentric rings (R3, R2, Rl and RO), shading
was successfully accomplished using only the top 54 as
illustrated in FIGURE 5. A profound economy is thus
illustrated.



21449~
-



In the course of texture mapping, a point 38 (FIGURE 1)
of interest may indicate a texture map level that lies
intermediate two adjacent map levels. To illustrate,
referring to FIGURE 7, a point 60 of interest lies between
two different levels, e.g., levels L3 and L4 as represented
in FIGURE 7 by a pair of single texels 62 and 64.
Accordingly, neither of the texels 62 or 64 is appropriate
with respect to the point 60. In such an event, an
interpolation is performed involving the four surrounding
lo coordinate corners of the texels 62 and 64 at the two map
levels L3 and L4. Specifically, the points 66, 67, 68 and
69 of the texel 62 are interpolated in combination with the
points 76, 77, 78 and 79 of the texel 64. Interpolation
(usually but not necessarily linear), as well known in the
art, is a calculation of a texture value from the eight
values of the surrounding points and accordingly a value
(intensity and color) is determined for texturing the pixel
identified by the impact point 60.
Recapitulating to some extent with respect to the
graphics representations as explained above, the texturing
operation basically involves mapping a texture pattern or
image onto the surface of a primitive, polygon or object,
utilizing traditional tec-hn;ques. For example, the
operation may involve applying a brick texture to the
exterior wall of a building as a part of a view terrain
dynamically displayed with respect to a moving eye point.
In accordance herewith, sets of levels from MIP maps are
used to texture objects with various levels of detail
depending upon the range. For an object or feature near the
eye point, the detail must be clear and sharp, that is, very
high definition. For a remote object, the level of detail
reduces and the texture becomes somewhat fuzzy.
In the disclosed embodiment, the MIP data may be
considered in the form of a pyramid comprising n levels of
filtered texture data as illustrated in FIGURE 5. As
explained above, the system of the present development is
based on the recognition that select sets can be used with

214491~

little compromise to image quality. In that regard, higher
resolutions of a MIP map pyramid are needed in a select set
only when the eye point is near the object. Furthermore,
for typical dynamic image creation, much of the display is
remote. Accordingly, selectively paging select sets of
levels of the MIP map pyramid, as they become necessary, is
an efficient online data reduction mechanism. Note that one
paged select set may be the entire map.
In accordance herewith, depending on the distance from
the eye point to an object, a select group or select set of
MIP levels are paged into active memory to accomplish the
texturing operations as reflected in pixel calculations. As
a result, individual pixel data signals are stored in a
display system, typically including a frame buffer and
display unit.
In the operation of a contemporary image generator,
graphic image data is utilized including texture map data
defined in levels of detail. Accordingly, the image
generator processes the graphic image data to provide pixel
display signals. In accordance herewith, a data paging
structure selectively pages select sets (including the full
set) of levels from the texture map pyramid into the image
generator. Accordingly, features are textured efficiently
and economically.
As indicated above, the selective data transfer is
determined by the texel size in relation to the perspective
pixel size, using the range to the feature and the field of
view. Essentially, the consideration involves the texel
size, determined by the distance from the eye point 0
(FIGURE 1) to the polygon 24 in world space in relation to
the size of the pixel 34 in screen space. As indicated, in
one operating embodiment, the size of a level 4 texel is
compared against the size of two pixels in the display. If
the level 4 texel size is too large, the entire map is paged
into the image generator, otherwise, only the top is paged.
Clearly, the levels of MIP texture maps can be variously

21~4~ ~ 4

fragmented in other embodiments and other criteria relating
to range can be employed for selective paging.
In view of the above explanations of operating steps
within the system, reference now will be made to FIGURE 8
showing an operating embodiment implementing the
development. A real-time system computer 142 (FIGURE 8,
left) functions as a system controller, as in a conventional
system. For example, the computer 142 may take the form of
a Motorola Model MVME 147S-1, available from that company,
which is located in Phoenix, Arizona.
The real-time system computer 142 is served by a
control input unit 144 which may take various forms
including a manual input terminal, another computer, or
virtually any source of control input information.
Essentially, in accordance with contemporary techniques, the
input unit 144 interfaces the real-time computer 142 for
driving an object management processor 146. An
environmental memory 152 is embodied in the object
management processor 146 along with an object pager control.
Note that the environmental memory 152 stores three-
dimensional data defining objects in world space, sometimes
referred to as "geometric data".
Functionally, the object management processor 146 is
intimately associated with a display processor 148 that is
connected to a texture memory 150 (active, for two-
dimensional data). Note that basically, the combination of
the real-time computer system 142 and the object management
processor 146 along with the display processor 148 may take
the form a Model ESIG-3000 Image Generator available from
Evans & Sutherland Computer Corporation located in Salt Lake
City, Utah. Modifications involve texture data management.
The texture memory 150 within the processor 148 and the
environmental memory 152 within the processor 146 each
receive data from a mass storage 154 controlled by the
computer 142 as indicated by a control path 156. As
suggested by the drawing, the mass storage 154 may take the
form of a disk storage designed for the transfer of address

-- ? ~

data to both the texture memory 150 and the environmental
memory 152 as indicated by the lines 158 and 159.
Specifically, the texture memory 150 stores two-dimensional
MIP data to be mapped selectively onto surfaces of objects.
Note that from the select set of MIP map levels paged into
the texture memory 150, typically two are designated to
provide the texels (e.g., texels 62 and 64, FIGURE 7) from
which a value is computed. The operation is executed for
each pixel effected by the object (polygon 24, FIGURE 1).
Consequently, fast access is a necessary characteristic and
space in the texture memory 50 is cherished. In accordance
herewith, by selective paging of MIP map data, substantial
savings occur in memory space and data transfer operations.
The texture memory 150 receives select levels (all
being a possibility) of MIP maps from the mass storage
system 154 under the control of the computer 142 and the
object pager control 157 in the processor 146. Essentially
as the object management processor initiates activity on a
particular object, the object pager control determines the
select set of levels (whole 55 or top 54) in the MIP map
pyramid 50 (FIGURE 5) needed for texturing the object. In
accordance with the selection, the object pager control 157
operates through the computer 142 to address and control
(page) the desired select set of MIP levels for transfer
from the mass storage system 154 to the texture memory 150.
As indicated above, in many instances, in view of the
distance from the eye point to the object, only the top 54
(FIGURE 5) of the MIP pyramid need be paged into the active
texture memory 150. From that location, the display
processor 148 texture maps the object for storage pixel-by-
pixel in a frame buffer 164 from which the display data is
scanned for display by a display unit 166.
Considering the operation of the system of FIGURE 8 in
somewhat greater detail, the real-time system computer 142
along with the object management processor 146 and the
display processor 148 function as a pipeline to provide
display signals to the frame buffer 164. The computer 142

14

~- L ~

implements the subject matter of displays controlling the
mass storage system 54 to selectively load and maintain the
texture memory 150 as explained above. Additionally, the
environmental memory 152 also is loaded and maintained to
accommodate the development of a dynamic image with a moving
eye point. The object management processor 146 receives
control data, with the consequence that object or polygon
data is supplied from the object processor 146 to the
display processor 148.
The accumulation and preliminary processing of object
data to accomplish basic image data for the display
processor 148 is well-known in the art. Accordingly, the
display processor 148 receives basic data for processing
object pixels to be stored in the frame buffer 164. As
explained in detail above, the display processor 148
utilizes selective texture map data stored in the active
texture memory 150 to process individual pixels for the
frame buffer 164. It is to be understood that the texture
maps may be stored in a variety of configurations or memory
organizations for fast access; however, in accordance
herewith, select numbers of levels (select sets of all or
less than all) are paged from the mass storage system 154
into the texture memory 150. The selectivity is based on
the result of a texel/pixel comparison as will now be
considered with respect to the block diagram of FIGURE 9.
Within the object management processor 146 (FIGURE 8),
certain operations are performed, specifically, the object
pager control 157 incorporates a level 4 texel store 202
(FIGURE 9). Generally, the store 202 receives signals from
an object data store 208 that are representative of the size
of a level 4 texel. That value is supplied from the store
202 to a comparator 204, also connected to receive an
indication of pixel size from a store 206. Specifically,
the store 206 provides signal indications representative of
the perspective size of two pixels. Accordingly, in the
disclosed embodiment, a level 4 texel (from store 202) is
compared with the size of two perspective pixels (from store

214~914

206) to determine the select set of levels that will be
paged into the texture memory 150 (FIGURE 8).
Recapitulating to some extent, the object management
processor 146 (FIGURE 8) will always request the tops 54
(FIGURE 5) of the MIP pyramids 52 that are encountered
during a pager traversal. The request for the whole 55 map
pyramid depends on the proximity of the eye point O (FIGURE
1) to the polygon 24. The object management processor 146
fetches the whole texture map when the texel size at level 4
(FIGURE 5) is larger than the size of two pixels on the
screen 28 (FIGURE 1). FIGURE 10 illustrates a case in which
two pixels are the same perspective size as a level 4 texel.
A view triangle 220 extends from the eye point O through the
screen 28 to an arrow D representing the size of a level 4
texel. The dimension of a shorter arrow A indicates the
size of two pixels located a distance B from the eye point
O. The distance R reflects a measure from the eye point 0
to the level 4 texel.
FIGURE 10 illustrates a ratio test that is true if the
perspective size of a texel is equal to or greater than the
perspective size of two pixels, i.e., D/R is equal to or
greater than A/B. The equation can be modified slightly to
simplify the operation of the management processor 146
(FIGURE 8). Specifically, by squaring all terms, the object
management processor can do the perspective size comparison
without the complications of calculating square roots.
Accordingly, if R2*A2/B2 < D2 is true, then the object
management processor 146 will command the full texture map
pyramid.
The test or comparison operations as set forth above
may be executed by a structure as represented in FIGURE 9,
either in the basic form or in the squared configuration.
Accordingly, a relatively simple comparison test is
performed for the texture processing of each object by the
comparator 204 utilizing the values as developed in the
stores 202 and 206. However, a multitude of other options
and variations departing from those disclosed above are

16

2194914

available without departing from the spirit of the present
development. In that regard, the top may define various
numbers of levels, the comparison may be variously
implemented and a variety of interpretation tec-hn;ques might
be employed. Accordingly, although certain detailed
structures and processes have been disclosed, the
appropriate scope hereof is deemed to be in accordance with
the claims as set forth below.

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date Unavailable
(22) Filed 1995-03-17
Examination Requested 1995-05-16
(41) Open to Public Inspection 1995-10-02
Dead Application 1998-03-17

Abandonment History

Abandonment Date Reason Reinstatement Date
1997-03-17 FAILURE TO PAY APPLICATION MAINTENANCE FEE

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Application Fee $0.00 1995-03-17
Registration of a document - section 124 $0.00 1995-08-31
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
EVANS & SUTHERLAND COMPUTER CORPORATION
Past Owners on Record
FITZGERALD, RAYMOND L.
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Abstract 1995-10-02 1 22
Description 1995-10-02 17 832
Representative Drawing 1998-06-16 1 22
Cover Page 1996-01-18 1 16
Claims 1995-10-02 3 102
Drawings 1995-10-02 7 109
Office Letter 1995-11-01 1 46
Prosecution Correspondence 1995-05-16 1 40
Prosecution Correspondence 1995-04-07 1 45