Language selection

Search

Patent 2466377 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent: (11) CA 2466377
(54) English Title: METHODS AND APPARATUS FOR SYNTHESIZING A THREE-DIMENSIONAL IMAGE SIGNAL AND PRODUCING A TWO-DIMENSIONAL VISUAL DISPLAY THEREFROM
(54) French Title: METHODES ET APPAREIL POUR SYNTHETISER DES SIGNAUX D'IMAGERIE TRIDIMENSIONNELLE ET PRODUIRE DES IMAGES BIDIMENSIONNELLES
Status: Term Expired - Post Grant Beyond Limit
Bibliographic Data
(51) International Patent Classification (IPC):
  • G09G 05/39 (2006.01)
  • G09G 05/393 (2006.01)
  • G09G 05/395 (2006.01)
(72) Inventors :
  • OKA, MASAAKI (Japan)
(73) Owners :
  • SONY COMPUTER ENTERTAINMENT INC.
  • SONY CORPORATION
(71) Applicants :
  • SONY COMPUTER ENTERTAINMENT INC. (Japan)
  • SONY CORPORATION (Japan)
(74) Agent: GOWLING WLG (CANADA) LLP
(74) Associate agent:
(45) Issued: 2008-04-01
(22) Filed Date: 1994-04-11
(41) Open to Public Inspection: 1994-10-16
Examination requested: 2004-05-31
Availability of licence: N/A
Dedicated to the Public: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): No

(30) Application Priority Data:
Application No. Country/Territory Date
P05-088485 (Japan) 1993-04-15
P05-88486 (Japan) 1993-04-15
P06-18228 (Japan) 1994-02-15

Abstracts

English Abstract

A method and apparatus for formulating a picture and a household game playing apparatus are provided. Data necessary for formulating the picture are read from picture data of a three-dimensional object stored in a main memory, and coordinates of apex points of relatively small-sized polygons comprising the three-dimensional object are produced by a geometry processor for producing luminance data and color data gor each apex point. Color data and coordinate data of boundary points between the apex points are produced by a raster processor. The raster processor uses the color data and coordinate data of the boundary points to produce color data and coordinate data of intermediate points between the boundary points.


French Abstract

La présente concerne une méthode et un appareil pour former une image et un appareil de jeu de société. Les données nécessaires pour former une image sont lues à partir des données d'image d'un objet en trois dimensions mémorisées dans une mémoire principale, et les coordonnées des sommet de polygones de taille relativement petite comprenant l'objet tridimensionnel sont produites par un processeur de géométrie pour produire des données de luminance et de couleur des données pour chaque sommet. Des données de couleur et de coordonnées des points limites entre les sommets sont produites par un processeur matriciel. Le processeur matriciel utilise les données de couleur et de coordonnées des points limites afin de produire des données de couleur et de coordonnées pour les points intermédiaires entre lesdits points limites.

Claims

Note: Claims are shown in the official language in which they were submitted.


What is claimed is:
1. A method of controlling a video memory having a display
area, said video memory for outputting data stored in the
display area to a display device, said method comprising the
steps of:
storing picture data in the display area at a first
position in the video memory;
setting the display area at a second position differing
from the first position in the video memory;
writing new picture data in the display area at the second
position, said new picture data formulated based at least in
part on the picture data stored in the display area at the first
position.
2. A method of controlling a video memory having a display
area, said video memory for outputting data stored in the
display area to a display device, said method comprising the
steps of:
holding a picture data in the display area at a first
position in the video memory during a selected Nth frame;
setting the display area at a second position differing
from the first position in the video memory during following
(N+1)th frame;
writing new picture data in the display area at the second
position, said new picture data formulated based at least in
part on the picture data stored in the display area at the first
position.
3. The method as recited in anyone of claims 1 or 2, wherein
at least a part of the picture data stored in the display area
at the first position is stored in an area other than the
53

display area at the second position in the video memory as a
texture source picture data.
4. A method of formulating picture data for a picture
formulating apparatus outputting a data stored in a display area
of a video memory to a display device, comprising steps of:
storing picture data in the display area at a first
position in the video memory;
setting the display area at a second position differing
from the first position in the video memory;
formulating new picture data based at least in part on the
picture data stored in the display area at the first position.
5. A method of formulating picture data for a picture
formulating apparatus outputting data stored in a display area
of a video memory to a display device, comprising steps of:
holding picture data in the display area at a first
position in the video memory during a selected Nth frame;
setting the display area at a second position differing
from the first position in the video memory during following
(N+1)th frame;
formulating new picture data based at least in part on the
picture data stored in the display area at the first position.
6. The method as recited in anyone of claims 4 or 5, wherein
at least a part of the picture data stored in the display area
at the first position is stored in an area other than the
display area at the second position in the video memory as a
texture source picture data, and the new picture data is
formulated using the texture source picture data.
54

7. A picture formulating apparatus formulating picture data to
be displayed on a display device, comprising:
a video memory having a display area provided for storing
the picture data; and
a processor for setting the display area at a first
position in the video memory, storing a picture data in the
display area at the first position, setting the display area at
a second position differing from the first position in the video
memory, formulating new picture data based at least in part on
the picture data stored in the display area at the first
position and writing the new picture data in the display area
set at the second position.
8. A picture formulating apparatus formulating picture data to
be displayed on a display device, comprising:
a video memory having a display area provided for storing
the picture data; and
a processor for setting the display area at a first
position in the video memory during a selected Nth frame,
holding a picture data in the display area at a first position,
setting the display area at a second position differing
from the first position in the video memory during following
(N+1)th frame, formulating new picture data based at least in
part on the picture data stored in the display area at the first
position and writing the new picture data in the display area at
the second position.
9. The picture formulating apparatus as recited in anyone of
claims 7 or 8, wherein the processor further stores at least a
part of the picture data stored in the display area at the first
position in an area other than the display area at the second
position in the video memory as a texture source picture data,

and formulates the new picture data using the texture source
picture data.
10. The picture formulating apparatus as recited in anyone of
claims 7, 8 or 9, further comprising a converter converting the
picture data in the display area of the video memory into a
video output signal.
56

Description

Note: Descriptions are shown in the official language in which they were submitted.


CA 02466377 2004-05-31
Methods And Apparatus For~Syntliesizing A Three-Dimensional
Image Signal And Producing A Two-Dimensional Visual
Display Therefrom
Field of the Invention
The present invention relates to methods and apparatus
for synthesizing three-dimensional picture signals and
producing two-dimensional visual displays based on such
signals.
In general, T~7 receivers, receiver monitors and CRT
displays used in conjunction with household game playing
apparatus, personal computers and graphic computing devices
provide two-dimensional displays. The signals or data are
produced so that two-dimensional character's and the like are
suitably arrayed and moved on a planar background with or
without changes in shape. However, in such display techniques
the ability to change either the background or the character,
as well as the movement thereof, are limited so that the
ambience of the game or other display cannot be enhanced.
_1_

CA 02466377 2004-05-31
r
Background of tie Invention
Methods have been adopted for formulating pseudo-three-
dimensional images. In such techniques, plural pictures of a
character to be displayed are stored, each picture depicting
the character as viewed from a different direction. At a
given time, one such picture is selected and displayed
depending on the viewpoint adopted for the image being
displayed. In another aspect of this technique, two-
dimensional pictures are superposed along an image depth
direction in order to display a pseudo-three-dimensional
picture. In generating or formulating picture data, sometimes
a texture mapping process is carried out whereby a surface
texture or pattern, such as a ground pattern, is
-2-

CA 02466377 2004-05-31
' ' PATENT
. - . _ 450100.2978
affixed to a selected face of a golyhedron form~.hg an element of
the image. A further technique employed involves converting
picture color data by means of a so-called color pickug tab3e in
order to change the color of the displayed image.
Fig. 1 provides a block diagram of a proposed househald
game playing apparatus. rn the apparatus of Fig. 2, a central
processing (CPU) 9I comprising a micre-processor fetches
operating infora~at~.on from an input device 94, such as an input
pad or a joystick, via an interface 93 and a main bus 99. Asp the
operating infarmation is fetched by the CPU 91, three-dimensional
picture data stored ~.n a ma~.n memory 92 are transferred by a
video processor 96 to a source video memory 95 for storage
therein.
The CPU 9I also transmits data to the video processor
96 indicating the sequence in which pictures represented in
memory 95 are to be displayed. The video processar 96 reads out
picture data from the source video memory 95 in accordance with
the data supplied by the CPU 91 in order to create a superposed
picture d~.splay.
Simultaneously with the display of the picture, an
audio processor 97 outputs voice data from an audio xaemory 98
Which is based on voice information in the fetched operating
informat~.on and which is coordinated with the picture that is
being displayed. For example, if the gicture being displayed
i0titt.Mllatatp74~97a.1tPp ' -3 -

CA 02466377 2004-05-31
PATENT
..,_ 454100.Z9~$
depicts a car crashing, the audio processor s7 outputs an
appropriate crashing sound.
Fig. 2 illustrates a sequence for producing and
outputting a three-dimensiana3. picture by the household game
playing apparatus of Fig. 1, using two-dimensional picture data.
The sequence depicted in Fig. 2 serves to produce a three-
dimensional picture depicting a cylindrical object on
checkerboard background pattern.
In Fig. 2, the source video memory 95 stores a
background picture ZQtl im the form of a checkerboard pattern and
a.sequence of rectangular pictures zax through 2v~ each
representing a cross section of the cylindrical, object to be
overlaid on the checkerboard background pattern at varying
depths. The portions of the rectangular pictures zol through 204
other than the cross sections of the cylinder contain data
representiztg transparency.
The picture data stared in the,source video memory 95
is read-out in accordance with an address supplied from a read-
out table 101 of the video processor 96. The CPU 9i provides
read-out address control data to the table 101 over the main bue
99 for indicating which address is to be output from the table
lvl. In addition, a sync generator 100 produces read-out timing
signals which it supplies to the read--cut address table 101 and
which are iaatched with. synchranizatinn signals for the picture to
be tl~.splayed so that the read-out addresses are supplied by t$e
~l.rulc~nip»t978-ASP _~'

CA 02466377 2004-05-31
PATENT
. - . _ . _ Q~501QQ.2978
table 101 in the appropriate order and at the appropriate times
for displaying the desired picture.
The read-out picture data from the source video ntentory
95 are received by a superpos~.tion processor 3.03 which serves to
superpose the picture data in the appropriate sequence in
accordance with sequence data supplied from a priority table 102
under the control of signals supplied by the CPU gl over the main
bus 99. Each of the pictures 200 through 204 is assigned a
sequential ranking beginning With a lowest rank for the
background picture 200 and advancing to a highest ranking for the
picture 204, so that the picture data as output by the
superposition processor 103 represents a superposition of the
pictures 200 through 204 in the appropriate order far displaying
the cylindrical object on the checkerboard background.
The data output by the superposition processvx 103 is
supplied to a transparent color processor 104 wherein the datn of
each of the data pictures 201 through 204 representing
transparency are processed so that underlying data may be
displayed. Once the data has thus been processed by the
transparent color processor I04, the data is output 'thereby as
picture data representing a three-dimensional picture VDO as
shown in Fig. 2.
Fig. 3 is a block diagram c~f a proposed picture
formulating apparatus which carries out a texture mapping
function.. In Fig. 3, .a central processing unit (CPU 301 is
7o~nl.~ult~nip»29T8.~Pp --5r

CA 02466377 2004-05-31
' PATENT
_ 45x100.2978
depicted comprising a microprocessor or the like coupled with a
main bus 309. A ma~.n memoxy 302 for storing progr~t~as end data, a
video memory 373 for storing l:exture sources p~,cturc data and a
videa memory 3D~ for storing display output data as formulated by
the apparatus of Fig. 3, are coupled to the main bus 309. The
CPU 301 reads out the texture source picture data from the video
memory 303 with such modifications as are necessary in order to
transform the texture data for mapping onto a display area of the
video memory 303. The transformed texture data are ~rr~.tten in
the display area of the video memory 304 and latest read therefrom
and converted by a D,A converter 3.05 into analog signals which
are output for displaying a picture.
Figs. 4A through 4C depict a sequence of processing
operations carried out by the picture formulating apparatus of
Fig. 3. As shown in Fig. 4A, in the video memory 303 several
texture source pictures provide original data to be transformed
as described above for texture mapping, such a source texture
pictures 311a~ 311b and 3I~a. The main meiaory 302 stores a
program far controlling the CPU 301 for specifying the particular
texture source picture to be used at a given t~.me as well as a
read-out position therefor to which the C~~T 30~. responds by
reading out the appropriate picture data from the preset
locations in the Video memory 30~s and carries rout the about-
mentioned modifications thereon as designated by the program to
produce the modified picture 312a as illustrated in F~.c~. 48, R
SOIIY.IIAIttnIp»t974.APP _ 6

CA 02466377 2004-05-31
PATENT
._._ 450100.2978
modified picture 312a is then written in x display area 321 of
the video memory 304. The write addresses ~or the display area
321 are also designated by the program stored in the main memory
302.
A sequence of such read out and modification operations
as described above is carriesi out until the data of a cotteplete
picture 313 has been generated and stored in the display are 321
of the video memory 304. The picture data 313 is then read out
from the video memory 30~ in accordance with addresses determined
ass descr~.bed above in synchronization with the video
synchronization. signals and converted to analog form by the D/A
converter 305 to produce analog output picture signals.
Fig. 5 provides a block diagram of a picture
formulating apparatus which carries out a picture data converting
function whereby color data is output from a conversion table in
the form rsf a color lookup table in response to image data. This
provides the ability to change the color of the displayed image
without re-writing image data., In the apparatus of Fig. 5, the
CPU 301, main memory 302, video memories 3Q3 and 304 coupled by
means of the main bus 309, as well as a D/A converter 305, are
s~tiLar to the corresponding devices of the Fig. 3 apparatus an8
are not, therefore, further described herein. In the arrangement
of Fig. 5, a conversion table memory 306 is also provided storing
a conversion table such as a lookup table for converting output
display picture data read out from the video memory 304. The
iow'r.wue(atpri29T8.~wP _ 7 _

CA 02466377 2004-05-31
PATENT
- . . . _ 4502013.2978
converted data are output from the conversion table memory 3os to
the ~DjA converter 305 for conversion to analog foray and are
supplied thereby to a video or image output.
With reference alas to Figs. 6A through 6D, a sequence
of operations carried out by the apparatus of Fig. 5 in
processing and outputting a picture is illustrated therein. The
CPU 301 reads texture source data From the video memory 303 and
modifies the same to store the modified data in appropriate
- locations of the video memory 304 to construct the picture 313 as
depicted in Fig. 6A in accordance with the data con~rersion .
operations discussed above in connection With Figs. 3 and 4_
Tizat is, the picture data 313 of Fig. fiA corresponds to the like-
referenced picture as depicted in Fig. 4C. However the picture
data 313 stored in the video memory 304 is provided in an
intermediate or temporary form which cannot be output as is
thereby to produce a pfcture display.
Rather, the picture data as store in the video memory
304 is read out from its display area 32.1, as represented by the
illustration of fig. 68 and used for addressing conversion table
314 in the conversion table memory 30f to output a color
corresponding to each input address which, in turn, corresponds
to a pixel of a picture to be output. The conversion table 314
serves to convert the address of virtual data supplied thereto
from the video memory 3t~4 into actual picture data, in accordance
with a process represented schematically by Fig. 6C. The
~r.mrt~ntp~1t9T4.~1PP -g-

CA 02466377 2004-05-31
_ PATENT
. _. _ _ 45o~.flo. asps
converted or actual picture data is output from the conversion
table memory 3os to the D/A converter 345 far conversion thereby
to analog picture data, as represented by Fig. 6D.
It will seen from the foregoing that the output picture
data is not the same as that generated in the source video
memory, but rather a three-dimensiona3. picture is produced by
changing read-out positions of the two-dimensional picture data
stored in the source video memory and superposing the data of a
cumber of pictures as read out. Gvnsequentl.y, the ability to
express a three-dimensional picture generated in this manner is
limited. For example, when processing a three--dimensional ob~eot
picture for display by means of a household game pJLaying
apparatus on a screen, it may not be possible to correctly
express the three-dimensional. object picture in this way. It may
occur that, since the point of view of the operator with respect
to the object picture changes after the three-dimensional object
picture has been processed, either the position of the object raay
not be changed oorrectly or the method of changing the three-
dimensional picture may not be carried out in an appropriate
manner. Likewise, the position of the viewing direction
according to which the three--dimensional object is depicted is
thus limited, for example, sa that a back side of the object
picture cannot be depicted even though the position or the
direction of view of the three-dimensional ot~~ect picture has
been changed so that the same should be visible. It max also
IImiT.NARtnlp'1Z97a.APP ~-9 -

CA 02466377 2004-05-31
' ~ PATLHT
. . ~5010Q.2978
happen that the screen display to the operator occurs
discontinuously even when three-dimensionally continuous movement
is depicted.
Since multiple object pictures or picture data are
stored in order to depict an object or other picture from various
directions in order to represent a three-dimensional picture, the
amount of data which must be stored for this purpose becomes
voluminous. In addition, since the stored two-dimensional.
picture data are presented as three-dimensional picture data,
substantial processing time is required for formulating each
picture and game execution speed is consequently limited
adversely:
In addition, since the two-dimensional picture data
stored in the source video memory must be modified in order to
produce a three-dimensional picture, the picture read-out control
for tale source video memory is complex and difficult to
implement.
A further difficulty is posed, for example, in the case
of the apparatus of Fig. 3 for which it is necessary to provide a
video memory 303 which is dedicated to storage of the source
texture data, which thus hinders efforts to achieve size
reduction in the apparatus as well as to hold down production.
casts therefor. In the case of the apparatus of ~'ig. 5, it is
necessary to provide a table memory 306 ded~.cated to storage of
the conversion table as well as a CPU bus xor accessing the
iO11Y.11A~tAip»9~0.APP -1_ 0-

CA 02466377 2004-05-31
- . ~ PATEhIT
. _. 45D10o.29~8
table, which likewise hinders efforts to reduce the size of the
apparatus as well as tv minimize its production costs.
OBJECTB ll~iD BL1E~1RY OF THE I3!1YSETZO~
It is an object of the present invention to provide
methods and apparatus far formulating picture data, as well as
household game playing apparatus, which overcome or alleviate the
foregoing shortcomings and limitations.
It is another object of the present invention to
provide methods and apparatus for formulating three-dimensional
picture data, and household game playing apparatus, wherein
positions and orientations of an object as depicted thereby may
ba changed easily and quickly;
Zt is another object of the present invention to
provide such methods and apparatus, as well as household game
playing apparatus, which employ conversion tables and/or texture
source data and whose size and costs of production are
advantageously minimized.
Tn accordance with an aspect of the present invention,
a method for formulating image signals for producing a three-
dimensional picture display comprises the steps of: reading out
first coordinate data for apex points of a plurality of polygonal
picture areas as units of a desired thgee-dimensional picture
stored in a fixst memory; producing second coordinate data
representing coordinates of the apex points on a predetermined
sotoY.~uutt~tpy.~ -11-

CA 02466377 2004-05-31
' ~ PATENT
_ . _ 450100.2978
screen based on the first coordinate data; producing color data
for each of said agex points; producing color data and
coordinate data of boundary points between the apex points using
the color data and the coordinate data of each of said apex
points an the predetermined screen; producing color data and
coordinate data of intermediate points bet~reen said boundary
points using the color data and the coordinate data of said
boundary paints; and storing the produced color data and the
coordinate data of the intermediate points in a second memory to
form image signals for producing a three-dimensional picture
lisp ~.ay .
In accordance with another aspect of the present
invention, an apparatur~ Ls~r formulating image signals
representing a three-dimensional'~picture for displaying said
three-dimensional picture with the use of said image signals,
comprises: a first memory for storing first coordinate data of
apex points of a pl.ura~.ity of polygflnal areas as units of a
desired three-dimensional picture; a geometry processor for
converting the first coordinate data of tl~e apex points of the
polygonal areas stored in first memory to second coordinate data.
of a predetermined screen and for producing color data for each
of said apex points; a raster processor for producing color data
and coordinate data rsf boundary points bet;aeen said apex points,
based on the color data and the second coordinate data on the
predetermined screen of the apex points as converted by said
~r.wu~~MP~u9~a.~e -lz-

CA 02466377 2004-05-31
PATENT
- . _.. _ 45fl1OQ.29?~
geometry processor, and for producing solar data and coordinate
data of intermediate points between said boundary paints using
the color data and coordinate data of the boundary points; and a.
second memory for storing the color data and the coordinate date.
of the intermediate points as produced by said raster processor,
the color data of said .intermediate points as stared representing
said three-dimensional picture.
I1~ Certain embodiments, the coordinate data and color
data of the apex points of the po2ygonal areas are store in the
f irct motaory _
In addition, in certain embodiments two-dimensional
texture data is modified and mapped onto three-dimensional
picture data to be displayed. In some etxtbodiments, the
coordinate data of the apex points of polygonal areas and texture
~caar~iinata data indiaatsr~g ceardinatc poaitiona og thc~ two-
dimenszonal texture data are stored in the first memory.
In certain embodiments of the ~.nvention, the end
positions of horizontally extending line segments slicing a
polygonal area at a predetermined distance from one of the apex
points thereof are used as the boundary points.and solar data of
the intez~ediate points are produced based on such boundary
points by interpolation.
As used herein, the terms "color information" and
"color data" include both aolar vectors for use in effecting a
soNr.w~ac~tp) i2v7s.~pp - Z 3 -

CA 02466377 2004-05-31
PATENT
. __ . _ a5o~.oo . 2978
color display as well as luminance and gradation data Eor uss in
effecting a manachromatic display.
In accordance with a further aspect of the present
invention, a household game playing apparatus is provided in
Which an object image for display as a three-dimensional. picture
is formulated, comprising: an external storage medium in which a
game pragram and picture information employed iri executing said
game program are stored; operating means for processing the
picture information of the ga~ae program stored in said external
storage medium gar producing display picture data representing .
movement of the object image; display means for displaying the
display picture data; a first memory for storing first coordinate
data of apex points of a plurality of polygonal areas in units of
the object image included in said picture data; a geometry
processor for converting the first coordinate data of the apex
points of the polygonal areas stored in said first memory
respc~rzsive to an input from said operating means for producing
second coordinate data thereof on a predetermined screen and for 1
praducing color data for each of the apex points; a raster
pros:essor .for prc~ciuLinc~ c;clor data and coGrdinate data of
boundary points between said apex points, using the coxor data
and coardinate data on said screen of said apex points as
produced by said geometry processor, and fc~r producing color data
and cvord3.nate data of intermediate points between said boundary
paints, using the color data and coordinate data of said boundary
SflilY.lURt~ip)12978.APP -14-

CA 02466377 2004-05-31
PA3'EN'f
450100.2978
p~~.l~ts; and a secan~l memery foY ~t~~fng the coior data and the
coordinate data of the intermediate points as produced by said
raster processor for producing three-dimensional picture data of
the object image.
In Certain embodiments, the external storage medium fox
the household game playing machine is either a CD-ROM ar a memory
card. Tn some embodiments, moreover, the game play~.ng apparatus
includes non-volatile storage means for staring the state of
progress of-the game on termination of a game program.
In certain embodiments, data for the output picture to
be displayed by the display means is formu3.ated with the use of
data written in a portion of a picture memory other than a
portion thereof in which the output picture data is stored. Tn
certain embodiments, the output picture data is formulated with
the use of a conversion table and/or texture source data, and the
memary locations in which the output pzcture data ate stored in
the picture memory are switched.
In accordance with yet another aspect of the present
invention a method for formulating picture data is provided,
comprising the steps oE: storing source data for formulating
output picture data in a first region of a picture memory;
processing the stored source data for producing the output
picture data; and storing the output picture data in a second
region of the picture memory other than the first region thereof.
soNr.~cntpW297tf.rwP -15-

CA 02466377 2004-05-31
' PATENT
._.._ 4501Q4.2978
In accordance With a still further aspect of the
present invention, a pioture formulating apparatus for
formulating data of an output picture gor output to a gicture
display device for displaying the output picture, comprises: a
picture memory far storing the data of said output picture in a
first storage area thereof and a processor far storing source
data for formulating the output picture data in an area of said
picture memory other than the first storage area for said output
pictuz~e data and for producing the output picture data based on
the source data.
In accordance with a still further aspect of the
gresent invention; a household game playing apparatus for
formulating data of an output picture to be output for display by
a picture display device, comprises: a picture memory for storing
data of said output picture in a first storage area thereof; alto
a processor for storing source data for gormulating the output
p~.cture data in an area of said picture memory ether than the
first starage.area for said output picture data_and for producing
the output picture data based on the Source data.
According to certain features of the present invention,
coordinate data of the apex points of polygonal areas stored in a
first memory are converted for producing color data and further
coordinate data on a predetermined screen, color data and
coordinate data on the screen of boundary points between the apex
paints are produced using the color data and the coordinate data
SOtIY. NAftt~lp~1297$.APP -1 ( -

CA 02466377 2004-05-31
PATENT
.. . _ 450100.2978
of the apex points on the screen, color data and coordinate data
on the screen of intermediate paints between the boundary points
are produced using the color data and the coordinate data of the
boundary points on the screen, and the color data of the
intermediate goints are written in a second mexuory for producing
output data far use in displaying a three-dimensional. picture.
In accordance with certain other features of the
present invention, a method is employed us~.ng two-dimensional
picture data, such as texture.data, by sliming a gol.ygonal area
at a predetermined distance from one of the apex points thereof
by means of a Line segment having boundary points at either end,
anc3 producing color data of intermediate points of the line
segment by interpolation and mapping the resulting color data of
the intermediate points an three-dimensional picture data.
In certain embodiments of the househozd game playing
apparatus of the present invention, a game program stored in the
external storage medium (such as a CD-ROM or a memory card] is
input to the apparatus, the game program. including three-
dimensional picture data for use in fox~ulating a picture to be
displayed.
Since color data ~tnd coordinate data of apex points of
polygonal areas are stored in memory for fQrmuJ.ating boundary
point data and intermediate point data, a three-dimensional
object p~.cture may be represented coxrect:ly~ Also, since tWO-
dimensional texture data is modified and mapped on a three-
sw~Y.~uxcnlpri2978_wv -17 -

CA 02466377 2004-05-31
PATENT
. _ . _ 45QIOO.2978
dimensional picture for display, and since textuze coordinate
data indicating coordinate points of the tW~o-dimensional picture
are employed in such mapping operation, the three-dimensional
object may be represented With the use of such texture date
correctly.
In those embodiments where a horizontally extending
line segment is used to slice a polygonal area at a predetermined
distance from one of the apex points thereof to defined boundary
points and to derive co7.or data for intermediate points along the
segment by interpolation, a three-dimensional abject picture may
be formulated easily.
In the case of the household ga~ae playing agparatus of
the present invention, in producing the display picture data
representing movement of the object image, its position and
orientation can be changed correctly and accurately by the
operating means, and there as no limitation imposed on the
direction and orientation of the objeot ~.mage as represented, so
that, even if the picture from the viewpo:Lnt of the operator is
changed continuously three-dimens,ionaliy, a picture may be
4isplayed which is responsive to the Golatinuotxs three-dimensional
changes correctly and aGCUrately.
Moreover, since the data volume of the picture data
necessary gor representing the three-dimensional picture
represents polygonal areas simply by their apex points, so that
the game formulating time may be d.iminisraed, it becomes possible
sc~r.ru~cntpslz9~o_~rP _ 1 g -

CA 02466377 2004-05-31
PATENT
- . _ _ ~5ozao. a9~~
to improve game producti~ri.ty and speed of execution of a computer
formulated game. Iri addition, in those embodiments where three-
d imensional picture data- is fox-mu~ated hor:LZontal line by
horizontal line as the same as written is the video memory, so
that the_three-dimensional picture data thus written in the
memory can be read during, the usual scanning of video signals,
the process of reading the picture data is made easy.
In those embodi~aents which employ an external storage
medium such as a CD-itt?M or memory card, a voluminous game program
can be stored therein and read therefrom quickly.
In those embodiments which employ a non-volatile memory
means in a game playing apparatus for storing game status data,
the progress of a computer game up to the end of prograta
execution can be recorded, so that the game can be re-started
beginning from the point at which the program was terminated.
Ey using portions of the picture memory other than
those in which outgut display gicture data are stored .for storing
picture data for use ~.n fozzuulating such output picture data, a
three-dimensional picture may be produced correctly and
aCCUrately by means of an apparatus having an advantageously
small size xcnd law cost. In those embodiments in which a
conversion table for a texture source picture data or other
picture data is used fr~r producing a display output picture, a
three-dimensional object picture may be represented accurt~tely
and correctly.
SONY.tIARtnip7 \29W_~1PP _ l g

CA 02466377 2004-05-31
-= PATENT
- . _... _ 4501b~.2978
Zn those embot~iments in which the aQdresses at which
the output picture data is store are changed, it is possible to
save and employ previous~.y generated picture data to produce a
new display output picture. In this marlnery the new output
picture can be represented carrectly and accurately.
The above and other objects, features and advaxltages of
the invention, will be apparent in the following detailed
description of certain illustrative embodiments thereof which is
to be read in connection with the accompanying drawings forming a
part hereof, and wherein corresponding parts and components are .
identified by the same reference numerals in the several views of
the drawings.
BRIEF DE9CRIPTTOH ~F T8L ~FIAIfIDI~B
Fig. i is a block diagram of a px-oposed picture
formulating apparatus;
F'ig. x illustrates a sequence of operations carried out
by the apparatus of Fic~ . 1;
Fig. 3~ is a block diagram of a farther picture
formulating apparatus which carries out a texture data mapp~.ng
function;
rigs. ~A through ~G illustrate a sequence of operations
carried out by the apparatus of Fig. 3;
SDHYaI4ARinl~t)~29T8.nPP -2 0-

CA 02466377 2004-05-31
PATENT'
4501a0.297~
Fig. 5 is a block diagram of a further picture
formulating apparatus which serves to convert picture data with
the use of a convezsion table;
Figs. 6A to 6D illustrate a sequence of operations
carried out by ~e apparatus of Fig. 5e
Fig. 7 is a block diagram of a picture formulating
apparatus in accordance with an embodiment of ;the present
invention;
Fig. 8 is a functional b~.ock diagratu for use i.n
explaininq processing according to a first embodiment of a
picture formulating method according to the present invention;
Fig. 9 is a functional bioclc diagram for use in
illustrating processing in accordance with a second embodiment o~
a picture formulating method of the present ~.~nventio~;
Fig. 10 is a flaw chart far illustrating processing in
accordance with the first embodiment of the picture formulating
methods;
Fig. lIA ~.llustrates a process of converting a normal
line vector in an abject coordinate system to a normal line.
vector in a screen coordinate system;
Fig. I1B illustrates coordinate values and drawing
colors of a triang~.e as produced in accordance with the first
embodiment of a pictuz-e formulating method;
SaltY_HAR(nip5 \29T8.lSPP - 2 2 '

CA 02466377 2004-05-31
PATEN2'
. _ _ . _ 450150, a9?8
Fig. 12 .is a flow chart for use in illustrating
processing in accordance with the second embodiment of the
picture fonuulating method;
Fig. 13 illustrates texture coordinates used in
parrying out the second embodiment of the picture formulating
method;
Fiq. 14 illuatzates coflrdinate values axed Luminance
values of a triangular area as produced in accordance with the
second embodiment of the picture formulating methods;
Figs. 15A through 15C illustrate the utilization of
memory space in a video memory used in carrying' out the method, in
accordance with the second embodiment;
Fig. I6 is a flow apart for use in i~.lustrating
processing according to the method of the second embodiment;
Figs. 1?A and 17H represent utilization of memory space
and the display of an output picture, respectively, in accordance
with a method of the present invention employing a conversion
table;
Figs. 18A and 1sB illustrate the usage of video memory
area in accordance with a further embodiment of the present
invention;
Fig. 19 zs a block diagram of a household game playinr~
apparatus in accordance with another embodiment of the preaent
invet'~tion;
sowY.KxRCnipWrB_r~ -2 ~-

CA 02466377 2004-05-31
PATENT
_ . _. _ _ 4soioo. a~~8
Fig. 20 is a flow chart for use in explaining the
operation of the household game playing apparatus of Fig. I9;
Fig. 21 is an exemplary display preduced with the use
of the household game glaying apparatus of Fig. 19.
srntt_MUttnip>\2978.~p -2 3 -

CA 02466377 2004-05-31
- PATENT
4~0100.29~8
~$~AILEp D~~CRIpTIO~ Og
CER AIN ADyA~tTAG~UB E~iHODIIiEPTB
With reference now to the drawings, and in particular
to the embodiment of gf.g: ~, a central processing unit (CPU 1~,
such as a microprocessor, serves to fetch~operating information
from an input device 4, such as an input pad or a joystick, via
an interface 3 and a main bus 9. Basal an the operating
lnformatiori fetdl~~d from the device ~, the CPU l transmits a
three-dimensional picture stored in a main memory 2 to a graphic
processor 6 over the main bus 9. The~graphic processor 6 serves
to convert the data frcam the main memory 2 into picture data. A
three-dimensional picture thus generated by the graphic processor
6 is stored in a video memory 5. The picture data stored fn the
memory 5 is read out in acaordance_with scanning by video signals
for displaying a three-dimensional picture an a display device,
net shown for purposes of simplicity and clarity.
At the same time that the three-dimensional picture is
being displayed as described above, voice data which has been
fetched by the CPU 1 from the main memory 2 and which is
associated w~.th the displayed three-dimensional picture is routed
to an audio processor 7. Based on the received voice
information, the audio processar 7 outputs appropriate voice riata
stored in an audio memory 8.
A shading method for displaying a three-dimensional
picture by shading an object is explained as a first embodiment
SOtIY .IV1R ( nl p) ~29T8. APP - 2 9 -

CA 02466377 2004-05-31
PATEIrT'f
- . 40100.2978
of the present invention, while a texture mapping method alas for
displaying a three-dimensional picture, but by modification and
mapping of further two-dimensional picture data will. also be
explained as a second embodiment of the present invention.
Following are three-dimensional coordinate systems
which will be employed in describing various features of the
disclosed embodiments. An object coordinate system serves to
represent the shape and size of a three-dimensional object
itself. A world coordinate system repz-esents the position of a
three-dimensional object in a three-dimensional space. Finally,
a screen coordinate system represent a three-dimensional object
displayed on a two-dimensional screen. In order to simplify the
explanation of the disclosed embodiments, the object coordinate
system and the screen coordinate system are principal~.y used. In
addition, while a three-dimensional object may be represented
according to the present invention by means of a plurality of
polygonal areas, processing of data representing triangular areas
will be described hereinbelow for purposes of simplicity and
clarity.
the shading method according to the first embodiment as
mentioned above,.will now be explained in connection with Fig. 8.
In the main memory 2 of the Fig. 7 apparatus, as
depicted in Fig. 8, picture information, such as color vectors
and coordinates of the apex points of a triangle used for
formulating a picture of a three-dimensional object are stored.
so~t~.~ulR~nip»2978.APP -2 5 _

CA 02466377 2004-05-31
PATENT
- , .. - . 450100. z97B
Fox the puzpo'e of Lvrmulating such a picture, the picture
infonaation stored in the metuory 2 is read by the CPU 1 for
supply to a geometry processor 61 of the graphic processor 6 as
illustrated in Fig. 8.
In general, in the graphic processor 61 the coordinates
of the apex points are converted to screen coordinates ~xith the
aid of read-out picture information, namely, a coordinate
transformation matrix. That is, the coordinate transformation
matrix is applied to the apex point coordinates and the X and Y
coordinates thus obtained are divided by the Z coordinate thus
obtained. In addition, colar and luminance data far the apex
points as displayed on the screen are calculated by obtaining the
inner product of a normal line vector (a vector normal to the
triangul2lr area represented by the.-apex points) and a light
saurce vector produced by light source data supplied by the CPU 1
Which reads the same from the main memory 2. The result of the
inner product multiplies color data specific to the apex point as
described in greater detail below, for producing the color and
luminance data thereof~
The values of the color and luminance data of the apex
points on the screen as thus obtained are routed to a raster
processor 62 of the graphic processor ~, as depicted in Ffg. 8t
which produces autput picture data for intermediate poznts of the
triangle delimited by the three apex points, by an interpolation
process using the coordinate values as well as the color and
SWlY.lIARtn l p) \29T8.I1PP -2 5 -

CA 02466377 2004-05-31
PATENT
4~O10a.2978
luminance data of the apex point. In ti~is manner, a picture VDT
of the three-dimensional object is obtained having pixels whose
coordinate values represent position within a screen coordinate
system Which is stored in a display area 51 of the video memory 5
and output therefrom under the cantrol of the graphic processor
6.
As depicted in Fig. 8, the main memory z stores picture
information such as coordinates of the apex.points of a
triangular area necessary for formulating a picture of a three-
dimensional object. ~In addition, the main memory 2 stores
texture coordinates indicating coordinate positions of texture
data for mapping to another picture. In the texture mapping
method in accordance with the second embodiment of the present
invention, the picture information is read by the CPU 1 of Fig. 7
from the main memory 2 thereof for routing to the geometry
processor 61 of the graphic processor 6.
Tn the geometry processor 61 the coordinates of the
apex paints are converted with the aid of the read-out picture
information for calculating the coord~.nates of the apex points on
the screen. Color and luminance data of the apex points on the
screen are obtained by determining the inner product of the
normal line vector to the triangular area and a light source
vector produced from the light source data and the inner product
thus produced multiplies color data specific to the apex point.
The values of the colax~ and luminance data of the apex points
soMY.NAR(nlp)t,29T8.~P -2 7

CA 02466377 2004-05-31
PATENT
4~01D0.2978
thus obtained are routed to the raster processor s2. The raster
pracessor 62 also determines, by an interpolation process,
luminance data for intermediate points in a triangular area
delimited by three apex points in a texture area such that each
apex point in the texture area corresponds to an apex point of
the object area. The texture area is provided as an off screen
area of the video memory 5 containing texture data to be mapped
on the object. In this manner, a picture vDz as illustrated in
Fig. 9, is formulated by mapping texture p~.cture data on the
various polygons of the object by accessing the luminance data
from the texture picture 5z of Fig. 9 by means of appropriately
produced coordinate values for matching the luminance data to
pixels in the display area 51 of the video memory 5. Once the
picture data has beem thus-formulated and stored in the display
area 51, it is then output under the control of the graphic
processor &.
A sequence of operations carried out by the apparatus
of Fig. 7 for producing and displaying a three-dimensional
picture izz accordance with the shading method of the f~.rst
embodiment i$ illustrated in the flow chart of Fig. 10.
In a step SP1 ag Fig. I0, information necessary for
formulating the picture is accessed from a variety of pictuxe
information data stored in the main memory 2 of Fig. 8, such as
coordinates v of the apex points of a small.-size triangle
represented in an object coordinate system, a coordinate
SouY.~IARCnI~)12978.~1FP -z 8 °

CA 02466377 2004-05-31
PATENT
_ _ _.. _ 450100. 2978
conversion matrix R for converting the apex point coordinates
from the object coordinate system into a world coordinate system,
a parallel translation vector T for transforming the apex point
coordinates in order to represent transnational movement of the
object, a line-of-sight vector which ser~res to translate and
rotate the entire three-dimensional image to present the image
according to the position of the viewer and the direction ~.n
which the viewer is looking for representing the apex point
coordinates according to the screen coordinate system, as well as
a normal-line vector N and a color vector C. The foregoing data,
as required, era read in sequence by the CPU ~. Of Fig. n one
routed to the graphic processor 6.
In a step sP2 of Fig. so, the coordinate values of the
apex points of the triangle and c~rawiny c:olr~rs ~herefrsr are
produced using the data read out as described above. First, in
the geometry processor 61 (Fig. 8) the three-dimensional
coordinates of each apex goint are converted by means of the
coordinate conversion matrix R (for rotating the triangular area)
and the parallel translation vector T (for translating the same
to produce corresponding three-dimensional coordinate vectors V
in accordance with the following formu~.a (z)
(gs~ ys~ Zs) _ (Xy~ yv, Zyy X R
'~ (y XTr ZT) ~ ~ ~ (1)
wherein (Xs, Ys, ZS) represents coordinates of the apex points So,
S2, and S~ as illustrated in Fig. 11H, (Xy, Yy, Zy) represent the
5o11Y.li0.fiinlp)12978_J1PP -~ 9 ~-

CA 02466377 2004-05-31
PAT~IT
. _ _ . ~4501fl0. X978
coordinate vectors of the apex points in the object coordinate
system, R is the coordinate conversion matrix for rotatihg the
area, and (XT, YT, Zt) is the Parallel translation vector.
The coordinates XS and YS are then subjected to
perspective conversion in accordance with the following formulas
(2) and (3j:
x ~ x5 x (h~zsj ... tzj
Y = Y~ x (h/ZS) ... ~3j
wherein h represents the distance from the viewpoint og the
operator to the display screen arid Zx represents t:2e depth of the
apex point from the screen after conversion in accordance with
formula (1). The perspective conversion serves to modify the
size of the.object depending on the distance fro~a the position of
the operator in order to produce a scenographic or perspective
sense or feeling.
The normal-line vector N relative to the plane of the
trianqle is converted, with the use of the coordinate converting
matrix R, into a normal litre vector P in the screen coordinate
system in accordance with the .following formula (4):.
(XP, YP, ZP) _ (XN, YN, ZRj X R ... (4)
wherein (X~, YN, ZW) is the normal-line vector N in the abject
coordinate system, R is the cr~ordinate conversion matrix
mentioned above and (Xp, Yp, Zp) is the normal-line vector P.
With reference also to Fig. L~.A, the normal-line vector N as
expressed in the object cgordinate system having axes XQr Yoa
&diY.I~AR(nlp)t2478.APP -3 Q-

CA 02466377 2004-05-31
PAT~I'
450100.2978
Z~ is illustrated therein relative to the triangular area: it
represents. As also shown in Fish. lzA, the axes X, Y and z of a
screen coordinate system are overlaid for illustrating that is
necessary to rotate the normal-line vector N by means of the
avordinate conversion matrix ~t in accordance with formula (4) in
order to represent the normal-line vector in aocotdance with the
screen coordinate system and, the vector P (X~,r Y~,, Zp)
Then the inner product of the light source vector L and
the normal-line vector P which was obtained by means of the
formu7.a~ (4) is then produced as follows according to formula
(5A~
PXL = (XQ, Yp, Z~,) 5C (XL, YL, ZL~
- Xp~ X~ + 7C~' X~ + ZP' ZL . . . ~5A)
A drawing color D for each apex point is then obtained by means
of the foz~titla (5B)
(Fta, Gp, Hp) ~ (P~tL~~ X (R~, Gc, B~) .. . (58)
wherein tR~, G~, Bz) is a color vector of the relevant apex point
and (R~, GD, BD) is the drawing color D for the apex point.
In a step SP3 of Fig. 10, it is determined whether the
dr$Wing colors and the coordinate values of all of the three apex
points So, S~, and S3 of the triangle have been found. If not,
processing continues by returning to the step SP2 for finding the
coordinate values and draTaing colors of the next apex point. If,
SONY.MAR(nlp)12978.APP -'3 1

CA 02466377 2004-05-31
PATENT
... 450100.2978
however, all such coordinate values and drawing colors have been
found, processing continues in a step SPA for finding the
coordinate values of the all of the intermediate points of pixels
within the triangle by means of the raster processor s2 as shown
in Fig. 8 and the drawing colors thus obtained and represented by
corresponding coordinate values are written in the video memory
S.
With reference also to Fig. I1B, the coordinate values
and the drawing colors of the three apex points Sa, SI and S~ of
the triangle are represented as follows:
Sa (Xar Yea Ro. car F~o~
SZ (X2, Yz, Rz, G~, Bz) _._
As shown i~ Fig. 118, the triangle which is delimited by these
three apex paints is sliced in the horizontal direction by a line
segment having the end points A and B corresponding with boundary
points of the triangle. The coordinate points and drawing colors
for each of a plurality Of boundary points A and B for each of n
raster lines are found by interpolation as fellows: if the
coordinate values of the boundar~r points A and s are given by
A ( XA r Y~ r RA r Gy F3A )
B t xe . Ye . ~Zs ~ Gs ~ B8 ~
the coordinate values and the drawing colors of the boundary
point A is determined by the following formulas (6) through (IOj:
sawr.EUR~aep)l~zs.u~Q -3 2-

CA 02466377 2004-05-31
' PATRNT
_ . _ 45x100.2978
~ X (Y~ - Yo - n) / tY~ -' Yo) + X~ x n j tY~ - Ya)
... (s)
y~=yQ+n ...(7)
R~ _ ~ X (Yt - Yo - n) f (Yi - ~~) + ~t~ x n / tY~ - Yo)
... (8)
Ga ~ Ga x (Y' - Yo - n3 / tY1 - Y~) + ~~ x n / (Yy - Yo)
...
B" = HQ x (Yi - Ya - n) l (Y1 - ya) + B~ x n ! (Yr : _Y~~ i~ )
The coordinate values arid the drawing colors for the
point B are then determined in accordance with the following
formulas (11) through (15):
X (yZ _ yo " n) / (YZ - Y~) + X2 x n / (Y2 - Yo)
...(11)
Yg a YD + n . . : (12)
R,~ = R,a x ( YZ - Yp - n ) / ( YZ ' Yp ) + RZ X n / ( YZ
... (Z3)
oX (Y~-YQ-n) l (Y2-Y4) +Gzxn / (YZ-YD)
... (lay
a X (Y2 - Y~ -' n) l (Y2 - Ya) + B2 ~c n / (Y2 - YQj
...(15)
Then coordinate values and drawings colors of intermediate points
P along the segment from the point A to the point H are found by
interpolation from the corresponding values of the boundary
points A and H and are represented by the following express~.on:
(xpr Ypr Rpr ~pr Bp)
the values of which are produced by means of the following
formulas (16) through (20y.
SONY.NAFitntp)~2978.APP -3 3 ~~

CA 02466377 2004-05-31
PATENT
- _ _ .. .450140. xg78
Xp = X~ + m . . . ( ~. 5 )
y~ = y~ . . . ( I7 y
I
~p s G~ x (xg - x" - m) l (gig - x,~) + Gg x m ~ ~~ . .x'~( 19 )
8p=$"x (Xg-XA-m) I tXg -x~) +Bg xia / (~ -XAy
. . (~0)
Then the pixel values (Rp, GP, Bp) o~ each intermediate
point P are then written in the display =area 51 of the video
memory 5 as shown in Fig. 8 according to the coordinate values of
each intermediate point P.
In a step &P5 of Fig. 30, it is determined whether
coordinate values for each of the triang=Les have been produced as
well as the related drawing colors., and whether the drawing
colors hzwe been accordingly written in i~he display area 51 of
the video memory 5. If such operations have been carried out for
each triangle, the picture data thus stored in the display area
5i of Fig. 8 is output by the graphic processor 6 as the three-
dimensional picture VIS~ as illustrated irv Fig. 8. If, however,
such operations have lxot been carried out in their entirety,
processing reverts to the step sP~ wherein the information
necessary for producing the picture data far the next triangle is
read prom the main memory 2 of Fig. 8.
Referring now to Fig. 12, a sequence of operations
which zs.carried out for praducing and displaying a three--
SONY.NAR(n1p'12978.11PP -3 4-

CA 02466377 2004-05-31
PATENT
_ . _ asoxno. 298
dimensional picture by the texture mapping method of the first
embodiment of the present invention is illustrated therein.
In a step SP11 of Fig. ~.2, the GPU 1 of Fig_ ? reads
the necessary information for producing the picture of a selected
triangle from the main memory ~ thereof , mamesly, the coordinates
of the~its apex points in the object coordinate system, the
coordinate conversion matrix from converting the apex point
coordixiates into the world coordinate system, the parallel
translation vector T, the line-of-sight vector for representing
the apex point coordinates in the world coordinate system
aecordinc~ to t:he screen coordinate system, the normal-line vector
N and apex point coordinates of a storage area for relevant
texture data. As the data is read in sequence by the CPU 1, it
is routed thereby to the graphic processor 6.
In a step SP12 of Fig. 12, the coordinate values of the
apex points of the triangle and luminance data therefor are
produced using the information read in the step SP11. First, in
the geometry processor 61 (Fig. 8y, the three~dimensianally
represented apex point coordinates are converted by means of the
caordinate conversion matrix R and the parallel translation
vector T, as described above, into the coordinates S in
accordance with formu3a (i). Thereafter, the coordinates S ure
perspective-converted in nc:c;vraanGe With the foi-3nulas (2~ and (~)
gor conversion to screen coordinates (X, Y).
SOlIY.lIARtntp)\x978.ApP -.3 5-

CA 02466377 2004-05-31
PATENT
. _ . _ . . 450108. X978
Also, the normal-line vector N og the triangle is
converted into the normal-line vector P in accordance with
formula (4) above by means of the coordinate conversion matrix lt.
Then the inner produce of the normal-line vector P and the light
source vector L is determined in accordance with formula (5A) tc~
yield a lighting quantity 1Q, as,represented by formula (zl?=
1Q = (P x L) ... (21j
In a step SPL3 of Fig. ~.2, it is then determined by the
geometry processor 61 whether the luminance data arid coordinate
values of three apex paints of the triangle have been found. If
not, processing reverts to the step SP12 .for finding the
luminance data and coordinate va~.ues of the next apex point of
the triangle. If, however, all such values have been found,
processing continues in a step SP14 for find~.ng all of the
texture coordinate values of the intermediate points within the
triangle from the teXt~xre picture data 52 as illustrated in Fig.
9, With the use of the raster processor 62. Then the luminance
data represented by corresponding coordinate values of the
display area 51 of Fig. 9 and associated with corresponding
texture coorci.inate values are multiplied by picture data
represented by the texture coordinates. The resulting values are
written at locations af.the display area 51 in accordance with
corresponding coordinate values.
With reference also to Fig. 13, the coordin$te value$
of the three apex points TQ, T~ and T2 of the triangle within the
sowr.vuac~y)12978.~QP - 3 6 -

CA 02466377 2004-05-31
' PATENT
. _. . _ 4Sb~.00. 2978
texture picture 52 which are to be mapped to the object picture
are represented as L'o ( U~ , V,~ ~
T' (vy~ Vi)
Tz ( Uz . tTz )
and wherein caordinate values of a boundary Qaint-c between the
points To and Ti, a boundary paint D between the points TQ and Tj,
as well as an intermediate point ~ on a line segment defined by
the points C and D, are represented as:
C ~Uc~ y~)
t Ua ~ v~ )
Q IUD. ~Q)
Since the three apex points of the triangle within the
texture picture 5z are each associated with a corresponding one
of the three apex points of the triangle within the object
picturQ, the coordinate values of the three apex-pointa of the
triangle withi~r the object picture, as well as luminance data and
texture coordinates of each point within the texture picture 52
may be represented as follows:
Q tXn. Ya. ~. Uo, v~)
s~ txi, Y1~ L,. u,~ v,)
S1 (XZ, Y2, L~, T3Z, V2)
The triangle delimited by these three apex points So, S~ and
Sx is sliced in the horizontal direction, as illustrated in Fig.
Z4, by a line segment defined by boundar~r points A and B as shown
therein. The Coordinate values and luminance data of the
SCmIY.MUtCnlp)\Z9'18_APP -3 7-

CA 02466377 2004-05-31
PATENT
450100.2978
boundary points A and B for each of n raster lines are found by
interpolation and are expressed as follows:
A (X~. Y~. Lit), _
$ t Xs . Y~ ~ Le ~
The coordinate values and luminance data of the boundary paint A
are determined in accort~ance with formulas (22) through (24)
below, while the coordinates of the boundary paint C providing
texture coordinates for the bounds=y paint A are detex-:ained in
accordance with foruiu~.as ( 25 ) arcd ( 26 ) below:
x" = xO x (Yt - Yo - n) l ~Y, - Y~) + x' x n / ~Yj _ ~nl .
... ta2y
y~ y~ n ... (2~)
~ +
L~ ~ (Y~ YQ / (Yf T~~) + L~ x (Y1 - Xoj
= x - n) n /
__ ... 12~)
Uo (Y' Yo - / Yt - Yn) ~ T~~ x dY~ Yoj
x - nj n / -
~
v a v
~~s,
(Y~ Y~ - I (Y~ - '~a) + ~1~ (Y1 Y~)
- n) x n /
. ...
(26j
The coordinate values and luminance data the
of
boundary point H are determined ~.n accordance with foratulas (27)
through (29), While the coordinates of the boundary point D
providing texture coordinates for the boundary point B are
determined in accordance with formulas (3o} and (31) es follows:
xe = ~ x ( yz -- y~o - n ) ! ( Y2 - yo ) + xz x n / ( YZ - yak
...(27j
Ya ° Yo + n . .. (2Sj
;'a ~ Zo x (Yz - Ya - n) / (YZ - x8) + ~ x n / (Yz - Yob
... (Z~~
SOIIY .lIA2Lntp) \2978.JIPP

CA 02466377 2004-05-31
PATENT
. .... _ 4~50300.~Si~$
UQ = ua x {Y2 " Yp a nj / (YZ ' Yo) + UZ x n / (YZ - YzJ
... ~3G?
vo = vo x {Yz - xD " n) l (Y2 - Yaj + °~Z x n / (YZ - Yo)
. . . (37.)
With reference again to Fig. 14, an intermediate point
P along the 3.ine segment AB there~.n is shifted by a value m from
the boundary point A towards the boundary point B. Coordinate
values, luminance data and texture Coordinates of the
intermediate point P are produced by interpolation by means of
the data of boundary points A and B. If the coordinates and
luminance values of the intermediate point P are expressed as P
(XP, Yp, T.~) the coordinates of an intermediate point Q as shown
in Fig. l3 with~.n the texture data and corresponding to the
intermediate point P and providing its texture coordinates, axe
deter~uined in accordance with the formulas (32) through (~6)
belowt
XP=X~+m ...(32j
Yp ~ YA ... {33)
p = LA x (XB ' X~ - mj I (Xg " Xp~ '~' ~ x m / (Xg : .X t34~
U4 = ~~ X (xa - x" - m) I (~ - XA) + up x m ~ (~ _ xAj
.... (35)
Q=~~x t~-X~-m) ~ (X8 'X~,) +VD xm/(Xg--XA1
. . . t~sj
The formulas {35) and (36) serve to relate the texture
coordinates Uo and Vo to the point P in order 'Go access
corresponding pixel data from the picture: 52.
5011'f .IiAR f ~1 p) 1~9~. APP - 3 9 -

CA 02466377 2004-05-31
PATENT
. _ . _ X50100.2978
Pixel data read from the textuxe picture 52 in
accordance With the coordinates correspo»ding to each
intermediate point P, are then multiplied by the corresponding
luatinance data of that point. The resulting data is written at a
corresponding location indf.cated by the coordinate values of the
intermediate point P in the video memory 5.
Thereafter, in a step SP1.5 of I~ig. 12, it is determined
whether the operations of fin8.ing the coQ.rdin~ate values and
luminance data for each of the triangles, as well as whether the
drawing colors therefor have beets written in the display area 51
of the video memory 5 of Fig. 9 have been completed. If so, the
picture data stored in the disp~:ay area 5l of Fig. 9 is output by
the graphic processor ~5 as the three-dimensiona3 picture VDz of .
Fig. 9. If not, however, processing reverts to the step SPlI for
finding the necessary data for the next triangle.
The methods described above for fox-mulating a three-
dimensioha~. pictuxe may be applied selecaiveiy depending on the
characteristics of the picture to be displayed.. Far example., ff
the shape of an object to be displayed as a three-dimensional
picture is critical at a given point during a game, a picture
preferably is drawn by the shading method according to the first
embodiment. however, if the pattern of 'the object's surface is
deemed to be more important, the picture is preferably produced
in accordance with the texture mapping ~nethodl ~f the second
embodiment.
SOttY.UlItR(ntp)\25~7a.IWP --Q ~-

CA 02466377 2004-05-31
' PATENT
.....,_ X50100.2978
A technique for formulating a picture using the texture
mapping function described above and a picture data converting
function by means of a conversion tahle uia li2ing the picture
formulating apparatus of Fig. ? will. now lie explained.
The CPU I of Fig. 7 reads out texture source picture
data from the video memory 5 in accordance with a program stored
in the main memory 2 and modifies the source picture data, if
necessary,-for writing in the v~.deo memory 5. Within the vi.dea
memory 5, information pertinent tv a picture to be output for
display, such as texture source data and conversion data (such as
a color lookup tabled is written in a portion of the memory other
than the display area in which data of the picture to be output,
that is, the display output picture data (picture to be displayed
by a device such as a TV receiver; monitor receiver er CRT
display) is stored.
Fig. 15A sc~3ematicazly illustrates the memory space of
the video memory 5 in an initial state wherein the memory spac~a
is. divided into the above-mentioned display area 51 and a vacaa~t
area 53 outside the display area 51. Fc~r example, a picture t~o
be displayed on a CRT display device may include 400 vertical
lines by b40 horizontal pixels or possibly 480 vertical lines by
720 horizontal pixels, whereas. a typical video memory ~:s provided
with vertical and horizontal memory dimensions set equal to
powers of 2. In either exemplary memory applicati4n, therefore,
a memory having 512 vertic2tl. lines by 1t~2~4 horizontal words ~.s~
S~fT.14A1iCtstp)129T8.JIPP ~~ ~_

CA 02466377 2004-05-31
' PATENT
.._ 43010.2378
required, so that redundant space is formed in both the vertical
and horizontal directions of the memory space. With reference to
Fig. 15H, in the vacant area 53 several pictures 54a, 54B and 5dc
are stored as picture souxces. Referring also fig. 15C, a
conversion table 55, such a~s a color lookup table ~.a also written
in the area 53.
Fig. 16 provides a flow chart illustrating a typical,
processing sequence in accordance with this technique. In a step
SP 31 of Fig. 16, the CPU 1 causes the texture source picture 54a
through 54o to be written in the vacant area 53 of the video
memory 55 before generating or formu~.ating the display output
picture data. At a step SP 32, tha conversion table 55 such as a
color lookup table, ~rhich serves to convert virtual picture data
into actual picture data is also stored zn the vacant area 53.
The texture source picture data may be obtained, for example, by
reading from an external storage device such as a disk or tape.
In a step SP 33, the CPU 3. executes a program stored in
the main meluory 2 for generation or formulation of a pre-set
picture and depending on the results of such processing, reads
data from the texture source pictures 54a through 54a in the ~txea
53. The texture source picture data thus read out are modified
in a step SP 34, if necessary, by reference to the conversion
table 55, and in a step SP 35 the picture data for writing in the
display are 51 are drawn or produced. That is, the data obtained
directly from the texture source pictures. are used as address~a
~ON~.11AR(ntp)129T8.APP _42

CA 02466377 2004-05-31
PATEN'S
. ... _ 450J.~1~.2978
to the conversion table 55 for converting t3ae texture source
picture data, as virtual. picture data, into real picture data.
The real picture data read frost the conversion table 55
consequently, are written i.n the display area 52 of the video
5ilemory . S 5 . . .
In a step SP 36, it is determined whether all picture
drawing operations have been completed for all of the polygonal
areas comprising the display output picture stored in the display
area 51. If not, the operations of-steps SP 33 through SP35 are
repeated. If so, however, the picture drawing operations are
terminated. -
With reference also to Fig:-17A, the state of the video
memory 5 once the foregoing.picture formulating operations have
been compJ.eted is represented thereby, and with reference also to
Fig. 178, the display state of a CRT display device 5~ employed
to display a picture 5~ thus formulated i.n the display area 51 of
the memory 5 is provided. With reference to Fig. 17A, the
picture data in the display area 51 is read out in accordance
with video synchronization signals and output $fter conversion
into analog farm by a D/A converter (not shown for purposes of
siz~plicity and clarity) for displaying the picture 56 on a screen
surface b8 of the CRT display dev3.ce 57 in Fig. 17H.
In the present embodiment, since the texture source
pictures 54a through 54c and/or the conversion table 55 (such as
a color lookup table) are stored in the vacant area 53 of the
sowr.ruecatQl ~29T8. era --~ 3 - .

CA 02466377 2004-05-31
PA'~~NT
~~o~.~o_a~~s
memory 5 outside the display area 51 thereof, neither a separate
video me~aory 343 (as in the apparatus of Fig. 3) for storing the
texture source pictures nor a separate conversion table memory
306 (as in the apparatus of Fig. 5) for storage of conversion
tables is required, th~a to permit a reduction in size of the
apparatus and reduced production costs therefor. Moreover, s~.nce
virtual data obtained directly from the texture source pictures
54a through 54C in the vacant area 53 of the memory 5 are
converted by reference to the conversion table 55 intro real
picture dat2~ which is then written in the display area 5i, there.
is no necessity to provide a separate bus for accessing a
conversio» table in.a separate memory (suckmae the memory 306 of
Fig. 5) which assists further in reducing dppa~~:a~tus size and
production costs.
Moreover, the display area in the video memory 3 may be
moved to perntit the previously formulated display output picture
itself tc~ be used as a texture source p~.cture, that is, as a
reference picture for practicing a picture fcarmulation technique
in accordance with a further embodiment of the present invention.
This embodiment is now explainad in connection with figs. 7.8A and
l8ba
With reference ffirst tv Fig. 1~A, the contents of the
video memory 5 during an arbitrary Nth frame ere illustrated
thereby. During the Nth frame, a display area 72 of the video
memory 5 is provided at a pre-set locatj.on therein and a number
SOlfY. iIAR (tt! p1129?S.IIi~P 'f~ ~ -

CA 02466377 2004-05-31
PATENT
450100.2975
of texture source pictures 74a, 74b and 74c axe stored in a
vacant area 73 of the video meluory 5 apart from the display area
zl. As explained below, during a following (N+1)th frame, at
least a portion of the picture formulated during the Nth frame
- and written in the display area 71 is used directly as texture
source data for producing a further display output picture.
~l3th reference nc5w to Fig. 188 the contents of the
video memory 5 during the next (N+l,th frame are .illustrated
therein. At that timer a display~area: 8l is provideCt in the
video memory 5 at a location differing from the position of the
display area 71 of Fig. 18A. Iz~ ~ vacant area 83 apart from the
display are 81, those portions of the picture previously written
zn the display area 71 which are required tv formulate data for
storage in the display area 18 to formulate a new picture. are
stored as a new texture source picture 84a as shown in Fief. 188.
That is, the display area 81 is selected-so that it does nat
c~verla.p those portions of the display area of the Nth frame
containing the new texture source picture 84a. In addition, a
~tumhe~ of teXtuY~ SbttYd~ ~5i~tilres 8~4b and. 8~c as required are
also stored in the area X3:3. Then the texaure source picture data
e4a is read and modified in the same manner as other texture
source data far writing in the display area 8.1 to formulate
picture data 86a for the (N+lyth frame.
soWY.pAR(nlp7\2978_APP - 4 5 -

CA 02466377 2004-05-31
- PATENT
450100.2978
Accordingly, with the use of this method a picture
previously generated by the Computer may be employed directlx as
texture source data for generating or formulating a picture.
A game playing apparatus in accordance with an aspeot
of the present invention, employing ~eatures of the above-
described picture for3nulat~.ng methods and apparatus, wild. now be
explained.
With ref~r2~~~ t~ Fits. 1~, a bl~ck diagram o~ an
embodiment of a household game playing apparatus in accordance
with the present invention is illustrated therein. In the
embodiment of Fig. 19, a game proc~ra~n is stored in. a CD-ROM 13., a
read-only memory employing a compact disc as an external storage
medium. In the alternative, a memory card 16 employing a non-
volatile memory such as a flash memory, may be employed for the
purpose of storing such a game program. A further alternative is
the use of a ROM cartridge, not shown for purposes of simplicity
and clarity. As yet another alternative, the game program may be
received over an external network via a communication interface
12.
The game prograyn and accompanying three-dimensional
picture information are entered from the CD-R4M 11 which is
connected to an external storage interface 1D, or in the
alternative from the memory card is through the external storage
interface 10 or from an external network connected to the
communication interface 12, under the Control of the CPU 1, such
9UtlY . NJIR (nip512978.xPP _ 4 ~ _

CA 02466377 2004-05-31
~ , PATENT
450100.2978
as a m~.croprocasgor. The game program and the three-dimensional
picture information are stored over the main bus ~ and the main
memory 2. Operating information is input from a coa~troller 14,
such as an input pad or a joystick when the same is fetched by
the CPU 1 from the input interface 13 via the main bus 9. Based
on the vpera~lng information thus obtained by the CPU l, the
three-dimensional picture information stored in the main memory 2
is converted by the graphic processor 6 for generating picture
data for display. A three-dimensional picture is drawn in the
video memory 5 by the graphic processor 6 raith the aid of the
picture data. The three-dimensional picture data thus drawn on
the video memory 5 are read synchronously with the scanning of
video signals for displaying the three-di~censional picture on a
display device 15, such as a monitor.
Simultaneously as the three-di~aensivnal picture is thus
displayed, voice information associated with the displayed three-
dimensional picture and included in. the operating information
fetched by the CPU Z is routed to an audio processor ?. Based on
such voice information, the audio processor 7 outputs appropriate
voice data which is stored permanently in the audio memory 8.
With reference also to Fig. 20, a flow chart is
provided therein for illustrating the operating sequence for
initiating and carrying out a computer game by means of the
imusetm~.d game playing apparatus in accordance with the present
embodiment. Far example, and with reference also to Fig_ 21, a
saftY.J~IAR<ntp)\29Z8.APP -4 7 -

CA 02466377 2004-05-31
PATENT
45n100.2978
computerized driving game in which a car i.s driven on a road x2
with the object of avoiding a building 23 and given terrain 21 is
implemented by the present embodiment utilizing the above-
described texture mapping method.
With reference to Fig. 20 in a step SP 23, the external
memory medium (such as the CD-ROM 11 or the memory card lb) is
mounted via the external storage interface 10 on a main portion
of the household game playing apparatus. In the a:Lternative, the
a~~a~atu~ f~ ~bna~~t~9 via the ~8~aunicat~an int~~~a~~ ~.2 tb trig
external network, as mentioned above. Necessary information,
such as the game program, functions, libraries and picture
information is fetched to the main memory 2 from the external
memory medium far storage or loading. Terrain data, building
data, road data and texture data are leaded at this tizae as such
picture information. In addition, the picture data are loaded
directly from the external memory medium to the vacant area in
the video memory 5 without the intermediate step of storing the
same in the main memory z.
Processing then continues in a step SP 22 in Which one
of a plurality of games stored in the external storage medium, ox
ease provided by the external network, is selected for play and a
mode of execution for the selected game is also selected. This
operation is carried out in dependence an variables stored in the
main memory 2 or based on the value of the register of the. CPU 1.
~r.ruRtn1p712978.APF - ~ g -

CA 02466377 2004-05-31
' w PATENT
45t)7 no. 2978
~,.,-_
The game program is then executed at a step SP 23 under
the control of the CPU 1 in order to start the computer game.
An initial picture indicating that the game has started
is formulated in a step SP 24 and displayed. t~ col~crete example
of such a picture is that illustrated in Fig. 21. This picture
is formulated by carrying out the following sequence of
operations.
In formulating the picture of Fig. Zl, a remote scene
is first prepared and displayed. For example, a portion 20
representing the sky which is the farthest portion of the picture
from the -plane of the screen or the position of the viewer is
initially formulated and displayed. Color data for the sky
portion 2o is read from the main memory 2 by the CPU 1, that is,
coordinate data of the four corners of the screen arid Color data
fez the sKy portion 2o are output to the raster processor 62
within the graphic processor~6. In the raster processor 62,
relatively smnll polygonal areas balsa referreQ tt5 as "polygons")
are formulated im the color of the sky 2t5 for depiction by the
display device 15.
The terrain portion 21 is then displayed. The CPU 1
causes coordinates of the three-dimensional apex points of the
polygons comprising the terrain 21 to be read from the main
memory 2 for supply to the geometry processor sl. The processor
s1 executes coordinate conversion and prospective conversion of
the three-dimensional apex point coordinates to yield two-
soN'f . t~lARt n~ p) \Z978.4PP ", 4 ~ ,~

CA 02466377 2004-05-31
PATENT
- 450100.278
dimensional screen coordinate values, and also produces texture
coordinate values associated with such two-dimensional coordinate
values. The texture coordinate values thus produced are output
to the raster processor 52 which reads vut texture data
corresponding to the input texture eoordin~te values from texture
data stored in the video memory 5 in order to modify such texture
data to conform to the polygons defined by the two-dimel'isional
coordinate values to write the modified data in the .video mel~ory
5. The above operations are carried out for all of the polygons
making up the terrain 21 and then the terrain 21 is displayed in
its entirety by the display device 15.
Y According-ly, as explained with reference to Figs. 9 and
12 through 18, a three-dimensional picture may be produced and
displayed eiaploying the texture mapping method described
hereinabove involving the production of texture coordinate
values, as tell as coordinate values and luminance data of the
apex points of the polygons, and then writing texture data
corresponding to the texture coorr~inate values as pixel data.
zn addition, if texture data are used for formulating a
picture of the road 22 and of the building 23, similar processing
operations are carried out therefor as in the case of the terrain
21. If texture data are not used for formulating pictures of the
road 22 and the building 23, color data for the polygons used to
represent the road 22 and the building 23 are read from the main
memory 2 and appropriate processing operations similar to those
SOHY.fUR< nlp)129T8.APP ~ 5 0 -

CA 02466377 2004-05-31
PATENT
43DIOO.2978
used for formulating the picture of the terrain 21 are carried
out for formulating and displaying the road 22 and the building
23 by means of the display device 15.
Once the picture has been displayed in tk~e foregoing
mariner, processing continues in a step SP 25 where a standstill
state for the game is set until a button or lever provided in ~
controller 14, such as a game controller or a game pad, are
actuated by a game operator. If the controller 14 is actuated,
data eorxesponding to the actuation is then fetched by the CPU 1
vi.a the interface 13 and the game proceeds in accordance with the
data thus obtained. Speed and direction information for the car
to be driven as displayed are determined by data input with the
use of the controller 14, and the position and orientation of the
terrain 21, road 22, and build~.ng~23 as displayed is changed as
required in accordance with the input data: Such changes are
carried out by changing variables stored ~.n the main memory 2 or
values stored in a register of the CPU 1.
In a step sp zs a picture modified in accordance with
data entered by means of a controller 14 as the computer game
progresses is sequentially formulated in accordance with the
above-described method and then displayed. The audio processor 7
fetches voice data matching the progress of the computer game
from the audio memory 8 and outputs the fetched data.
At a step sp 27, it is determined depending on the
progress of the game and based on the intent of the operator as
sa~r.~uetnlp)\29Te.~ _ 51 _

CA 02466377 2004-05-31
' ' - PATENT
_ 450100.2978
expressed by operatf.ztg the controller 14 or other appropriate
input t9evice, whether or not the game is to be teraninated. If
the game is not to be terminated, contro3 reverts to the step SP
25 to await data representing the next operation to be enterdd
- from the controller 14. Tf'the game is to be termiristed, data
representing the state of the game progress is stored in storage
means, such as a non-volatile memory (not shown far purposes of
simplicity and clarity) before terminating the game.
From the foregoing, it will be apprec~.ated that the
methods and apparatus of the present invention may be adapted for
use not only in household game playing apparatus but in other
types of game playing apparatus, such as apparatus used in
arcades. Moreover, the methods aZzd apparatus of the present
invention may also he employed for implementing fl~.ght simulators
and other training devices, as well as virtual reality systems.
Various other applications for the present. invention.will be
apparent to those.of ordinary ski~.l in the art. based on the
foregoing disclosure.
Although specific embodiments of the invention have
been described in detail herein with reference to the
accompanying drawings, it is to be understood that the invention
is not limited to those precise embodiments, but that various
changes and modifications may be effected therein by one skilled
in the art without departing from the scope. of spirit of the
inVerition as defined in the appended claims.
Sally .IIAR ( Ti l p»2978. APP - ~ ~ ...

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

2024-08-01:As part of the Next Generation Patents (NGP) transition, the Canadian Patents Database (CPD) now contains a more detailed Event History, which replicates the Event Log of our new back-office solution.

Please note that "Inactive:" events refers to events no longer in use in our new back-office solution.

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Event History , Maintenance Fee  and Payment History  should be consulted.

Event History

Description Date
Inactive: Expired (new Act pat) 2014-04-11
Inactive: Late MF processed 2008-07-14
Letter Sent 2008-04-11
Grant by Issuance 2008-04-01
Inactive: Cover page published 2008-03-31
Inactive: Final fee received 2008-01-08
Pre-grant 2008-01-08
Notice of Allowance is Issued 2007-10-29
Letter Sent 2007-10-29
Notice of Allowance is Issued 2007-10-29
Inactive: IPC assigned 2007-10-26
Inactive: IPC removed 2007-10-26
Inactive: First IPC assigned 2007-10-26
Inactive: IPC assigned 2007-10-26
Inactive: IPC assigned 2007-10-26
Inactive: Approved for allowance (AFA) 2007-10-17
Amendment Received - Voluntary Amendment 2006-12-08
Inactive: S.30(2) Rules - Examiner requisition 2006-08-02
Amendment Received - Voluntary Amendment 2006-05-15
Inactive: Office letter 2005-11-21
Inactive: S.30(2) Rules - Examiner requisition 2005-11-14
Inactive: Adhoc Request Documented 2004-12-17
Amendment Received - Voluntary Amendment 2004-07-26
Inactive: Cover page published 2004-07-15
Inactive: First IPC assigned 2004-07-12
Divisional Requirements Determined Compliant 2004-06-09
Letter sent 2004-06-09
Letter Sent 2004-06-09
Application Received - Regular National 2004-06-09
Application Received - Divisional 2004-05-31
Request for Examination Requirements Determined Compliant 2004-05-31
All Requirements for Examination Determined Compliant 2004-05-31
Application Published (Open to Public Inspection) 1994-10-16

Abandonment History

There is no abandonment history.

Maintenance Fee

The last payment was received on 2007-03-29

Note : If the full payment has not been received on or before the date indicated, a further fee may be required which may be one of the following

  • the reinstatement fee;
  • the late payment fee; or
  • additional fee to reverse deemed expiry.

Patent fees are adjusted on the 1st of January every year. The amounts above are the current amounts if received by December 31 of the current year.
Please refer to the CIPO Patent Fees web page to see all current fee amounts.

Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
SONY COMPUTER ENTERTAINMENT INC.
SONY CORPORATION
Past Owners on Record
MASAAKI OKA
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column (Temporarily unavailable). To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.

({010=All Documents, 020=As Filed, 030=As Open to Public Inspection, 040=At Issuance, 050=Examination, 060=Incoming Correspondence, 070=Miscellaneous, 080=Outgoing Correspondence, 090=Payment})


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Description 2004-05-30 52 2,325
Drawings 2004-05-30 20 456
Abstract 2004-05-30 1 27
Claims 2004-05-30 4 154
Representative drawing 2004-07-08 1 17
Claims 2006-05-14 4 122
Claims 2006-12-07 4 123
Acknowledgement of Request for Examination 2004-06-08 1 176
Commissioner's Notice - Application Found Allowable 2007-10-28 1 164
Maintenance Fee Notice 2008-05-25 1 171
Late Payment Acknowledgement 2008-10-05 1 164
Late Payment Acknowledgement 2008-10-05 1 164
Correspondence 2005-11-20 1 16
Correspondence 2008-01-07 2 50
Fees 2008-07-13 1 29