Language selection

Search

Patent 1332192 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent: (11) CA 1332192
(21) Application Number: 453798
(54) English Title: METHOD FOR COLORIZING FOOTAGE
(54) French Title: METHODE DE COLORISATION D'IMAGES
Status: Deemed expired
Bibliographic Data
(52) Canadian Patent Classification (CPC):
  • 350/35
(51) International Patent Classification (IPC):
  • H04N 9/43 (2006.01)
  • H04L 69/16 (2022.01)
  • H04N 5/84 (2006.01)
  • H04L 29/06 (2006.01)
(72) Inventors :
  • GESHWIND, DAVID M. (United States of America)
(73) Owners :
  • GESHWIND, DAVID M. (United States of America)
(71) Applicants :
(74) Agent: KIRBY EADES GALE BAKER
(74) Associate agent:
(45) Issued: 1994-09-27
(22) Filed Date: 1984-05-08
Availability of licence: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): No

(30) Application Priority Data:
Application No. Country/Territory Date
492,816 United States of America 1983-05-09
601,091 United States of America 1984-04-20

Abstracts

English Abstract



- 1 -
Abstract

A process for colorizing a high information density
black and white image comprises generating color inform-
ation for the image at an information density lower in
some aspect than the information density of the original
black and white image. This color information is then
computer processed at the lower information density to
produce processed color information. The processed low
information density color information is then combined
with the high information density black and white
information to produce a high information density full
color image.


Claims

Note: Claims are shown in the official language in which they were submitted.



-20-
Claims:-

1. A process for colorizing a high information
density black and white image comprising the steps of:
(a) generating color information for said image at an
information density lower in some aspect than the inform-
ation density of the original black and white image;
(b) computer processing said color information at said
lower information density to produce processed color
information; and
(c) combining said processed low information density
color information with said high information density black
and white information to produce a high information density
full color image.
2. A process as in claim 1, wherein steps (a) through
(c) are performed on successive frames of a black and white
motion sequence.
3. A process as in claim 2, wherein said motion
sequence is derived from a motion picture film.
4. A process as in claim 2, wherein said motion
sequence is derived from a video media.
5. A process as in claim 1, wherein said information
density is lower in spatial density.
6. A process as in claim 1, wherein said information
density is lower in the accuracy with which a particular
color is defined.
7. A process as in claim 1, wherein said processing
consists of blurring or filtering said color information.
8. A process as in claim 1, wherein said processing
is varied in response to instructions from an operator.
9. A process as in claim 1, wherein said processing
is varied in accordance with a computer algorithm.
10. A process as in claim 2, wherein said color
information is generated with a lower temporal information
density than the temporal information density of said
black and white motion sequence.



-21-
11. A process as in claim 10, further comprising the
step of synthesizing missing color information by
interpolating between existing color information.
12. A process as in claim 2, wherein said processing of
the color information is in the temporal domain.
13. A process as in claim 12, wherein said temporal
domain processing consists of temporal filtering.
14. A process as in claim 13, wherein said temporal
domain filtering consists of cross-dissolving.
15. A process as in claim 2, wherein said processing is
varied in accordance with specifications pertaining to the
motion of individual objects portrayed in said black and white
motion sequence.
16. A process as in claim 1, wherein said combination is
achieved optically.
17. A process as in claim 1, wherein said combination is
achieved electronically.
18. A process for colorizing a series of black and white
frames containing black and white information in a motion
sequence, comprising the steps of:
(a) generating a color information frame at an
information density lower in some aspect than the information
density of the original black and white information for a
first black and white frame in said motion sequence;
(b) generating a second color information frame at an
information density lower in some aspect than the information
density of the original black and white information for a
second black and white frame which is a first number of frames
from said first black and white frame in said motion sequence;
and
(c) combining said low density color information frames
with the high information density black and white frames to
which they are associated to produce high information density
full color frames.


-22-
19. A process as in claim 18, comprising the additional
step of:
(d) generating at least one intermediate color
information frame to be associated with a black and white
frame intermediate between said first and second black and
white frames by interpolating color information between said
first and second generated color information frames.
20. A process as in claim 19, wherein said interpolation
is shape interpolation.
21. A process as in claim 19, wherein said interpolation
is cross-dissolving.
22. A process as in claim 18, comprising the additional
step of:
(e) processing said color information frames.
23. A process as in claim is, comprising the additional
step of:
(f) generating at least one mixed color information
frame between at least one adjacent pair of said color
information frames, to be associated with a black and white
frame intermediate between the black and white frames
associated with said pair of color information frames, by
interpolation between said pair of color information frames.
24. A process as in claim 23, wherein said interpolation
is shape interpolation.
25. A process as in claim 23, wherein said interpolation
is cross-dissolving.
26. A process as in claim 18, comprising the additional
step of:
(g) associating those black and white frames not already
associated with color information frames to the nearest color
information frame.
27. A process for modifying high density visual
information in an image comprising the steps of:
(a) separating said visual information into a plurality
of individual components;
(b) processing at least one of said individual
components separately at an information density lower than the



-23-
information density of said original visual information to
generate at least one processed low information density
component; and
(c) combining said components including said at least
one processed low information density component into a
modified high information density image.
28. A process as in claim 27, wherein the image is a
full color image, said full color image is separated into
color and black and white components, and said combination
results in a full color image.
29. A process as in claim 27, wherein steps (a) through
(c) are performed on successive frames of a motion sequence.
30. A process as in claim 29, wherein said motion
sequence is derived from a motion picture film.
31. A process as in claim 29, wherein said motion
sequence is derived from video media.
32. A process as in claim 27, wherein at least one
component is separated with a lower information density than
the information density of the same component in said image.
33. A process as in claim 27, wherein at least one
component is processed with a lower information density than
the information density of the same component in said image.
34. A process as in claim 27, wherein one component is
processed with a lower information density than that of
another component.
35. A process as in claim 27, wherein said processing
consists of blurring or filtering one of said plurality of
image components.
36. A process as in claim 27, wherein said processing is
varied in response to instructions from an operator.
37. A process as in claim 27, wherein said processing is
varied in accordance with a computer algorithm.
38. A process as in claim 29, wherein at least one of
said information components is extracted with a lower temporal
information density than the temporal information density of
the same component in said motion sequence.


24
39. A process as in claim 29, comprising the additional
step of processing at least one of said information components
of said motion sequence in the temporal domain.
40. A process as in claim 39, wherein at least one of
said information components is processed with a lower temporal
information density than the temporal information density of
the same component in said motion sequence.
41. A process as in claim 39, wherein one of said
information components is processed at lower temporal density
than another component.
42. A process as in claim 38, comprising the further
step of synthesizing missing component information by
interpolating between existing component information.
43. A process as in claim 39, wherein said temporal
processing consists of temporal filtering.
44. A process as in claim 43, wherein said temporal
filtering consists of cross-dissolving.
45. A process as in claim 27, wherein said combination
is achieved optically.
46. A process as in claim 27, wherein said combination
is achieved electronically.
47. A process for modifying a high information density
full color film, comprising the steps of:
(a) making a high information density black and white
film from said high information density full color film;
(b) inputting said full color film into a computer at an
information density lower in some aspect than the information
density of the original full color film;
(c) separating a color-only low information density
component from the full color information in the computer;
(d) processing said color-only low information density
component, in the computer, into a processed color-only low
information density component;
(e) outputting said processed color-only low information
density component onto film to produce a processed color-only
low information density film; and



(f) combining said processed color-only low information
density film with said high information density black and
white film into a modified full color high information density
film.
48. A process for modifying a full color high
information density film, comprising the steps of:
(a) inputting said full color film into a computer;
(b) separating black and white and color-only components
from the full color information in the computer;
(c) processing at least one of said components, in the
computer at an information density lower in some aspect than
the information density of said full color high information
density film, into a processed component;
(d) combining said color-only and black and white
components into modified full color high density information;
and
(e) outputting said modified full color high density
information component onto film.
49. A process of colorizing a series of black and white
frames having a relatively high resolution and containing only
black and white information in a motion picture motion
sequence, comprising the steps of:
(a) generating color information for a single frame in a
sequence;
(b) blurring all portions of all said color information;
(c) combining said blurred color information with said
relatively high resolution black and white information to
generate a colored frame corresponding to said black and white
frame and having high density luminance information and low
density color information.
50. A process as in claim 49, further comprising the
step of:
(d) adding said same blurred color information to a
first number of frames in said motion sequence and adjacent to
said colorized frame to colorized said adjacent frames.



- 26 -
51. A process as in claim 50, further comprising the
steps of:
(e) colorizing a second frame which is a second number
of frames subsequent to said single frame; and
(f) interpolating color information between said single
frame and said second frame to develop a number of
intermediate sets of frame color information.
52. A process as in claim 51, wherein intermediate sets
of frame color information corresponds only to a limited
number of frames in said sequence and wherein said adding of
color is performed on all remaining frames which are not
associated with generated or interpolated color information by
colorizing the remaining frames with the color information
associated with the closest manually colorized or
interpolatedly colorized frame to generate individually
colorized frames for all frames in the sequence.
53. A process as in claim 52, wherein said color
information is cross-dissolved information from an earlier and
later adjacent colorized frame to colorize said remaining
frames.
54. A process as in claim 53, wherein said color
information and said black and white information are optically
combined.
55. A process as in claim 53, wherein said color
information and said black and white information are
electronically combined.
56. A process as in claim 51, wherein said intermediate
sets of frame color information are individually corrected by
an operator.
57. A process as in claim 49, wherein blurring is varied
in response to motion of a colored image contained on said
black and white frame.
58. A process as in claim 57, wherein blurring is
increased on the side of a moving image component closest to
the next position of the image.
59. A process as in claim 49, further comprising the
steps of:


27
(g) colorizing a second frame which is a second number
of frames subsequent to said single frame; and
(h) interpolating color information between said single
frame and said second frame to develop a number of
intermediate sets of frame color information.
60. A process of colorizing a series of black and white
frames containing black and white information having a first
resolution in a motion picture sequence, comprising the steps
of:
(a) generating color information for a single frame in
the sequence;
(b) processing said color information to generate
processed color information having a second resolution, said
second resolution being lower than said first resolution; and
(c) combining said lower resolution color information
with the high resolution black and white frame to colorize
said black and white frame.
61. A process as in claim 60, wherein said second
resolution is lower only in special resolution.
62. A process as in claim 60, wherein said second
resolution is lower in the accuracy with which a particular
color is defined.


Description

Note: Descriptions are shown in the official language in which they were submitted.


/ :
.
1 3321 q2 ~

_ - 2 -

I In an attempt to solve some of these problems, numerous
techniques have evolved. For example, the movie may be
reprinted on color film with a sepia color, or other
attractive color. The color for various scenes may even be
varied depending upon the contents of scenes, the lighting
level and the like. Thus blue might be used for a night
scene, sepia for an indoor scene and green in a park like
setting.
',
In an attempt to get a mixture of coloration on black &
white television shows, products have even been marketed
which comprise a thin transparent plastic film which is
adhered to a television screen and which contain several
stripes of color, for example, a blue region at the top
presumably coloring the sky, a green region on the bottom,
corresponding to foliage, and a brown region in the center
;; corresponding to the various characters in the scene.
However none of the above systems are capable of
individually providing the various elements in the picture
20 with realistic colors.

;The alternative to this type of colorization is individual
coloring of each frame of the movie. Naturally, this is a
~''?.. ~ manual operation and, involves the colorization of a great
Z5~;n~umber~of frame~s and accordingly a relatively great expense.

DIS~LOSURE OF INVENTION

In accordance with the present invention the colorization of
existing black & white footage is achieved by individually,
for~a first frame, outlining the regions to be colorized in
;var~ious colors,~ storing this information in a random access
-~ memory and using a computer to ~fill in~ the various colors
inside the various regions in the scene in accordance with

~ 3 ~ 1332 ~ q2

the stored information. The color information contained in
the first colored frame is then used on all successive frames
until half way to the next frame to be individually colored is
reached after which the color information from that second
colored frame is used. Alternatively, a mixture of
information may be used. Because the color information for
each Nth frame is inaccurate for the frames which lie between
every Nth frame, the reduction of psysioloaically perceptible
inaccuracies in colorization is achîeved by decreasing the
resolution of the color information (i.e. the number of color
information pixels), effectively blurring its outlines.
Likewise, only color informaton is stored during this process,
thus having the advantage of decreasing the number of bits per
second processed by the computer.

Because of the response of the eye, the combination of low
resolution color plus high resolution black and white images
gives the impression of a high resolution color image.
Moreover, not only is the eye insensitive to the fact that the
~; color information used is not strictly accurate, but the
~; 20 resulting color picture looks more realistic than individually
colored frames which tend to get a "cartoon-like" appearance.
In accordance with a further embodiment of the inventive
method, the value of N may be increased greatly by
interpolating the outlines for color information to obtain
¢olor information for a number of individual frames which lie
between the first and the Nth frame and define equal time
periods between the first and the Nth frame.
Thus, in one aspect the present invention is directed to
a method for colorizing a high information density black and
!" 30 whlte imagel comprising the steps of: (a) generating color
lnformation for said~image at an information density lower in
some aspect than the information density of the original black
and white im`age; (b) computer processing said color
information at said lower information density to produce
~ ~ 35 processed color information; and (c) combining said processed
i low information density color information with said high
information density black and white information to produce a
~ high information density full color image.

:

1 332 1 92
-- 4 --

1BRIEF DESCRIPTION OF DRAWINGS


One way of carrying out the invention is described below -~
with reference to the drawings which illustrate only several
embodiments in which:-

, .
Figure l is a schematic drawing of a system for
carrying out the method of the present invention;

lOFigure 2 is a diagram illustrating the method of the
invention; and

Figures 3 and 4 illustrate alternative systems for
carrying out the invention.

BEST MODE FOR CARRYING OUT THE INVENTION


An apparatus for practising the inventive method is
illustrated in Figure l. In accordance with the invention,
a fr~ame of a conventional black & white film strip (or black
&~w~hite ~ideo~) is processed by a film to video converter 12
n~to~a ~standar~d~video image which may be displayed on a
frame by frame basis on~black & white monitor 14. Monitor
25~ 4;~is eq;u~i~ped with an X-Y position transducer 16 whose
ou~tput is coupled to a computer 18. The outline of an
object of g~iven color, together with an operator instruction `~
d~regarding~the color desired allows computer l8 to generate a
color signal timed in synchronism with the video frame.
30~Computer 18, in turn, outputs the color information which is
mlxed with t~he~ black & white~vldeo output of film to video
converter 12 by a composite color generator 20. The output
of color generator 20 is then passed to a color monitor 22.
The output of color monitor 22 may be photographed by color
motion picture camera 24 to produce a hard copy on
,. :~ ~ :
.. ~
~ '``' ~ `

1 3321 ~2
_ 5

1 photographic film in color oL the original frame of black h
white film 10.
:.
During practice of the inventive process, the black & white
5 film strip 10 is converted on a frame by frame basis by
converter 12 to a video image. The first image of the film
strip would be displayed in black & white on monitor 14, as
described above, prior to conversion of subsequent images to
video signals. The image on the first frame would be held
lO in conventional fashion for continuous display on monitor
14. During the period of continuous display, X-Y position
transducer 16 is used to translate the outlines of objects
of various colors to generate color control information.
For example, position transducer 16 could be made by a
15 manual operator to follow the outside line 26 defining
flying saucer 28 in frame 30. At the same time, computer 18
is then instructed tha~ the area within this figure is to be
colored red. Outside line 32 defining tree foliage 34 is
then followed by the transducer and the computer instructed
20 to generate the color green. In similar fashion trunk 36
may be colored dark brown. The computer is then instructed
after the tracing of horizon line 38 to color all remaining
material above line 38 blue and all material below line 38
;tan. Thi~ would result in colorization of the entire
25~pict~ure. The computer could also be instructed respecting
w~hich objects are moving, (e.g., flying saucer 28) and which
; objects are stationary, (e.g., horizon 38). With respect to
moving objects, these objects could be surrounded with an
aura 40 iof their own color which extends beyohd their
30 borders in an indistinct and fuzzy fashion. Such;extension
may~also~be greater in the direction of movement. In the
drawing the bottom~of the flying saucer wouId have such a
5gr~eater aura. On the other hand, where both objects are
stationary the aura of color may be replaced with a mixing
35 zone 42, such as that existing around horizon 38 during
~,

1 33~ 1 92
_ - 6 -

1 which one color would gradually shift to another, thus
avoiding a comic book like appearance. As an alternative to
a blurry aura or mixing zone, the color information may
simply be blurred by using a low resolution digital or
analog encoder for the color information.

Once the color information has been added to the first frame
computer 18 can add this information via composite color
generatox 20 to successive and preceeding black & white
frames to generate composite color frames in the same
sequence for display on color monitor 22 for photographing
by color motion picture camera 24. As an alternative to the
monitor 22 and camera 24, a conventional digital film
printer may be used.
;~ If we consider the case where flying saucer 28 is moving in
the direction indicated by arrow 44, the color information
defined by aura 40 and the surface area within aura 40 may
be used in successive and proceeding frames.
In particular, for example, if every Nth frame is colored,
the~information containe~ within the Nth individual color
frame is then used for frame N/2 through N+tN/2)-1, without
change. Likewise the color information for the first frame
25 would be used in frames 1 through frame (N/2)-1. This is
done by having computer 18 repeating the color information
defined by aura 40 (for the nearest individually colored
; !frame) each time on a frame by frame basis, adding it toithe
particular frame received by generator 20, thus passing a
30 series of full color ~frames to color monitor 22 for
photographing by camera 24. Alternatively, on a frame by
frame basis computer generated color information could be
superimposed over each black & white frame on a color
h monitor with any necessary touch-up done manually operating
35 an X-Y position transducer associated with the monitor's

l ~


"`', ` ;'. . ?' '. ::

- 7 ~ 1 332192 :;

1 screen.

The above is illustrated more clearly with reference to
Figure 2. Halfway to the Nth frame, corresponding to an
image of the flying saucer positioned as designated by the
numeral 46 in Figure 2, the color begins to loose its
accuracy, due to the displacement between an image
positioned between image 26 and image 46. It then becomes
necessary to repeat the manual coloring operation on the Nth
frame (image 46) that was performed earlier on the first
frame 26. The coloring would then be completed for the 2Nth
frame 48 through 6Nth frame 56 resulting in the generation
of color information approximations for the flying saucer
moving from position shown in solid lines (image 26), to the
position shown in the dot-dashed lines (image 56) in Figure
2. Intermediate frames 26 and 46 through 56 would use the
^ ~ color in.ormation of their closest manually colored frame,
as described above. Naturally, to the extent that other
objects remain stationary, color information for frame 26
may be reused~by the computer 18 for frames 46 through 56.

In~the e~ent that it is desired to further reduce the amount
of~work involved in manually entering color information,
ate~r~entering~color information for the frame shown in
;25~soli~d lines in Figure 2 (frame 26), one can immediately go
to~a much later~frame, such as the frame containing image 56
and enter the color information associated with image 56
passing over the operations of adding color indiviqually to
images 46 through 54. Using standard interpolation software
30~ t~he computer then interpolates between the color information
assoaiated~with the solid lines in Figure 2 (image 26) and
the color information associated with image 56 to generate
thelcolor information associated with intermediate images 46
through 54. Thus, by generating color information for a
~ 35 llmited number of boundary points (i.e., images 26 and 56),
I .~
l ~

1~2192
-- 8 --

1 the computer generates intermediate information for a number
of what might be called sub-boundary points or frames
(corresponding to images 46-54). By blurring the color
information to extend an aura around the actual color
information, we can generate a plurality of color
information images which roughly coincide with all frames
between individual subpart boundaries defined by images 46-
54. If we consider the case where N = 10, we could color
the first frame individually, go to the 60th frame and color
those objects which have moved, allow the computer to
generate auras around the colored images associated with the
10th, 20th, 30th, 40th, and 50th frames. These interpolated
colorized frames may then be individually corrected by an
operator. One then plays back the entire sequence adding
the operator furnished color information of the first frame,
~;~ to frames 1 through 4, the interpolated color information of
the 10th frame to frames 5 through 14, the interpolated
color information of the 20th frame to the frames 15 to 24,
the interpolated color information of the 30th frame to
~` 20 frames 25 through 34, and so forth. Alternatively, cross-
disolving may be employed as is discussed below. Thus, if
one wished to color 6000 frames in a given motion sequence,
it would merely be necessary to manually enter complete
color~information for one frame and up-date information for
~ 25 100 additional frames. Such up-date information would be
''t~ processed in such a manor as to override non-varying color
information, such as the coloring of the sky and the ground
,~ ,above and below horizon 42 in Figure 2.
. :
As the value of N in the above example increases, jumping of
a blurred color image will occur in a physiologically
perceptable fashion. This movement in color information can
be smoothed by cross-disolving intermediate blurred color
. frames by coloriz$ng them with different percentages of
adjacent individually colorized frames. The chart below

g t 3 3 2 1 q 2

1 illustrates such a use of mixed color information. In it,
each of the frames (frame number) is colorized with a
percentage of the color information from its respective
earlier adjacent colorized (E. Adj. Frame) and later
adjacent colorized (L. Adj. Frame) frames.

Frame E. Adj. % L. Adj.
Number Frame Frame ;~

~: 10 ,,~,,
1 1 100 10 0 ~'.;
2 l 89 10 ll
3 l 78 lO 22
4 1 Ç7 10 33 ~-
: .
~; 15 5 1 56 10 44
6 1 45 10 55
I ~
7 1 34 10 66
8 1 23 10 77
9 1 12 10 88
~-~ 20 lO 10 100 20 0
I ;.~
11 10 90 20 lO
12 10 80 20 20
13 lO 70 20 30
14 10 60 20 40
16 lO 40 20 60
17 10 30 20 70
, 18 10 20 20 , 80
19 10 10 20 90
100 30 0
,~
. ,. ~ ~
~" ~
Ey cross-disolving between the color portion of frames
1~ ~ colored by either the computer or by hand, not every frame
need have a unique color component; although for each frame
~; ,



."~ "~ " i., ~

133219`2
- 10 -

1 the unique black & white information may be used.

Different ways of implementing the above process are
illustrated by the three systems shown in Figures 3 and 4.
Considering first Figure 3, a black & white film original
100 is displayed on a television monitor 102 via a film
projector and television camera 104. An X-Y data entry
tablet 106 incorporating an electronic stylus 108 is used by
a human operator to trace the color outlines of an image to
be colorized. This image information is processed into
~ color info~mation through computer 110. The computer, in
! ~ turn, feeds back color information to the operator which is
displayed on monitor 102. Alternatively black & white film
original 100 may be projected onto a transparent X-Y data
entry tablet 112 by a projector 114 to be traced with a
stylus 116. Such a system is illustrated in dashed lines in
Figure 3.

;~ 20 ~As discussed above, area outlining, color choice and motion
speclfication information is fed to computer 110. The
; computer coordinates the system components to accomplish the
d~eslred~ compu~ter a~ssisted colorization. In addition,
computer~110 wi11 perform the steps of interpolation between
25~ hand~tr~aced frames, edge traced blurring of objects based
upon~motion information, blurring of the color image (by
filtering or~other suitable method), and cross-disolving
between color~components which generate color im!agesifor
~-` intervening frames.

The computer then generates a "color-only" information
s~l~gnal~whlch is sent to~a d~lgital film printer 118 which
generates a color overlay film 120. Color overlay film 120
is comblned with original black & white film 100 by an
optical printer 122 to generate high resolution colorized

- 1332192 ;;
1 film 124.

An alternative system is illustrated in Figure 4. In
accordance with this system, black ~ white film 200 is
converted by a video camera 202 into a video signal.
Alternatively, film 200 and camera 202 may be replaced by a
black & white video source such as a video tape recorder
204. This black & white video signal is displayed on a
monitor 206 and sent to a color encoder 208 to be combined
with a color signal to be generated as described below.
~ .
An X-Y position transducer 210 incorporating an electronic ~
stylus 212 is used to generate color outline information -
which together with color choice information and motion
., ., ~.
specification is input into computer 214. Computer 214
generates a color signal which is combined with a black &
white signal by color encoder 208 to generate a composite
color image which, in turn, is sent to a video tape recorder
216. Alternatively, the output of the color encoder may be
sent to a digital film printer 218 for generation of a hard
copy of the colorized motion picture. It is also noted that
the~output of the computer 214 is used to generate color
informat1on on monitor 206, allowing the operator to adjust
the manually ent;ered colors in any desired manner.
25~
It~should~be noted that in the digital domain information
requirements to store and process information are reduced in
; three ways. By using lower resolution for the color
information, the number of pixels or dots is reduced
30~(spatial resolution). Further, by generating the color
information only, the n~umber of bits required to specify
each dot~is reduced. This number of bits can be further
rèduced by limiting the number of hues of color that may be
entered into the system. Thirdly, by generating color for
only every Nth frame, information requirements are further
.,~,, ;
: .:

- 12 -
1 332 1 92
1 reduced.

While the instant invention has been described in the
context of adding color to a motion picture which was
initially filmed in black and white, the inventive system is
also useful to replace, modify or enhance the color aspects
of an existing color film. This may be desired if the film
has deteriorated due to age, exposure to light or other
elements or has been damaged by water, fingerprints or other
causes. It may also be useful to correct film scenes which
`~ have been shot under adverse conditions such as poor
lighting or exposure or which have been improperly processed
or handled. Color may also be adjusted to correct for
inconsistancies between scenes. Finally the color film may
be modified for purposes of ~special effects~ or any other
desired technical or aesthetic reasons.

Such color alteration or enhancement can be carried out in
; three steps. The first is the extraction of the black &
white and color portions of the information from the
ori~ginal full color~film or video. The second is the
mod~ification or regeneration of the color portion (and in
some~cases the black & white portion as well) by techniques
alre~ady described or those described below. The third is
25~ the~recomb~ination of the black & white and color portions
into;a ful~l color product as already described.

As an example, Figure 3A shows a system for applying this
method to color film 340 where the extraction of the black
and white porti~on is accomplished by printing a high
definition black and white film 300 from the original color
-film 340. The original color film is also scanned by an
ima~ge digitizer~336 which measures the film at each pixel or
dot and conver~ts this to digital numeric information which
35~ is fed to computer 310. The computer can then extract the

- 1 332 1 q2
- 13 -

1 "color-only" information from the digitized full color
information.

It should be noted that, since it is the full color
information that is input to the computer, the black and
white signal could also be extracted by the co~puter. This
would be particularly useful for cases where the black &
white portion of the image was also damaged or deficient in
some way and would allow for automated repair of both the
color and black & white aspects of the image. However it
should be understood that, since the eye is less sensitive
to color information, the color information can be
digitized, stored and/or processed at lower density than the
black & white information.
~;~ The color information can be handled at lower density using
any combination of three methods. The first method is
dealing with the color information at lower geometric
resolution, i.e. fewer pixels. The second is by limiting
the number of possible colors or gradations of colors thus
requiring fewer bits to specify the color at each pixel,
i.e. lower color choice resolution. The third is by dealing
with~unique color information for only some frames, i.e.
lower temporal resolution. While the above techniques can
25~p~rovide considerable savings of storage and processing
resources the invention is not limited to cases where they
` are applied. It should also be noted that these same
techniquqs can be applied to the black & white portion of
the image as well as the color; but, since the eye is more
sensitive to black & white information, their use may be
more severely limited by anomalies of perception.

In the original film 340, the black & white information is
~~ at high density and, if it is to be extracted for processing
and regeneration by the computer without losing detail, the
,.

1 332 ~ q2
_ - 14 -

1 original film must be scanned at high density. If the black
& white information does not require regeneration it can be
preserved at high density on film strip 300. This allows
film 340 to be scanned in at lower density since the black &
white portion of this information will eventually be
discarded. (However, this low density black & white
information may be used to determine how to process the
color signal. For example, patern recognition and image
analysis of shadows, highlights and texture may be used to
classify objects in or areas of the overall image. Once
classified these distinct areas or objects may have their
respective color components processed in different ways. A
more straight foward algorithm would be to search for
particularly "bright" pixels in the black & white
~15 information; these would correspond to "highlighted" areas.
-~If the color aspect of these pixels were made more blue, for
example, it would appear that the light source that caused
the highlights was more blue.) In this way, color
~information can be scanned, processed and stored at a lower
?~20 density without causing degradation of the black & white
information. When the high density black and white
information is combined with low density color information
the result is an apparently high density color image. The
same principles apply to the practice of the technique where
the original material is on video. If fact, with composite
color television systems the practice may be simplified
~;since the color signals are usually generated at lower
resolution than the black ~ white signal.

~;30 The second step is to process the seperated color signal. A
-~black & white signal can be similarly processed by the
computer if it has been scanned and extracted, but the
discussion here will focus on the color signal. Returning to
.~
figure 3A, once the color information ~and optionally the
~;35 black & white information also) has been input to computer ~;
~;
i

1 332 1 92
_ - 15 -

1 310 it may be processed in any combination of several ways.

Firstly, image processing algorithms may be applied
uniformly over a frame for every pixel. For example, it may
be known how particular film dyes deteriorate over time, or
measurement of the faded color of known objects in the image
could yield this information. Rules for how to restore
various colors to their original values would then be
; developed and applied to the color of each pixel. This
I0 technique would also be useful for scenes which were shot,
handled or processed adversly or to match the look between
scenes. Since the digitized images are in the computer, the
rules for color modification may be developed by the
computer from programmed analysis of this information. In
the case where it is desired to match the look of several
scenes, scanned information from one scene may be analyzed
to determine the modifications required for another scene.

Secondly, the computer may be programmed to process the
image non-uniformly over the frame. Here the rules will not
on~1y~take into account the value scanned in for a pixel but
its position in the frame and other information derived from
analy~zing the~image and from knowledge of how the image was
;c~reated or damaged. Non-uniform processing rules may also
25~be~developed to~descrlbe "special effects" or~other desired
res~ults. For example, a light leak in a camera may have
caused one side of the image to be "washed outn. An
algorithm could be developed indicating that the pixels on
~r?~ the "washed out~ side be made more saturated in color with a
30~ gradual lessenlng~of the effect toward the other side. As
anot;her~ example,~we may wish to create the ~special effectn
`f ~ of a light source~off to the right side. A first step would
'; ``~ be~to use an algorithm to detect object edges based on
'~ ~ analysis of~differences in the color and black & white
~ 35 values of groups of pixels. As a second step the right side

1 s32 ~ ~2

- 16 -

1 of all objects would then be colored more like the color of
the light source and the left side less so. If a separate
black & white signal were also in the computer the right
side of the black & white objects could be lightened and the
left side darkened to further enhance the effect.

.
Non-uniform processing rules may be derived from analysis of
information components of images other than the one being
modified. For example, the color values of the pixels of
the color component of one image may be used to modify the
color pixel values of another image to be processed. Assume
the first component is from an image of a person's face
against a "blue screen" background. A second color
component might be modified as follows. Where ever the
first color component had pixels of "blue screen" blue the
second component would remain unmodified; elsewhere the
pixels of the second component would be adjusted towards
that of the first component. This technique could be used
i - .
to produce the effects of adding the reflection of someone's
face to a shop window, of a ghost or spirit, or to add flame
or explosion to an image.

Thirdly, color modifications may be effected by operator
intervention using~the methods described earlier for adding
25~ col~or to black & white film. For example, the tablet 306
and pen 308could be used to outline the face of a character
in every frame or only in some frames with the face outline
being interpolated between frames. All of the~pixels within
this outlined area could then be made redder causing the
appearance of a "flush~, denoting embarasment of the
character.

Tbe technique of~ lnterpolating processes between frames is
not limited to cases of operator input. It may be applied
to any stage of an image generation or processing technique

1 332 1 92
- 17 - ~

1 where a input is not specified for each and every frame or
where unique output is not produced for each and every
frame. For example cross-dissolving may be classified as an
interpolative technique.
The third step in the process is to recombine the color and
black 6 white signals. This can be done in computer 310 if
both are present and then put on film 324 by digital film
printer 318. If only the color signal is in the computer it
is put on a color overlay film 320 by digital film printer
318 and combined with the higb density black & white film
300 in optical printer 322.

~ Figure 4A shows the color modification technique as applied
- 15 to a full color video original 428 or to a color film 426 to
color video 424 transfer of material. Composite color video
~` most often consists of electronically extractable black &
white and one or more color signals. A signal decoder 422
can take in composite color and put out a black & white
(Nluminance" or "Y") signal and color signals (nR-Y~ and
G-Yn~ "I" and "Q" or "Hue" and ~Chroma" are some of the
forms these signals take). The color signals are generally
~`s~ at lower bandwidth or resolution than the black 6 white
signal. (An alternative system sometimes used from color
25~ cameras is for high bandwidth red, green and blue signals
- [RGB] to be available from which the high resolution black &
white "Y" and low resolution color "I" and "Q" signals are
developed.) In all cases the color and optionally the black
& white signals are digitized by the analog-to-digital
30 ~converters~420 and fed to computer 414. As an alternative,
the entire color signal may be digitized and the extraction
of black & white and color signals done computationally by
the computer 414.
'i ;' '~
~he second step of processing the signals is largely the

I ~

t 332 1 92
- 18 -

l same as described above for material originating on film.
One point of difference is that since the black & white and
color seperation may be done before the signals are input to
the computer the black & white component may bypass the
computer completely. However it may be of use to digitize
and input this signal, even at lower resolution, for use by
the computer in determining how to modify the color signal.
The black & white information may then be discarded and
replaced by the original high resolution black & white video
signal.

The third step of combining the black & white information
with the modified color information may be done
computationally by the computer if both are present. The
-; 15 composite signal is then sent to a digital film printer 418
or recorded on video tape recorder 416. Alternately the new
color signal may be combined with the original black & white
video signal in a color encoder 408 and then recorded on
video tape recorder 416 or film printer 418.
As has been noted throughout the color information may exist
at lower density than the black & white information in any
co~mbination of several ways, namely geometric, temporal or
co:lor choice resolution. When combined with a higher
25~ density black & white information component the composite
image will be perceived as high density full color
information because the eye is less sensitive to the color
detail than to black & white detail. However, low d~ensity
information is sometimes perceived as "jagged~, "chunkyn,
ncontoured~ or to ~jump" or "flicker~ in some manner. Image
proces:sing or signal filtering techniques can be applied to
lessen these effects and have been referred to generally as
blurring the signal. This filtering or anti-aliasing can
take several forms and can happen at several different
stages in the system. For the ~jaggedness~ caused by low

1 332 1 9~
- 19 -

1 geometric resolution, processing of the color signal may be
done while it is still in the computer. This may consist of
a simple spatial low pass digital filter or more
sophisticated computer algorithms. It should also be noted
that while the color information may be scanned, stored and
processed at low density, after it is optionally filtered by
the computer it may be output as a higher resolution (but
not really higher information content) signal before being
; recorded. The color component may also be filtered after it
~lO leaves the computer, for example, in the color encoder 208
-~or 408 in Figures 4 and 4A or by defocusing or placing a
diffusion filter in the optical train of optical printer 122
or 322 in Figures 3 and 3A. Lastly, the inherent properties
of film and video media or of the recording systems
I5 themselves may cause filtering. The anomalies caused by low
color choice resolution can be lessened by similar methods.

For anomalies caused by low temporal density the filtering
methods happen over time rather than over space. The
; 20~ cross-dissolving" between unique frames of color
in~formation, that was described earlier, is an example of
this. It can be carried out by the computer or done as a
;post computational process in the film or video domain.

Wh1le an;illùstrative embodiment has been described, it is
of~course, anderstood that various modifications will be
obvious to those with ordinary skill in the art. For
example, the co~lorization and other processes can be applied
` to images and image components stored in the video and
30~digltal co~mputer domains as well as to those on film. Also,
a~variety of processes have been described, not all of
wh~ich~need to be applied in all versions of the invention.
Such modif:ications are within the spirit and~scope of the
invention, which is limited and defined only by the appended
35 ~claims.
, j ~ _, .
. ~ :
~.~

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date 1994-09-27
(22) Filed 1984-05-08
(45) Issued 1994-09-27
Deemed Expired 2002-09-27

Abandonment History

There is no abandonment history.

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Application Fee $0.00 1984-05-08
Maintenance Fee - Patent - Old Act 2 1996-09-27 $50.00 1996-09-25
Maintenance Fee - Patent - Old Act 3 1997-09-29 $50.00 1997-09-26
Maintenance Fee - Patent - Old Act 4 1998-09-28 $50.00 1998-09-24
Maintenance Fee - Patent - Old Act 5 1999-09-27 $75.00 1999-09-21
Maintenance Fee - Patent - Old Act 6 2000-09-27 $75.00 2000-09-27
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
GESHWIND, DAVID M.
Past Owners on Record
None
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Prosecution Correspondence 1984-07-04 1 20
Examiner Requisition 1985-12-05 1 36
Prosecution Correspondence 1986-02-24 2 53
Examiner Requisition 1988-03-25 3 104
Prosecution Correspondence 1988-06-27 3 91
Examiner Requisition 1989-03-17 2 58
Prosecution Correspondence 1989-06-19 2 38
Examiner Requisition 1993-06-02 2 73
Prosecution Correspondence 1993-12-01 1 18
PCT Correspondence 1994-06-28 1 26
Representative Drawing 2001-12-06 1 11
Drawings 1995-09-02 5 353
Claims 1995-09-02 8 708
Abstract 1995-09-02 1 65
Cover Page 1995-09-02 1 93
Description 1995-09-02 18 2,137
Fees 1996-09-25 1 70