Language selection

Search

Patent 2192439 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent Application: (11) CA 2192439
(54) English Title: SUBTITLING VIDEO DATA
(54) French Title: DONNEES DE SOUS-TITRAGE VIDEO
Status: Dead
Bibliographic Data
(51) International Patent Classification (IPC):
  • H04N 5/278 (2006.01)
  • G06T 9/00 (2006.01)
  • G09G 5/22 (2006.01)
  • H03M 7/46 (2006.01)
  • H04N 1/419 (2006.01)
  • H04N 5/445 (2006.01)
  • H04N 7/087 (2006.01)
  • H04N 7/088 (2006.01)
(72) Inventors :
  • ATKINS, DAVID JOHN (United Kingdom)
(73) Owners :
  • SCREEN SUBTITLING SYSTEMS LTD. (United Kingdom)
(71) Applicants :
  • SCREEN SUBTITLING SYSTEMS LTD. (United Kingdom)
(74) Agent: CALDWELL, ROSEANN B.
(74) Associate agent:
(45) Issued:
(86) PCT Filing Date: 1995-06-09
(87) Open to Public Inspection: 1995-12-14
Availability of licence: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): Yes
(86) PCT Filing Number: PCT/GB1995/001353
(87) International Publication Number: WO1995/034165
(85) National Entry: 1996-12-09

(30) Application Priority Data:
Application No. Country/Territory Date
9411615.9 United Kingdom 1994-06-09

Abstracts

English Abstract




Subtitling data is generated for association with video signals. The
subtitling data may be combined with the video signals, by placing it in
vertical blanking periods for example, or it may be supplied over an
associated data channel, particularly when the video signal has undergone
digital compression. A representation of displayable characters are generated
in the form of a pixel array and output codes are generated representing
compressed lines of said array. Run-length encoding is performed by
recursively addressing a look-up table and shifting new input data to provide
addresses to said look-up table as required. The output codes are synchronised
to the video frames for transmission, via terrestrial broadcast, cable or
satellite, or for storage.


French Abstract

Des données de sous-titrage sont générées en association avec des signaux vidéo. On peut combiner les données de sous-titrage avec des signaux vidéo en les plaçant dans des périodes d'occultation verticales par exemple, ou les émettre via une voie de données associée, notamment lorsque le signal vidéo a subi une compression numérique. Une représentation des caractères affichables est générée sous la forme d'un ensemble de pixels, ainsi que des codes de sortie représentant les lignes comprimées de cet ensemble. On effectue un codage RLL en adressant de façon récurrente une table de consultation et en décalant les nouvelles données d'entrée pour fournir les adresses à ladite table de consultation. Les codes de sortie sont synchronisés avec les trames vidéo utilisées pour le stockage ou la transmission (par voie terrestre, par câble ou satellite).

Claims

Note: Claims are shown in the official language in which they were submitted.


24

CLAIMS

1. Apparatus for generating subtitling data for association with video
signals, comprising
character pixel generating means arranged to generate a representation
of displayable characters as a pixel array; and
compression means arranged to produce output codes representing lines
of said pixel array in compressed form; and
synchronising means for associating said output codes with video frames.

2. Apparatus according to claim 2, wherein said subtitling data is
associated with the video signals by being added to vertical blanking periods of a
conventional television broadcast signal.

3. Apparatus according to claim 1, wherein said subtitling data is
associated with video signals by being added to an associated data transmission
channel.

4. Apparatus according to claim 3, wherein said associated data
transmission channel is associated with digitally compressed video frames.

5. Apparatus according to claim 1, wherein said compression means
is arranged to perform a process of run-length encoding upon lines of said pixelarray.

6. Apparatus according to claim 5, wherein said run-length encoding
is performed by addressing a look-up table.



7. Apparatus according to claim 6, including means for generating an
output code and means for addressing a
plurality of look-up table addresses before generating said output code.

8. Apparatus according to any of claims 1 to 7, wherein said subtitling
data is stored as character codes, said character codes are read at video rate in
response to video time code, and said read character codes are supplied to an
encoder at video rate.

9. Apparatus according to claim 8, wherein said encoder converts
character codes to run-length codes at video rate, to produce run-length codes
synchronised to said time code.

10. Apparatus for encoding image data representing picture elements,
comprising
a look-up table configured to produce table data in response to input
address data;
means for supplying contiguous picture element data as input address
data to said look-up table; and
analysing means for analysing said table data read from said look-up
table, wherein
said analysing means analyses a first table data and generates run length
output data or,
in response to said analysis, on detecting table data representing input
data having runs of similar data extending beyond the input address, said
analysing means requests new address data from said input data, and said
look-up table produces new table data in response to said new address data,
said new table data comprising a code representing a string of input data largerthan a said input address.

26


11. Apparatus according to claim 10 wherein said analysing means
determines the number of input pixels
encoded by an output code, such that elements previously used as addressing
elements are not encoded by said code and additional contiguous picture
elements are read to provide a new address to said look-up table.

12. Apparatus for adding subtitling data to a video image, wherein said
subtitling data is received with associated video signals, said subtitling data
comprising output codes representing lines of said pixel array in compressed
form, said apparatus comprising:
means for decompressing associated subtitling data at video rate;
means for assembling said subtitling data as pixel values; and
means for combining said pixel values with associated video frames at
video rate.

13. Apparatus according to claim 12, wherein said subtitling data is
received with a broadcast television signal in the vertical blanking periods of said
signal.

14. Apparatus according to claim 12, wherein said subtitling data is
received with a digitally encrypted signal in an associated data channel.

15. Apparatus according to claim 14, wherein said digitally encrypted
video signal is compressed to reduce video image bandwidth.

16. Apparatus according to claim 15, wherein said video data is read
from a local storage medium with said associated subtitling.

27

17. Apparatus according to claim 15, wherein said video data is received from
a cable television system or a satellite system with said associated subtitling.
18. A method of generating subtitling data for association with video
signals, comprising
generating a representation of displayable characters as a pixel array;
producing output codes representing lines of said pixel array in
compressed form; and
synchronising said output codes with video frames.

19. A method of encoding character strings represented as picture
elements, comprising steps of
supplying contiguous picture element data as an input address to a look-up
table; and
analysing said table data read from said look-up table, wherein first table
data may be supplied to an output or, in response to said analysis, on detectingtable data representing input data having runs of similar data extending beyond
the input address, new address data is selected to produce a new table data
comprising an output code representing a plurality of input picture elements
larger than a said input address.

20. A method of adding subtitling data to a video image, wherein said
subtitling data is received with associated video signals, said subtitling data
comprising output codes representing lines of said pixel array in compressed
form, the method comprising:
decompressing associated subtitling data at video rate;
assembling said subtitling data as pixel values; and
combining said pixel values with associated video frames at video rate.

Description

Note: Descriptions are shown in the official language in which they were submitted.


~ wo 9sra4i65 2 ~ 9 ~ 4 ~ 9 r~ ia~a



SUBTITLING VIDEO DATA

The present invention relates to adding subtitling data to video signals
and to l~lu~,e~ g video signals having subtitling data added thereto.

Processes for the addition of subtitles to video and AinPnnq~A,~phi~A
film are known. Traditionally, the video material was processed by a
trqnclAtin~ and subtitling d~ , resulting in new video material being
produced with subtitles forming a p ~llA ,~ ,I part of the video image. In the
art, the subtitling data is referred to as being "burnt" into the video pictures.
It is also known to include subtitling data as a teletext ~ t~ d
page. Under the teletext system, character data is supplied during vertical
blanking intervals of the video l"..,~.";~-~)n Each video frame includes
blanking intervals at the top and bottom of each image, that may be
15 considered as I I A- I~ 1 Iines which do not contain video ;. . r, .., ., - ;~ ", This
in turn may be cu..~idel~d as ArlrlitiAnAlly available bandwidth, allowing data
to be 1,, .~ d therein, such as the ~ru. "~ teletext data or Nicam
stereo audio data etc.

In conventional teletext systems, character codes are 1,, ~ during
20 the blanking periods and these codes are translated into dia~ y_ble characters
at the receiving station. Known teletext systems therefore have restricted
character sets and the definition of these character sets is severely restricted.
Thus, it is possible using these character sets to generate characters in most
roman based lqngA,lAgPc, including English and most European lAAn~A~lA~c but
25 this character set does not facilitate the 1,~ ~.";~:,, of characters in manyother lqn~ PC Thus, for example, it is extremely difficult to convey

WO 95134165 2 1 9 2 4 3 9 P~


Chinese, Korean or Japanese characters using teletext t. h~ therefore
subtitles for these languages, using conventional methods, are burnt into the
video images.

A procedure for adding subtitles to video images is described in
S European Patent Publication No. 0,400,990. The system provides an
cllvilul~ llt in which a user may generate titles (not necessarily translated
subtitles) wL~ an~l this data may be added to video frames processed by a
video tape recorder. A more 50phi~ti~t~d system for adding subtitles is
disclosed in United Kingdom Patent Publication No. 2,170,371. In this
system, non-Roman characters may be combined with video frames by storing
the character image data on an j..~ I data carrying medium, such as an
optical disc. In this way, only time codes are stored on the actual video data
and m operation these time codes are used to access full bandwidth character
inf~rrnsltit~n stored j,,.1. IJ~ lly on the optical disc medium. Clearly, a
15 severe di:~adv~ a~, of this system is that s~lLIh;~ d additional hardware
is required at a receiving station.

The above l~ i4~ represent subtitles as an array of pixels and
within this array ~h~ ly any c~nfi~lr~ n of lettering is possible.
However, the severe disadvaula~, of such an approach is that the subtitling
20 pixels require a snhst ~nti~l amount of bandwidth for storage or 1 ,.. .~
In particular, incnffi~i.ont bandwidth is provided in video blanking periods forsubtitling of this type to be 1,, ,~ l ;1 in a form which would allow it to be
sel.,.,~iv~ly combined with the video images at the receiver.

According to a first aspect of the present invention, there is provided
25 apparatus for generating subtitlrng data for ~o~ -l;.... with video signals,
c~mpri~ing character pixel generating means arranged to generate a

2 1 ~
~ W0 9S134165 r~ a:~5


,..L~ion of di~layal)le characters as a pixel array; compression means
arranged to produce output codes ~ lliug lines of said pixel array in
cvl.ll,l~ ed form; and ~yll.,LIvlli~illg means atranged to ~yll~,hlvlli~c said
output codes with said output video frames.

Thus, an advantage of the present invention is that characters are
generated as an array of pixels, thereby allowing l~; g Al~l,;r characters to beused as subtitles without reference to a small set of character codes. The
CU~II,ul~ivll means is then arranged to produce output codes l~ ,a~ g
lines of the pixel array in cv...~,.c;i,~;1 form. thereby reducing the bandwidth0 I~ UilG~ lL for the subtitling data. In this way, the subtitling data in
Culll~ form may be associated with the video signals (for example by
placing said cv...,u.~,,cd data within the video blanking periods) without
requiring unrealistic levels of bandwidth.

In a preferred ~ I,odil--~l-L the subtitling date is associated with the
15 video signals by being added to vertical blanking periods of a cv.l~,~,l.Lullal
television broadcast signal. Alt~,llldti~,~ly, the subtitling data is associatedwith the video signals by being added to am associated data ~ ."
chalmel and this data l,, ,~ ";~ , channel may be associated with the video
frames conveyed in cvlll~ ai,~d form, such as in àccOld with MPEG 2
Ic~ , ;nn~

According to a second aspect of the present invention, there is provided
apparatus encoding image data lc~ llLllg picture elements, c~lmrri~ing a
look-up table configured to produce table data in response to input address
data; means for supplying c~ nti~lr~nc picture element data as input address
25 data to said look-up table; and analysing means for analysing said table dataread from said look-up table, wherein said analysing means analyses a first

wo 95/34165 2 1 9 2 4 3 9 PCT/GBgS/01353


table data and generating run-length output data or, in response to said
analysis, said analysing means requests new address data, from said input
data, so as to produce new table data, on detecting table data l~ g
mput data having runs of similar data extending beyond the input address.

Preferably, the analysing means ~1. r~ the number of input pixels
encoded by an output code, such as elements previously used as addl.~ g
elements are not encoded by said code and additional conti~l~nlc picture
elements are read to provide a new address to said look-up table.

According to a third aspect of the present invention, there is provided
apparatus for adding subtitling data to a video image, wherein said subtitlmg
data is received with associated video signals, c~,...l.. ;~;..g means for
d~,vlllJJl~a~;llg associated subtitlimg data at video rate; means for ~cc~mhlingsaid subtitling data as pixel values; and means for ç--mhining said pixel valueswith associated video frames at video rate.

According to a fourth aspect of the present invention, there is provided
a method of generating subtitling data for ~Cco~ nn with video signals,
c~ g generating a l~l.l~ ,.,lll~tion of li~L.~lc characters as a pixel
array; producing output codes l.,~ llillg lines of said pixel array in
CVL~ Cd form; and ~ ,Lvlli:,illg said output codes with video frames.

Preferably, CVIll~ .iUll is effected by a process of run-length encoding
upon lines of the pixel array. In a preferred cllllJvdill.-llL the run-length
encoding is performed by ad~ hlg a look-up table and, preferably, a
plurality of look-up table addresses are , ' ~ before an output code
is generated.

~ WO 95/34165 2 1 ~ ~ ~ 3 9 PcTlGB95lol3s3


The invention will now be described by way of e~ample only, with
reference to the accu~ llyi~,g drawings, in which:

Figure 1 shows a video monitor, displaying a video image with
subtitles added thereto;

Figure 2 shows a system for gen~r)lting subtitles, including am off-line
assembly station, a subtitle syll~,LIvlli~. and a subtitle encoder;

Figure 3 details the assembly station shown in Figure 2;

Figure 4 details the subtitle ~ ,LIUIIis~,~ shown in Figure 2;

Figure 5 details the subtitle encoder shown in Figure 2;

Figure 6 and Figure 7 illustrate the operation of the subtitle encoder
shown in Figure 5;

Figure 8 shows a receiving station for receiving subtitles generated by
the system shown in Figure 2;

Figure 9 illustrates the operation of the receiving system shown in
15 Figure 8.

A video monitor 15 is shown in Figure 1, that may forrn part of a
cbll~ iullal television receiver. The video monitor displays a conventional
television picture 16, over which subtitles 17 have been overlaid, by a process
known as keying.

21 92439
wo 95/34165 ~ .ll~,., ~1J~ ~


The subtitles are placed within a notional horizontal band towards the
bottom half of the picture, in response to signals received by the receiver
itself or by a separate decoding apparatus. Thus, at the receiving station, it
is possible to sele~ ,ly decide whether the subtitles are to be combined with
S thedisplayedvideoimage. Furthermore, inamore5~ Ilo.l;.~
it is possible to select from a plurality of available languages of subtitles, each
of which may be keyed onto the video image in ~yll~llLvll; llll with a~ Li~Lt.,
frames of displayed video.

Video data keyed within region 17 is derived from pixel data
0 ICIJlC:~,lltillg a region 50 lines high and 720 pixel locations across. Subject
to bandwidth ava;labilily~ it is possible to transmit any image within this
region, such that the generation of keyable subtitling characters is not
restricted to characters available from a character generator provided at the
reception end. Thus, with a region of this size, it is possible to display
characters from any desired character set, such as Chinese, Japanese or
Korean etc.

Subtitling characters tend to be displayed in solid colours, therefore
with a defmition of 750 pixel positions over 50 lines, it is possible to supply
single bit data for each pixel location. Thus, ~ ;..g data of this type is
20 usually referred to as a bit map. In alternative cl-.l,odi.l-.,l-ts, subject to
available bandwidth, it is possible to transmit keyable image data having
multi-bit values, for example to indicate image data having a plurality of
colours or l,. ;gl.~ Filtering l~ are also employed to "soften"
the edges of the key.

An overview of a system for generating subtitles is shown in Figure
2. At an assembly station 201 an operator reviews video sequences and

~ W0 95134165 2 ~ 9 2 4 3 ~ F~ ..,. S [1;,~.~


manually enters subtitling character codes. The assembly station 201 is a
conventional c~-nfi~mqtinn and is arranged to produce files of data consisting
of video time codes with associated strings of subtitling text. For each video
sequence a plurality of such files may be produced when subtitling is required
S in a plurality of l~n~lqgec Thus, each file would consist of time code
listings with an associated text string of the ~ u~l;at~, language. These files
are written to a ~ lloval~lc data carrying medium, such as a 31/2" floppy disc
202. Thus, the qcc~mhling operation is ~t~,LiY~ly performed "off-line" to
produce files which relate subtitling strings to specific frames within video
seql-~ n~Pc, identified by time codes.

The system shown in Figure 2 also includes a subtitle 7yll~luul~;.7
203, a subtitle encoder 204, a video ~ .1 device 205 for CUIlvcillliul~al
broadcasts, a video ~ ,ly~liull device 206 for digital ~lictrihlltil~n and a video
recorder 207 for recording subtitles as video pictures. The ~y~,luù~ 203,
encoder 204 and m~lrllllqti~m devices 205 and 206 are arranged to operate in
real time, allowing subtitling characters to be associated with video
infnrmqtil-n while said video informqti~n is being 1,, ,~.";lr~ ~1 In this way,
decisions relating to the subtitling process do not need to be made until the
actual time for l,,..,~ -", occurs, thereby enhancing system flexibility and
~ l;", .,-:;"g the need for full bandwidth storage of video inf 7rm~tinn with its
associated subtitling data.

The assembly station 201 is detailed in Figure 3 and, as ~ viuu71y
~ stated, represents a station of ~.,h~l,."l;~lly conventional form. Input video
source material from a video tape recorder 301 or similar device supplies
video signals to a processor 302. In addition, the video tape recorder 301
also supplies time code to the processor 302, such that said processor may

2 1 92439
W0 95/3416~ P'~


identify specific frames of the video sequence with reference to said time
code.

A manually operable keyboard 303 allows an operator to manually
select characters for inclusion in video sequences as subtitles. As character
5 subtitles are being generated, they may be displayed via a visual display unit304 and video SPqnPnrP~ with or without subtitles, may be displayed on a
video monitor 305.

The A~coAiAtinn of specific subtitling strings with video frames is
recorded by A~U. A i~lg these strings to j l ,lil; ~.. ~ of time code. Thus,
10 data of this type, mapping time code to character strings, is written to files
on a l~llluvdblc floppy disc 306, using disc drive 307.

The subtitle ~yl~,l..u~ 203 and the subtitling encoder 204 operate
in real time allowing the subtitling data to be associated with the video data
as the video data is being ~ d or recorded. The ~yll~LIvlli~e~ 203 is
15 detailed in Figure 4, shown connected to an automated video feed device 401.
The video feed device 401 is loaded with a plurality of source video tapes,
allowing video signals to be produced (video out) for broadcast purposes from
a collection of pre-recorded tapes. Thus, it is possible for several hours,
possibly several days, of video material to be generated by the video feed
20 device 401 with ~Anhst:~nt~ ly little manual i..L.,. v .,l..ioll.

The automated feed, such as a Sony LMS, supplies an i~iPntifirAti~-n
of the current video sequence selection to the ~ llUII;~ 203. The video
source material to be played is selected and, while playing, video time code
is supplied to the ~yll~,LIulli~,l 203.

~ W095/34165 ~19~-~3~ r~ ,s,~c~


The syn~lllu~ 203 receives a disc 202 of subtitling files and these
files are read by floppy disc drive 402 under the control of a Illi~lU,UlU~ VI
based system 403. The lld~lulJluc~ aul based system uses a cull~ ltiullal
central ~lu~.~a~hlg unit, such as an Intel 88486 DX2 device, with associated
5 program memory. Data read via the floppy disc drive 402 is assembled by
the ~ h,~uulu~ ul system 403 and written to a local hard storage device 404.
As this process occurs, individual files, related to the same video sequence,
are combined such that, for each time code entry, each of the available
subtitles, of differing lqn~l~ge~ are combined into the same file. Thus, in
10 this way, the number of files present on the hard disc 404 is reduced and
selection of a required subtitling language is made by an a~ulu~ , index
mto the data read from the disc 404.

The ~ ,LIu~ ,l 203 also includes a Graphics Signal Processor (GSP)
based l..u~,c;,~;--g environment 405, configured around a Texas 34010 device.
15 On start up, instruction codes for the GSP are dulllOàdcd, from the hard
storage device 404, via the Ill;~lu~uluc~ ul subsystem 403 and its associated
system bus 406. The GSP 405 is configured to ~yll.,LIull;~, the subtitling
files to the incoming time code. The incoming time code, read by a time
code reader 407, is supplied to the GSP subsystem 405 via the U~UlUCCil:~'.)l
subsystem 403 and its system bus 406. Time code is supplied from the reader
407 to the lllh,lu,ulu~,c;~.)l subsystem 403 under interrupt control, ~L~ a~l
said processor 403 directs the time code infnnns~ti~ n to the GSP subsystem
405. The time code received by the time code reader 407 is ~ llUlll~d to
~ the video output signal and the GSP subsystem 405 is arranged to maintain
this ~ uull;~ such that, in response to time code coming in, its associated
subtitlmg text is supplied as a ~ Llulli~d output.

2 1 92~39
WO 95/34165 r~ 5 ~.~5


The syll~lll u~ e :i subtitling text, assembled by the GSP subsystem 405,
is supplied to a parallel to serial conversion circuit 408 which~ given the
processing power of the GSP subsystem 405, is capable of generating eight
serial outputs, possibly conveying subtitles in different l~ gec at videû
S rate. In the present .,lllI,odilll.,~lL, one of these serial outputs of ~yll~,hlvlli~ed
subtitling characters is supplied to the subtitling encoder 204 via an output
port 409.

The subtitle encoder 204 is detailed in Figure 5. The encoder is based
upon a Texas 34010 GSP subsystem, in which the GSP 501 c-.. ;~n
with a hard disc storage device 502, a two megabyte frame buffer 503, an
eight megabyte random access memûry device 504 and a serial
~n~ device 505. The eight megabyte random access memûry
device 504 is used for storing working copies of fonts and program-specific
i.,r.,, ...~1;.,.. Images are cu"~.lu.,t~d within the two megabyte frame buffer
15 503 which is configured frûm RAM devices providing a serial access port,
allowing the image to be scanned out in real time in Dy~ u..;~lll with a
video source.

The encoder 204 receives character infnnn~til n at video rate in
response to time codes received by the syllclllulli~l 203. These character
20 strings are supplied to an input serial port 506. which in turn directs thern to
the working memory 504 over an intemal system bus 507.

Once buffered in the working memory 504, the GSP 501 processes the
i ~ - r( ~. ~ - -~ " ~ to determine the character data structure. In addition to particular
characters, this will contain other inf(mnqrinn such as font type and font size.25 In response to this infnrmsltinn, the processor 501 produces a list of blitting

~ W0 9513416S 2 1 9 2 ~ 3 9 r~


operations required to generate pixel inforrn~ti~n, derived from the
inforrn~ti~n available on hard disc 502, from the input character strings.

Having produced a blit list of this type, the blit list is executed to
produce image pixels that are written to the two megabyte frame buffer 503.
5 Having written pixels to the frame buffer 503, an ,.~ . .i is made as to
the area of coverage within the buffer, resulting in inf~mn~tir~" being
generated identifying a bounding rectangular (603) and the position of said
rectangle within the di~lJla.~able area of the final image screen.

Having supplied a complete subtitle to the frame buffer 503, the
processor 501 scans the image retained within the frame buffer 503 line by
line, to produce c~ ..,d run-length codes that are supplied to an output
port 508.

As shown in Figure 6, a subtitle image has been written to tbe frame
buffer 503 in which the first word, positioned at the top left corner, starts
15 with the letters "HE". The processor 501 has .1- f ...;...~l tbat the whole of
the subtitle starts from the top left corner of the letter "H", therefore it is
necessary to initiate scanning from this position. Arrow 601 represents the
starting position for the first scan Ime. Scan line 601 will result m
infnnn~tion being produced to the effect that four white pixels are required,
20 followed by eight black, followed by four white, followed by six black,
followed by eleven white, and so on.

Similarly, at scan line position eight, the starting point of which is
identified by arrow 602, the relevant infonn~fi-~n consists of sixteen white
pixels, followed by six black, followed by seven white and so on. Thus, it
25 can be a~ t~,l that for the majority of subtitling ~h~r:~rtrr~ the run of

2 1 92439
WO 95/34165 PCT/GB95101353


pixel data will consist of a ~ t~ ,; l Ird number of white or coloured pixels,
h~g the location of characters, followed by runs of black rh~r~rtl~r~
Lillg the spaces.

In the majority of situations in which subtitling data is to be associated
5 with video data and possibly combined with said video data for L.- .~...;-- ...
over a comrnon carrier, bandwidth is limited; therefore it would not be
possible to provide pixel infortn~tir~n for each pixel within the subtitling
region. The data is ~ .1 umder such ~ by effecting a level
of data compression and given the nature of the infr,rrn:ltir,n involved, run-
10 length encoding has been identified as an attractive option. However, undernormal schemes for p~.rullllhlg run-length encoding, it would be necessary
to examine each pixel individually in series, to make a decision as to whether
that pixel value is the same as the previous value, or different from the
previous value thereby initiating the start of a new run for the scam line being15 examined.

In the present CllVilVIllll~, all of the processing perforrned by the
subtitle ~yll-,hlvll.~,~. 203 and the subtitle encoder 204 is effected in real time
such that time code received from video source material, bemg l.,...~ .(i in
real time, may be used to generate and associate subtitles with said video for
20 ; . l ....~ real time n ,~ . Under these ~ the processing
and encodmg of the subtitling pixels must also be perforrned in real time and
this is extremely difficult, given realisable platforms, if it is necessary to
serially consider the nature of each pixel in order to detect the start and endsof individual runs.

It is known to encode data strings by examining a plurality of bits,
making up a word, in parallel. Output codes may be generated for each input

2 1 9~39
WO 9S/34165 PCTIGB9S/01353


code by using the input code ~ an address or index to a look-up table. Thus,
for example, an eight bit word could be supplied to a look-up table having a
total of two hundred and fifty six adL~ "al,lc locations. Thereafter, assuming
that the input data words have a ~JIcli~lablc level of l~d~ulda~l~ y7 some words5 occurring more often than others, it is possible to produce smaller output
codes for the regularly occurring words with the longer codes being used for
the less frequently occurring words.

In acculdallcc with known systems, it is possible for the look-up table
to be updated over time, so as to perform an ~ l. process in terms of
10 the mapping of input words to output words. Thus, at the i , a
process may continually examine the input data words to determine which are
occurring more frequently. On the basis of this A~ , it is then possible
for the ~ f ' to issue an hl71lu~ liull to the receiver to the effect that, at
an a~lJIUl moment, a l~lnd;r~ ;llll is to be made to the look-up table
15 coding. Thus, in this way, it is possible for the ~ to identify a new
optimieAtion table, issue codes to the receiver to the effect that a 1ll~
is required and thereafter make use of the new IJlJ~ A1;llll table so ~ to
enhance overall p~.. r~lllll~ 1..~. and to take full advantage of the inherent
l~lulllau~,y in the input data stream.

A first problem with using look-up tables in a subtitling ~,llVilUIII~
is that ~ ll paths are often ,- ~ ~IJ1;1 Jc to relatively large levels of
noise. When noise is present on the line, data l l n~ are corrupted and
- it is krlown to introduce levels of 1~ d~d~.~y in order to facilitate error
i~Pntif f~ m and error correction. However, the whole point of using look-
up tables is to reduce data rates and minimise l~dufluall~y, therefore it would
be high y ulld. i7ha'l,1e to start hllludu~ hlg new l~dulldàu~y in order to provide
a level of error correction.

WO 95/34165 2 1 9 2 4 3 9 PCT/GB95101353

14
Noise immunity is sllhct~mti~lly improved if the code ~,Ull~ iU~I tables
remain fixed and do not attempt to perform adaptive oprimic~tinn during
1". . .~ 5 " . However, a problem with this approach is that the coding may
not make the best use of the inherent l~,dulldall~,y, resulting in a notionally
5 higher bandwidth l~lU;lI;illl-,ll~.

A further problem with the look-up table approach is that the coding
process is limited to input strings of a length equal to the loûk-up table indexaddress bus. Thus, for an eight bit look-up table, the input string would be
c(,.,l~ d in units of eight bits. Clearly when CC~ i lg data derived
10 from images of the type shown in Figure 6, runs of 5nhct~nfiqlly more than
eight bits may be present, and these large runs could be culll~ l very
efficiently using run-length encoding t ~ l..i However, as ~ viuu~ly
stated, cu--~,llLiu~al run-length encoding h ' . serially examine each
incoming pixel so as to identify transition points. This requires a s~hQfs~fi~l
l~lu~ hlg overhead and cannot be j",pl~ .. ~ d practically for the present


Thus, it can be seen that a first constraint is placed on the 1~
of pixelated subtitles in that, in many CIlVilulllll.,llt~, the available bandwidth
for 1.~ subtitles is severely restricted. As used herein, the subtitles
20 are referred to as being associated with the video, meaning that the subtitles
are ~y---,luull;..c-l to video frames allowing the receiver to overlay or key the
subtitling data at the a,UIJIU~ t~ points in the ~ . In some
situations, such as traditional broadcast cllvilulull~llL~, the ~u ;,.~;m. involves
"" ~ ;,.g the subtitling data with the 1.,..,~ signal. This may be
25 achieved by placing the subtitling data in vertical frame blanking periods, in
a similar fashion to that employed by teletext systems in which character
codes are conveyed in vertical blanking periods. The use of character codes

2 1 92~39
WO9S/34165 F~ .,,~v1353


in teletext systems is very bandwidth-efficient but the amount of bandwidth
for 11~ .";ll;,lg pixel data is severely restricted and it is not possible to
transmit an image of the type shown in Figure 6 without performing a level
of data Culll,ul~oi~iull. F,~PriPnre has shown that it is necessary to provide a5 CvlllAul~o~i(JIl ratio of at least three to one in order to transmit pixelated subtitles within vertical blanking periods.

Associated pixelated subtitles may be associated with the video frames
(i.e. syll-,lllulliO.,d) for other forms of lln~ ll For example, using
MPEG 2, separate channels are provided for the ~ ., of additional
10 data and these channels may be employed for the 1l~ ... of cu~ U~io ~,d
pixelated subtitle data. Other digital 1l~ lll systems are becoming
h~ oill~ly available and again data channels will be provided for the
tr2-ncmiceion of associated data, such as subtitles, along with other types of
non-~y...lllu.liO~d data. In ~..h~ ly all of these cases, the amount of
15 bandwidth for the L~ ;.lll of associated data is restricted and data
UUIII~UI~ oiU~I tPrhniqllrc are required.

If the pixelated subtitling data were being associated to video frames
in non real-time, that is, as an off-line process, the only constraint would be
that of bandwidth and processing power would no longer become a problem.
20 Situations in which encoding l h" ~ require ~ ;nlly more processing
power at the tr~n~micci~n end compared to the amount of plu~,e;>Oillg power
at the receiving end are well known. For example, during MPEG encoding
~ search ~ r~rithm~ are required in order to encrypt ;~ r frarnes in
order to calculate ,~ r, .,- .I vectors. It is the r.,~ ;"" of the vectors
25 that requires a ~i~lbo~ ..u~,.,;"i..g overhead, while at the receiver it is asimple process of merely making use of the vectors calculated at the

WO 95134165 2 1 9 2 4 3 9 PCT/GB95/01353

16
In the present ~ bodil-lc.lL, the subtitling data is associated with the
video stream in real-time. Therefore, in addition to bandwidth c~".~ a
further constraint is made upon the system in terms of IJIU~ aillg power.
Thus, known pixel-by-pixel run-length encoding 1~.1",;~1". ~ would provide a
5 good solution to optimiQ;ng bandwidth, whereas word-by-word look-up table
1~. I",;~ examining a plurality of pixels in parallel, would provide a good
solution to reducing the processor overhead.

As previously stated, in the present ~,llll,odilll~ both bandwidth and
processing speed are ~",~1".;".~ therefore neither of the above known
10 ~.h~ provide suitable solutions for cu~ a~;llg the pixel data for
trPncmicQ;rm in an associated form in real time.

Operat;on of processor 501, in order to encode images of the type
shown in Figure 6~ may be cu..Did~ with reference to the functional
iilnctrstinn shown in Figure 7. A shift register 701, a look-up table 702 and
15 an output buffer 703 are physically provided as ad~Lcaa~liJlc locations within
random access memory device 504. Data is read from frame buffer 503 amd
processor 501 effects the functions of an input modifier 704, an output
modifier 705, and analysing logic 706.

The look-up table 702 is tqctahliQh~d within random access memory 504
20 on system initialiQstinn The look-up table values remain constant durmg
operation and the inverse table, to perform the dcculll~ ,iull process, is
retained p " ~ ly at decoding stations. The look-up table 702 receives a
nine bit address and is tberefore capable of holding a total of five hundred
and twelve table entries. Character strings, l~ llt.,.l as picture elements,
25 will have been assembled in the frame buffer 503. Contiguous data elements
are read from the frame buffer 503 line-by-line and supplied to the shift

2~ 9~3~
~ W095/34165 .~ '.'al.5:~


register 701. The shift register 701 provides means for supplying c~-nti~-nu~
picture element data, as an input addresses~ to the look-up table 702.

The output from the look-up table, in the form of table data, is
supplied to analysing means in the form of the analysing logic 706,
,,,,IlI.. ,.. ,f~d by processor 501. The analysing logic 706 is arranged to
analyse table data (which may be coll ,id. l~d as first table data) that may then
be supplied to the output buffer 703 via the output modifier 705.
A t~ , the analysing logic 706 may request a new address data,
although at this stage no output data will have been produced. New address
10 data is supplied to the shift register 701 but the input address is modified, by
means of the input modifying circuit 704, in response to address modify data
received from the output modifier 705. Thus, although the ad~ ,;,...g of the
look-up table 702, for this new data, is performed in a ~nhst~nti~lly similar
way to that performed for the initial look-up, the input address has been
15 modified by input modifying circuit 704, therefore t'ne addressing of t'ne look-
up table 702 represents an i~i. . . l; r.. ~ ;"., of output codes ~. l r~ g an input
string larger than the input address. This situation occurs when a long run of
similar characters has been identified, thereby r~ ;..g highly optimised
run-length encoding. On the second iteration it is possible that the run has
20 continued, therefore again it will be possible for the output modifying circuit
to modify the ad~L.;"h.g input and for the analysing logic 706 to request new
data from the frame buffer 503. Thus, new table data will be produced,
possibly suitable for supplying to the output buffer 703, on detecting initial
~ table data 1~ 7~11Lillg input data having runs of similar data extending
25 beyond the input address.

The operation of the system in Figure 7 may be c-,..,;d. .~d in further
detail with reference to the illn~trslti~-n shown in Figure 6. Coding is initiated

WO 95/34165 2 1 9 2 ~ 3 9 F~ ..,,S,'/ii3~

18
firom the top left corner of the first character, therefore no coding is
performed until a transition occurs from the notionally black background to
the notionally white character edge. As shown in Figure 6, this represents the
coding of four white characters followed by eight black rhA-~rtP c Nine
S characters will be written to the shift register 701, consisting of the first four
white characters of line 601 followed by five black rhAr:.rfP-.~ lAhe look-up
table is therefore presented with an address consisting of four white charactersfollowed by five black characters although optimised run-length encoding
would not produce an output code for the white characters umtil the full
10 length of the run had been identified; the run consisting of a total of eight characters in this example.
In a non-recursive, open loop situation, using conventional look-up
tables, an output code could be produced identifying the situation in which
four white characters have been received followed by five black rhDrAArtP-
~
15 Groupings of this type are common in character strings, therefore thiscrnfigAI~tirln could be allocated a relatively short ~ 1 code.
However, from a run-length encoding point of view, full ~ ;(III would
not have been achieved. IAhus, the look-up table 702, in accvldA. I,C with the
present clllb - ' t, does not produce an output code l~ ,.,lllillg four white
20 characters followed by five black characters. Under these ~,il' ....~1 - . ~ it
produces a recursive code to the analysing logic 706, informing said analysing
means that the input string consists of four white characters followed by a run
of black rhArA~rrP-c With this ;.. r.. ~ ;. ,.. known to the analysing logic 706,
the frame buffer 503 is addressed, shown filnrtiorDlly as line 707, resulting
25 in the frame buffer 503 supplying new contigAIr.--c data to the shift register
701.

The previous data, consisting of four white pixels followed by five
black pixels, is replaced and the new input data consists of three black pixels,

2 1 92439
W095/3416S P_l,. _.'~I.S:~.t

19
followed by four white pixels, followed by two black pixels. However, the
input address to the look-up table 702 will be modified by input modifier
704, c~c~,liv~,ly providing infonn~tinn to the effect that the three black pixels
are related to the ~IICV;U~JIY Cvll~h~ cll input and that the ~ ,vivu~ly
S cull,idc,~d input has not, as yet, been supplied to the output buffer 703.

The output code from the look-up table 702, supplied to the analysing
circuit 706, informs said analysing circuit that the previously identified groupof five black pixels is completed by a further three black pixels before a
transition from black to white occurs. Furlhermore, after this transition
10 occurs, four white pixels are ~ ;urd before a transition back to black
pixels occurs. Thus, the analysing logic 706 is aware of the transition from
white to black but, at this stage, it is unable to determine how many black
pixels are present. C.~ y~ an output code is produced identifying a
run of four white pixels, followed by a run of eight black pixels, followed by
15 a run of four white pixels. Thus, sixteen pixels have been coded, nine
Cull~;d~,~cd as a first table address with the remaining seven being cv..i.;d~,.~;l
as part of a second table address. The table has been addressed Ic~ ;v.,ly,
so as to produce a composite code for a total of sixteen input pixels. The
composite code for the sixteen pixels is assembled within the output modifier
20 705, which, on this occasion, allows the output code to be directed to the
output buffer 703. The input modifier 704 is cE,~ ly re-set, such that the
next conti~ input pixels will be cull,id.,..,d as the start of a new run.

Sixteen input pixels have been cvll,id.,.cd, whereas a total of eighteen
pixels have been read from the frame buffer 503. The shift register 701 is
25 therefore iUl~lClll~ t~ by seven positions, so that the two remaining pixels,l~lc~c..liug the start of the run of black pixels at transition 611, may be
CUII~id~ . As the seven l~lcviuu~ly processed input pixels are clocked out

WO95/34165 2 1 9 2 4 3 9 ~ JaJ ~


of the shift register 701, seven new input pixels are read from the frame
buffer 503, thereby making up a new ~;UIU~ of nine input pixels.

The nine input pixels provide an input address to the look-up table
702, I.,~ ,Clliillg six black pixels followed by three white pixels. Again,
three white pixels extend beyond the input address therefore coding is
performed lc.,ul ,;~,~,ly in order to determine the run-length of the white pixels.
Nine new pixels are read from the frame buffer 503, I.,~lci,.,lllillg a run of
eight white pixels followed by one black pixel. Thus, having received two
input words from the frame buffer 503, it is possible for the analysing logic
706 to deternune that, from transition 611, six black pixels have been
received followed by a total of eleven white pixels. A code to this effect is
l;llr~i, and, given that a total of nine plus eight input pixels have been
coded, a further eight pixels are read from the frame buffer 503 and coding
continues from position 613 shown in Figure 6.

This process contmues until the end of the scan lirle. It is not
necessary to consider each pixel individually in order to determine the
position of pixel ~nCiti~ n~ The look-up table 702 receives nine pixel values
m parallel and produces codes identifying pixel transitions within the word.
Runs of pixels are encoded resulting in codes being ~ I to the output
buffer 703. The analysing logic 706 is aware of how many pixels have been
coded from the input stream, therefore this logic allows an a~
number of cnnti~lnllc input pixels to be supplied to the shift register 701. In
this way, run-length encoding is optimised so as to reduce the lc~uu~ .clll on
transition bandwidtb. Thus, by using the look-up table 702 in a recursive
loop, as illustrated in Figure 7, it is possible to optimise run-length encodingand thereby optimise tr~nci~innh~l storage bandwidth, while at the same time
~llb~lh-ll;,llly increasing processing speed by c~ rinP a plurality of input

~ W0 95/34165 2 1 9 2 4 3 9 ~ 7~ l ~


pixel values in parallel. FUILI~ given this level of optimiesltirn~ it is
not necessary to adapt values stored within the look-up table 702, therefore
it is not necessary to transmit details of new look-up table values to receivers,
thereby snhetAntiAlly improving noise immunity.

In alternative ~ .. ho~ a plurality of look-up tables, similar to
look-up table 702 may be provided. An input to a first look-up table
produces an output which may be used as an output code. All~,lllali~ this
output may itself be used as an index to a second level look-up table. Further
jl~t...lll...ijAI~ look-up tables may be included and a chain would then be
10 l ~ ~ by a final look-up table which, under all input r,nn~iiti~ne, is
arr~mged to produce an output. The provision of a plurality of look-up tables
should further optimise C~ ,t7;011. However, this will also increase tne
hardware overhead, therefore a Cull~ JIlli7-, must be made in terms of
1-,--.~---;~;---- efficiency and hardware l~uui ~ ta. Similarly""r l;l~ c
15 could be made to the word length, with fewer than nine bits being used or
more than nine bits being used. FYr.or-~nre has shown that with a plurality
of sample fonts, a nine bit word provides a good COIll~ 7ll.;..~, between
hardware demands and colll~ 7;ioll efficiency. The size of subtitles is
controlled by many c~ given that it must be possible to display an
20 intelligible number of words on the final screen, while at the same time
making each word large enough so as to be legible to viewers at the relevant
display definition.

- Output broadcast signals produced by device 205 of Figure 2, are
processed by receiving stations, of the type shown in Figure 8, allowing the
25 ~ .l subtitle data to be displayed in combination with television
pictures, as shown in Figure 1.

wo95/34165 2 ~ q 2 4 3 q P~ ..,. C iJ~a


Trqn~mittf d signals are received by an antenna 801 and supplied to a
tuner/-k-"n~ , 802. The tuner/~iPmn<~ tnr 802 separates the video
blanking data from the trqn~mitt~d television signal and supplies the video-
related infnrrn~linn to a video processing device 803. The video processing
5 device 803 cleans up the video infnrm~lti/m and lI:;illLlV~ new blanking
intervals. Additional video processing may be performed so as to supply
,.u, 'y encoded video signals to a television monitor 804.

A combining device 805 is positioned between the video ~lV~ C~D;llg
device 803 and the television monitor 804. The cullll,ill;llg device 805 is
10 arranged to key subtitling data over the video picture, as shown in Figure 1.

The run-length encoded data derived from the video blanking intervals
is supplied to a subtitle decoder 806, arranged to decode the CVlll~ D~ d
subtitling data and to provide a subtitling video signal to the combiner 805
via a switch 807. Switch 807 may be operated manually, so as to D.le~ Iy
15 add received subtitling infnm~qtinn to the picture. In addition, switch 807 is
activated to its off position if no new subtitling data is received for a
d time-out period. At the combiner 805, the decoded subtitling
data is keyed over the television picture and a composite image is supplied
to the TV monitor 805, wh~ v-- conventional processing is performed so
20 as to display the television pictures.

The subtitle decoder 806 is detailed in Figure 9. In the present
,o~ , codes 1, . ,~..,;~u d to represent the subtitles have a maximum bit
length of nine bits. This bit length is ~iPt~tTninpd by the particular coding
adopted in order to effect the trqn~mi~inn and is not related to the nine bit
25 address at the 1., .~.";ltl ..

~ wo ss/34~6s 2 1 9 2 4 3 9 ~ s o~


The l ~ rd words are conveyed as a serial stream and are
therefore coded such that the decoder can identify the start of new words.
The decoder 806 includes a shift register 901, a look-up table 902, an address
writing device 903, a frame buffer 904 and a serial reading device 905.
S Incoming bits are supplied to the shift register 901 which in turn supplies a
parallel address to the look-up table 902 when a complete word has been
received. After a ~ d word has been supplied to the look-up table 902
as an index, the shift register 901 is cleared and the next word is shifted
,Iuuu~;ll.

Codes supplied to the look-up table 902 represent runs of white pixels
and black spaces. These runs are written to au~lvlul;~t~, locations in the framebuffer 904 under the control of the address writing circuit 903. In this way,
the original array of pixels, l~ ia~.ltillg the subtitle, are built up in the frame
buffer, until a complete frame has been ~Pmhl~ Once a complete frame
15 of subtitling data has been assembled in the frame buffer 904, this
inf~ ti-ln is read serially under the control of serial reading device 905, so
as to provide a serial data stream of video ;"r.,....~ -" a.~ uulli~,d to the
data supplied to combmer 805 from the video processing circuit 803.

The output from the frame buffer 904 is in digital form and the system
20 may include digital-to-analogue converting devices so as to effect the
combining of video signals in the analogue domain. Alt~ dth,.,ly, the video
output from the video lulu.,~,;"ulg circuit 803 may be in digital form, allowingthe combiner 805 to operate in the digital domain. In either event, assuming
switch 807 is in the closed position, the two video signals are bi 1,
25 resulting in the composite signal being supplied to the TV monitor 804.

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date Unavailable
(86) PCT Filing Date 1995-06-09
(87) PCT Publication Date 1995-12-14
(85) National Entry 1996-12-09
Dead Application 1999-06-09

Abandonment History

Abandonment Date Reason Reinstatement Date
1998-06-09 FAILURE TO PAY APPLICATION MAINTENANCE FEE

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Application Fee $0.00 1996-12-09
Maintenance Fee - Application - New Act 2 1997-06-09 $100.00 1996-12-09
Registration of a document - section 124 $100.00 1997-12-09
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
SCREEN SUBTITLING SYSTEMS LTD.
Past Owners on Record
ATKINS, DAVID JOHN
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Abstract 1995-12-14 1 36
Cover Page 1997-04-18 1 11
Representative Drawing 1998-01-05 1 5
Description 1995-12-14 23 744
Claims 1995-12-14 4 99
Drawings 1995-12-14 8 82
Cover Page 1998-06-25 1 11
PCT Correspondence 1996-12-09 1 26
International Preliminary Examination Report 1996-12-09 11 371
Office Letter 1997-01-21 1 39
Fees 1996-12-09 1 37