Sélection de la langue

Search

Sommaire du brevet 2612122 

Énoncé de désistement de responsabilité concernant l'information provenant de tiers

Une partie des informations de ce site Web a été fournie par des sources externes. Le gouvernement du Canada n'assume aucune responsabilité concernant la précision, l'actualité ou la fiabilité des informations fournies par les sources externes. Les utilisateurs qui désirent employer cette information devraient consulter directement la source des informations. Le contenu fourni par les sources externes n'est pas assujetti aux exigences sur les langues officielles, la protection des renseignements personnels et l'accessibilité.

Disponibilité de l'Abrégé et des Revendications

L'apparition de différences dans le texte et l'image des Revendications et de l'Abrégé dépend du moment auquel le document est publié. Les textes des Revendications et de l'Abrégé sont affichés :

  • lorsque la demande peut être examinée par le public;
  • lorsque le brevet est émis (délivrance).
(12) Brevet: (11) CA 2612122
(54) Titre français: SYSTEME ET METHODE D'OBTENTION D'UNE FREQUENCE D'IMAGES COMPLETES VARIABLE ET DE SAUT DE TRAME ADAPTATIF SUR APPAREIL MOBILE
(54) Titre anglais: SYSTEM AND METHOD FOR PROVIDING A VARIABLE FRAME RATE AND ADAPTIVE FRAME SKIPPING ON A MOBILE DEVICE
Statut: Accordé et délivré
Données bibliographiques
Abrégés

Abrégé français

Un processeur et une méthode sont fournis pour décoder un fichier multimédia avec des flux de données audio et vidéo qui sont configurés pour être joués en synchronisation. Des trames de flux vidéo sont décodées et jouées avec laudio et, pour compenser la saturation dutilisation du processeur, deux procédures sont réalisées. La première procédure fonctionne à un premier intervalle périodique et ralentit le taux de trames pour réduire lutilisation du processeur sil y a lieu. La seconde procédure essaie premièrement daccélérer la vidéo pour rattraper laudio si elles ne sont pas synchronisées et si cela ne peut être fait lors du prochain intervalle de temps, le décodage dun nombre sélectionné de trames est sauté de sorte que les flux de données audio et vidéo sont synchronisés de nouveau.


Abrégé anglais

A processor and method are provided for decoding a multimedia file having video and audio data streams that are configured to be played in synchronization. Frames of the video stream are decoded and played with the audio and, to compensate for saturation of the processor usage, two procedures are performed. The first procedure operates at a first periodic interval and slows down the frame rate to reduce processor usage if needed. The second procedure first attempts to speed up the video to catch up to the audio if they are out of sync and if this cannot be done in the next time interval, the decoding of a select number of frames is skipped such that the video and audio data streams are resynchronized.

Revendications

Note : Les revendications sont présentées dans la langue officielle dans laquelle elles ont été soumises.


Claims:
1. A method of decoding a multimedia file using a processor according to usage
of said
processor, said multimedia file comprising a video data stream including a
series of encoded
frames and an audio data stream to be played in synchronization with said
video data stream as
said frames are decoded, said method comprising:
at a first periodic interval:
determining if a target frame rate for said video data stream can be
achieved while meeting a predetermined usage threshold for said processor;
if said usage threshold is not met:
determining a first scaling factor required to meet said usage
threshold;
modifying said target frame rate according to said first scaling
factor to determine a current frame rate; and
decoding subsequent frames at a first modified frame rate while
playing said audio data stream therewith;
if said usage threshold is met:
decoding said frames of said video data stream and playing said
video data stream at said target frame rate while playing said audio
data stream therewith; and
at a second periodic interval which is longer than said first periodic
interval:
determining if said video data stream and said audio data stream are out of
synchronization;
if said video data stream and said audio data stream are out of
synchronization, determining a second scaling factor to enable said video data
stream to resynchronize with said audio data stream;
- 25 -

applying said second sealing factor to a current frame rate to determine a
second modified frame rate; and
if said second modified frame rate exceeds a capped frame rate, decoding
subsequent frames at said capped frame rate and not decoding one or more
frames
scheduled to be decoded within a next time interval.
2. The method according to claim 1 wherein if said second modified frame
rate does not
exceed said capped frame rate, decoding subsequent frames at said second
modified frame rate
until said video data stream is resynchronized with said audio data stream.
3. The method according to claim 1 or claim 2 wherein decoding subsequent
frames at said
capped frame rate and not decoding one or more frames comprises determining a
difference
between said second modified frame rate and said capped frame rate,
determining a number of
frames to not be decoded to compensate for said difference, and not decoding
said number of
frames.
4. The method according to any one of claims 1 to 3, said frames being
decoded in groups
of I, B and P frames, wherein a counter is incremented after each frame is
decoded until a next I
frame is encountered and decoding subsequent frames at said capped frame rate
and not
decoding one or more frames comprises not decoding all frames until said next
I frame and
continuing decoding from said next I frame.
5. The method according to any one of claims 1 to 4 comprising tracking the
number of
frames not decoded for said determining if said target frame rate can be
achieved.
6. The method according to claim 4 or claim 5 comprising recording a
timestamp for each I
frame.
7. The method according to any one of claims 1 to 6, wherein said
determining if said target
frame rate can be achieved comprises consideration of an average scaling count
of a
predetermined number of previous frames.
8. The method according to claim 7 comprising updating said scaling count
according to
each application of said first scaling factor.
- 26 -

9. The method according to claim 7 or claim 8 wherein said determining if
said target frame
rate can be achieved comprises determining how said processor usage dedicated
to said decoding
should be scaled to maintain a predetermined amount of processor usage for
other applications
and idle processes according to said average scaling count.
10. A computer readable medium containing computer executable instructions,
which when
executed by a processor, enable a device to:
at a first periodic interval:
determine if a target frame rate for said video data stream can be achieved
while meeting a predetermined usage threshold for said processor;
if said usage threshold is not met:
determine a first scaling factor required to meet said usage
threshold;
modify said target frame rate according to said first scaling factor
to determine a current frame rate; and
decode subsequent frames at a first modified frame rate while
playing said audio data stream therewith;
if said usage threshold is met:
decode said frames of said video data stream and playing said
video data stream at said target frame rate while playing said audio
data stream therewith; and
at a second periodic interval which is longer than said first periodic
interval:
determine if said video data stream and said audio data stream are out of
synchronization;
- 27 -

if said video data stream and said audio data stream are out of
synchronization, determine a second scaling factor to enable said video data
stream to resynchronize with said audio data stream;
apply said second scaling factor to a current frame rate to determine a
second modified frame rate; and
if said second modified frame rate exceeds a capped frame rate, decode
subsequent frames at said capped frame rate and not decoding one or more
frames
scheduled to be decoded within a next time interval.
11. The computer readable medium according to claim 10 wherein if said
second modified
frame rate does not exceed said capped frame rate, decoding subsequent frames
at said second
modified frame rate until said video data stream is resynchronized with said
audio data stream.
12. The computer readable medium according to claim 10 or claim 11, wherein
decoding
subsequent frames at said capped frame rate and not decoding one or more
frames comprises
determining a difference between said second modified frame rate and said
capped frame rate,
determining a number of frames to not be decoded to compensate for said
difference, and not
decoding said number of frames.
13. The computer readable medium according to any one of claims 10 to 12,
said frames
being decoded in groups of I, B and P frames, wherein a counter is incremented
after each frame
is decoded until a next I frame is encountered and decoding subsequent frames
at said capped
frame rate and not decoding one or more frames comprises not decoding all
frames until said
next I frame and continuing decoding from said next I frame.
14. The computer readable medium according to any one of claims 10 to 13,
comprising
tracking the number of frames not decoded for said determining if said target
frame rate can be
achieved.
15. The computer readable medium according to claim 13 or claim 14,
comprising recording
a timestamp for each I frame.
- 28 -

16. The computer readable medium according to any one of claims 10 to 15,
wherein said
determining if said target frame rate can be achieved comprises consideration
of an average
scaling count of a predetermined number of previous frames.
17. The computer readable medium according to claim 16 comprising updating
said scaling
count according to each application of said first scaling factor.
18. The computer readable medium according to claim 16 or claim 17, wherein
said
determining if said target frame rate can be achieved comprises determining
how said processor
usage dedicated to said decoding should be scaled to maintain a predetermined
amount of
processor usage for other applications and idle processes according to said
average scaling count.
19. A processor for decoding a multimedia file according to usage of said
processor, said
multimedia file comprising a video data stream including a series of encoded
frames and an
audio data stream to be played in synchronization with said video data as said
frames are
decoded, said processor being configured for:
at a first periodic interval:
determining if a target frame rate for said video data stream can be
achieved while meeting a predetermined usage threshold for said processor;
if said usage threshold is not met:
determining a first scaling factor required to meet said usage
threshold;
modifying said target frame rate according to said first scaling
factor to determine a current frame rate; and
decoding subsequent frames at a first modified frame rate while
playing said audio data stream therewith;
if said usage threshold is met:
- 29 -

decoding said frames of said video data stream and playing said
video data stream at said target frame rate while playing said audio
data stream therewith; and
at a second periodic interval which is longer than said first periodic
interval:
determining if said video data stream and said audio data stream are out of
synchronization;
if said video data stream and said audio data stream are out of
synchronization, determining a second scaling factor to enable said video data
stream to resynchronize with said audio data stream;
applying said second scaling factor to a current frame rate to determine a
second modified frame rate; and
if said second modified frame rate exceeds a capped frame rate, decoding
subsequent frames at said capped frame rate and not decoding one or more
frames
scheduled to be decoded within a next time interval,
20. The processor according to claim 19 wherein if said second modified
frame rate does not
exceed said capped frame rate, decoding subsequent frames at said second
modified frame rate
until said video data stream is resynchronized with said audio data stream.
21. The processor according to claim 19 or claim 20 wherein decoding
subsequent frames at
said capped frame rate and not decoding one or more frames comprises
determining a difference
between said second modified frame rate and said capped frame rate,
determining a number of
frames to not be decoded to compensate for said difference, and not decoding
said number of
frames.
22. The processor according to any one of claims 19 to 21, said frames
being decoded in
groups of I, B and P frames, wherein a counter is incremented after each frame
is decoded until a
next I frame is encountered and decoding subsequent frames at said capped
frame rate and not
decoding one or more frames comprises not decoding all frames until said next
I frame and
continuing decoding from said next I frame.
- 30 -

23. The processor according to any one of claims 19 to 22 further
configured for tracking the
number of frames not decoded for said determining if said target frame rate
can be achieved.
24. The processor according to claim 22 or claim 23 further configured for
recording a
timestamp for each I frame.
25. The processor according to any one of claims 19 to 24 wherein said
determining if said
target frame rate can be achieved comprises consideration of an average
scaling count of a
predetermined number of previous frames.
26. The processor according to claim 25 further configured for updating
said scaling count
according to each application of said first scaling factor.
27. The processor according to claim 25 or claim 26 wherein said
determining if said target
frame rate can be achieved comprises determining how said processor usage
dedicated to said
decoding should be scaled to maintain a predetermined amount of processor
usage for other
applications and idle processes according to said average scaling count.
28. A mobile device comprising the processor according to claim 19.
- 31 -

Description

Note : Les descriptions sont présentées dans la langue officielle dans laquelle elles ont été soumises.


CA 02612122 2007-11-23
t
1 SYSTEM AND METHOD FOR PROVIDING A VARIABLE FRAME RATE AND
2 ADAPTIVE FRAME SKIPPING ON A MOBILE DEVICE
3
4 TECHNICAL FIELD:
[0001] The following relates to systems and methods for decoding multimedia
files
6 according to processor usage.
7 DESCRIPTION OF THE PRIOR ART
8 [0002] A computing device, such as a mobile device, uses a processor to
perform tasks.
9 Each task inherently consumes a certain percentage of the processor's
overall capability.
However, it is well known that mobile devices generally have weaker processors
than, e.g.,
11 personal computers (PCs). Many tasks, often referred to as non-interactive
tasks, are fixed tasks
12 that are scheduled by a scheduling algorithm. Other tasks, often referred
to as interactive tasks,
13 in some way relate to recent input/output (UO) traffic or user related
tasks, such as user input or
14 user directed output. The scheduling algorithm typically aims to schedule
interactive tasks for
optimal low latency and non-interactive tasks for optimal throughput. An
example of a non-
16 interactive task is video decoding, which is done in the background (i.e.
the user will not notice
17 as it occurs), and an example of an interactive task is a keystroke or
status bar update that the
18 user can presumably view on the display of the mobile device.
19 [0003] The video content currently expected to be played on a mobile device
often pushes
the capabilities of mobile processors such that in some circumstances, the
mobile device cannot
21 decode a video in real-time. Also, scheduling video decoding can be
difficult as the system load
22 felt due to video decoding is heavily dependent on the content of the
video. Attempting to
23 decode such video content can saturate the processor and, on a multi-thread
system, where user
24 interface (UI) runs at a lower priority thread, the user's input and
control of the device may feel
unresponsive.
26 [0004] For example, in a mobile device, when a task saturates the central
processor, a
27 keystroke or user directed output such as a status bar update may not
respond in a timely manner.
28 Also, a mobile device that is decoding a video may be sluggish when
responding to a user
29 moving a positioning device (e.g. to move a cursor on the screen). When
encountering the
21700696.1
-1-

CA 02612122 2007-11-23
1 above, the result is often a poor viewing experience, which can be made
worse if the video is
2 synchronized with audio content.
3 [0005] Previous methods of simply dropping frames is not always possible
because of
4 temporal coding tools used in modem video codecs, e.g., MPEG-4, where a
video frame relies on
data from previous or future frames. Also, the system load may vary (spike)
due to
6 asynchronous events such as when receiving email or other radio traffic.
7 BRIEF DESCRIPTION OF THE DRAWINGS
8 [0006] Embodiments will now be described by way of example only with
reference to the
9 appended drawings wherein:
[0007] Figure 1 is a schematic diagram of a mobile device and a display screen
therefor.
11 [0008] Figure 2 is a schematic diagram of another mobile device and a
display screen
12 therefor.
13 [0009] Figure 3 is a schematic block diagram of components of the mobile
device of any or
14 both of Figures 1 and 2.
[0010] Figure 4 is a schematic block diagram of the memory shown in Figure 3.
16 [0011] Figure 5 is a screen shot of a home screen for the mobile device of
any or both of
17 Figures 1 and 2.
18 [0012] Figure 6 is a schematic block diagram of a processor used in
decoding a multimedia
19 file.
[0013] Figure 7 is a schematic block diagram of the multimedia file shown in
Figure 6.
21 [0014] Figure 8 is a schematic block diagram of the video decode task shown
in Figure 6.
22 [0015] Figure 9 is a series of timing diagrams illustrating operation of
the compensation
23 module shown in Figure 8.
21700696.1
-2-

CA 02612122 2007-11-23
1 [0016] Figure 10 is a flow diagram illustrating a procedure executed
according to a frame
2 rate timer.
3 [0017] Figure 11 is a flow diagram illustrating a procedure executed
according to a
4 synchronization timer.
[0018] Figure 12 is a flow diagram illustrating a frame skipping procedure.
6 DETAILED DESCRIPTION OF THE DRAWINGS
7 [0019] A processor, mobile device and method performed thereby are now
described for
8 providing a variable frame rate and adaptive frame skipping on a mobile
device to, among other
9 things, absorb spikes in processor load to improve the overall viewing
experience on such mobile
devices when decoding multimedia files.
11 [0020] Referring now to Figures 1 and 2, one embodiment of a mobile device
l0a is shown
12 in Figure 1, and another embodiment of a mobile device l Ob is shown in
Figure 2. It will be
13 appreciated that the numeral "10" will hereinafter refer to any mobile
device 10, including the
14 embodiments l0a and l Ob. It will also be appreciated that a similar
numbering convention may
be used for other general features common between Figures 1 and 2 such as a
display 12, a
16 positioning device 14, and a cancel or escape button 16.
17 [0021] The mobile device l0a shown in Figure 1 comprises a display 12a and
the cursor or
18 view positioning device 14 shown in this embodiment is a positioning wheel
14a. Positioning
19 device 14 may serve as another input member and is both rotatable to
provide selection inputs to
the processor 64 (see Figure 3) and can also be pressed in a direction
generally toward housing to
21 provide another selection input to the processor 64. The display 12 may
include a selection
22 cursor 18 (see Figure 5) that depicts generally where the next input or
selection will be received.
23 The selection cursor 18 may comprise a box, alteration of an icon or any
combination of features
24 that enable the user to identify the currently chosen icon or item. The
mobile device 10a in
Figure 1 also comprises an escape or cancel button 16a and a keyboard 20. In
this example, the
26 keyboard 20 is disposed on the front face of the mobile device housing and
positioning device 14
27 and cancel button 16a are disposed at the side of the housing to enable a
user to manoeuvre the
21700696.1
-3-

CA 02612122 2007-11-23
1 positioning wheel 16a while holding the mobile device 10 in one hand. The
keyboard 20 is in
2 this embodiment a standard QWERTY keyboard.
3 [0022] The mobile device lOb shown in Figure 2 comprises a display 12b and
the positioning
4 device 14 in this embodiment is a trackball 14b. Trackball 14b permits multi-
directional
positioning of the selection cursor 18 such that the selection cursor 18 can
be moved in an
6 upward direction, in a downward direction and, if desired and/or permitted,
in any diagonal
7 direction. The trackball 14b is preferably situated on the front face of a
housing for mobile
8 device 10b as shown in Figure 2 to enable a user to manoeuvre the trackball
14b while holding
9 the mobile device lOb in one hand. The trackball 14b may serve as another
input member (in
addition to a directional or positioning member) to provide selection inputs
to the processor 64
11 and can preferably be pressed in a direction towards the housing of the
mobile device 10b to
12 provide such a selection input.
13 [0023] The mobile device 10b also comprises a menu or option button 24 that
loads a menu
14 or list of options on display 12b when pressed, and a cancel or escape
button 16b to exit, "go
back" or otherwise escape from a feature, option, selection or display. The
mobile device 10b as
16 illustrated in Figure 2, comprises a reduced QWERTY keyboard 22. In this
embodiment, the
17 keyboard 22, positioning device 14, escape button 16b and menu button 24
are disposed on a
18 front face of a mobile device housing.
19 [0024] The reduced QWERTY keyboard 22 comprises a plurality of multi-
functional keys
and corresponding indicia including keys associated with alphabetic characters
corresponding to
21 a QWERTY array of letters A to Z and an overlaid numeric phone key
arrangement. The
22 plurality of keys that comprise alphabetic and/or numeric characters total
fewer than twenty-six
23 (26). In the embodiment shown, the number of keys that comprise alphabetic
and numeric
24 characters is fourteen (14). In this embodiment, the total number of keys,
including other
functional keys, is twenty (20). The plurality of keys may comprise four rows
and five columns
26 of keys, with the four rows comprising in order a first, second, third and
fourth row, and the five
27 columns comprising in order a first, second, third, fourth, and fifth
column. The QWERTY array
21700696.1
-4-

CA 02612122 2007-11-23
1 of letters is associated with three of the four rows and the numeric phone
key arrangement is
2 associated with each of the four rows.
3 [0025] The numeric phone key arrangement is associated with three of the
five columns.
4 Specifically, the numeric phone key arrangement may be associated with the
second, third and
fourth columns. The numeric phone key arrangement may alternatively be
associated with keys
6 in the first, second, third, and fourth rows, with keys in the first row
including a number "1" in
7 the second column, a number "2" in the third column, and a number "3" in the
fourth column.
8 The numeric phone keys associated with keys in the second row include a
number "4" in the
9 second column, a number "5" in the third column, and a number "6" in the
fourth column. The
numeric phone keys associated with keys in the third row include a number "7"
in the second
11 column, a number "8" in the third column, and a number "9" in the fourth
column. The numeric
12 phone keys associated with keys in the fourth row may include a"*" in the
second column, a
13 number "0" in the third column, and a"#" in the fourth column.
14 [0026] The physical keyboard may also include a function associated with at
least one of the
plurality of keys. The fourth row of keys may include an "alt" function in the
first column, a
16 "next" function in the second column, a "space" function in the third
column, a "shift" function in
17 the fourth column, and a"return/enter" function in the fifth column.
18 [0027] The first row of five keys may comprise keys corresponding in order
to letters "QW",
19 "ER", "TY", "UI", and "OP". The second row of five keys may comprise keys
corresponding in
order to letters "AS", "DF", "GH", "JK", and "L". The third row of five keys
may comprise keys
21 corresponding in order to letters "ZX", "CV", "BN", and "M".
22 [0028] It will be appreciated that for the mobile device 10, a wide range
of one or more
23 positioning or cursor/view positioning mechanisms such as a touch pad, a
joystick button, a
24 mouse, a touchscreen, set of arrow keys, a tablet, an accelerometer (for
sensing orientation
and/or movements of the mobile device 10 etc.), or other whether presently
known or unknown
26 may be employed. Similarly, any variation of keyboard 20, 22 may be used.
It will also be
27 appreciated that the mobile devices 10 shown in Figures 1 and 2 are for
illustrative purposes only
21700696.1
-5-

CA 02612122 2007-11-23
1 and various other mobile devices 10, presently known or unknown are equally
applicable to the
2 following examples.
3 [0029] Movement, navigation, and/or scrolling with use of a cursor/view
positioning device
4 14 (e.g. trackball 14b or positioning wheel 14a) is beneficial given the
relatively large size of
visually displayed information and the compact size of display 12, and since
information and
6 messages are typically only partially presented in the limited view of
display 12 at any given
7 moment. As previously described, positioning devices 14 such as the
positioning wheel 14a and
8 trackball 14b, are helpful cursor/view positioning mechanisms to achieve
such movement.
9 Positioning device 14, which may be referred to as a positioning wheel or
scroll device 14a in
one embodiment (Figure 1), specifically includes a circular disc which is
rotatable about a fixed
11 axis of housing and may be rotated by the end user's index finger or thumb.
As noted above, in
12 another embodiment (Figure 2) the trackball 14b comprises a multi-
directional member that
13 enables upward, downward and if desired, diagonal movements. The multi-
directional
14 movements afforded, in particular, by the trackball 14b and the
presentation of icons and folders
on display 12 provides the user with flexibility and familiarity of the layout
of a traditional
16 desktop computer interface. Also, the positioning device 14 enables
movement and selection
17 operations to be executed on the mobile device 10 using one hand. The
trackball 14b in
18 particular also enables both one-handed use and the ability to cause a
cursor 18 to traverse the
19 display 12 in more than one direction.
[00301 Figure 3 is a detailed block diagram of an embodiment of a mobile
station 32. The
21 term "mobile station" will herein refer to the operable components of, e.g.
mobile device 10.
22 Mobile station 32 is preferably a two-way communication device having at
least voice and
23 advanced data communication capabilities, including the capability to
communicate with other
24 computer systems. Depending on the functionality provided by mobile station
32, it may be
referred to as a data messaging device, a two-way pager, a cellular telephone
with data
26 messaging capabilities, a wireless Internet appliance, or a data
communication device (with or
27 without telephony capabilities) - e.g. mobile device 10 shown in Figures 1
and 2. Mobile station
28 32 may communicate with any one of a plurality of fixed transceiver
stations 30 within its
29 geographic coverage area.
21700696.1
-6-

CA 02612122 2007-11-23
1 [0031] Mobile station 32 will normally incorporate a communication subsystem
34 which
2 includes a receiver 36, a transmitter 40, and associated components such as
one or more
3 (preferably embedded or internal) antenna elements 42 and 44, local
oscillators (LOs) 38, and a
4 processing module such as a digital signal processor (DSP) 46. As will be
apparent to those
skilled in field of communications, particular design of communication
subsystem 34 depends on
6 the communication network in which mobile station 32 is intended to operate.
7 [0032] Mobile station 32 may send and receive communication signals over a
network after
8 required network registration or activation procedures have been completed.
Signals received by
9 antenna 42 through the network are input to receiver 36, which may perform
such common
receiver functions as signal amplification, frequency down conversion.
filtering, channel
11 selection, and like, and in example shown in Figure 3, analog-to-digital
(A/D) conversion. A/D
12 conversion of a received signal allows more complex communication functions
such as
13 demodulation and decoding to be performed in DSP 46. In a similar manner,
signals to be
14 transmitted are processed, including modulation and encoding, for example,
by DSP 46. These
DSP-processed signals are input to transmitter 40 for digital-to-analog (D/A)
conversion,
16 frequency up conversion, filtering, amplification and transmission over
communication network
17 via antenna 44. DSP 46 not only processes communication signals, but also
provides for receiver
18 and transmitter control. For example, the gains applied to communication
signals in receiver 36
19 and transmitter 40 may be adaptively controlled through automatic gain
control algorithms
implemented in DSP 46.
21 [0033] Network access is associated with a subscriber or user of mobile
station 32. In one
22 embodiment, mobile station 32 uses a Subscriber Identity Module or "SIM"
card 74 to be
23 inserted in a SIM interface 76 in order to operate in the network. SIM 74
is one type of a
24 conventional "smart card" used to identify an end user (or subscriber) of
the mobile station 32
and to personalize the device, among other things. Without SIM 74, the mobile
station terminal
26 in such an embodiment is not fully operational for communication through a
wireless network.
27 By inserting SIM 74 into mobile station 32, an end user can have access to
any and all of his/her
28 subscribed services. SIM 74 generally includes a processor and memory for
storing information.
29 Since SIM 74 is coupled to a SIM interface 76, it is coupled to
microprocessor 64 through
21700696.1
-7-

CA 02612122 2007-11-23
1 communication lines. In order to identify the subscriber, SIM 74 contains
some user parameters
2 such as an International Mobile Subscriber Identity (IMSI). An advantage of
using SIM 74 is that
3 end users are not necessarily bound by any single physical mobile station.
SIM 74 may store
4 additional user information for the mobile station as well, including
datebook (or calendar)
information and recent call information. It will be appreciated that mobile
station 32 may also be
6 used with any other type of network compatible mobile device 10 such as
those being code
7 division multiple access (CDMA) enabled and should not be limited to those
using and/or having
8 a SIM card 74.
9 [0034] Mobile station 32 is a battery-powered device so it also includes a
battery interface 70
for receiving one or more rechargeable batteries 72. Such a battery 72
provides electrical power
11 to most if not all electrical circuitry in mobile station 32, and battery
interface 70 provides for a
12 mechanical and electrical connection for it. The battery interface 70 is
coupled to a regulator (not
13 shown) which provides a regulated voltage to all of the circuitry.
14 [0035] Mobile station 32 in this embodiment includes a microprocessor 64
which controls
overall operation of mobile station 32. It will be appreciated that the
microprocessor 64 may be
16 implemented by any processing device. Communication functions, including at
least data and
17 voice communications are performed through communication subsystem 34.
Microprocessor 64
18 also interacts with additional device subsystems which may interface with
physical components
19 of the mobile device 10. Such addition device subsystems comprise a display
48, a flash
memory 50, a random access memory (RAM) 52, auxiliary input/output subsystems
54, a serial
21 port 56, a keyboard 58, a speaker 60, a microphone 62, a short-range
communications subsystem
22 66, and any other device subsystems generally designated at 68. Some of the
subsystems shown
23 in Figure 3 perform communication-related functions, whereas other
subsystems may provide
24 "resident" or on-device functions. Notably, some subsystems such as
keyboard 58 and display
48, for example, may be used for both communication-related functions, such as
entering a text
26 message for transmission over a communication network, and device-resident
functions such as a
27 calculator or task list. Operating system software used by microprocessor
64 is preferably stored
28 in a persistent store such as flash memory 50, which may alternatively be a
read-only memory
29 (ROM) or similar storage element (not shown). Those skilled in the art will
appreciate that the
21700696.1
-8-

CA 02612122 2007-11-23
1 operating system, specific device applications, or parts thereof, may be
temporarily loaded into a
2 volatile store such as RAM 52.
3 [0036] Microprocessor 64, in addition to its operating system functions,
preferably enables
4 execution of software applications on mobile station 32. A predetermined set
of applications
which control basic device operations, including at least data and voice
communication
6 applications, as well as the inventive functionality of the present
disclosure, will normally be
7 installed on mobile station 32 during its manufacture. A preferred
application that may be loaded
8 onto mobile station 32 may be a personal information manager (PIM)
application having the
9 ability to organize and manage data items relating to user such as, but not
limited to, e-mail,
calendar events, voice mails, appointments, and task items. Naturally, one or
more memory
11 stores are available on mobile station 32 and SIM 74 to facilitate storage
of PIM data items and
12 other information.
13 [0037] The PIM application preferably has the ability to send and receive
data items via the
14 wireless network. In the present disclosure, PIM data items are seamlessly
integrated,
synchronized, and updated via the wireless network, with the mobile station
user's corresponding
16 data items stored and/or associated with a host computer system thereby
creating a mirrored host
17 computer on mobile station 32 with respect to such items. This is
especially advantageous where
18 the host computer system is the mobile station user's office computer
system. Additional
19 applications may also be loaded onto mobile station 32 through network, an
auxiliary subsystem
54, serial port 56, short-range communications subsystem 66, or any other
suitable subsystem 68,
21 and installed by a user in RAM 52 or preferably a non-volatile store (not
shown) for execution
22 by microprocessor 64. Such flexibility in application installation
increases the functionality of
23 mobile station 32 and may provide enhanced on-device functions,
communication-related
24 functions, or both. For example, secure communication applications may
enable electronic
commerce functions and other such financial transactions to be performed using
mobile station
26 32.
27 100381 In a data communication mode, a received signal such as a text
message, an e-mail
28 message, or web page download will be processed by communication subsystem
34 and input to
21700696.1
-9-

CA 02612122 2007-11-23
1 microprocessor 64. Microprocessor 64 will preferably further process the
signal for output to
2 display 48 or alternatively to auxiliary UO device 54. A user of mobile
station 32 may also
3 compose data items, such as e-mail messages, for example, using keyboard 58
in conjunction
4 with display 48 and possibly auxiliary 1/0 device 54. These composed items
may be transmitted
over a communication network through communication subsystem 34.
6 [0039] For voice communications, the overall operation of mobile station 32
is substantially
7 similar, except that the received signals would be output to speaker 60 and
signals for
8 transmission would be generated by microphone 62. Alternative voice or audio
1/0 subsystems,
9 such as a voice message recording subsystem, may also be implemented on
mobile station 32.
Although voice or audio signal output is preferably accomplished primarily
through speaker 60,
11 display 48 may also be used to provide an indication of the identity of a
calling party, duration of
12 a voice call, or other voice call related information, as some examples.
13 [0040] Serial port 56 in Figure 3 is normally implemented in a personal
digital assistant
14 (PDA)-type communication device for which synchronization with a user's
desktop computer is
a desirable, albeit optional, component. Serial port 56 enables a user to set
preferences through
16 an external device or software application and extends the capabilities of
mobile station 32 by
17 providing for information or software downloads to mobile station 32 other
than through a
18 wireless communication network. The alternate download path may, for
example, be used to load
19 an encryption key onto mobile station 32 through a direct and thus reliable
and trusted
connection to thereby provide secure device communication.
21 [0041] Short-range communications subsystem 66 of Figure 3 is an additional
optional
22 component which provides for communication between mobile station 32 and
different systems
23 or devices, which need not necessarily be similar devices. For example,
subsystem 66 may
24 include an infrared device and associated circuits and components, or a
BluetoothTM
communication module to provide for communication with similarly enabled
systems and
26 devices. BluetoothTM is a registered trademark of Bluetooth SIG, Inc.
27 [0042] As shown in Figure 4, memory 50 includes a plurality of applications
80 associated
28 with a series of icons 102 (see Figure 5) for the processing of data.
Applications 80 may be any
21700696.1
-10-

CA 02612122 2007-11-23
1 variety of forms such as, without limitation, software, firmware, and the
like. Applications 80
2 may include, for example, electronic mail (e-mail) 82, calendar program 84,
storage and/or
3 program for contacts 86, a multimedia/video player application 88, memo
program 90, storage
4 for messages 92, a search function and/or application 94 etc. An operating
system (OS) 96, and
in this embodiment a multimedia storage area 89 also reside in memory 50. The
multimedia
6 storage area 89 is generally a designated portion of memory 50 for storing
multimedia files 120
7 that are used by the multimedia/video player 88. The multimedia/video player
88 will
8 hereinafter, for brevity, be referred to as a`video player 88' (as shown in
Figure 4).
9 [0043] The mobile devices 10 of the present disclosure are also configured
to enable
communication between different ones of the applications 80, e.g. between
contacts application
11 86 and the email application 82. Also, the icons 102 for the applications
on the mobile devices
12 10 can be modified, named, moved, sorted and otherwise interacted with for
the purposes of
13 organizing and/or manipulating the visibility of the icons for those
applications 102.
14 [0044] Turning now to Figure 5, the mobile device 10 displays a home screen
100, which is
preferably the active screen when the mobile device 10 is powered up and
constitutes the main
16 ribbon application. The home screen 100 generally comprises a status region
104 and a theme
17 background 106, which provides a graphical background for the display 12.
The theme
18 background 106 displays a series of icons 102 in a predefined arrangement
on a graphical
19 background.
[0045] In some themes, the home screen 100 may limit the number icons 102
shown on the
21 home screen 100 so as to not detract from the theme background 106,
particularly where the
22 background 106 is chosen for aesthetic reasons. The theme background 106
shown in Figure 5
23 provides a grid of icons. In other themes (not shown), a limited list of
icons may be displayed in
24 a column (or row) on the home screen along one portion of the display 12.
In yet another theme,
the entire list of icons may be listed in a continuous row along one side of
the home screen on the
26 display 12 enabling the user to scroll through the list while maintaining a
limited number of
27 currently visible icons on the display 12. In yet another theme (not
shown), metadata may be
28 displayed with each of a limited number of icons shown on the home screen.
For example, the
21700696.1
- 11 -

CA 02612122 2007-11-23
1 next two appointments in the user's calendar may be accessed by the
processor 64 and displayed
2 next to the calendar icon. It will be appreciated that preferably several
themes are available for
3 the user to select and that any applicable arrangement may be used.
4 [0046] One or more of the series of icons 102 is typically a folder 112 that
itself is capable of
organizing any number of applications therewithin.
6 [0047] The status region 104 in this embodiment comprises a date/time
display 107. The
7 theme background 106, in addition to a graphical background and the series
of icons 102, also
8 comprises a status bar 110. The status bar 110 provides information to the
user based on the
9 location of the selection cursor 18, e.g. by displaying a name for the icon
102 that is currently
highlighted.
11 [0048] Accordingly, an application, such as a video player application 88
may be initiated
12 (opened or viewed) from display 12 by highlighting a multimedia/video icon
114 using the
13 positioning device 14 and providing a suitable user input to the mobile
device 10. For example,
14 video player application 88 may be initiated by moving the positioning
device 14 such that the
contacts icon 114 is highlighted as shown in Figure 5, and providing a
selection input, e.g. by
16 pressing the trackball 14b.
17 [0049] As noted above, one or more multimedia files 120 are stored in the
multimedia
18 storage portion 89 of memory 50, which are configured to be used with the
video player 88.
19 Multimedia files 120 are typically stored in a compressed (encoded) form
that must be
decompressed (decoded) by the processor 64 in order to be played on the video
player 88. It will
21 be appreciated that the multimedia files 120 may be loaded from an external
source through a
22 web browser or downloaded from a web site accessed through the
communication system 34 and
23 need not be stored directly on the mobile device 10. As such, locally
stored and streaming
24 content is applicable to the principles discussed herein.
[0050] In one embodiment, video decoding is one of a number of tasks that the
processor 64
26 is responsible for performing. Referring now to Figure 6, the processor 64
is shown with a
27 number of defined tasks 124 that execute a particular set of instructions
for providing a function
21700696.1
-12-

CA 02612122 2007-11-23
1 or service on the mobile device 10. In the example of video decoding, a
video decoding task 122
2 obtains a target video frame rate 132 and the encoded video data 126 or
video data stream from
3 the multimedia file 120 stored in memory 50. The video decoding task 122
decodes the encoded
4 video data 126 and provides decoded data 136 to the video player 88 at a
particular frame rate
that is preferably close to or exactly at the target frame rate 132. The video
player 88 is
6 responsible for playing the video on the display 12 using a suitable user
interface such as a video
7 portal, viewer etc. Although not shown in Figure 6, if corresponding audio
data 130 (see Figure
8 7) exists for the video data stream 126, the video player 88 also processes
and plays the audio
9 data stream 130 with the video data stream 126.
[0051] As can be seen in Figure 6, the processor 64 also processes user input
138 (e.g.
11 keystrokes or positioning device movements) and user directed output 140
(e.g. status bar
12 updates such as displaying an "unread mail" icon) to perform user related
tasks 139. It may be
13 noted that when a user is watching a video, they typically do not interact
with the device (e.g.
14 with user related tasks 139) the majority of the time. As such, when
scheduling video decoding
tasks 122, it has been recognized that allocating resources for the user
related tasks 139 (e.g. UI
16 related resources) is more efficiently allocated on-demand or dynamically,
e.g. according to the
17 procedures discussed below.
18 [0052] Turning now to Figure 7, a multimedia file 120 is shown in greater
detail. The
19 multimedia file 120 contains a set of encoded video data 126 being made up
of a series of frames
128, a set or stream of audio data 130 that corresponds to (synchronized to be
played with) the
21 video data stream 126 stored therein, the target frame rate 132 for that
particular set of video data
22 126 (e.g. 30 frames per second), and other data 134 such as file name,
codec information, author
23 etc.
24 [0053] In the following embodiment, the video data stream 126 is encoded
using MPEG
video encoding, e.g. MPEG-4, however, it will be appreciated that the
principles discussed below
26 are equally applicable to other encoding/decoding schemes. In MPEG video
encoding, a group
27 of pictures is used to specify the order in which intra-frame and inter-
frames are arranged,
28 wherein the group of pictures is a stream of encoded frames in the video
data stream 126. The
21700696.1
-13-

CA 02612122 2007-11-23
1 frames 128 in MPEG encoding are of the following types: An I-frame (intra
coded) corresponds
2 to a fixed image and is independent of other picture types. Each group of
pictures begins with
3 this type of frame. A P-frame (predictive coded) contains difference
information from the
4 preceding I or P-frame. A B-frame (bidirectionally predictive coded)
contains difference
information from the preceding and/or following I or P-frame. D frames may
also be used,
6 which are DC direct coded pictures that serve the fast advance. In the
following examples, video
7 data stream 126 having I, B and P frames is used.
8 [0054] As shown in Figure 6, the video decode task 122 receives the encoded
video data
9 stream 126 and the target frame rate 132 and outputs decoded video 136 for
the video player 88.
The general components of the video decode task 122 are shown in greater
detail in Figure 8.
11 The video decode task 122 includes a decoder module 150 that decodes the
video data stream
12 126 on a frame by frame basis. A compensation module 152 schedules and
monitors the
13 decoding process and updates an internal frame count 154 and scaling factor
156 as required
14 during a compensation procedure performed thereby as explained below. The
compensation
module 152 also communicates with the decoder module 150 to modify the decode
schedule as
16 needed. For video data 126 utilizing I, P and B frames, the frame count 154
tracks the number of
17 frames decoded since the last I-frame. The scaling factor 156 keeps track
of the average scaling
18 in previous decodes experienced as a result of executions of the
compensation procedure
19 described below. The compensation module 152 also keeps track of a frame
rate cap 158 that
limits the upward scaling amount such that re-scaling the video decode frame
rate to catch up to
21 the audio data stream 130 will not compete with the purpose of the
compensation procedure and
22 saturate the processor 64.
23 [0055] The compensation module 152 adjusts a frame rate timer 160 in
response to the
24 scaling that is deemed to be necessary, and the frame rate timer 160
instructs the decoder 150 at
which rate to decode frames 128. The compensation module 152 also reacts to a
synchronization
26 (sync) timer 162 at predetermined intervals (e.g. 1 second) and monitors
the decoding process to
27 determine if synchronization between the video data stream 126 and the
audio data stream 130 is
28 required. As will be explained below, the compensation module 152 is also
responsible for
21700696.1
-14-

CA 02612122 2007-11-23
1 skipping frames when processor load is saturated and such saturation is not
short-lived
2 (transient) enough to be fixed by scaling the frame rate alone.
3 [0056] Video decoding is a periodic process, namely it uses processor power
for specific
4 intervals of time in a repeated fashion. Although it is important that the
period for performing a
periodic task is as accurate as possible, it is generally desirable to ensure
that the mobile device
6 10 respond to user related tasks in a timely manner. As discussed above,
some processor tasks
7 use up significant processor power such that interactive tasks like cursor
18 movements are
8 adversely affected and clearly noticeable to the user. Typically, user
related tasks 139 are lower
9 priority threads that are neglected when the processor 64 becomes saturated.
It has been
recognized that the lower priority threads, especially on a mobile device 10,
may need to operate
11 in most situations for the mobile device 10 to be considered `usable'. As
such, even though, e.g.
12 a multimedia file 120 could be decoded in real time, a limit on the
processor usage is set to
13 ensure that the user-related tasks 139 can occur without more than a
transient saturation. The
14 compensation procedure performed by the compensation module 152 scales the
frame rate and, if
necessary, skips frames in an adaptive manner, to lessen such adverse affects.
16 [0057] Turning now to Figure 9, a series of generalized timing diagrams are
shown to
17 illustrate the effect of the scaling module's operations when scaling the
frame rate and/or
18 skipping frames. In Figure 9, for ease of explanation, it will be
appreciated that each period
19 represents either a single frame 128 or a block of frames, e.g. those
frames that are grouped with
a particular leading I-frame. For each period, the portion at "1" indicates
the amount dedicated
21 to video decoding and thus consuming processor resources, and "0" indicates
that video
22 decoding is not occurring and the processor resources are available to
and/or being used by other
23 tasks 124. Figure 9 also shows a generic waveform along with each video
timing diagram
24 representing the corresponding stream of audio data 130. It has been
recognized that where
compensation of a multimedia output is required, degradation of the video data
stream 126 is
26 generally more agreeable than degradation of the audio data stream 130
since the human eye is
27 generally less sensitive to variations in frame rate than the human ear. As
such, it can be seen in
28 the timing diagrams in Figure 9 that the audio data stream 130 maintains a
consistent rate whilst
21700696.1
-15-

CA 02612122 2007-11-23
=
1 the frame rate for the video data stream 126 is compensated as needed in
order to re-align with
2 the audio data stream 130 at some point.
3 [00581 The compensation module 152 continuously monitors and schedules the
decoding
4 process, e.g. as shown in Figure 9, and may decode any number of frames in a
group and skip the
other frames as will be explained below. In this way, all frames may be
decoded, down to zero
6 in a particular group. It is understood that decoding zero frames in every
group causes no video
7 to be played. If a group is skipped entirely, the additional processor time
that is made available
8 can be used to pre-decode for the next group thus avoiding the existence of
two subsequently
9 skipped groups.
[00591 Timing diagram 1 in Figure 9 shows a normal decode sequence where the
processor
11 64 is capable of decoding the multimedia file 120, i.e. both video data
stream 126 and audio data
12 stream 130, in `real time' at the target frame rate 132 which produces the
period shown. As
13 shown in Figure 9, each time block T; includes five frames 128, numbered 1-
5. For ease of
14 explanation, i = 0 to 4 in this example. Each time period T is equal to 1
second for simplicity.
Therefore, in timing diagram 1, the target frame rate 132 is 5 frames per
second (fps). When
16 examining the subsequent timing diagrams 2 to 4, a cross-reference may be
made to timing
17 diagram 1 to ascertain how the video data stream 126 is being compensated
in the scenarios
18 depicted.
19 100601 Turning now to timing diagram 2, it can be seen that between T1 and
T2, the frame
decode rate has been slowed down and thus only frames 1-4 are decoded in the
same time that
21 five frames would normally decode (i.e. 4 fps). It can be appreciated that
by slowing down the
22 frame rate, e.g. from 24 fps to 18 fps in a realistic situation, the
processor time dedicated to video
23 decoding can be decreased thus freeing up processor time for other tasks
such as the user tasks
24 139 or other transient tasks such as radio traffic. As will be explained in
connection with the
method described below, the compensation module 152 may reduce the frame rate
to compensate
26 for both transient saturation issues and continuous saturation issues.
27 [00611 Also in timing diagram 2, at T2, it is determined that whatever was
saturating the
28 processor 64 between T1 and T2 has gone away, and thus the compensation
module 152 re-scales
21700696.1
-16-

CA 02612122 2007-11-23
1 the frame rate to `catch up' or `resynchronize' with the audio stream. As
can be seen in timing
2 diagram 2, the new frame rate enables six frames to be decoded between T2
and T3. This enables
3 frame 5 that was not able to be decoded between Tl and T2 to be decoded, in
addition to frames
4 1-5 of the next sequence (i.e. 6fps in this time period). As such, at T3,
the video stream is
resynchronized with the audio stream (when compared to timing diagram 1). In
this example,
6 since the saturation was transient, e.g. occurring sometime around T1, the
frame rate can return to
7 the target frame rate 132 as that which occurred between To and T1. As shown
below, rescaling
8 (speeding up) the frame rate to `catch up' competes against the goal of
managing processor load
9 since a faster frame rate requires more processor time (i.e. less is
available for other tasks 139).
However, when saturation is transient, the compensation module 152 may be able
to catch up in
1 1 the next 1 s time interval as shown in timing diagram 2.
12 [0062] Turning next to timing diagram 3, another scenario is shown wherein
the
13 compensation procedure utilizes both a variable frame rate and frame
skipping. As before, the
14 video decode sequence between To and T1 is at the target frame rate 132. At
or around T1 it is
determined that the processor 64 has become saturated. Although the saturation
may be
16 transient, it may also be more or less `constant'. The compensation module
152 first scales the
17 frame rate between T1 and T2 as in timing diagram 2. However, at T2, it is
determined that, in
18 order to catch up from the degradation (slow down) imposed between Ti and
T2 in the next time
19 block, i.e. T2 to T3, rescaling the frame rate alone will not resynchronize
the video stream and the
audio stream. This may be due to a particularly bad (prolonged) transient
effect or constant
21 saturation (e.g. another intensive program is running at the same time as
the video player 88).
22 [0063] In this example, it is determined that the frame rate can only be
rescaled back up to
23 the target frame rate 132 without further saturating the processor 64
between T2 and T3. In order
24 to resynchronize during this period, a frame is skipped, e.g. frame 5 from
the previous sequence.
As can be seen, at T3 the audio and video streams are resynchronized and from
T3 to T4 normal
26 decoding occurs. By skipping frames, other frames may also need to be
discarded if they depend
27 on each other such as in the case of an I frame and the associated B and P
frames. In this case,
28 each frame shown in Figure 9 can be considered a frame block and each
skipped frame
29 represents an I frame and the associated B and P frames in that `block' or
`group'. As will be
21700696.1
-17-

CA 02612122 2007-11-23
1 explained below, the video decode function 122 discards the frames (i.e.
does not decode) but
2 does keep track of how many total frames were skipped for future
calculations. If more than one
3 frame 128 is skipped, they may be skipped in succession or spread out during
a predetermined
4 interval. Since P and B frames depend on sets of other frames (which can be
determined by the
module 152), and the I frames do not depend on other frames, the compensation
module 152 may
6 choose the group of frames 128 that has the closest duration to the time
that is desired to be made
7 up for, such that no frame 128 which is not in the group depends on a frame
128 that is in that
8 group.
9 [0064] Turning next to timing diagram 4, yet another scenario is shown. It
can be seen that
the frame sequence in timing diagram 4 is the same as in timing diagram 3 up
to TZ. At this
11 point, it is determined that not only can the frame rate not be rescaled to
catch up alone (i.e.
12 frame skipping is needed) but the current scaled down frame rate will be
required for the next
13 time period. In this case, since only three addition frames can be decoded
between T2 and T3,
14 two frames will need to be skipped in order to catch up in the next ls
interval. Although the
video output on the display 12 may appear somewhat `choppy' or `jerky' for a
brief period of
16 time, since the audio stream will not be disrupted, in this example, at T3,
the video will catch up
17 and the audio should appear smooth. For long transient saturation or
constant saturation, both a
18 slower frame rate and frame skipping may be needed either continuously
throughout the video,
19 or for certain extended (and/or periodic) blocks of time in order to leave
a buffer of processor
time available to enable the mobile device 10 to be usable. In timing diagram
4, frame 5 from
21 the previous sequence is skipped and frame 3 from the next sequence. It
will be appreciated that
22 where the frames in Figure 9 represent groups of frames, the `skipped'
frames may be an I-frame
23 with those B and P frames associated with that I-frame. In a general sense,
where the `skipped'
24 frames are individual frames, the choice of which frames to skip can be
made according to the
nature of the video itself, based on the specific encoding/decoding process
being used and/or to
26 maintain continuity (e.g. do not skip two frames in a row).
27 [0065] Figures 10 and 11 illustrate an example algorithm for performing a
variable frame
28 rate and frame skipping compensation procedure when scheduling a video
decode stream, to alter
29 the video data stream 126, as shown generally in the timing diagrams in
Figure 9, and described
21700696.1
-18-

CA 02612122 2007-11-23
1 above. In this example algorithm, frames are evaluated as frame blocks or
groups of I, P and B
2 frames.
3 [0066] Turning first to Figure 10, a procedure that is executed by the
compensation module
4 152 according to the `firing' or periodic interval of the frame rate timer
160 is shown. As noted
above, the multimedia file 120 provides a target frame rate 132 that dictates
the desired period at
6 which to play the video on the display 12. The video player 88 will attempt
to play the video at
7 this desired rate by setting the frame rate timer 160 operated by the video
decode task 122
8 accordingly. In this way, the video player 88 normally attempts to schedule
the video decodes
9 such that they occur once per period. For example, if the desired frame rate
is 25fps, the decoder
module 150 attempts to decode one frame every 40ms. In this case, the frame
rate timer 160
11 would `fire' every 40ms.
12 [0067] The procedure shown in Figure 10 repeats for each group of frames,
and consequently
13 maintains the frame count 154 as a reference to how many frames have been
decoded since the
14 last I-frame was encountered. After each frame 128 is decoded at step 200,
the frame count 154
is incremented by one at step 202. In step 202, the current scaling factor
being applied to the
16 frame (i.e. deviation from target frame rate 132) is added to a scaling
count 156. The scaling
17 count 156 tracks the scaling applied over time to determine average scaling
for predicting if
18 future rescaling can be performed to catch up the video with the audio.
19 [0068] For each frame 128 that is decoded, the compensation module 152
determines if the
current processor usage is sufficient at step 204 by determining if the amount
of processor time
21 given to an idle task (not shown) and other applications 80, since the last
I-frame, is a sufficient
22 percentage of the total processor time consumed since the last I-frame. The
idle task represents
23 the amount of time that the processor 64 is not performing any task.
Reference is made to the I-
24 frame, since any group of frames 128 that can be displayed starts with an I-
frame as frames 128
in such a decoding scheme, cannot be decoded unless the frame 128 is either an
I-frame or the
26 frame 128 before was also decoded. As such, when an I-frame is decoded, it
is possible to drop
27 out of the group (i.e. skip the remaining frames 128 in the group) if the
particular group of
21700696.1
-19-

CA 02612122 2007-11-23
1 frames is using too much processor power to decode. The process may then
begin again at the
2 next I-frame.
3 [0069] If the amount of time is sufficient, this means that the amount of
processor time
4 dedicated to idle tasks and other applications 80 is greater than a
predetermined threshold.
Setting this threshold low makes the video decoding smoother, whereas setting
it high makes
6 applications more responsive. The threshold can vary based on the nature of
the mobile device
7 10 and what applications 80 and features are generally available. The
threshold should be
8 chosen so that the mobile device 10 is responsive and the processor time is
balanced. The
9 decoder module 150 then determines if the next frame is an I-frame, i.e. we
are at the end of a
group of frames, at step 206. If not, steps 200-204 are repeated for the
remaining frames. If so,
11 for each I-frame that is encountered, the frame counter 154 is reset and a
timestamp recorded for
12 the leading I-frame of the next block at step 208 and the next group of
frames 128 can be
13 decoded. When the target frame rate 132 is achievable without any
compensation, these steps
14 above will repeat until saturation occurs.
[0070] If the amount of time is insufficient (saturation detected), then
compensation module
16 152 will scale down the frame rate. At step 210, the compensation module
152 first looks at the
17 previous scaling performed per what is stored in the scaling count 156.
This is done to determine
18 of the amount of processor usage dedicated to the applications 80 currently
running, what
19 percentage is consumed by the video decode task 122, and what percentage is
dedicated to the
other tasks 124. If the amount of processor usage consumed by the other tasks
124 stays the
21 same, and considering the previous average scaling, it may then be
determined how the video
22 decoding task 122 should be scaled in order to have the total processor
usage for the applications
23 80 and the idle task meet a particular threshold or target usage at step
212. This target usage is
24 based on a predetermined maximum processor usage that leaves enough
processor power to
accommodate user related tasks 139.
26 [0071] Based on the above determination at step 212, a scaling factor can
be applied at step
27 214 and this scaling factor added to the scaling count 156 so that if the
frame rate is decreased
28 (slowed), it can later be rescaled by increasing (speeding up) the frame
rate to catch the video up
21700696.1
-20-

CA 02612122 2007-11-23
1 to the audio. The procedure then determines if the next I-frame is
encountered at step 206 and
2 resets the counter 154 if this is true. It can therefore be seen that at
each frame decode, the
3 current scaling (and average of previous scalings) is examined. For example,
if at a first frame
4 128, the frame rate is decreased, the decode for the next frame 128 is
slower and, if the saturation
is transient, the compensation module 152 may determine once that next frame
128 is decoded,
6 the video can be rescaled. However, if at the next frame, the processor
usage does not correct,
7 further scaling can be performed and thus the frame rate can be decreased at
each frame 128 and
8 then readjusted if necessary when the sync timer 162 fires as will be
explained below.
9 [0072] Turning now to Figure 11, a procedure performed whenever the sync
timer 162 fires
is shown. The sync timer 162 can be set to fire at a predetermined interval,
e.g. ls. This interval
11 should be chosen to balance the competing objectives of enabling enough
time to catch up the
12 video while not having too long of a gap with `bad video'. It will be
appreciated that the interval
13 can also be linked to other parameters such as for each group of frames
(i.e. monitor environment
14 at every I-frame). However, it will be appreciated that since I-frames do
not appear at consistent
intervals, this alternative may result in poorer performance (or fire too
often) than an arbitrary
16 but consistent interval. In this example, the sync timer 162 fires every 1
s.
17 [0073] As noted above, if the processor usage does not correct itself (i.e.
the saturation is not
18 transient enough), simply slowing down and then attempting to speed up the
frame rate timer 160
19 may cause the playback to be too slow and frames 128 may need to be skipped
to catch up or to
periodically resynchronize the video and the audio. When the sync timer 162
fires, the decoder
21 module 150 first determines if the video and audio are out of sync at step
220. If not, then
22 frames 128 may continue to be decoded as the frame timer 160 fires (e.g.
per Figure 10). If the
23 video and audio is out of sync, the decoder module 150 then determines the
scaling factor that
24 would be required to catch up the video to the audio at step 222.
[0074] The decoder module 150 then looks at the required scaling factor at
step 224 and if
26 the new frame rate that would be required to catch up is greater than the
frame rate cap 158,
27 frames 128 need to be skipped. The frame rate cap 158 ensures that the
compensation module
28 152 does not compete with itself, since an increase in frame rate
ultimately increases the overall
21700696.1
-21 -

CA 02612122 2007-11-23
1 processor usage. The frame rate cap 158 is a target value that is intended
to leave enough
2 processor power to the user related tasks 139 such that the response to
these tasks is not
3 detrimental to the usability of the mobile device 10. The frame rate cap 158
is typically
4 dependent on the number of user related tasks 139 and the overall processor
power. As such, the
frame rate cap 158 will vary from mobile device 10 to mobile device 10.
Therefore, the frame
6 rate cap 158 avoids `overcompensating' in terms of scaling. If the frame
rate cap 158 does come
7 into effect at step 224, this signifies that whatever task(s) was/were
consuming a large amount of
8 processing time was not sufficiently transient to be compensated for in the
next sync timer cycle
9 (e.g. 1 s). If the frame rate cap 158 does not come into effect at step 224,
the scaling can be
applied to the frame rate at step 226 to compensate and thus catch up the
video to the audio
11 according to the determination made at step 222.
12 [0075] If frames 128 need to be skipped, the procedure shown in Figure 12
may be executed.
13 To skip frames 128, the compensation module 152 first determines how fast
the frame rate timer
14 160 can be set without saturating the processor 64 at step 230 and
determines the frame rate that
would be required to catch up the video to the audio at step 232. At step 234,
the compensation
16 module 152 then determines the difference between what is required to
compensate for the lag in
17 the video and what can be done according to the restrictions on the
processor usage. In other
18 words, it is first determined how fast the frame rate can be set without
saturating the processor
19 64, and then determined how much this differs from how fast the frame rate
timer 160 would
have to be set in order to resynchronize the video and audio in the next sync
timer cycle (e.g. is).
21 [0076] In this example, this can be done by calculating how far back the
video data stream
22 126 will be 1 s from now when compared to where it should be to match up
with the audio. This
23 is represented pictorially in Figure 9, where at T2 in timing diagram 2,
the video data stream 126
24 is two frames behind. When dealing with groups of frames, the compensation
module 152
counts back from the next I-frame to determine how many frames would have to
be discarded
26 instead of being decoded in order to bring the video back up to the same
point as the audio. In
27 general terms, this is illustrated in Figure 9 in timing diagram 3 where
frame 5 from the previous
28 sequence and frame 3 from the next sequence are discarded or decoding
thereof is skipped. This
29 example can represent a case where two groups of frames are discarded in
order to catch up,
21700696.1
-22-

CA 02612122 2007-11-23
1 namely group 5 from the previous block of groups and group 3 from the next
block of groups.
2 When dealing with I-frames, the compensation module 152 can use the frame
count 154 to
3 determine how far away from an I-frame we are so that it can schedule how
many frames to
4 discard at step 236. It can therefore be seen that the general principles
shown in Figure 9 can be
adapted to be used with the various possible encoding and decoding schemes and
may apply to
6 both frame-by-frame analysis and group of frames analysis.
7 [0077] These frames 128 are then discarded from the queue at step 238 and
are thus not
8 decoded. The compensation module 152 tracks the number of frames 128 that
are discarded at
9 step 240 so that when the decoder module 150 reaches such frames in the
decode schedule it can
count these towards the number of frames having been played in order to
calculate how to scale
11 the frame rate timer 160 to maintain a certain percentage of processor
usage for the applications
12 80 and idle task in Figure 10.
13 100781 Accordingly, the procedure shown in Figure 10 operates according to
the frame rate
14 timer 160 in order to adapt to the current processor usage and compensate
by degrading the video
data stream 126 in order to make the mobile device 10 usable. The procedure in
Figure 11
16 operates less frequently than the procedure in Figure 10 and periodically
evaluates the
17 environment to see if non-transient saturation is causing the scaling to be
unable to bring the
18 video back into alignment with the audio as represented in timing diagram 1
of Figure 9. If
19 necessary, the procedure in Figure 11 may call the frame skipping procedure
shown in Figure 12
to combine both scaling and frame skipping in an adaptive manner to handle
both transient and
21 constant saturation of the processor 64.
22 [0079] It can therefore be seen that the above provides a method of
decoding a multimedia
23 file that can handle both transient and constant saturation of the
processor 64 by performing a
24 method that may incorporate both a variable frame rate procedure and an
adaptive frame
skipping procedure as required. The method operates on multimedia files 120
having a video
26 data stream 126 including a series of frames 128 and an audio data stream
130 to be played in
27 synchronization with the video data stream as frames are decoded.
21700696.1
-23-

CA 02612122 2007-11-23
1 [0080] The method comprises decoding the frames of the video data stream and
playing the
2 video data stream at a target frame rate while playing the audio data stream
therewith. At a first
3 periodic interval, it is determined if the target frame rate can be achieved
while meeting a
4 predetermined usage threshold for the processor, wherein if the usage
threshold is met,
subsequent frames are decoded at the target frame rate, and if the usage
threshold is not met, a
6 scaling factor is determined which is required to meet the usage threshold,
the target frame rate is
7 modified according to the scaling factor, and subsequent frames are decoded
at a modified frame
8 rate. At a second periodic interval, it is determined if the video data
stream and the audio data
9 stream are out of synchronization, wherein if the video data stream and the
audio data stream are
not out of synchronization, subsequent frames are decoded at the target frame
rate, and if the
11 video data stream and the audio data stream are out of synchronization, a
rescaling factor is
12 determined which is required to catch the video data stream up to the audio
data stream, the
13 rescaling factor is applied to the modified frame rate and, if the re-
modified frame rate does not
14 exceed a capped frame rate, subsequent frames are decoded at a re-modified
frame rate until the
video data stream is resynchronized with the audio data stream.
16 [0081] If the re-modified frame rate does exceed the capped frame rate,
subsequent frames
17 are decoded at the capped frame rate while skipping the decoding of one or
more frames to be
18 decoded within a next time interval.
19 [0082] It will be appreciated that the examples described above are for
illustrative purposes
only and many other variations can be used according to the principles
described. This applies,
21 e.g. to general computing devices that are used to decode and display
video, both mobile and
22 stationary.
23 [0083] Although the above has been described with reference to certain
specific
24 embodiments, various modifications thereof will be apparent to those
skilled in the art as
outlined in the appended claims.
21700696.1
-24-

Dessin représentatif
Une figure unique qui représente un dessin illustrant l'invention.
États administratifs

2024-08-01 : Dans le cadre de la transition vers les Brevets de nouvelle génération (BNG), la base de données sur les brevets canadiens (BDBC) contient désormais un Historique d'événement plus détaillé, qui reproduit le Journal des événements de notre nouvelle solution interne.

Veuillez noter que les événements débutant par « Inactive : » se réfèrent à des événements qui ne sont plus utilisés dans notre nouvelle solution interne.

Pour une meilleure compréhension de l'état de la demande ou brevet qui figure sur cette page, la rubrique Mise en garde , et les descriptions de Brevet , Historique d'événement , Taxes périodiques et Historique des paiements devraient être consultées.

Historique d'événement

Description Date
Représentant commun nommé 2019-10-30
Représentant commun nommé 2019-10-30
Requête pour le changement d'adresse ou de mode de correspondance reçue 2018-12-04
Accordé par délivrance 2015-03-24
Inactive : Page couverture publiée 2015-03-23
Exigences relatives à la nomination d'un agent - jugée conforme 2015-02-12
Inactive : Lettre officielle 2015-02-12
Exigences relatives à la révocation de la nomination d'un agent - jugée conforme 2015-02-12
Inactive : Lettre officielle 2015-02-11
Demande visant la révocation de la nomination d'un agent 2015-01-27
Requête pour le changement d'adresse ou de mode de correspondance reçue 2015-01-27
Demande visant la nomination d'un agent 2015-01-27
Préoctroi 2014-12-16
Inactive : Taxe finale reçue 2014-12-16
Inactive : Lettre officielle 2014-10-20
Lettre envoyée 2014-10-20
Un avis d'acceptation est envoyé 2014-07-08
Lettre envoyée 2014-07-08
month 2014-07-08
Un avis d'acceptation est envoyé 2014-07-08
Inactive : Approuvée aux fins d'acceptation (AFA) 2014-07-03
Inactive : Q2 réussi 2014-07-03
Inactive : CIB désactivée 2014-05-17
Inactive : CIB du SCB 2014-02-01
Modification reçue - modification volontaire 2014-01-28
Inactive : CIB expirée 2014-01-01
Inactive : CIB enlevée 2013-11-04
Inactive : CIB en 1re position 2013-11-04
Inactive : CIB attribuée 2013-11-04
Inactive : Dem. de l'examinateur par.30(2) Règles 2013-07-29
Modification reçue - modification volontaire 2013-01-17
Inactive : Dem. de l'examinateur par.30(2) Règles 2012-08-13
Modification reçue - modification volontaire 2012-03-28
Inactive : Dem. de l'examinateur par.30(2) Règles 2011-09-30
Demande publiée (accessible au public) 2009-05-23
Inactive : Page couverture publiée 2009-05-22
Inactive : Lettre officielle 2009-05-05
Lettre envoyée 2009-05-05
Inactive : Transfert individuel 2009-03-20
Inactive : CIB en 1re position 2008-04-02
Inactive : CIB attribuée 2008-04-01
Inactive : CIB attribuée 2008-04-01
Inactive : CIB enlevée 2008-04-01
Inactive : CIB attribuée 2008-04-01
Inactive : Certificat de dépôt - RE (Anglais) 2008-01-11
Lettre envoyée 2008-01-11
Demande reçue - nationale ordinaire 2008-01-11
Exigences pour une requête d'examen - jugée conforme 2007-11-23
Toutes les exigences pour l'examen - jugée conforme 2007-11-23

Historique d'abandonnement

Il n'y a pas d'historique d'abandonnement

Taxes périodiques

Le dernier paiement a été reçu le 2014-11-03

Avis : Si le paiement en totalité n'a pas été reçu au plus tard à la date indiquée, une taxe supplémentaire peut être imposée, soit une des taxes suivantes :

  • taxe de rétablissement ;
  • taxe pour paiement en souffrance ; ou
  • taxe additionnelle pour le renversement d'une péremption réputée.

Les taxes sur les brevets sont ajustées au 1er janvier de chaque année. Les montants ci-dessus sont les montants actuels s'ils sont reçus au plus tard le 31 décembre de l'année en cours.
Veuillez vous référer à la page web des taxes sur les brevets de l'OPIC pour voir tous les montants actuels des taxes.

Titulaires au dossier

Les titulaires actuels et antérieures au dossier sont affichés en ordre alphabétique.

Titulaires actuels au dossier
BLACKBERRY LIMITED
Titulaires antérieures au dossier
AARON B. SMALL
DAVID MAK-FAN
THOMAS C. NAGY
Les propriétaires antérieurs qui ne figurent pas dans la liste des « Propriétaires au dossier » apparaîtront dans d'autres documents au dossier.
Documents

Pour visionner les fichiers sélectionnés, entrer le code reCAPTCHA :



Pour visualiser une image, cliquer sur un lien dans la colonne description du document (Temporairement non-disponible). Pour télécharger l'image (les images), cliquer l'une ou plusieurs cases à cocher dans la première colonne et ensuite cliquer sur le bouton "Télécharger sélection en format PDF (archive Zip)" ou le bouton "Télécharger sélection (en un fichier PDF fusionné)".

Liste des documents de brevet publiés et non publiés sur la BDBC .

Si vous avez des difficultés à accéder au contenu, veuillez communiquer avec le Centre de services à la clientèle au 1-866-997-1936, ou envoyer un courriel au Centre de service à la clientèle de l'OPIC.


Description du
Document 
Date
(yyyy-mm-dd) 
Nombre de pages   Taille de l'image (Ko) 
Description 2007-11-22 24 1 333
Abrégé 2007-11-22 1 18
Revendications 2007-11-22 5 195
Dessins 2007-11-22 10 156
Dessin représentatif 2009-04-26 1 7
Page couverture 2009-05-14 2 42
Revendications 2012-03-27 5 194
Revendications 2013-01-16 9 356
Revendications 2014-01-27 7 285
Page couverture 2015-02-17 1 37
Dessin représentatif 2015-02-17 1 6
Accusé de réception de la requête d'examen 2008-01-10 1 176
Certificat de dépôt (anglais) 2008-01-10 1 159
Courtoisie - Certificat d'enregistrement (document(s) connexe(s)) 2009-05-04 1 103
Rappel de taxe de maintien due 2009-07-26 1 110
Avis du commissaire - Demande jugée acceptable 2014-07-07 1 161
Correspondance 2009-05-04 1 10
Correspondance 2014-10-19 1 22
Correspondance 2014-12-15 3 79
Correspondance 2015-01-26 10 572
Correspondance 2015-02-10 4 402
Correspondance 2015-02-11 4 713