Language selection

Search

Patent 2823388 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent Application: (11) CA 2823388
(54) English Title: METHOD AND APPARATUS FOR GESTURE BASED CONTROLS
(54) French Title: PROCEDE ET APPAREIL POUR DES COMMANDES A BASE DE GESTES
Status: Deemed Abandoned and Beyond the Period of Reinstatement - Pending Response to Notice of Disregarded Communication
Bibliographic Data
(51) International Patent Classification (IPC):
  • H04N 5/775 (2006.01)
  • H04N 21/432 (2011.01)
  • H04N 21/472 (2011.01)
(72) Inventors :
  • HAYES, ROBIN (United States of America)
(73) Owners :
  • TIVO INC.
(71) Applicants :
  • TIVO INC. (United States of America)
(74) Agent: SMITHS IP
(74) Associate agent: OYEN WIGGS GREEN & MUTALA LLP
(45) Issued:
(86) PCT Filing Date: 2012-01-05
(87) Open to Public Inspection: 2012-07-12
Examination requested: 2013-06-27
Availability of licence: N/A
Dedicated to the Public: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): Yes
(86) PCT Filing Number: PCT/US2012/020306
(87) International Publication Number: WO 2012094479
(85) National Entry: 2013-06-27

(30) Application Priority Data:
Application No. Country/Territory Date
12/986,054 (United States of America) 2011-01-06
12/986,060 (United States of America) 2011-01-06

Abstracts

English Abstract

In an embodiment, a number of parallel gestures are detected, in a particular area on a touch screen interface of a device. A command is identified based at least on the parallel gestures and an action associated with the command is performed.


French Abstract

Dans un mode de réalisation, un certain nombre de gestes parallèles sont détectés, dans une zone particulière sur une interface d'écran tactile d'un dispositif. Une commande est identifiée au moins sur la base des gestes parallèles et une action associée à la commande est effectuée.

Claims

Note: Claims are shown in the official language in which they were submitted.


CLAIMS
1. A method, comprising:
detecting a slide gesture, in a particular area on a touch screen interface of
a
device, from a first location in the particular area to a second location
in the particular area;
identifying a video playback command for a video based at least on the slide
gesture;
performing an action associated with the video playback command.
2. The method as recited in Claim 1, wherein the sliding gesture is detected
without
detecting selection of any video progress indicator displayed within the
particular
area.
3. The method as recited in Claim 1, wherein the slide gesture is detected in
the
particular area concurrently with displaying at least a portion of the video
in the
particular area.
4. The method as recited in Claim 1, wherein the slide gesture is detected in
the
particular area while displaying information on how to perform one or more
gestures
in the particular area.
5. The method as recited in Claim 1, wherein identifying the video playback
command
comprises:
identifying the particular area, in which the slide gesture was detected, from
a
plurality of areas on the touch screen interface;
wherein identifying the video playback command comprises selecting the
video playback command from a plurality of video playback
commands associated with the particular area.
6. The method as recited in Claim 1, wherein performing the action comprises a
first
device sending information to a second device, the information based on the
video
playback command.
7. The method as recited in Claim 1, wherein performing the action associated
with the
video comprises performing the action on a same device as the device detecting
the
slide gesture.
8. The method as recited in Claim 1, wherein the video playback command
selects a
playing speed and direction.
9. The method as recited in Claim 1, wherein the slide gesture comprises a
swipe gesture
- Page 22 -

from the first location to a second location.
10. The method as recited in Claim 1, wherein the slide gesture comprises a
flick gesture
starting at the first location.
11. The method as recited in Claim 1, wherein the video playback command is
for one or
more of:
pausing the playing of the video;
resuming the playing of the video;
replaying a played portion of the video;
stopping playing of the video;
stopping playing of the video and resuming playing of the video at a
particular
playing position;
playing the video in slow motion;
frame-stepping through a video;
playing the video from the beginning;
playing one or more videos from a next playlist;
playing the video from a particular scene forward;
bookmarking a playing position in the video;
stopping playing and resuming playing at a bookmarked position; or
rating the video.
12. A method, comprising:
concurrently detecting a plurality of parallel gestures on a touch screen
interface of a device;
determining a number of the plurality of parallel gestures;
selecting a command from a plurality of commands based on the number of
the plurality of parallel gestures;
performing an action associated with the command.
13. The method as recited in Claim 12, wherein selecting the command comprises
selecting a menu option based on the number of the plurality of parallel
gestures.
14. The method as recited in Claim 12, wherein the plurality of parallel
gestures comprise
a plurality of parallel sliding gestures performed in a same direction.
15. The method as recited in Claim 12, wherein determining the number of the
plurality
of parallel gestures comprises determining a number of tap gestures
concurrently
performed on the touch screen interface.
- Page 23 -

16. The method as recited in claim 12, further comprising determining a
playback speed
for playing of multi-media content based on the number of the plurality of
parallel
gestures.
17. A method, comprising:
concurrently detecting a plurality of parallel gestures on a touch screen
interface of a remote control device in a left to right direction;
determining a number of the plurality of parallel gestures;
selecting a playback speed from a plurality of playback speeds based on the
number of the plurality of parallel gestures;
playing multimedia content on a multimedia device at the selected playback
speed.
18. The method as recited in Claim 17, wherein the plurality of playback
speeds
comprises two or more fast-forward speeds.
19. The method as recited in Claim 17, wherein the multimedia device comprises
an
audio device.
20. The method as recited in Claim 17, wherein the multimedia device comprises
a video
device.
21. A method, comprising:
concurrently detecting a plurality of parallel gestures on a touch screen
interface of a remote control device in a right to left direction;
determining a number of the plurality of parallel gestures;
selecting a rewind speed from a plurality of rewind speeds based on the
number of the plurality of parallel gestures;
playing multimedia content in a rewind mode on a multimedia device at the
selected rewind speed.
22. The method as recited in Claim 21, wherein the plurality of playback
speeds
comprises two or more rewind speeds.
23. A computer readable storage medium comprising a sequence of
instructions, which
when executed by one or more processors, cause performing steps as recited in
any
one of Claims 1-22.
24. A device comprising:
one or more processors;
the device configured to perform the steps as recited in any one of Claims 1-
22.
- Page 24 -

Description

Note: Descriptions are shown in the official language in which they were submitted.


CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
METHOD AND APPARATUS FOR GESTURE BASED CONTROLS
FIELD OF THE INVENTION
[0001] The present invention relates to the use of gestures. Specifically,
the invention
relates to gesture-based controls for multimedia content.
BACKGROUND
[0002] The approaches described in this section are approaches that could
be pursued, but
not necessarily approaches that have been previously conceived or pursued.
Therefore,
unless otherwise indicated, it should not be assumed that any of the
approaches described in
this section qualify as prior art merely by virtue of their inclusion in this
section.
[0003] Multimedia content such as web pages, images, video, slides, text,
graphics, sound
files, audio/video files etc. may be displayed or played on devices. Commands
related to
playing or displaying of content on devices may be submitted by a user on the
device itself or
on a separate device functioning as a remote control.
[0004] For example, a user may select a button on a remote control to play,
pause, stop,
rewind, or fast-forward a video being displayed on a television.
BRIEF DESCRIPTION OF THE DRAWINGS
[0005] The present invention is illustrated by way of example, and not by
way of
limitation, in the figures of the accompanying drawings and in which like
reference numerals
refer to similar elements and in which:
[0006] Figure 1 is a block diagram illustrating an example system in
accordance with one
or more embodiments;
[0007] Figure 2 illustrates a flow diagram for detecting a gesture in
accordance with one
or more embodiments;
[0008] Figure 3 illustrates an example interface in accordance with one or
more
embodiments;
[0009] Figure 4 shows a block diagram that illustrates a system upon which
an
embodiment of the invention may be implemented.
DETAILED DESCRIPTION
[0010] In the following description, for the purposes of explanation,
numerous specific
- Page 1 -

CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
details are set forth in order to provide a thorough understanding of the
present invention. It
will be apparent, however, that the present invention may be practiced without
these specific
details. In other instances, well-known structures and devices are shown in
block diagram
form in order to avoid unnecessarily obscuring the present invention.
[0011] Several features are described hereafter that can each be used
independently of
one another or with any combination of the other features. However, any
individual feature
might not address any of the problems discussed above or might only address
one of the
problems discussed above. Some of the problems discussed above might not be
fully
addressed by any of the features described herein. Although headings are
provided,
information related to a particular heading, but not found in the section
having that heading,
may also be found elsewhere in the specification.
[0012] Example features are described according to the following outline:
1.0 OVERVIEW
2.0 SYSTEM ARCHITECTURE
3.0 GESTURES
4.0 GESTURE AREA(S)
5.0 COMMANDS
6.0 DETECTING A GESTURE WITHIN A GESTURE AREA
7.0 EXAMPLE GESTURES AND COMMANDS
8.0 REMOTE CONTROL USE EXAMPLES
9.0 EXAMPLE EMBODIMENTS
10.0 HARDWARE OVERVIEW
11.0 EXTENSIONS AND ALTERNATIVES
1.0 OVERVIEW
[0013] In an embodiment, a gesture is detected in a particular area of a
touch screen
interface on a device. The gesture may not necessarily select or move any
visual objects
within the particular area. For example, the gesture may be detected in a
blank box, on top of
a video, on top of instructional information for performing gestures, etc. A
video playback
command associated with the gesture may be identified, and an action
corresponding to the
video playback command may be determined. The action may then be performed on
the
same device that detects the gesture. The action may be performed on a
different device that
is communicatively coupled with the device that detects the gesture.
[0014] In an embodiment, multiple input instruments (e.g., multiple
fingers) may be used
- Page 2 -

CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
concurrently to perform parallel or identical gestures on a touch screen
interface. Based on
the number of gestures that are detected, an action may be selected. For
example, the number
of gestures may also be used to select a particular item from a menu or may be
used to
identify a command.
[0015] Although specific components are recited herein as performing the
method steps,
in other embodiments agents or mechanisms acting on behalf of the specified
components
may perform the method steps. Further, although some aspects of the invention
are discussed
with respect to components on a system, the invention may be implemented with
components
distributed over multiple systems. Embodiments of the invention also include
any system
that includes the means for performing the method steps described herein.
Embodiments of
the invention also include a computer readable medium with instructions, which
when
executed, cause the method steps described herein to be performed.
2.0 SYSTEM ARCHITECTURE
[0016] Although a specific computer architecture is described herein, other
embodiments
of the invention are applicable to any architecture that can be used to
perform the functions
described herein.
[0017] Figure 1 is a block diagram illustrating an example system (100) in
accordance
with one or more embodiments. The example system (100) includes one or more
components that function as content sources, touch screen interface devices,
multimedia
devices (e.g., devices that play audio and/or video content), and/or content
management
devices. Specific components are presented to clarify the functionalities
described herein and
may not be necessary to implement one or more embodiments. Each of these
components are
presented to clarify the functionalities described herein and may not be
necessary to
implement one or more embodiments.
[0018] Components not shown in Figure 1 may also be used to perform the
functionalities described herein. Functionalities described as performed by
one component
may instead be performed by another component.
[0019] An example system (100) may include one or more of: an input device
(110), a
multimedia device (140), and a data repository (150). One or more devices
shown herein
may be combined into a single device or further divided into multiple devices.
For example,
the input device (110) and the multimedia device (140) may be implemented in a
single
device. The multimedia device (140) may be configured to play audio and/or
video content.
The multimedia device (140) may be configured to display one or more still
images. In
- Page 3 -

CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
another example, an input device (110) may be used as a remote control
detecting gesture-
based commands related to content being displayed on a separate multimedia
device (140).
The input device (110) may communicate directly with the multimedia device
(140) or may
communicate with an intermediate device (not shown). The intermediate device
may, for
example, function as a content source for the multimedia device (140) or a
media
management device. A network bus (102) connecting all components within the
system
(100) is shown for clarity. The network bus (102) may represent any local
network, intranet,
Internet, etc. The network bus (102) may include wired and/or wireless
segments. All
components (shown as communicatively coupled) may not necessarily be
communicatively
coupled to all other components within the system (100).
[0020] In an embodiment, input device (110) may include a touch screen
interface (115)
configured to detect one or more gestures, as described herein. Input device
(110) may be
configured to detect a gesture, a path of a gesture, a speed of a gesture, an
acceleration of the
gesture, a direction of a gesture, etc.
[0021] In one example, input device (110) may include a resistive system
where an
electrical current runs through two layers which make contact at spots/areas
on the touch
screen interface (115) that are touched. The coordinates of the contact points
or contact spots
may be compared to gesture information stored in a data repository (150) to
identify a gesture
performed by a user on the touch screen interface (115). In another example,
input device
(110) may include a capacitive system with a layer that stores electrical
charge, a part of
which is transferred to a user where the user touches the touch screen
interface (115). In
another example, input device (110) may include a surface acoustic wave system
with two
transducers with an electrical signal being sent from one transducer to
another transducer.
Any interruption of the electrical signal (e.g., due to a user touch) may be
used to detect a
contact point on the touch screen interface (115). For example, input device
(110) may be
configured to first detect that an initial user touch on a visual
representation, of the data,
displayed on the touch screen interface.
[0022] In an embodiment, input device (110) may include hardware configured
for
receiving data, transmitting data, or otherwise communicating with other
devices in the
system (100). For example, input device (110) may be configured to detect a
gesture
performed by a user and perform a video playback action associated with the
gesture. In
another example, input device (110) may include functionality to transmit
information (may
be referred to herein as and used interchangeably with "metadata") associated
with the
gesture. For example, input device (110) may be configured to transmit
information
- Page 4 -

CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
comprising a chronological sequence of detected contact points on the touch
screen interface
(115).
[0023] In an embodiment, input device (110) may include one or more of:
Read Only
Memory (ROM) (206), a Central Processing Unit (CPU), Random Access Memory
(RAM),
Infrared Control Unit (ICU), a key pad scan, a key pad, Non-Volatile Memory
(NVM), one
or more microphones, a general purpose input/output (GPIO) interface, a
speaker/tweeter, a
key transmitter/indicator, a microphone, a radio, an Infrared (IR) blaster, a
network card, a
display screen, a Radio Frequency (RF) Antenna, a QWERTY keyboard, a network
card,
network adapter, network interface controller (NIC), network interface card,
Local Area
Network adapter, Ethernet network card, and/or any other component that can
receive
information over a network. In an embodiment, input device (110) may be
configured to
communicate with one or more devices through wired and/or wireless segments.
For
example, the input device (110) may communicate wirelessly over one or more
of: radio
waves (e.g., Wi-Fi signal, Bluetooth signal), infrared waves, over any other
suitable
frequency in the electro-magnetic spectrum, over a network connection (e.g.,
intranet,
internet, world wide web, etc.), or through any other suitable method.
[0024] In an embodiment, input device (110) generally represents any device
which may
be configured for detecting a gesture as user input. A user (includes any
operator of input
device (110)) may perform a gesture by touching the touch screen interface
(115) on the input
device (110). For example, a user may perform a gesture by tapping the touch
screen
interface (115) with a finger or sliding a finger on the touch screen
interface (115).
[0025] For clarity, examples described herein may refer to a particular
input instrument
(e.g., a user's finger) to perform gestures. However, any input instrument
including, but not
limited to, a stylus, a user's finger, a pen, a thimble, etc. may be used to
perform gestures in
accordance with one or more embodiments.
[0026] Gestures relating to touching or making contact with the touch
screen interface
(115), as referred to herein, may include hovering over a touch screen
interface (115) with a
finger (or other input instrument) without necessarily touching the touch
screen interface
(115) such that the touch screen interface (115) detects the finger (e.g., due
to transfer of
electrical charge at a location on the touch screen interface (115)).
3.0 GESTURES
[0027] In an embodiment, a tap gesture may be performed by touching a
particular
location on the touch screen interface (115) and then releasing contact with
the touch screen
- Page 5 -

CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
interface (115). A tap gesture may be detected by detecting a contact to a
touch screen
interface (115) at a particular location followed by detecting that the
contact is released.
[0028] A tap gesture may refer to a gesture performed using one or more
fingers. For
example, a two-fingered tap may be performed by using two fingers to
concurrently touch
two locations on a touch screen interface (115) and thereafter release contact
with the touch
screen interface (115). A two-fingered tap may be detected by concurrently
detecting contact
at two locations on the touch screen interface (115) followed by a release of
the contact.
[0029] In an embodiment, a slide gesture may include any motion in which a
user slides
one or more fingers on the surface of the touch screen interface (115).
Examples of a slide
gesture include flick gestures, swipe gestures, or gestures involving moving a
finger along
any path on the touch screen interface (115). The path may be closed shape
such as a circle
or square where the start and end points are the same or an open shape such as
a right angle
where the start and end points are different. Examples of paths include, but
are not limited to,
a straight line, a curved line, a circle, a square, a triangle, an angle, etc.
[0030] In an embodiment, a flick gesture may be performed by touching a
particular
location on the touch screen interface (115) of the input device (110) with a
finger (or any
other item, e.g., a stylus), and sliding the finger away from the particular
location while
maintaining contact with the touch screen interface (115) for a portion of the
sliding action
performed by the user and continuing the sliding action even after contact
with the touch
screen interface (115) has ended. In an embodiment, the touch screen interface
(115) may be
configured to detect the proximity of the finger after physical contact with
the touch screen
interface (115) has ended.
[0031] For example, the user may release contact with the touch screen
interface (115)
while still moving the finger in the direction of the sliding action even
though additional
surface area of the touch screen interface (115), in the direction of the
sliding action, may be
available to continue the sliding action while maintaining contact.
[0032] In another example, a flick gesture may involve a user touching a
particular
location on the touch screen interface (115) of input device (110) and then
sliding the finger,
while maintaining contact with the touch screen interface (115), beyond the
edge of the touch
screen interface (115). Accordingly, the user may maintain contact with the
touch screen
interface (115) (e.g., with a finger) until the finger reaches the edge of the
touch screen
interface (115) and continue a motion in the same direction past the edge of
the touch screen
interface (115).
[0033] A user performing a flick gesture may continue the sliding action
after releasing
- Page 6 -

CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
contact with the touch screen interface (115). Input device (110) may detect
that contact
between a finger and the touch screen interface (115) was released as the
finger was still
moving based on a duration of contact with the touch screen interface at the
last contact
point. The detected release while the finger is moving may be determined to be
a flick
gesture.
[0034] In an embodiment, a swipe gesture may be performed by touching a
particular
location on the touch screen interface (115) of the input device (110) with a
finger and sliding
the finger away from the particular location while maintaining contact with
the touch screen
interface (115) during the sliding action.
[0035] In another example, a user may slide a finger along the touch screen
interface
(115) from a first location to a second location and thereafter stop by
maintaining contact
with the second location for a threshold period of time (e.g., one second).
The detected
continued contact with the second location may be used to determine that the
user has
completed a swipe gesture.
[0036] In an embodiment, a sliding action (e.g., a swipe or a flick) may be
detected
before the sliding action is completed. For example, a right-direction sliding
gesture may be
detected by detecting contact at a first location followed by contact at a
second location that
is to the right of the first location (or within a particular degree in the
right direction). The
user may continue the sliding gesture to a third location that is right of the
second location,
however, the direction of the sliding gesture may already be detected using
the first location
and the second location.
[0037] In an embodiment, a flick gesture and a slide gesture (e.g., in the
same direction)
may be mapped to the same video playback command. Accordingly, a device may be
configured to detect either of the slide gesture or the flick gesture and
identify the same video
playback command in response to the detected flick gesture or slide gesture.
[0038] In an embodiment, a flick gesture and a slide gesture (possibly in
the same
direction) may be mapped to different commands. For example, a flick gesture
to the left
may correspond to a twenty second rewind command and a swipe gesture to the
left may
correspond to a command for selecting the previous bookmarked scene in a
video. A scene
may be bookmarked, for example, by a user or hard coded into a media recording
such as
selectable scenes from a movie recorded on a Digital Video Disc (DVD).
[0039] In an embodiment, a slide gesture may be performed with multiple
input
instruments being used concurrently. For example, a user may slide two fingers
across a
touch screen interface at the same time. Further the user may concurrently
slide the two
- Page 7 -

CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
fingers in parallel (e.g., sliding two fingers in the same direction from left
to right).
[0040] The term concurrently has referred to herein includes approximately
concurrent.
For example, two fingers concurrently performing a parallel gesture may refer
to two fingers
of different lengths performing the same gesture at slightly different times.
For example, one
finger may lag in time behind another finger for starting and/or finishing the
gesture.
Accordingly, the two fingers may start and finish the gesture at different
start and/or finish
times.
[0041] The term parallel as referred to herein include paths that are in
approximately the
same direction. Two fingers performing a parallel motion, as referred to
herein, include a
user dragging two fingers across a touch screen interface in the same
direction. Due to a
difference in the length of the fingers or due to an angle of the hand, two or
more fingers
performing a parallel motion in the same general direction may differ in
direction by a few
degrees. In an embodiment, the paths along which two parallel gestures are
performed may
overlap. The term parallel, as referred to herein, may refer to any set of two
or more gestures
that are performed in the same general direction.
4.0 GESTURE AREA(S)
[0042] In an embodiment, the touch screen interface (115) includes a
gesture area. A
gesture area is at least a portion of the touch screen interface (115) that is
configured to detect
a gesture performed a user. The gesture area may include the entire touch
screen interface
(115) or a portion of the touch screen interface (115). The gesture area may
display a blank
box or one or more items. For example, the gesture area may display a video.
In another
example, the gesture area may display information on how to perform gestures.
[0043] In an embodiment, a gesture may be detected within a gesture area
without a
user's interaction with any visual objects that may be displayed in the
gesture area. For
example, a swipe gesture across a cellular phone's touch screen interface
(115) may be
detected in a gesture area that is an empty box on the touch screen interface.
In another
example, a progress indicator displayed in the gesture area is not touched by
a detected swipe
gesture associated with a rewind command.
[0044] In an embodiment, any visual objects displayed within the gesture
area are not
necessary for detecting a gesture or determining a command related to the
gesture. In an
embodiment, any visual objects displayed within the gesture area are not
selected or dragged
by a finger performing the gesture.
[0045] In an embodiment, the touch screen interface (115) may include
multiple gesture
- Page 8 -

CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
areas. A gesture detected within one gesture area may be mapped to a different
command
than the same gestured performed in a different gesture area. A device may be
configured to
identify an area in which a gesture is performed and determine an action based
on the gesture
and the gesture area in which the action was performed.
[0046] In an embodiment, the gesture area of multiple gesture areas may be
selected by a
device when a gesture is detected across multiple gesture areas. The gesture
area in which
the gesture area was initiated may be identified as the selected gesture area.
For example, a
user may begin a swipe gesture in a first gesture area and end the swipe
gesture in a second
gesture area. In response to detecting that the swipe gesture was initiated in
the first gesture
area, the command mapped to the gesture and the first gesture area may be
selected. In
another example, a gesture area in which the end of a sliding action is
detected may be
identified as the intended gesture area. The selected or intended gesture area
may be then
used to identify a command.
5.0 COMMANDS
[0047] In an embodiment, a gesture may be mapped to (or associated with) a
command.
For example, a command mapped to a gesture may be a video playback command
related to
the playback of a video. The command may be related to playback of a video on
the device
on which the command was received or on a different device.
[0048] In an embodiment, a command may specify a video playing speed and
direction.
For example, the command may select rewinding at a particular rewinding speed
or fast-
forwarding a particular fast-forwarding speed. Examples of other video
playback commands
include, but are not limited to, pausing the playing of the video, resuming
the playing of the
video, replaying a played portion of the video, stopping playing of the video,
stopping
playing of the video and resuming playing of the video at a particular playing
position,
playing the video in slow motion, frame-stepping through a video, playing the
video from the
beginning, playing one or more videos from a next playlist, playing the video
from a
particular scene forward, bookmarking a playing position in the video,
stopping playing and
resuming playing at a bookmarked position, or rating the video.
[0049] In an embodiment, a command may select a particular option out of a
list of
options. For example, a list of available media content may be displayed on a
screen and the
command may select particular media content of the available media content. In
another
example, a list of configuration settings may be displayed and the command may
select a
particular setting for modification.
- Page 9 -

CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
6.0 DETECTING A GESTURE WITHIN A GESTURE AREA
[0050] Figure 2 illustrates a flow diagram for detecting a gesture within a
gesture area.
One or more of the steps described below may be omitted, repeated, and/or
performed in a
different order. Accordingly, the specific arrangement of steps shown in
Figure 2 should not
be construed as limiting the scope of the invention.
[0051] In one or more embodiments, detecting a gesture may include
detecting interface
contact at an initial location that is a part of the detected gesture (Step
202). The initial
contact on the touch screen interface may be made with a user finger, a
stylus, or any other
item which may be used to perform a gesture on a touch screen interface. The
initial contact
with the touch screen interface may involve a quick touch at the initial
location (e.g., a tap
gesture) or a touch that is maintained at the initial location for any period
of time (e.g., a
millisecond, a second, two seconds, etc.). The initial contact with the touch
screen interface
may be brief as may be made by a finger already moving in a direction. For
example, a
finger moving in the air without making contact, and thereafter during the
moving making the
initial contact with a portion of the touch screen interface.
[0052] In an embodiment, the initial contact as referred to herein may
include a finger (or
other item) being close enough to a touch screen interface that the touch
screen interface
detects the finger. For example, when using a device including a capacitive
system with a
layer that stores electrical charge, a part of the electrical charge may be
transferred to a user
where the user touches the touch screen interface or where a user simply
hovers close to the
touch screen interface without touching. Accordingly, initial contact or
maintained contact as
referred to herein may include a user hovering a finger or other item over a
touch screen
interface.
[0053] In an embodiment, the initial contact on the touch screen interface
does not select
any visual object displayed on touch screen interface. The initial contact may
be made when
no visual object is displayed. The initial contact may be made on top of a
display of a visual
object without selecting the visual object. For example, the initial contact
may be made on a
touch screen interface that is displaying a user-selected background image for
the cellular
phone. In another example, the initial contact may be made on a blank screen.
The initial
contact may be detected on a top of a television show being played on a
tablet.
[0054] In one or more embodiments, detecting a gesture may further include
detecting
interface contact at additional locations, on the touch screen interface (Step
204). For
example, detecting a flick gesture or a swipe gesture may include detecting
interface contact
- Page 10 -

CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
at additional locations in a chronological sequence along a path from the
initial contact
location. For example, interface contact may be detected continuously in a
left-direction path
away from an initial contact location on the touch screen interface.
[0055] The contact along a path away from the location of the initial
contact point may be
referred to herein as a sliding gesture. In one or more embodiments, a speed
of the sliding
gesture or a direction of the sliding gesture may be determined. For example,
contact at two
or more locations on the interface, such as the initial contact point and a
second point along
the path of the sliding gesture, may be used to determine a direction and/or a
speed of the
sliding gesture. Contact at multiple points may be used to calculate an
acceleration of a
sliding gesture.
[0056] In one or more embodiments, a gesture may be identified based on
contact
detected at one or more locations on the touch screen interface (Step 206).
For example,
detecting concurrent contact at three locations on a remote control interface
followed by a
release of contact at all three locations may be identified as a three finger
tap gesture. In an
embodiment, detecting a gesture may include identifying a path along which
contact was
detected on the touch screen interface. For example, a circle gesture may be
identified in
response to detecting contact along a circular path on a touch screen
interface. A flick
gesture or a swipe gesture may be identified based on contact points in a
chronological
sequence on a touch screen interface.
[0057] In one or more embodiments, identifying a gesture may include
determining a
number of concurrent parallel gestures (Step 208). For example, initial
contact may be
detected concurrently at multiple locations on a touch screen interface.
Subsequent to the
initial contact at each initial location, contact along paths beginning from
the initial locations
may be detected. If the paths are determined to be parallel, the number of
paths may be
identified to determine the number of concurrent parallel gestures.
[0058] In an embodiment, a number of concurrent parallel gestures may be
determined
based on the number of paths that match a known configuration. For example, if
a path has at
least a first contact point and a subsequent second contact point to the right
within ten degrees
from a horizontal line from the first contact point, the path may be
determined to correspond
to a sliding gesture to the right. The number of detected gestures that
correspond to paths that
match the same criteria within a particular time period may be counted to
determine the
number of concurrent parallel gestures. In an embodiment, other methods not
described
herein may be used for determining the number of concurrent parallel gestures.
[0059] In an embodiment, a command is determined based on an identified
gesture (Step
- Page 11 -

CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
210). The command may be determined while the gesture is still being performed
or after the
gesture is completed.
[0060] In an embodiment, determining a command may include determining that
a
particular detected gesture is mapped to a command in a database. For example,
a two
fingered swipe to the right may be queried in a command database to identify a
command
associated with the two fingered swipe. In another example, a two fingered
flick toward the
bottom of the gesture area may be associated with a command for selecting the
second menu
item out of items currently displayed in a menu.
[0061] In an embodiment, a number of parallel fingers in a command may be
used to
determine a playback speed for the playing of multi-media content. For
example, detection
of two parallel gestures may be mapped to a command for playback speed which
is two times
a normal playback speed.
[0062] In an embodiment, a direction of gesture command may be combined
with a
number of parallel fingers in the gesture command to determine the playback
command. For
example, two fingers swiped concurrently from the right side of the screen to
the left side of
the screen may be mapped to rewind at two times a normal speed. In another
example, two
fingers swiped concurrently from the left side of the screen to the right side
of the screen may
be mapped to fast-forward at a speed that is twice the normal playback speed
(without fast-
forwarding).
[0063] In an embodiment, a command may include resuming playing of a video
at
particular bookmarks (e.g., user defined bookmarks or manufacturer defined
bookmarks). A
number of fingers used to perform a concurrent parallel gesture may be used to
select the
bookmark. For example, in response to detecting a two-fingered flick downward,
the playing
of a video may be resumed at the second bookmark from a current playing
position.
[0064] In an embodiment, determining a command may include identifying the
device
corresponding to the command. For example, a device related to the command may
be
identified based on the gesture and/or the gesture area in which the gesture
was detected.
[0065] In an embodiment, an action corresponding to the command is
performed (Step
212). The action may be performed by a device that detects the command. For
example, if a
gesture for a fast-forward command is detected on a hand-held touch screen
phone that is
playing a video, the hand-held touch screen phone play the video in fast-
forward mode.
[0066] In an embodiment, an action corresponding to the command may include
transmitting information related to the command to another device. For
example, a gesture
may be detected on a touch screen remote control. Information related to the
gesture (e.g.,
- Page 12 -

CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
information identifying the gesture or information identifying a command
associated with the
gesture) may then be transmitted to a digital video disc player. The digital
video disc player
may then perform a corresponding action. If the command was for pausing the
playing of a
video, the digital video disc player may pause the playing of the video on a
display screen.
7.0 EXAMPLE GESTURES AND COMMANDS
[0067] Figure 3 illustrates an example screen shot for an input device
configured to detect
gestures. The gestures, commands, mapping between gestures and commands,
gesture areas,
visual objects, and any other items discussed in relation to Figure 3 are
examples and should
not be construed as limiting in scope. One or more of the items described in
relation to
Figure 3 may not be necessarily implemented and other items described may be
implemented
in accordance with one or more embodiments.
[0068] Figure 3 illustrates an example interface (300) with a circular
gesture area (305)
and a square gesture area (310). Any gestures detected in circular gesture
area (305) are
mapped to navigation commands. For example, a two fingered tap detected in
circular
gesture area (305) may be associated with a command selecting a second item on
any
currently displayed menu. If the second item is a folder, the items within the
folder may be
displayed in response to detecting the two fingered tap.
[0069] In an embodiment, square gesture area (310) may identify commands
that are
associated with one or more gestures detected within the square gesture area
(310). For
example, the square gesture area (310) may include graphics illustrating that
a single finger
swipe gesture to the left corresponds to a rewind command, a single finger tap
gesture
corresponds to a pause command, a single finger swipe gesture to the right
corresponds to a
fast-forward command, a two fingered swipe gesture to the left corresponds to
a ten second
rewind, a two fingered tap gesture corresponds to show motion playback
command, and a
two fingered swipe to the right corresponds to skip to next bookmark command.
[0070] In an embodiment, the example interface (300) may include a progress
indicator
(315) which is separate from the circular gesture area (305) and the square
gesture area (310).
The progress indicator (315) may include a current playing position of the
video, bookmarks,
a current playback speed, etc. For example, the progress indicator (315) may
include a
symbol representing a current playback speed (e.g., play, fast forward at lx,
pause, rewind at
2x, etc.).
[0071] In an embodiment, the symbol may be displayed in response to a
command. For
example, in response to a rewind at 3x command, a symbol indicating 3x rewind
may
- Page 13 -

CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
displayed while rewinding multimedia content at 3x is performed by displaying
frames in
reverse at three times the normal playback speed. However, the progress
indicator (315) may
not necessarily be selected by any gesture associated with a video playback
command. In an
embodiment, no visual objects within example interface (300) are necessarily
selected when a
user is performing a gesture within the example interface (300).
[0072] In an embodiment, the example interface (300) may also include a
tool (e.g., a
drop down box) to select a particular media device to be controlled by
detected gestures. In
an embodiment, the example interface (300) may include an option to switch
between input
mechanisms (e.g., gesture based input, buttons, text box, radio boxes, etc.).
8.0 REMOTE CONTROL USE EXAMPLE
[0073] In an embodiment, a remote control device communicates with a media
device
(e.g., a digital video recorder, a digital video disc player, a media
management device, a
video recorder, a blu-ray player, etc.). The remote control device may
communicate with the
media device over wired and/or wireless communication segments. For example,
the remote
control device may communicate over a network (e.g, internet, intranet, etc.),
via radio
communication, over Bluetooth, via infrared, etc.
[0074] In an embodiment, a remote control displays a progress indicator
(315) as shown
in the screen shot (300) of Figure 3. The progress indicator (315) may
indicate a playing
position of multimedia content being displayed on a separate multimedia
device. The
progress indicator (315) may display an exact playing position or an
approximate playing
position. For example, the progress indicator (315) may include a slider (320)
displayed
along a trickplay bar (330) to indicate the playing position. In an
embodiment, a particular
playing position may be indicated by a time (e.g., 8:09). The time may
indicate, for example,
the actual streaming time of the currently played content or may indicate an
offset from the
starting point of the content.
[0075] In an embodiment, information related to the playing position of the
multimedia
content may be obtained from a media device (e.g., a digital video recorder, a
cable box, a
computer, a media management device, a digital video disc player, multimedia
player, audio
player, etc.). For example, a remote control device communicatively coupled
with a media
device may be configured to receive frame information related to the
particular frame being
displayed (played) by the media device. In an embodiment, the media device may
periodically send the remote control device the frame information.
Alternatively, the remote
control device may periodically request the frame information from the media
device. The
- Page 14 -

CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
remote device uses the information to position the slider (320) along the
trickplay bar (330).
The remote control device can also receive information from the media device
indicating the
extent of the cache bar (325) which indicates the amount of multimedia content
stored or
recorded by the media device. If the media device is in the process of
recording or caching a
multimedia content, the cache bar (325) will increase in size as the media
device records or
caches more content. If the media device is playing a recorded multimedia
content, then the
cache bar (325) extends the length of the trickplay bar (330).
[0076] Another example may involve the remote control device being
configured to
receive a time stamp closest to the frame being displayed. The remote control
device may
also be configured to use a step function, e.g., next frame or previous frame
from the time
stamp if no frame is an exact match to the time stamp. Another example may
include the
remote control device continuously receiving images (e.g., bitmap, display
instructions, etc.)
from the media device of the progress indicator to display on the remote
control device. In an
embodiment, the remote control device may include a particular starting
position and a
display rate for use by the remote control device to determine the playing
position of the
multimedia content. For example, a digital video recorder may transmit an
initial playing
position in the playing of the multimedia content to the remote control device
with a rate of
progress (e.g., change of the slider (320) per unit of time, frame rate,
etc.). The remote
control device may use the information to first display a progress indicator
based on the
initial playing position and may then compute the subsequent positions as a
function of time.
[0077] In an embodiment, the slider (320) becomes out of sync with a
displayed video
when a trickplay function is performed (e.g., when a ten second rewind is
performed). In
response to a trickplay function, updated information regarding a new playing
position may
be provided to the remote control device.
[0078] In an embodiment, the remote control device may further receive
updates
selecting specific playing positions or indicating changes in the rate of
progress. For
example, a user may submit one or more commands to pause the playing of
multimedia
content at a current playing position, then skip back 10 seconds before the
current playing
position and then resume playing. In this case, a media device may provide
information to
the remote control device to pause the slider (320), display a new playing
position
corresponding to 10 seconds before the current playing position by moving the
slider (320),
and then resume periodically updating the slider (320).
[0079] In an embodiment, the slider (320) may be updated when the remote
control
device is activated. For example, when a user picks up the remote control
device or touches
- Page 15 -

CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
the remote control device, the remote control device may request playing
position
information from a media device. For example, the remote control device may
include an
accelerometer configured to detect motion and/or a touch screen interface
configured to
detect touch. In response, the media device may provide playing position
information to the
remote control device. The remote control device may then display the slider
(320)
indicating a current playing position of multimedia content based on the
playing on the
position information received from the media device.
[0080] In an embodiment, information related to the playing position of the
multimedia
content may be continuously received by the remote control device for the
remote control
device to constantly update the slider (320). In another embodiment, the
information related
to the playing position of the multimedia content may be periodically received
and the remote
control device may update the slider each time the information is received.
[0081] In an embodiment, the remote control device may transmit the
multimedia content
to the multimedia device for display by the multimedia device. For example,
the remote
control device may obtain a video stream over the internet and send the video
stream to a
multimedia device for display on the multimedia device. In this example, the
remote control
device may determine the display position of the slider (320) based on playing
position
information determined by the remote control device itself. For example, the
remote control
device may compute the playing position information based on a frame being
sent to the
multimedia device from the remote control device.
9.0 EXAMPLE EMBODIMENTS
[0082] In an embodiment, a method comprises detecting a slide gesture, in a
particular
area on a touch screen interface of a device, from a first location in the
particular area to a
second location in the particular area; identifying a video playback command
based at least
on the slide gesture; performing an action associated with the video playback
command;
wherein the method is performed by at least one device.
[0083] In an embodiment, the sliding gesture is detected without detecting
selection of
any video progress indicator displayed within the particular area. The slide
gesture may be
detected in the particular area while displaying at least a portion of the
video in the particular
area. The slide gesture may be detected in the particular area while
displaying information on
how to perform one or more gestures in the particular area.
[0084] In an embodiment, identifying the video playback command is further
based on
the particular area, in which the slide gesture was detected, from a plurality
of areas on the
- Page 16 -

CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
touch screen interface.
[0085] In an embodiment, performing the action comprises a first device
sending
information to a second device, the information based on the video playback
command.
Performing the action associated with the video may comprise performing the
action on a
same device as the device detecting the slide gesture. The video playback
command may
select a playing speed and direction.
[0086] In an embodiment, the slide gesture comprises a swipe gesture from
the first
location to a second location. The slide gesture may comprise a flick gesture
starting at the
first location.
[0087] In an embodiment, the video playback command is for one or more of:
pausing
the playing of the video; resuming the playing of the video; replaying a
played portion of the
video; stopping playing of the video; stopping playing of the video and
resuming playing of
the video at a particular playing position; playing the video in slow motion;
playing the video
from the beginning; playing one or more videos from a next playlist; playing
the video from a
particular scene forward; bookmarking a playing position in the video;
stopping playing and
resuming playing at a bookmarked position; or rating the video.
[0088] In an embodiment, a method comprises concurrently detecting a
plurality of
parallel gestures on a touch screen interface of a device; determining a
number of the
plurality of parallel gestures; selecting a command from a plurality of
commands based on
the number of the plurality of parallel gestures; performing an action
associated with the
command.
[0089] In an embodiment, selecting the command comprises selecting a menu
option
based on the number of the plurality of parallel gestures. The plurality of
parallel gestures
may comprise a plurality of parallel sliding gestures performed in a same
direction.
[0090] In an embodiment, determining the number of the plurality of
parallel gestures
comprises determining a number of tap gestures concurrently performed on the
touch screen
interface.
[0091] Although specific components are recited herein as performing the
method steps,
in other embodiments agents or mechanisms acting on behalf of the specified
components
may perform the method steps. Further, although some aspects of the invention
are discussed
with respect to components on a system, the invention may be implemented with
components
distributed over multiple systems. Embodiments of the invention also include
any system
that includes the means for performing the method steps described herein.
Embodiments of
the invention also include a computer readable medium with instructions, which
when
- Page 17 -

CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
executed, cause the method steps described herein to be performed.
10.0 HARDWARE OVERVIEW
[0092] According to one embodiment, the techniques described herein are
implemented
by one or more special-purpose computing devices. The special-purpose
computing devices
may be hard-wired to perform the techniques, or may include digital electronic
devices such
as one or more application-specific integrated circuits (ASICs) or field
programmable gate
arrays (FPGAs) that are persistently programmed to perform the techniques, or
may include
one or more general purpose hardware processors programmed to perform the
techniques
pursuant to program instructions in firmware, memory, other storage, or a
combination. Such
special-purpose computing devices may also combine custom hard-wired logic,
ASICs, or
FPGAs with custom programming to accomplish the techniques. The special-
purpose
computing devices may be desktop computer systems, portable computer systems,
handheld
devices, networking devices or any other device that incorporates hard-wired
and/or program
logic to implement the techniques.
[0093] For example, FIG. 4 is a block diagram that illustrates a computer
system 400
upon which an embodiment of the invention may be implemented. Computer system
400
includes a bus 402 or other communication mechanism for communicating
information, and a
hardware processor 404 coupled with bus 402 for processing information.
Hardware
processor 404 may be, for example, a general purpose microprocessor.
[0094] Computer system 400 also includes a main memory 406, such as a
random access
memory (RAM) or other dynamic storage device, coupled to bus 402 for storing
information
and instructions to be executed by processor 404. Main memory 406 also may be
used for
storing temporary variables or other intermediate information during execution
of instructions
to be executed by processor 404. Such instructions, when stored in non-
transitory storage
media accessible to processor 404, render computer system 400 into a special-
purpose
machine that is customized to perform the operations specified in the
instructions.
[0095] Computer system 400 further includes a read only memory (ROM) 408 or
other
static storage device coupled to bus 402 for storing static information and
instructions for
processor 404. A storage device 410, such as a magnetic disk or optical disk,
is provided and
coupled to bus 402 for storing information and instructions.
[0096] Computer system 400 may be coupled via bus 402 to a display 412,
such as a
cathode ray tube (CRT), for displaying information to a computer user. An
input device 414,
including alphanumeric and other keys, is coupled to bus 402 for communicating
information
- Page 18 -

CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
and command selections to processor 404. Another type of user input device is
cursor control
416, such as a mouse, a trackball, or cursor direction keys for communicating
direction
information and command selections to processor 404 and for controlling cursor
movement
on display 412. This input device typically has two degrees of freedom in two
axes, a first
axis (e.g., x) and a second axis (e.g., y), that allows the device to specify
positions in a plane.
[0097] Computer system 400 may implement the techniques described herein
using
customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or
program logic
which in combination with the computer system causes or programs computer
system 400 to
be a special-purpose machine. According to one embodiment, the techniques
herein are
performed by computer system 400 in response to processor 404 executing one or
more
sequences of one or more instructions contained in main memory 406. Such
instructions may
be read into main memory 406 from another storage medium, such as storage
device 410.
Execution of the sequences of instructions contained in main memory 406 causes
processor
404 to perform the process steps described herein. In alternative embodiments,
hard-wired
circuitry may be used in place of or in combination with software
instructions.
[0098] The term "storage media" as used herein refers to any non-transitory
media that
store data and/or instructions that cause a machine to operation in a specific
fashion. Such
storage media may comprise non-volatile media and/or volatile media. Non-
volatile media
includes, for example, optical or magnetic disks, such as storage device 410.
Volatile media
includes dynamic memory, such as main memory 406. Common forms of storage
media
include, for example, a floppy disk, a flexible disk, hard disk, solid state
drive, magnetic tape,
or any other magnetic data storage medium, a CD-ROM, any other optical data
storage
medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM,
a
FLASH-EPROM, NVRAM, any other memory chip or cartridge.
[0099] Storage media is distinct from but may be used in conjunction with
transmission
media. Transmission media participates in transferring information between
storage media.
For example, transmission media includes coaxial cables, copper wire and fiber
optics,
including the wires that comprise bus 402. Transmission media can also take
the form of
acoustic or light waves, such as those generated during radio-wave and infra-
red data
communications.
[00100] Various forms of media may be involved in carrying one or more
sequences of
one or more instructions to processor 404 for execution. For example, the
instructions may
initially be carried on a magnetic disk or solid state drive of a remote
computer. The remote
computer can load the instructions into its dynamic memory and send the
instructions over a
- Page 19 -

CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
telephone line using a modem. A modem local to computer system 400 can receive
the data
on the telephone line and use an infra-red transmitter to convert the data to
an infra-red
signal. An infra-red detector can receive the data carried in the infra-red
signal and
appropriate circuitry can place the data on bus 402. Bus 402 carries the data
to main memory
406, from which processor 404 retrieves and executes the instructions. The
instructions
received by main memory 406 may optionally be stored on storage device 410
either before
or after execution by processor 404.
[00101] Computer system 400 also includes a communication interface 418
coupled to bus
402. Communication interface 418 provides a two-way data communication
coupling to a
network link 420 that is connected to a local network 422. For example,
communication
interface 418 may be an integrated services digital network (ISDN) card, cable
modem,
satellite modem, or a modem to provide a data communication connection to a
corresponding
type of telephone line. As another example, communication interface 418 may be
a local
area network (LAN) card to provide a data communication connection to a
compatible LAN.
Wireless links may also be implemented. In any such implementation,
communication
interface 418 sends and receives electrical, electromagnetic or optical
signals that carry
digital data streams representing various types of information.
[00102] Network link 420 typically provides data communication through one or
more
networks to other data devices. For example, network link 420 may provide a
connection
through local network 422 to a host computer 424 or to data equipment operated
by an
Internet Service Provider (ISP) 426. ISP 426 in turn provides data
communication services
through the world wide packet data communication network now commonly referred
to as
the "Internet" 428. Local network 422 and Internet 428 both use electrical,
electromagnetic
or optical signals that carry digital data streams. The signals through the
various networks
and the signals on network link 420 and through communication interface 418,
which carry
the digital data to and from computer system 400, are example forms of
transmission media.
[00103] Computer system 400 can send messages and receive data, including
program
code, through the network(s), network link 420 and communication interface
418. In the
Internet example, a server 430 might transmit a requested code for an
application program
through Internet 428, ISP 426, local network 422 and communication interface
418.
[00104] The received code may be executed by processor 404 as it is received,
and/or
stored in storage device 410, or other non-volatile storage for later
execution.
[00105] The received code may be executed by processor 604 as it is received,
and/or
stored in storage device 610, or other non-volatile storage for later
execution. In an
- Page 20 -

CA 02823388 2013-06-27
WO 2012/094479 PCT/US2012/020306
embodiment, an apparatus is a combination of one or more hardware and/or
software
components described herein. In an embodiment, a subsystem for performing a
step is a
combination of one or more hardware and/or software components that may be
configured to
perform the step.
11.0 EXTENSIONS AND ALTERNATIVES
[00106] In the foregoing specification, embodiments of the invention have been
described
with reference to numerous specific details that may vary from implementation
to
implementation. Thus, the sole and exclusive indicator of what is the
invention, and is
intended by the applicants to be the invention, is the set of claims that
issue from this
application, in the specific form in which such claims issue, including any
subsequent
correction. Any definitions expressly set forth herein for terms contained in
such claims shall
govern the meaning of such terms as used in the claims. Hence, no limitation,
element,
property, feature, advantage or attribute that is not expressly recited in a
claim should limit
the scope of such claim in any way. The specification and drawings are,
accordingly, to be
regarded in an illustrative rather than a restrictive sense.
- Page 21 -

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

2024-08-01:As part of the Next Generation Patents (NGP) transition, the Canadian Patents Database (CPD) now contains a more detailed Event History, which replicates the Event Log of our new back-office solution.

Please note that "Inactive:" events refers to events no longer in use in our new back-office solution.

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Event History , Maintenance Fee  and Payment History  should be consulted.

Event History

Description Date
Inactive: Associate patent agent added 2022-02-22
Inactive: IPC expired 2022-01-01
Revocation of Agent Requirements Determined Compliant 2021-12-31
Appointment of Agent Requirements Determined Compliant 2021-12-31
Appointment of Agent Requirements Determined Compliant 2021-12-30
Revocation of Agent Requirements Determined Compliant 2021-12-30
Time Limit for Reversal Expired 2016-01-05
Application Not Reinstated by Deadline 2016-01-05
Deemed Abandoned - Failure to Respond to Maintenance Fee Notice 2015-01-05
Inactive: Cover page published 2013-09-26
Amendment Received - Voluntary Amendment 2013-09-20
Letter Sent 2013-09-18
Inactive: Correspondence - Transfer 2013-08-29
Inactive: IPC assigned 2013-08-20
Inactive: IPC assigned 2013-08-20
Inactive: IPC assigned 2013-08-20
Application Received - PCT 2013-08-19
Letter Sent 2013-08-19
Letter Sent 2013-08-19
Inactive: Office letter 2013-08-19
Inactive: Acknowledgment of national entry - RFE 2013-08-19
Inactive: IPC removed 2013-08-19
Inactive: First IPC assigned 2013-08-19
Inactive: IPC assigned 2013-08-19
Inactive: IPC assigned 2013-08-19
Inactive: First IPC assigned 2013-08-19
All Requirements for Examination Determined Compliant 2013-06-27
National Entry Requirements Determined Compliant 2013-06-27
Request for Examination Requirements Determined Compliant 2013-06-27
Amendment Received - Voluntary Amendment 2013-06-27
Application Published (Open to Public Inspection) 2012-07-12

Abandonment History

Abandonment Date Reason Reinstatement Date
2015-01-05

Maintenance Fee

The last payment was received on 2013-12-19

Note : If the full payment has not been received on or before the date indicated, a further fee may be required which may be one of the following

  • the reinstatement fee;
  • the late payment fee; or
  • additional fee to reverse deemed expiry.

Please refer to the CIPO Patent Fees web page to see all current fee amounts.

Fee History

Fee Type Anniversary Year Due Date Paid Date
Request for examination - standard 2013-06-27
Registration of a document 2013-06-27
Basic national fee - standard 2013-06-27
MF (application, 2nd anniv.) - standard 02 2014-01-06 2013-12-19
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
TIVO INC.
Past Owners on Record
ROBIN HAYES
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Description 2013-06-27 21 1,178
Drawings 2013-06-27 4 328
Representative drawing 2013-06-27 1 5
Claims 2013-06-27 3 116
Abstract 2013-06-27 1 53
Claims 2013-06-28 5 153
Cover Page 2013-09-26 1 31
Acknowledgement of Request for Examination 2013-08-19 1 176
Notice of National Entry 2013-08-19 1 202
Courtesy - Certificate of registration (related document(s)) 2013-08-19 1 103
Reminder of maintenance fee due 2013-09-09 1 112
Courtesy - Certificate of registration (related document(s)) 2013-09-18 1 102
Courtesy - Abandonment Letter (Maintenance Fee) 2015-03-02 1 173
PCT 2013-06-27 23 1,155
Correspondence 2013-08-19 1 19