Language selection

Search

Patent 3079444 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent: (11) CA 3079444
(54) English Title: SYSTEMS AND METHODS FOR PROCESSING IMAGE DATA TO COINCIDE IN A POINT OF TIME WITH AUDIO DATA
(54) French Title: SYSTEMES ET METHODES DE TRAITEMENT DE DONNEES D`IMAGE POUR COINCIDER AVEC LES DONNEES SONORES A UN POINT DANS LE TEMPS
Status: Granted
Bibliographic Data
(51) International Patent Classification (IPC):
  • H04N 21/8547 (2011.01)
  • H04N 21/4725 (2011.01)
  • H04N 21/81 (2011.01)
  • H04N 21/84 (2011.01)
  • H04N 21/8545 (2011.01)
  • G06F 16/40 (2019.01)
  • G11B 27/10 (2006.01)
  • H04L 12/16 (2006.01)
(72) Inventors :
  • MONTEIRO SIQUEIRA FRANCESCHI, WILTER (Canada)
  • DA SILVA FIGUEIREDO, VANESSA (Canada)
(73) Owners :
  • MONTEIRO SIQUEIRA FRANCESCHI, WILTER (Canada)
  • DA SILVA FIGUEIREDO, VANESSA (Canada)
(71) Applicants :
  • MONTEIRO SIQUEIRA FRANCESCHI, WILTER (Canada)
  • DA SILVA FIGUEIREDO, VANESSA (Canada)
(74) Agent:
(74) Associate agent:
(45) Issued: 2023-06-27
(22) Filed Date: 2020-04-24
(41) Open to Public Inspection: 2021-10-24
Examination requested: 2020-04-24
Availability of licence: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): No

(30) Application Priority Data: None

Abstracts

English Abstract


Systems and methods for processing image data to coincide in a point of time
with audio data to
create a master timeline coordinating audio timeline playback and image data
display to generate
a synchronized multimedia presentation using content creator and content
player modules. The
system comprises a network architecture, a synchronization system and a file
system database.
The method comprises: accessing a content creator module of a synchronization
system;
manipulating one or more audio data to assemble an audio timeline;
manipulating one or more
image data to display transformations made as a result of editing; creating
one or more
timestamps on the audio timeline; assigning each image data to a timestamp
corresponding to a
time value of a play duration of the audio timeline; generating a master
timeline; generating a
digital control file; and reproducing the synchronized multimedia presentation
in a content player
module.


French Abstract

Il est décrit des systèmes et des procédés pour traiter des données dimage afin quelles coïncident à un instant donné avec des données audio pour créer une ligne de temps maître coordonnant la lecture de ligne de temps audio et laffichage de données dimage pour générer une présentation multimédia synchronisée en utilisant des modules de créateur de contenu et de lecteur de contenu. Le système comprend une architecture de réseau, un système de synchronisation et une base de données de système de fichier. Le procédé consiste à : accéder à un module de créateur de contenu dun système de synchronisation; manipuler au moins une donnée audio pour assembler une ligne de temps audio; manipuler au moins une donnée dimage pour afficher les transformations effectuées suite au montage; créer au moins une estampille temporelle sur la ligne de temps audio; affecter chaque donnée dimage à une estampille temporelle correspondant à une valeur de temps dune durée de lecture de la ligne de temps audio; générer une ligne de temps maître; générer un fichier de commande numérique; et reproduire la présentation multimédia synchronisée dans un module de lecteur de contenu.

Claims

Note: Claims are shown in the official language in which they were submitted.


- 15 -
CLAIMS
The embodiments of the invention in which an exclusive property or privilege
is claimed are
defined as follows:
1. A computer implemented method for processing image data to coincide in a
point of time with
audio data generating a synchronized multimedia presentation, the method
comprising:
accessing a synchronization system content creator module;
receiving one or more audio data;
receiving one or more image data;
generating one or more timestamps to coincide with one or more specific time
values of
audio data;
enabling a time relationship consisting of a plurality of commands requesting
the display
of each image data at specific one or more play durations corresponding to the
one or
more timestamps coinciding with one or more specific time values of the audio
data;
generating a master timeline comprising the plurality of commands guiding the
display of
each image data at specific one or more play durations corresponding to
timestamps
coinciding with one or more specific time values of audio data by the means of
the time
relationship;
generating a digital control file comprising the plurality of commands guiding
the master
timeline wherein the said plurality of commands activate an uncoupled
synchronization
between audio data and image data by the means of maintaining the
characteristic file
integrity of the said audio data and image data; and
generating a synchronized multimedia presentation by the means of executing
the
plurality of commands in the digital control file guiding the master timeline.
2. The method of Claim 1 further enabling one or more interactive graphic
element layers
overlaying the synchronized multimedia presentation wherein the said one or
more interactive
graphic element layers consists of one or more synchronous manipulations in
the format of
annotations, content selection, clickable items, zooming of image data,
drawings, or polls.
3. The method of Claim 2 further enabling the synchronous manipulations of the
one or more
interactive graphic elements overlaying the synchronized multimedia
presentation while the said
synchronized multimedia presentation is being reproduced.
4. The method of Claim 1 further generating the digital control file in a text
file standard in at
least one of the formats: HTML, XML, CSV, or JSON.
5. The method of Claim 1 further processing the digital control file
comprising the synchronized
multimedia presentation in one or more of the following formats: a computer
program, a
webpage, or a mobile application.

- 16 -
6. The method of Claim 1 further enabling the synchronization system content
creator module to
receive one or more image data and one or more audio data interchangeably.
7. A computer implemented synchronization system configured to process image
data to
coincide in a point of time with audio data generating a synchronized
multimedia presentation,
the system comprising:
a content creator module comprising:
an audio data section configured to receive one or more audio data;
an image data section configured to receive one or more image data;
a master timeline section configured to:
receive one or more timestamps coinciding with one or more specific time
values
of audio data;
generate a time relationship between audio data and image data;
encode the time relationship between audio data and image data to a plurality
of
commands stored in a digital control file;
a synchronization section configured to:
process the digital control file wherein the plurality of commands activate an

uncoupled synchronization between audio data and image data by means of
maintaining the characteristic file integrity of said audio data and image
data;
transcode the plurality of commands in the digital control file to a master
timeline
guiding display of each image data at specific one or more play durations
corresponding to the one or more timestamps coinciding with one or more
specific time values of the audio data by the means of the time relationship;
generate the synchronized multimedia presentation.
8. The system of Claim 7 further configured to generate at least one
interactive graphic element
overlaying the synchronized multimedia presentation wherein said at least one
interactive
graphic element consists of one or more synchronous manipulations in the
format of
annotations, content selection, clickable items, zooming of image data,
drawings, or polls.
9. The system of Claim 8 further configured to generate synchronous
manipulations of at least
one interactive graphic element overlaying the synchronized multimedia
presentation while the
said synchronized multimedia presentation is being reproduced.

- 17 -
10. The system of Claim 7 further configured to generate the digital control
file in a standard text
format in at least one of the formats HTML, XML, CSV, or JSON.
11. A computer implemented method for reproducing a synchronized multimedia
presentation,
the method comprising:
accessing a synchronization system content player module;
loading a digital control file comprising a plurality of commands guiding a
master timeline,
wherein the said plurality of commands activate an uncoupled synchronization
between
audio data and image data by means of maintaining the characteristic file
integrity of said
audio data and image data;
requesting the plurality of commands in the digital control file to execute
the master
timeline;
reproducing the master timeline guiding the display of each image data at
specific one or
more play durations corresponding to the one or more timestamps coinciding
with one
or more specific time value of audio data by the means of a time relationship,
maintaining
the characteristic file integrity of each of said audio data and image data;
generating the synchronized multimedia presentation while the said
synchronized
multimedia presentation is being reproduced on a playback viewer.
12. The method of Claim 11 further comprising the synchronization system
content player
module reproducing the master timeline generating the synchronized multimedia
presentation
on the playback viewer, the said playback viewer being at least one of the
formats available for
multimedia players.
13. The method of Claim 11 further enabling at least one interactive graphic
element overlaying
the synchronized multimedia presentation wherein at least one interactive
graphic element
consists of one or more synchronous manipulations in the format of
annotations, content
selection, clickable items, zooming of image data, drawings, or polls.
14. The method of Claim 13 further enabling one or more synchronous
manipulations of at least
one interactive graphic element overlaying the synchronized multimedia
presentation while the
said synchronized multimedia presentation is being reproduced on the playback
viewer.
15. A computer implemented system configured to reproduce a synchronized
multimedia
presentation, the system comprising:
a synchronization section configured to load a digital control file comprising
a plurality of
commands guiding a master timeline, wherein the said plurality of commands
activate an
uncoupled synchronization between audio data and image data by means of
maintaining
the characteristic file integrity of said audio data and image data;

- 18 -
a synchronization processing section configured to execute the digital control
file guiding
the master timeline on a playback viewer; and
a synchronization system content player module configured to reproduce the
master
timeline comprising the synchronized multimedia presentation.
16. The system of Claim 15 further configured to execute at least one
interactive graphic element
overlaying the synchronized multimedia presentation wherein at least one
interactive graphic
element consists of one or more synchronous manipulations in the format of
annotations,
content selection, clickable items, zooming of image data, drawings, or polls.
17. The system of Claim 16 further configured to enable synchronous
manipulations of at least
one interactive graphic element overlaying the synchronized multimedia
presentation while the
said synchronized multimedia presentation is being reproduced on the playback
viewer.
18. The system of Claim 15 further configured to load the digital control file
on at least one of the
formats: a computer application, a software application, or a mobile
application.
19. The system of Claim 15 further configured to execute the digital control
file in at least one of
the formats HTML, XML, CSV, or JSON.

Description

Note: Descriptions are shown in the official language in which they were submitted.


- 1 -
SYSTEMS AND METHODS FOR PROCESSING IMAGE DATA TO COINCIDE IN A POINT OF TIME
WITH AUDIO DATA
TECHNICAL FIELD
This application relates to the field of software engineering. More
particularly, the present
disclosure relates to implementing methods and systems to synchronize
unconnected image data
and audio data.
BACKGROUND
Since the advent of the Internet, a number of multimedia platforms and
software have been
developed to support the creation, consumption and distribution of digital
content. Many
individuals have created presentations, podcasts and digital tutorial sessions
using audio and
video formats. Nevertheless, individuals producing, transmitting and
distributing digital audio
content often encounter limitations to include multimedia elements (e.g.
images, annotations
and hyperlinks) to complement the content existing in digital audio data.
Existing technology supports the inclusion of presentation slides or other
kinds of graphic data to
coincide in a point of time with audio data and video data to generate
synchronized
presentations. Nevertheless, the methods and systems employed to process the
inclusion of such
data produce video data by merging audio and graphic data. As a result, the
audio and graphic
data become a single file in a video format (e.g. .mp4). If an individual
needs to edit parts of that
single file, it will be necessary to upload the data that generated such a
file again. Furthermore,
the existing technology does not support the synchronization of audio data and
image data
during live transmissions occurring on the Internet. For example, an
individual hosting a podcast
wants to include an image to illustrate what is being presented during the
podcast. Such an image
cannot be added to coincide in a point of time with the audio playback using
the existing
technology unless the individual uses a video camera to record a video.
The present disclosure concerns implementing systems and methods for
processing image data
to coincide in a point of time with audio data, maintaining the integrity of
both image and audio
data. For example, a tutor teaching online lessons wants to use an image to
support the learning
of their pupils while narrating the characteristics of such an image. The
tutor uploads the
supporting image and manipulates the image to be displayed during the online
lessons. The tutor
chooses to zoom in on the image and display the details, thus supporting the
narration describing
the characteristics of the image. The tutor generates a timestamp and assigns
this timestamp to
the image. Consequently, the image will be displayed at a point of time
coinciding with the audio
timeline playback in a synchronized format. Rather than merging the image with
the audio, the
present disclosure results in a synchronized format establishing a timed and
uncoupled
connection between the image and audio data.
In a different scenario, a social media influencer expert on reviews and
tutorials concerning
makeup. The social media influencer takes pictures of her face wearing makeup
and adds
interactive links and descriptions overlaying those pictures by using a number
of manipulations
Date Recue/Date Received 2023-03-15

- 2 -
existing in the present disclosure. The social media influencer's audience
will be able to save in
the form of bookmarks the content existing on the images.
Date Recue/Date Received 2023-03-15

- 3 -
SUMMARY
The present disclosure concerns implementing systems and methods for
processing image data
to coincide in a point of time with audio data to create a master timeline
coordinating audio
timeline playback and image data display to generate a synchronized multimedia
presentation
using content creator and content player modules. The system comprises a
network architecture,
a synchronization system and a file system database. The network architecture
comprises one or
more computing devices and computer networks connected to the synchronization
system. The
synchronization system comprises the content creator module, the content
player module and
the file system database. The file system database is configured to store data
pertaining to a
master timeline, a digital control file and one or more content player user
events.
In accordance with an embodiment of the invention, the method comprising:
accessing a content
creator module of a synchronization system; manipulating one or more audio
data to assemble
an audio timeline containing a sequential order of one or more audio data;
manipulating one or
more image data to display transformations made as a result of editing;
creating one or more
timestamps on the audio timeline; assigning each image data to a timestamp
corresponding to a
time value of a play duration of the audio timeline; generating a master
timeline containing
manipulated image data and the audio timeline; generating a digital control
file containing
records and information pertaining to manipulations of image data and the
audio timeline in the
master timeline; including meta-elements describing characteristics of a
synchronized
multimedia presentation; storing the master timeline in the form of a digital
control file in a file
system database; making a request to the file system database; processing the
digital control file
containing records and information of a synchronization of image data with an
audio timeline;
loading a master timeline containing a synchronized multimedia presentation;
activating the
master timeline playback; and reproducing the synchronized multimedia
presentation in the
content player module.
Date Recue/Date Received 2023-03-15

- 4 -
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is an illustration of a computer network, one or more computing devices
and a
synchronization system in accordance with an embodiment of the invention.
FIG. 2 is an illustration of a synchronization system in accordance with an
embodiment of the
invention.
FIG. 3 is an illustration of a content creator module system in accordance
with an embodiment
of the invention.
FIG. 4 is an illustration of a content player module system in accordance with
an embodiment of
the invention.
FIG.5 is an illustration of a user interface for a content creator module
system in accordance with
an embodiment of the invention.
FIG. 6 is an illustration of a user interface for a content player module
system in accordance with
an embodiment of the invention.
FIGS. 7A-7B (collectively referred to as "FIG.7") provide a flow diagram of an
illustrative method
for processing image data to coincide in a point of time with audio data in
accordance with an
embodiment of the invention.
FIG.8 provides a flow diagram of an illustrating method for reproducing a
master timeline
synchronizing image data with audio data in accordance with an embodiment of
the invention.
FIG. 9 is an illustration of a master timeline playback illustrating image
data processed to coincide
in a point of time with an audio timeline in accordance with an embodiment of
the invention.
Date Recue/Date Received 2023-03-15

- 5 -
DETAILED DESCRIPTION
It will be readily understood that the components of the embodiments as
generally described
herein and illustrated in the appended figures could be arranged and designed
in a wide variety
of different configurations. Thus, the following detailed description of
various embodiments, as
represented in the figures, is not intended to limit the scope of the present
disclosure but is
merely representative of various embodiments. While the various aspects of the
embodiments
are presented in drawings, the drawings are not necessarily drawn to scale
unless specifically
indicated.
The described embodiments are to be considered in all respects only as
illustrative and not
restrictive. The scope of the invention is, therefore, indicated by the
appended claims rather than
by this detailed description. All changes which come within the meaning and
range of equivalency
of the claims are to be embraced within their scope.
Reference throughout this specification to features, advantages, or similar
language does not
imply that all of the features and advantages that may be realized with the
present invention
should be or are in any single embodiment. Rather, language referring to the
features and
advantages is understood to mean that a specific feature, advantage, or
characteristic described
in connection with an embodiment is included in at least one embodiment. Thus,
discussions of
the features and advantages, and similar language, throughout this disclosure
may, but do not
necessarily, refer to the same embodiment.
Furthermore, the described features, advantages, and characteristics of the
present disclosure
may be combined in any suitable manner in one or more embodiments. One skilled
in the
relevant art will recognize, in light of the description herein, that the
invention can be practiced
without one or more of the specific features or advantages of a particular
embodiment. In other
instances, additional features and advantages may be recognized in certain
embodiments that
may not be present in all embodiments of the invention.
Reference throughout this specification to "one embodiment," "an embodiment,"
or similar
language means that a particular feature, structure, or characteristic
described in connection
with the indicated embodiment is included in at least one embodiment. Thus,
the phrases "in
one embodiment," "in an embodiment," and similar language throughout this
specification may,
but do not necessarily, all refer to the same embodiment.
Systems for processing image data to coincide in a point of time with audio
data to create a
master timeline coordinating audio timeline playback and image data display to
generate a
synchronized multimedia presentation using content creator and content player
modules
comprises a network architecture, a synchronization system and a file system
database.
A network architecture 100 for producing, transmitting, receiving and
reproducing synchronized
multimedia presentations is illustrated in FIG.1. As used herein, synchronized
multimedia
presentations refer to one or more image data arranged to coincide in a point
of time with parts
of audio data containing play duration times. The one or more image data may
be image files and
Date Recue/Date Received 2023-03-15

- 6 -
computer graphic images in accordance with an embodiment of the invention. The
one or more
audio data may be relating to files reproducing recorded sounds in accordance
with an
embodiment of the invention.
In an embodiment of the invention, the network architecture 100 comprises one
or more
computing devices 101 and computer networks 102 connected to a synchronization
system 103
as illustrated in FIG. 1. The computing devices 101 may be any electronic
device for storing and
processing data comprising at least a screen and enabled to connect to
computer networks 102.
In an embodiment of the invention, the computer network 102 may be the
Internet and other
networks that connect to the Internet.
As illustrated in FIG. 2, the synchronization system 103 comprises a content
creator module 201,
a content player module 202 and a file system database 203 in accordance with
an embodiment
of the invention. The synchronization system 103 is configured to permit the
creation,
transmission, reception, storage, preview and playback of synchronized
multimedia
presentations. Each of the content modules 201 and 202 comprise a processor
204 to execute
commands and operations generated in each module 201 and 202 in accordance
with an
embodiment of the invention.
Users accessing the synchronization system 103 may be content creator users
and content player
users in accordance with an embodiment of the invention.
The content creator module 201 operates to permit content creator users using
a computing
device 101 to create, transmit, receive, save, preview and reproduce
synchronized multimedia
presentations as illustrated in FIG. 3 in accordance with an embodiment of the
invention. A
content creator user may create, list and manipulate synchronized multimedia
presentations
using one or more image data and audio data in accordance with an embodiment
of the
invention.
As shown in FIG.3, the content creator module 201 comprises an audio data
section 301, an
image data section 306, a master timeline section 310, a synchronization
section 313 and a file
system database 203 in accordance with an embodiment of the invention. In
another
embodiment of the invention, the content creator module may comprise a live
transmission
section 317 and audio to text transcription section 323.
The audio data section 301 comprises an audio data creation 302, an audio data
list 303, an audio
data manipulation 304 and an audio data timeline creation 305. The audio data
section 301 is
configured to provide an interface to create, list and manipulate one or more
audio data.
The image data section 306 comprises an image data creation 307, an image data
list 308 and an
image data manipulation 309. The image data section 306 is configured to
provide an interface
to create, list and manipulate one or more image data.
The master timeline section 310 comprises a timestamp creator 311 and an audio
timeline and
image data time relationship generator 312. The master timeline section 310 is
configured to
Date Recue/Date Received 2023-03-15

- 7 -
provide an interface to create timestamps in an audio timeline and process an
uncoupled
synchronization of one or more image data with the audio timeline. As used
herein, timestamps
refer to digital records of time to indicate a play duration time of audio
data wherein image data
is displayed to coincide in time with a time assigned in an audio timeline
The synchronization section 313 comprises a master timeline information
compilation 314, an
audio timeline and image data time relationship execution 315, and a digital
control file
generator 316. The synchronization section 313 is configured to process and
execute a digital
control file containing commandsto execute an uncoupled synchronization of one
or more image
data with the audio timeline.
In another embodiment of the invention, a content creator user may activate a
live transmission
317 of one or more synchronized multimedia presentations as shown in FIG.3.
The content
creator may choose whether to activate the live transmission or not. If the
content creator user
chooses to activate a live transmission 317, a live transmission section 318
starts to operate. The
live transmission section 318 comprises a live transmission activation 319, a
transmission delay
time 320, a live transmission operation 321. The live transmission activation
319 is configured to
start live transmissions at the same time that the content creator user
creates one or more
synchronized multimedia presentations. The transmission delay time 320 is
configured to delay
the time wherein one or more synchronized multimedia presentations are
transmitted to one or
more content player users. The live transmission operation 321 is configured
to permit live
transmissions and reception of one or more synchronized multimedia
presentations using a
computing device 101 over a computer network 102, allowing playback to start
while the rest of
the data is still being received.
In another embodiment of the invention, a content creator user may activate an
audio to text
transcription 322 based on the content of an audio timeline as shown in FIG.
3. The content
creator may choose whether to activate the audio to text transcription or not.
If the content
creator user chooses to activate an audio to text transcription 322, an audio
to text transcription
section 323 starts to operate. The audio to text transcription section 323
comprises a manual
audio to text transcription upload 324, an automated audio to text
transcription activation 325,
and an audio to text transcription display 325. The manual audio to text
transcription upload 324
is configured to permit transferring audio to text transcription files from a
computing device 101
to the content creator module 201. The automated audio to text transcription
activation 325 is
configured to generate automatic audio to text transcriptions of an audio
timeline using machine
learning techniques generated by an application programming interface. The
audio to text
transcription display 326 is configured to enable the display of audio to text
transcriptions at the
same time that a synchronized multimedia presentation is reproduced.
As shown in FIG. 4, the content player module 202 comprises a synchronization
section 323, a
synchronization section processing section 401, a playback section 404 and a
user interface
section 406 in accordance with one embodiment of the invention.
The synchronization processing section 401 comprises a digital control file
processor 402 and a
master timeline loader 403. The synchronization processing section 401 is
configured to retrieve
Date Recue/Date Received 2023-03-15

- 8 -
the digital control file from the synchronization section 323 and provide an
interface to process
the digital control file 402 and load the master timeline 403.
The playback section 404 is configured to reproduce a master timeline playback
405 guiding a
synchronized multimedia presentation wherein one or more image data are
configured to
coincide in a point of time with parts of an audio timeline.
The user interface section 406 comprises a synchronized master timeline
playback viewer 407, a
content player user events 408 and a log content player user events 409. The
synchronized
master timeline playback viewer 407 is configured to play, forward, backward,
pause, stop, adjust
sound volume, vary playback speed of the master timeline and enable content
player user events
in accordance with an embodiment of the invention.
The content player user events 408 is configured to create, store and
distribute one or more
content player user events. Content player user events may be to store and
distribute preferred
parts of a master timeline, create marks on the master timeline, add one or
more image data to
the master timeline and enable audio to text transcriptions in accordance with
an embodiment
of the invention. The log content player user events 408 is configured to load
and store one or
more content player user events in the file system database 203.
An exemplary content creator interface 501 for creating a synchronized
multimedia presentation
according to an embodiment of the invention is illustrated in FIG 5. In one
embodiment, the
content creator interface 501 is provided as a computer program to computing
devices.
Nevertheless, in other embodiments, the content creator interface 501 may be
provided as a
webpage by a webpage provider to computing devices accessing a webpage
processing the
synchronization system 103.
The content creator interface 501 comprises a live transmission module 502, a
list of images
section 503, an add image data section 505, a manipulation tools section 506,
a current image
section 507, an add audio data section 508, an audio manipulation tools
section 509, an audio
timeline assembling section 510, a timestamp section 514, an image timestamp
assignment
section 517, and a master timeline section 518 and an audio to text
transcription module 521 in
accordance with an embodiment of the invention.
The list of images section 503, wherein one or more images 504 created, listed
and manipulated
are displayed once the content creator user captures, transfers from a
computing device or
manipulates the one or more image data using the add image data section 505.
The current image
section 507 displays an image data being manipulated by the content creator
user. The
manipulation tools section 506 comprises one or more manipulations a content
creator user
generates for an image data. The one or more manipulations may be to resize,
crop, colour
manipulation, rotate, include layers with interactive graphic elements,
tridimensional
manipulations, animations, zoom, sharpen, enhance, remove blemish, add tone
effect, reverse
image, reverse exposure, add, delete, and modify one or more graphic image
elements in
accordance with an embodiment of the invention. The assign timestamp to image
data section
Date Recue/Date Received 2023-03-15

-9-
508 enables the assignment of timestamps containing the play duration time of
audio data
wherein image data is displayed to coincide in time with a time assigned in an
audio timeline.
In the embodiment shown in FIG.5, the add audio data section 508 comprises
creating and/or
listing one or more audio data, wherein a content creator user captures,
transfers from a
computing device and/or manipulates the one or more audio data in accordance
with an
embodiment of the invention.
The audio manipulation tools section 509 comprise tools to equalize sounds of
audio data, add
and/or remove parts of audio data, add and/or remove sound effects of audio
data in accordance
with an embodiment of the invention. The audio timeline assembling section 510
is configured
to list one or more audio data 511 in a sequential order and assemble 512 the
one or more data
511 into an audio timeline 513.
The timestamp section is configured to add timestamps 516 corresponding to
time durations in
a playback of the audio timeline 513. The image timestamp assignment section
517 is configured
to assign one or more timestamps 515 to each of the image data 507
corresponding to a time
duration in a playback of the audio timeline 513. The master timeline section
518 is configured
to generate 519 and reproduce a master timeline 520 comprising the time
synchronization of
image data with audio data in accordance with an embodiment of the invention.
As shown in FIG. 5, the live transmission area 502 comprises a transmission
and reception of the
synchronized multimedia presentation using a computing device over a computer
network,
allowing playback to start while the rest of the data is still being received
while in a live
transmission in accordance with one embodiment of the invention. The audio to
text
transcription area 521 comprises an area to transfer an audio to text
transcription generated
manually and an area to enable an audio to text transcription generated
automatically. The
automatic audio to text transcription is generated by an application
programming interface.
The content creator interface 501 illustrated in FIG. 5 represents one of many
possibilities to
arrange the content creator interface. Nevertheless, in other embodiments, the
sections
represented in the content creator interface 501 may be arranged in different
configurations.
In one embodiment of the invention, a synchronized multimedia presentation may
be used by a
content creator user generating audio data to be available over computer
networks to one or
more individuals (e.g. podcasts). In another embodiment of the invention, the
synchronized
multimedia presentations may be used by a content creator teaching electronic
lessons to one
or more individuals in person and/or over computer networks. In another
embodiment of the
invention, the synchronized multimedia presentation may be used by a content
creator
generating content to be distributed over one or more web applications.
A content player module 202 may operate to permit content player users
operating a computing
device 101 to search synchronized multimedia presentations, navigate
synchronized multimedia
presentations, add content player user events and reproduce synchronized
multimedia
presentations in accordance with an embodiment of the invention. In another
embodiment of
Date Recue/Date Received 2023-03-15

- 10 -
the invention, the content player module 202 is configured to permit content
player users to
receive live transmissions of synchronized multimedia presentations created by
content creator
users.
An exemplary content player interface 601 for reproducing a synchronized
multimedia
presentation according to one embodiment of the invention is illustrated in
FIG. 6. In one
embodiment, the content player interface 601 is provided as a computer program
to computing
devices. Nevertheless, in other embodiments, the content creator interface may
be provided as
a webpage by a webpage provider to computing devices accessing the webpage.
As shown in FIG. 6, the content player interface comprises a content player
user events area 602,
a synchronized multimedia presentation viewer 608, and a synchronized
presentation sections
area 610.
The content player user events area 602 comprises one or more digital records
603 generated by
content player users to mark a master timeline, web applications to distribute
content over
computer networks 604 to one or more content player users, an audio to text
transcription
module 605, an add content player user events section 606 and a log content
player user events
section 607 in accordance with one embodiment of the invention.
The one or more digital records 603 comprise sections in the master timeline
wherein the content
player user creates marks on one or more preferred sections of a master
timeline. The web
applications distributing content over computer networks 604 comprise one or
more applications
to distribute preferred master timelines and sections in master timelines to
other users over a
computer network. The audio to text transcription module 605 comprises a
section to enable the
audio to text transcript to be displayed in a master timeline. The add content
player user events
section 606 is configured to include and record information created by a
content player user for
a synchronized multimedia presentation. The information may be annotations,
links, meta-
elements, image data and audio data in accordance with one embodiment of the
invention. The
log content player user events section 607 is configured to store records and
information
pertaining to content player user events and generated by a content player
user for a
synchronized multimedia presentation in accordance with one embodiment of the
invention.
The synchronized presentation viewer 608 comprises an image data viewer area
609 and a
master timeline playback area 610 in accordance with an embodiment of the
invention. The
image data viewer area 609 displays one or more synchronized multimedia
presentations. In
another embodiment of the invention, the one or more synchronized multimedia
presentations
are reproduced in a live transmission generated by a content creator user.
The master timeline playback area 610 reproduces the synchronized multimedia
presentation
containing image data and the audio timeline in accordance with an embodiment
of the
invention. The master timeline playback area 610 comprises play, forward,
backward, pause,
stop, adjust sound volume, and vary playback speed of the synchronized
multimedia
presentation.
Date Recue/Date Received 2023-03-15

- 11 -
The synchronized multimedia presentation sections area 611 comprises one or
more sections
612 wherein a content player user selects a preferred point in the master
timeline to view the
synchronized multimedia presentation. The one or more sections 612 may be
generated
automatically or manually by a content creator user in accordance with one
embodiment of the
invention. The one or more sections 612 generated automatically are retrieved
from the
timestamps 515 generated by the content creator user.
The content player interface 601 illustrated in FIG. 6 represents one of many
possibilities to
arrange the content creator interface. Nevertheless, in other embodiments, the
sections
represented in the content creator interface 601 may be arranged in different
configurations.
Methods for processing image data to coincide in a point of time with audio
data to create a
master timeline coordinating audio timeline playback and image data display to
generate a
synchronized multimedia presentation using content creator and content player
modules are
now described.
As illustrated in FIGS. 7A-7B, the method 700 begins 701 with a content
creator user accessing a
synchronization system using a computing device 101 as described in the
process 702.1n another
embodiment of the invention, accessing the synchronization system 103 may
occur over a
computer network 102.
As illustrated in FIG. 7A, a content creator may choose whether to transmit a
synchronized
multimedia presentation over a computer network 102 using a live transmission.
If a content
creator user prefers to transmit a synchronized multimedia presentation over a
computer
network 102 using a live transmission 703:YES, then the method 700 continues
with enabling the
live transmission 704 in accordance with another embodiment of the invention.
The live
transmission 703 is configured to transmit and receive one or more
synchronized multimedia
presentations in a computing device 101 over a computer network 102 allowing
playback to start
while the rest of the data is still being received while in a live
transmission. In process 704, the
live transmission may be configured to delay the transmission and reception of
synchronized
multimedia presentations to one or more content player users in accordance
with an
embodiment of the invention.
In process 705, the one or more audio data is/are manipulated to assemble an
audio timeline
containing a sequential order of one or more audio data. The one or more audio
data are created
by using an apparatus to capture sounds and transfer them to the content
creator module 201.
In another embodiment of the invention, the one or more audio data are listed
by transferring
the one or more audio data from a computing device 101 to the content creator
module 201. In
another embodiment of the invention, the one or more audio data is/are
manipulated by
equalizing sounds of audio files, adding and/or removing parts of audio files,
adding and/or
removing sound effects. In another embodiment of the invention, the one or
more audio data
is/are reproduced by playing, forwarding, moving backwards, pausing, stopping,
adjusting the
sound volume, varying playback speed of audio data.
Date Recue/Date Received 2023-03-15

- 12 -
In process 705, the one or more audio data are assembled to generate an audio
timeline 513
comprising the sum of time durations of the one or more audio data in
accordance with an
embodiment of the invention. The audio timeline 513 comprises a sequential
order of one or
more audio data wherein the sequential order may be determined by a content
creator user in
accordance with an embodiment of the invention.
In process 706, one or more image data are manipulated to display
transformations made as a
result of editing. The one or more image data is/are created using an
apparatus to capture images
and transfer to the content creator module 201 in accordance with one
embodiment of the
invention. The one or more image data is/are listed by transferring the one or
more image data
from a computing device 101 to the content creator module 201. The one or more
image data
is/are manipulated using one or more manipulation tools 506. The one or more
manipulation
tools comprises/comprise resizing, cropping, colour manipulating, rotating,
including layers with
interactive graphic elements, tridimensional manipulations, animations,
zooming, sharpening,
enhancing, removing blemish, adding tone effect, reversing image, reversing
exposure, adding,
deleting, and modifying one or more graphic image elements in accordance with
one
embodiment of the invention. In another embodiment of the invention, the one
or more image
data transferred to the content creator module 201 may be previewed by a
content creator user.
In process 707, one or more timestamps are generated for an audio timeline to
be assigned to
each of the image data listed in the content creator module. The audio
timestamps 515 represent
the current play time for each of the image data listed in a list of image
data section 503.
Following process 708, each of the image data listed in the content creator
module is assigned a
timestamp 513 recording a play time wherein each image is displayed to
coincide in time with
the audio timeline 513. Then, in process 709, each image timestamp is
connected to the master
timeline, thus generating an audio data and image data time relationship.
In process 710, the master timeline containing image data and the audio
timeline generates a
digital control file. The digital control file contains records or information
concerning one or more
manipulations processed in audio data and image data wherein the
characteristic file integrity of
each audio data and image data are maintained. The digital control file may
use any standard
text file, wherein some embodiments may use HTML, XML, CSV or JSON. The
digital control file
comprises a standard text file storing a time relationship between the audio
timeline and image
data.
In the 711 process, one or more meta-elements describing the characteristics
of a synchronized
multimedia presentation may be added permitting that synchronized multimedia
presentations
are discoverable by search engines in accordance with one embodiment of the
invention. The
one or more meta-elements comprising records and information stored in the
digital control file
and audio to text transcription may be text files, meta-tags, titles, author
and date to generate
meta-elements.
As illustrated in FIG. 7B, a content creator may choose whether to generate
audio to text
transcriptions. If a content creator user prefers to generate audio to text
transcriptions 712:YES,
then the method 700 continues with enabling the audio to text transcriptions
713 in accordance
Date Recue/Date Received 2023-03-15

- 13 -
with another embodiment of the invention. Automatic and manual audio to text
transcripts
describing the content in an audio timeline may be enabled in accordance with
one embodiment
of the invention. The manual audio to text transcription comprises a
transcription generated by
a human operator (e.g. content creator user) transferred from a computing
device 101 to the
content creator module 201. The automated audio to text transcription
comprises a transcription
generated using machine learning techniques generated by an application
programming
interface in accordance with an embodiment of the invention. The manual and
automated audio
to text transcriptions comprising a text-type file may be recorded in a
different file than the digital
control file storing the master timeline synchronizing one or more image data
with an audio
timeline 513 in accordance with another embodiment of the invention. The text-
type file
generated for audio to text transcriptions is stored in the file system
database 203.
In process 714, the digital control file containing the commands to execute
the time relationship
between image data and the audio timeline, thus comprising a master timeline
is stored in a file
system database 203. The method 700 may continue to generate another
multimedia
presentation or finish 715 in accordance with an embodiment of the invention.
As shown in FIG.8, a method 800 for reproducing a master timeline 520 to
reproduce a
synchronized multimedia presentation containing image data coinciding in time
with an audio
timeline is described in accordance with an embodiment of the invention. In
process 801, the
content player module is accessed by a content player user. Then, a request to
the file system
database to retrieve the digital control file is generated in 802.
The content player module 202 processing the digital control file in 803 to
execute the time
relationship between image data and the audio timeline. Then, the master
timeline is loaded to
proceed with reproducing a synchronized multimedia presentation in 804. In
process 805, the
master timeline playback is activated.
As illustrated in FIG. 8, a content player may choose whether to display audio
to text
transcriptions during the time the master timeline playback is being
reproduced. If the content
player user prefers to display audio to text transcriptions to be displayed
during the time the
master timeline playback is being reproduced 806:YES, then the method 800
continues with
displaying the audio to text transcriptions 807 in accordance with an
embodiment of the
invention.
The content player module 202 may enable the exhibition of meta-elements,
records content
player user events and distributes content player user events to one or more
content player users
in accordance with an embodiment of the invention.
In process 808, the master timeline is reproduced and the synchronized
multimedia presentation
is viewed by the content player user. The method 800 may continue to generate
another
multimedia presentation or finish 809 in accordance with an embodiment of the
invention.
As illustrated in FIG. 9, an embodiment of a master timeline with one or more
image data
processed to coincide in a point of time with the audio timeline through an
uncoupled
Date Regue/Date Received 2023-03-15

- 14 -
synchronization is disclosed herein 901. The audio timeline reproducing a
sequential order of
audio data during the time that listed, created and/or manipulated image data
are displayed in
908. In some embodiments, an image not having the use of manipulations 902 may
be displayed
as the first image in the point of time synchronized with the timestamps 517
generated for the
audio timeline 513. In the time following, a manipulated version of the first
image may be
displayed in the point of time synchronized with the timestamps generated for
the audio
timeline, wherein a section of the first image displaying a close-up with one
or more
characteristics of the first image is viewed in detail 903. A second image not
having the use of
manipulations 904 may be displayed in the point of time synchronized with
timestamps 517
generated for the audio timeline 513. One or more annotations overlaying the
second image
wherein some sections may be made visually prominent using hand drawings 905.
A third image
not having the use of manipulations 906 may be displayed in the point of time
synchronized with
the timestamps generated for the audio timeline. In the time following, a
manipulated version of
the third image may be displayed in the point of time synchronized with the
timestamps
generated for the audio timeline, wherein one or more hyperlinks 907
overlaying the third image
allowing a content player user pressing a button or touching a computing
device screen to access
the content of the one or more hyperl inks 907.
The manipulated version of one or more image data 903, 905 and 907 described
herein may be
generated by a content creator user when creating a synchronized multimedia
presentation in
one embodiment of the invention. In another embodiment of the invention, the
manipulated
version of one or more image data 903, 905 and 907 described herein may be
generated
synchronously by a content player user while viewing the synchronized
multimedia presentation.
As illustrated in FIG. 9, the synchronous manipulation of one or more image
data occurs on the
same frame 901 wherein the synchronized multimedia presentation is being
reproduced.
Although the processes of the methods herein are shown and described in a
particular order, the
order of the processes of the methods may be altered so that certain processes
may be
performed in an inverse order or so that certain processes may be performed,
at least in part,
concurrently with other processes. In another embodiment, instructions or sub-
processes of
distinct processes may be implemented in an intermittent and/or alternating
manner.
Although the present disclosure has been illustrated and described with
respect to one or more
implementations, equivalent alterations and modifications will occur to others
skilled in the art
upon the reading and understanding of the present disclosure and the annexed
drawings.
Furthermore, while a particular feature of the present disclosure may have
been disclosed with
respect to one of several implementations, such feature may be combined with
one or more
other features of the other implementations as may be desired and advantageous
for any given
or particular application. Thus, the breadth and scope of the present
disclosure should not be
limited by any of the above-described embodiments. Rather, the scope of the
present
disclosure should be defined in accordance with the following claims and their
equivalents.
Date Recue/Date Received 2023-03-15

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date 2023-06-27
(22) Filed 2020-04-24
Examination Requested 2020-04-24
(41) Open to Public Inspection 2021-10-24
(45) Issued 2023-06-27

Abandonment History

There is no abandonment history.

Maintenance Fee

Last Payment of $50.00 was received on 2021-04-22


 Upcoming maintenance fee amounts

Description Date Amount
Next Payment if small entity fee 2024-04-24 $250.00
Next Payment if standard fee 2024-04-24 $400.00

Note : If the full payment has not been received on or before the date indicated, a further fee may be required which may be one of the following

  • the reinstatement fee;
  • the late payment fee; or
  • additional fee to reverse deemed expiry.

Patent fees are adjusted on the 1st of January every year. The amounts above are the current amounts if received by December 31 of the current year.
Please refer to the CIPO Patent Fees web page to see all current fee amounts.

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Application Fee 2020-04-24 $200.00 2020-04-24
Request for Examination 2024-04-24 $400.00 2020-04-24
Maintenance Fee - Application - New Act 2 2022-04-25 $50.00 2021-04-22
Final Fee 2020-04-24 $153.00 2023-04-24
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
MONTEIRO SIQUEIRA FRANCESCHI, WILTER
DA SILVA FIGUEIREDO, VANESSA
Past Owners on Record
None
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
New Application 2020-04-24 5 151
Drawings 2020-04-24 10 412
Description 2020-04-24 17 810
Abstract 2020-04-24 1 21
Claims 2020-04-24 7 288
Modification to the Applicant/Inventor 2020-05-28 5 132
Office Letter 2020-10-30 1 199
Name Change/Correction Applied 2020-10-30 1 191
Examiner Requisition 2021-03-02 5 234
Maintenance Fee Payment 2021-04-22 3 60
Amendment 2021-06-07 25 929
Amendment 2021-06-07 29 1,001
Drawings 2021-06-07 10 410
Claims 2021-06-07 8 270
Examiner Requisition 2021-08-04 5 245
Representative Drawing 2021-10-15 1 88
Cover Page 2021-10-15 1 118
Change of Address 2021-12-04 4 82
Amendment 2021-12-04 45 1,823
Abstract 2021-12-04 1 20
Description 2021-12-04 15 764
Claims 2021-12-04 4 157
Examiner Requisition 2022-03-07 6 284
Amendment 2022-07-06 31 1,290
Abstract 2022-07-06 1 29
Claims 2022-07-06 4 244
Description 2022-07-06 14 1,139
Interview Record Registered (Action) 2023-01-17 1 28
Amendment 2023-01-17 14 551
Claims 2023-01-17 4 271
Interview Record Registered (Action) 2023-03-15 1 24
Amendment 2023-03-15 25 1,320
Abstract 2023-03-15 1 32
Description 2023-03-15 14 1,253
Claims 2023-03-15 4 270
Final Fee 2023-04-24 4 82
Representative Drawing 2023-05-31 1 49
Cover Page 2023-05-31 1 88
Office Letter 2024-03-28 2 189
Maintenance Fee + Late Fee 2024-06-13 3 99
Correspondence Related to Formalities 2024-06-13 3 741
Maintenance Fee + Late Fee 2023-06-06 4 90
Electronic Grant Certificate 2023-06-27 1 2,527
Office Letter 2023-06-27 2 211