Language selection

Search

Patent 3103562 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent: (11) CA 3103562
(54) English Title: METHOD AND SYSTEM FOR GENERATING AN AUGMENTED REALITY IMAGE
(54) French Title: METHODE ET SYSTEME DE PRODUCTION D`UNE IMAGE DE REALITE AUGMENTEE
Status: Granted and Issued
Bibliographic Data
(51) International Patent Classification (IPC):
  • G6T 11/00 (2006.01)
  • G6F 3/14 (2006.01)
  • G6T 7/00 (2017.01)
  • G6T 11/60 (2006.01)
(72) Inventors :
  • BERUBE, SAMUEL (Canada)
  • MILLETTE, ALEXANDRE (Canada)
(73) Owners :
  • CAE INC
(71) Applicants :
  • CAE INC (Canada)
(74) Agent: FASKEN MARTINEAU DUMOULIN LLP
(74) Associate agent:
(45) Issued: 2022-04-05
(22) Filed Date: 2020-12-22
(41) Open to Public Inspection: 2021-03-11
Examination requested: 2020-12-22
Availability of licence: N/A
Dedicated to the Public: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): No

(30) Application Priority Data: None

Abstracts

English Abstract

A system for generating an augmented reality image using an initial image comprising a reference element, the system comprising: an image analyzer for identifying the reference element within the initial image; an image generating unit for: generating a simulation image of a scene of a virtual environment based on the identified reference element, an outline of the simulation image corresponding to the reference outline; overlaying the initial image with the simulation image to obtain the augmented reality image; and providing the augmented reality image for display.


French Abstract

Un système pour générer une image de réalité augmentée au moyen dune image initiale comprenant un élément de référence comprend : un analyseur dimage pour déterminer un élément de référence dans limage initiale; une unité de génération dimage pour générer une image de simulation dune scène de lenvironnement virtuel fondée sur lélément de référence déterminé, un contour de limage de simulation correspondant au contour de référence; la superposition de limage initiale sur limage de simulation pour obtenir limage de réalité augmentée; et la production de limage de réalité augmentée aux fins daffichage.

Claims

Note: Claims are shown in the official language in which they were submitted.


l/WE CLAIM:
1. A computer-implemented method for generating an augmented reality image
using an initial image
comprising a reference element, the computer-implemented method comprising:
identifying the reference element within the initial image, a reference
outline being associated
with the reference element;
generating a simulation image of a scene of a virtual environment based on the
identified
reference element, an outline of the simulation image corresponding to the
reference outline;
overlaying the initial image with the simulation image to obtain the augmented
reality image; and
providing the augmented reality image for display.
2. The computer-implemented method of claim 1, wherein said generating the
simulation image and said
overlaying the initial image with the simulation image are performed
concurrently by drawing the scene
of the virtual environment over the reference element within the initial
image.
3. The computer-implemented method of claim 1, wherein said generating the
simulation image
comprises:
generating a first image of the scene of the virtual environment, an outline
of the first image being
larger than the reference outline; and
downsizing the first image to obtain the simulation image.
4. The computer-implemented method of claim 1, wherein said generating the
simulation image
comprises:
generating a first image of the scene of the virtual environment, an outline
of the first image being
smaller than the reference outline; and
expanding the first image to obtain the simulation image.
- 28 -

5. The computer-implemented method of claim 1, wherein the reference outline
is different from a
physical outline of the reference element, the computer-implemented method
further comprising
determining the reference outline.
6. The computer-implemented method of claim 5, wherein a position of the
simulation image within the
augmented reality image is different from a position of the reference element
within the initial image, the
computer-implemented method further comprising determining the position of the
simulation image
within the augmented reality image.
7. The computer-implemented method of claim 5 or 6, wherein the reference
element is a marker.
8. The computer-implemented method of claim 7, wherein the marker comprises
one of a barcode and a
QR code.
9. The computer-implemented method of claim 1, wherein the reference outline
is a physical outline of
the reference element, and a position of the simulation image within the
augmented reality image is
identical to a position of the reference element within the initial image.
10. The computer-implemented method of claim 9, wherein the reference element
comprises a
representation of one of a screen, a window and a porthole.
11. The computer-implemented method of claim 1, wherein said generating the
simulation image
comprises:
generating a first image of the scene having a rectangular shape, an outline
of the first image
being larger than the outline of the simulation image; and
rendering transparent some pixels of the first image, thereby obtaining the
simulation image.
12. The computer-implemented method of any one of claims 1 to 11, wherein said
identifying the
reference element is performed using an object recognition method.
13. The computer-implemented method of any one of claims 1 to 11, wherein the
reference element is
provided with a predefined color, said identifying the reference element
comprising identifying image
pixels having the predefined color within the initial image.
- 29 -
Date recue / Date received 2021-12-06

14. The computer-implemented method of claim 13, wherein the identified image
pixels correspond to
outline pixels of the reference element.
15. The computer-implemented method of any one of claims 1 to 14, wherein the
initial image comprises
a video frame from a video.
16. The computer-implemented method of any one of claims 1 to 15, wherein the
initial image is
associated with a line of sight of a user and the simulation image is
generated further based on the line of
sight of the user.
17. The computer-implemented method of claim 16, further comprising
determining the line of sight of
the user.
18. The computer-implemented method of claim 17, wherein said determining the
line of sight of the user
comprises measuring a position and an orientation of a head of the user.
19. A system for generating an augmented reality image using an initial image
comprising a reference
element, the system comprising:
an image analyzer for identifying the reference element within the initial
image;
an image generating unit for:
generating a simulation image of a scene of a virtual environment based on the
identified
reference element, an outline of the simulation image corresponding to the
reference outline;
overlaying the initial image with the simulation image to obtain the augmented
reality image; and
providing the augmented reality image for display.
20. The system of claim 19, wherein the image generating unit is configured
for drawing the scene of the
virtual environment over the reference element within the initial image,
thereby concurrently performing
said generating the simulation image and said overlaying the initial image
with the simulation image.
21. The system of claim 19, wherein the image generating unit comprises:
a simulation image generator for generating the simulation image; and
- 30 -
Date recue / Date received 2021-12-06

an image combiner for overlaying the initial image with the simulation image
to obtain the
augmented reality image and providing the augmented reality image for display.
22. The system of claim 21, wherein the simulation image generator is
configured for:
generating a first image of the scene of the virtual environment, an outline
of the first image being
larger than the reference outline; and
downsizing the first image to obtain the simulation image.
23. The system of claim 21, wherein the simulation image generator is
configured for:
generating a first image of the scene of the virtual environment, an outline
of the first image being
smaller than the reference outline; and
expanding the first image to obtain the simulation image.
24. The system of claim 19, wherein the reference outline is different from a
physical outline of the
reference element, the image analyzer being further configured for determining
the reference outline.
25. The system of claim 24, wherein a position of the simulation image within
the augmented reality image
is different from a position of the reference element within the initial
image, the image analyzer being
further configured for determining the position of the simulation image within
the augmented reality
image.
26. The system of claim 24 or 25, wherein the reference element is a marker.
27. The system of claim 26, wherein the marker comprises one of a barcode and
a QR code.
28. The system of claim 19, wherein the reference outline is a physical
outline of the reference element,
and a position of the simulation image within the augmented reality image is
identical to a position of the
reference element within the initial image.
29. The system of claim 19, wherein the reference element comprises a
representation of one of a screen,
a window and a porthole.
30. The system of claim 21, wherein the simulation image generator is
configured for:
- 31 -
Date recue / Date received 2021-12-06

generating a first image of the scene having a rectangular shape, an outline
of the first image
being larger than the outline of the simulation image; and
rendering transparent some pixels of the first image, thereby obtaining the
simulation image.
31. The system of any one of claims 19 to 30, wherein the image analyzer is
configured for identifying the
reference element using an object recognition method.
32. The system of any one of claims 19 to 31, wherein the reference element is
provided with a predefined
color, said identifying the reference element comprising identifying image
pixels having the predefined
color within the initial image.
33. The system of claim 32, wherein the identified image pixels correspond to
outline pixels of the
reference element.
34. The system of any one of claims 19 to 33, wherein the initial image
comprises a video frame from a
video.
35. The system of any one of claims 19 to 34, wherein the initial image is
associated with a line of sight of
a user, the image generating unit being configured for generating the
simulation image further based on
the line of sight of the user.
36. The system of claim 35, further comprising a tracking device for
determining the line of sight of the
user.
37. The system of claim 36, wherein the tracking device is a camera.
38. A computer program product comprising a non-volatile memory storing
thereon computer executable
instructions for generating an augmented reality image using an initial image
comprising a reference
element, the computer executable instructions being configured for, when
executed by a computer,
performing the method steps of:
identifying the reference element within the initial image, a reference
outline being associated
with the reference element;
- 32 -
Date recue / Date received 2021-12-06

generating a simulation image of a scene of a virtual environment based on the
identified
reference element, an outline of the simulation image corresponding to the
reference outline;
overlaying the initial image with the simulation image to obtain the augmented
reality image; and
providing the augmented reality image for display.
- 33 -
Date recue / Date received 2021-12-06

Description

Note: Descriptions are shown in the official language in which they were submitted.


METHOD AND SYSTEM FOR GENERATING AN AUGMENTED REALITY IMAGE
TECHNICAL FIELD
[001] The present invention relates to the field of image generation, and more
particularly to
the generation of augmented reality images.
BACKGROUND
[002] Augmented reality is an interactive experience of a real-world
environment where the
objects that reside in the real-world are "augmented" by computer-generated
perceptual
information. In the field of simulators such as aircraft simulators, augmented
reality may be used
to insert a virtual environment into a video captured by cameras. For example,
the scene that would
be seen by a user through a window of an aircraft may be simulated and
inserted into a video of
the cockpit.
[003] In order to generate an augmented reality image, the simulated image
usually acts as a
background image and the image captured by the camera acts as the foreground
image overlaying
the background image. This implies that some parts of the foreground image are
rendered
transparent, as explained in greater detail below with respect to FIG. 1.
SUMMARY
[004] There is described a method and a system for generating an augmented
reality image. As
described in detail below, the augmented reality image is generated by
overlaying an initial image
which may be a video frame with a simulation image. While for at least some of
the methods of
the prior art for generating an augmented reality image the required resources
such as the required
processing time and/or computational power are important and/or restrictive,
the present method
and system allow for less powerful computer equipment to be used for example.
The saved
resources may then be used for generating a higher definition simulation image
in comparison to
the prior art for example.
- 1 -
Date Recue/Date Received 2021-08-04

reference element, the computer-implemented method comprising: identifying the
reference
element within the initial image; a reference outline being associated with
the reference
element; generating a simulation image of a scene of a virtual environment
based on the
identified reference element, an outline of the simulation image corresponding
to the
reference outline; overlaying the initial image with the simulation image to
obtain the
augmented reality image; and providing the augmented reality image for
display.
[006] In one embodiment, the steps of generating the simulation image and
overlaying
the initial image with the simulation image are performed concurrently by
drawing the
scene of the virtual environment over the reference element within the initial
image.
[007] In one embodiment, the step of generating the simulation image
comprises:
generating a first image of the scene of the virtual environment, an outline
of the first image
being larger than the reference outline; and downsizing the first image to
obtain the
simulation image. In another embodiment, the step of generating the simulation
image
comprises: generating a first image of the scene of the virtual environment,
an outline of the
first image being smaller than the reference outline; and expanding the first
image to obtain
the simulation image.
[008] In one embodiment, the reference outline is different from a physical
outline of the
reference element, the computer-implemented method further comprising
determining the
reference outline.
[009] In one embodiment, a position of the simulation image within the
augmented
reality image is different from a position of the reference element within the
initial image,
the computer-implemented method further comprising determining the position of
the
simulation image within the augmented reality image.
[010] In one embodiment, the reference element is a marker.
[011] In one embodiment, the marker comprises one of a barcode and a QR code.
- 2 -
Date Recue/Date Received 2020-12-22

[012] In one embodiment, the reference outline is a physical outline of the
reference
element, and a position of the simulation image within the augmented reality
image is
identical to a position of the reference element within the initial image.
[013] In one embodiment, the reference element comprises a representation of
one of a
screen, a window and a porthole.
[014] In one embodiment, the step of generating the simulation image
comprises:
generating a first image of the scene having a rectangular shape, an outline
of the first
image being larger than the outline of the simulation image; and rendering
transparent some
pixels of the first image, thereby obtaining the simulation image.
[015] In one embodiment, the step of identifying the reference element is
performed
using an object recognition method.
[016] In one embodiment, the reference element is provided with a predefined
color, said
identifying the reference element comprising identifying image pixels having
the
predefined color within the initial image.
[017] In one embodiment, the identified image pixels correspond to outline
pixels of the
reference element.
[018] In one embodiment, the initial image comprises a video frame from a
video.
[019] In one embodiment, the initial image is associated with a line of sight
of a user and
the simulation image is generated further based on the line of sight of the
user.
[020] In one embodiment, the method further comprises the step of determining
the line
of sight of the user.
[021] In one embodiment, the step of determining the line of sight of the user
comprises
measuring a position and an orientation of a head of the user.
[022] According to another broad aspect, there is provided a system for
generating an
augmented reality image using an initial image comprising a reference element,
the system
- 3 -
Date Recue/Date Received 2020-12-22

comprising: an image analyzer for identifying the reference element within the
initial
image; an image generating unit for: generating a simulation image of a scene
of a virtual
environment based on the identified reference element, an outline of the
simulation image
corresponding to the reference outline; overlaying the initial image with the
simulation
image to obtain the augmented reality image; and providing the augmented
reality image
for display.
[023] In one embodiment, the image generating unit is configured for drawing
the scene
of the virtual environment over the reference element within the initial
image, thereby
concurrently performing said generating the simulation image and said
combining the
initial image and the simulation image.
[024] In one embodiment, the image generating unit comprises: a simulation
image
generator for generating the simulation image; and an image combiner for
overlaying the
initial image with the simulation image to obtain the augmented reality image
and
providing the augmented reality image for display.
[025] In one embodiment, the simulation image generator is configured for:
generating a
first image of the scene of the virtual environment, an outline of the first
image being larger
than the reference outline; and downsizing the first image to obtain the
simulation image. In
another embodiment, the simulation image generator is configured for:
generating a first
image of the scene of the virtual environment, an outline of the first image
being smaller
than the reference outline; and expanding the first image to obtain the
simulation image.
[026] In one embodiment, the reference outline is different from a physical
outline of the
reference element, the image analyzer being further configured for determining
the
reference outline.
[027] In one embodiment, a position of the simulation image within the
augmented
reality image is different from a position of the reference element within the
initial image,
the image analyzer being further configured for determining the position of
the simulation
image within the augmented reality image.
- 4 -
Date Recue/Date Received 2020-12-22

[028] In one embodiment, the reference element is a marker.
[029] In one embodiment, the marker comprises one of a barcode and a QR code.
[030] In one embodiment, the reference outline is a physical outline of the
reference
element, and a position of the simulation image within the augmented reality
image is
identical to a position of the reference element within the initial image.
[031] In one embodiment, the reference element comprises a representation of
one of a
screen, a window and a porthole.
[032] In one embodiment, the simulation image generator is configured for:
generating a
first image of the scene having a rectangular shape, an outline of the first
image being
larger than the outline of the simulation image; and rendering transparent
some pixels of the
first image, thereby obtaining the simulation image.
[033] In one embodiment, the image analyzer is configured for identifying the
reference
element using an object recognition method.
[034] In one embodiment, the reference element is provided with a predefined
color, said
identifying the reference element comprising identifying image pixels having
the
predefined color within the initial image.
[035] In one embodiment, the identified image pixels correspond to outline
pixels of the
reference element.
[036] In one embodiment, the initial image comprises a video frame from a
video.
[037] In one embodiment, the initial image is associated with a line of sight
of a user, the
image generating unit being configured for generating the simulation image
further based
on the line of sight of the user.
[038] In one embodiment, the system further comprises a tracking device for
determining the line of sight of the user.
- 5 -
Date Recue/Date Received 2020-12-22

[039] In one embodiment, the tracking device is a camera.
[040] According to a further broad aspect, there is provided a computer
program product
for generating an augmented reality image using an initial image comprising a
reference
element, the computer program product comprising a non-volatile memory storing
computer executable instructions thereon that when executed by a computer
perform the
method steps of: identifying the reference element within the initial image; a
reference
outline being associated with the reference element; generating a simulation
image of a
scene of a virtual environment based on the identified reference element, an
outline of the
simulation image corresponding to the reference outline; overlaying the
initial image with
the simulation image to obtain the augmented reality image; and providing the
augmented
reality image for display.
BRIEF DESCRIPTION OF THE DRAWINGS
[041] Further features and advantages of the present technology will become
apparent
from the following detailed description, taken in combination with the
appended drawings,
in which:
[042] FIG. 1 is a conceptual diagram illustrating a prior art method for
generating an
augmented reality image;
[043] FIG. 2 is a flowchart illustrating a method embodying features of the
present
technology for generating an augmented reality image, in accordance with an
embodiment;
[044] FIG. 3 is a conceptual diagram illustrating an embodiment of the method
of FIG. 2
in which a scene is drawn over a reference element of an initial image, in
accordance with
an embodiment;
[045] FIG. 4 is a conceptual diagram illustrating another embodiment of the
method of
FIG. 2 in which a simulation image of a scene having a same outline as an
outline of a
reference element of an initial image is generated and the simulation image is
superimposed
over the reference element on the initial image, in accordance with an
embodiment;
- 6 -
Date Recue/Date Received 2020-12-22

[046] FIG. 5 is a conceptual diagram illustrating another embodiment of the
method of
FIG. 2 in which a first image of a scene is generated, some pixels of the
first image are
rendered to obtain a simulation image having an outline different from an
outline of a
reference element of an initial image and the simulation image is superimposed
over the
reference element on the initial image, in accordance with an embodiment;
[047] FIG. 6 is a block diagram illustrating an embodiment of a system adapted
to
execute at least some of the steps of the method of FIG. 2;
[048] FIG. 7 is a block diagram illustrating another embodiment of a system
adapted to
execute at least some of the steps of the method of FIG. 2, the system
comprising an image
analyzer, a simulation engine and an image combiner; and
[049] FIG. 8 is a block diagram illustrating an exemplary processing module
adapted to
execute at least some of the steps of the method of FIG. 2.
[050] It will be noted that throughout the appended drawings, like features
are identified
by like reference numerals.
DETAILED DESCRIPTION
[051] FIG. 1 illustrates a prior art method 10 for generating an augmented
reality image.
An initial image 12 is received. The initial image 12 contains the
representation of a portion
of a colored screen 13 such as a green screen or a blue screen. For example,
the initial
image 12 may be a video frame of a live video. In the prior art method, the
portion of the
colored screen 13 is identified and rendered transparent to obtain the
foreground image 14
comprising the transparent portion 15. Concurrently a background image 16 is
generated.
The background image 16 may be generated according to some characteristics of
the initial
image 12 such as an orientation associated with the initial image 12. However,
the
simulated image 16 is independent of the colored screen, i.e. the same
simulated image 16
is generated independently of the presence of a colored screen portion 13 in
the initial
image 12 and/or independently of the size and/or position of the colored
screen portion 13
within the initial image 12. The size of the simulated image 16 is the same as
that of the
background image 14. The simulated image 16 is used as a background image and
is
- 7 -
Date Recue/Date Received 2020-12-22

combined with the foreground image 14, i.e. the foreground image 14 is
superimposed on
the background image 16 to obtain the augmented reality image 18. As a result,
within the
augmented reality image 18, the simulated image 16 may be seen as a background
image
through the transparent window 15 of the foreground image 14.
[052] The prior art method 10 is usually used in the context of simulation
such as for the
simulation of a vehicle such as an aircraft. In this case, a simulator
comprises a frame
reproducing the interior of a vehicle and the user is installed within the
interior of the frame
to follow a training. The interior wall of the frame may be provided with at
least one
colored screen to represent a window for example. The simulator then generates
images to
be displayed on the colored screen and the generated images may correspond to
what would
be seen by the user if he was in a real vehicle. For example, the generated
images may
correspond to images of an outdoor scene.
[053] The user may be provided with a camera used to capture his field of
view. For
example, the camera may be mounted on the user's head or on a helmet to be
worn by the
user. In this case, the frames of the video outputted by the camera are
exemplified by the
initial image 12 described above. The simulator then generates the images to
be displayed
on the colored screen according to the field of view of the user, i.e.
according to the positon
and/or orientation of the camera.
[054] In the prior art system, instances of the simulated image 16 generated
by the
simulator are identical for a same field of view of the user. In other words,
the simulated
image 16 generated by the simulator does not depend on the characteristics of
the colored
screen 13 such as its shape, size and/or location within the initial image 12
taken by the
camera and the simulated image 16 has the same size as the initial image 12.
As a result,
the resources required by the prior art system such as the required processing
time and/or
computational power are important and/or restrictive.
[055] FIG. 2 illustrates one embodiment of a computer-implemented method 50
for
generating an augmented reality image. The method 50 may be executed by a
computer
machine provided with a processing unit or processor, a memory or storing unit
and a
- 8 -
Date Recue/Date Received 2020-12-22

communication unit or communication interface. However, it will be understood
that the
method 50 may be executed by more than one processing unit or more than one
computer
machine.
[056] Referring to FIG. 2 and FIG. 3, an initial image 82 is received at step
52. The
initial image 82 comprises at least a representation of a reference element
84. It will be
understood that the initial image 82 may comprise the representation of only a
portion of a
reference element or only portions of the reference element. It should also be
understood
that the initial image 82 may comprise the representation of more than one
reference
element or portions of more than one reference element.
[057] In one embodiment, the reference element 84 is provided with a
predefined
shape/outline and the identification of the reference element 84 within the
initial image 82
may be performed by locating the predefined shape/outline within the initial
image 82, as
described below.
[058] For example, the reference element 84 may correspond to a window or a
porthole
present in the frame of a simulator. In this case, shape or object recognition
algorithms may
be used for identifying the reference element 84 within the initial image 82.
[059] In another embodiment, the reference element 84 corresponds to a marker
contained in the initial image 82. For example, the marker may be a predefined
geometrical
shape such as a cross, a triangle, a disc, etc., associated with a portion of
the initial image
82 to be replaced by a simulation image. In another example, the marker may be
a barcode
such as a Quick Response (QR) code associated with a portion of the initial
image 82 to be
replaced by a simulation image.
[060] In a further embodiment, the reference element 84 is of a predefined
color such as
a predefined green color or a predefined blue color, i.e. the pixels of the
background image
82 forming the reference element 84 are of the predefined color. In this case,
the
identification of the reference element 84 within the initial image 82 may be
performed by
identifying the pixels of the initial image 82 having the predefined color.
For example, the
color associated with each pixel of the initial image may be compared to the
predefined
- 9 -
Date Recue/Date Received 2020-12-22

color to determine whether the pixel belongs to the reference element 84. In
one
embodiment, all of the pixels having the predefined color are identified so
that the whole
reference element 84 be identified. In another embodiment, only the pixels
having the
predefined color and corresponding to the outline of the reference element 84
are identified
so that only the outline of the reference element 84 be identified.
[061] In one embodiment, the reference element 84 is a representation of at
least one
screen and/or a portion of a screen. The screen may be provided with a
predefined shape
and/or a predefined color to be identified within the initial image 82. The
screen may be
used to simulate a window or a porthole for example.
[062] It should be understood that any adequate method may be used for
creating the
reference element 84. For example, infrared light which may have a predefined
shape
and/or size may be projected on the frame of the simulator to generate a
reference element.
In a further example, depth mapping can be used for creating the reference
element 84.
[063] In one embodiment, the method 50 is used for training a user in a
simulator. In this
case, the simulator may be one conceived to simulate a vehicle such as an
aircraft
simulator, a helicopter simulator, a tank simulator, an infantry fighting
vehicle simulator, or
the like. The simulator comprises a frame, walls, a control panel and/or
control instruments
for allowing the user to control the simulated vehicle as known in the art.
The simulator
further comprises at least one reference element 84 such as at least one
screen. For
example, the reference element 84 may be used for simulating a window present
in the
simulated vehicle.
[064] The simulator further comprises a simulation engine for generating
simulation
images and a database having stored thereon at least topography information
about the
simulated terrain and simulated structures such as buildings, walls, trees,
bridges, and
moving entities such as other vehicles, landable ships, and/or the like. For
example, the
database may contain information such as the position information, dimension
information,
information about the material from which a structure is made, and/or the
like.
- 10 -
Date Recue/Date Received 2020-12-22

[065] In one embodiment, the initial image 82 exemplifies frames of a video.
In this
case, the simulator is further provided with a camera directed towards the
interior space of
the simulator and the video frames captured by the camera correspond to the
initial images
82.
[066] In one embodiment, the camera may have a fixed or static position within
the
simulator. For example, the camera may be fixed to the ceiling of the
simulator. In another
embodiment, the camera may be positioned on a tripod located within the
simulator.
[067] In another embodiment, the camera may have a dynamic position. For
example,
the camera may be attached to the user of the simulator in order to capture
the field of view
of the user. In this case, the initial image 82 exemplifies images taken by
the camera, each
of which shows what the user sees at a specific moment in time during a
simulation. In this
case, the camera may be fixed onto the head of the user such as on a helmet
worn by the
user and positioned and oriented so as to image the field of view of the user.
[068] Referring back to FIG. 2, the second step 54 of the method 50 consists
in
identifying the reference element 84 within the initial image 82. It will be
understood that
any adequate method for recognizing/identifying a reference object such as the
object 84
within an image may be used.
[069] In one embodiment, the identification of the reference element 84 within
the initial
image 82 consists in identifying the pixels of the initial image 82 that
correspond to the
reference element 84 (or the pixels that form the outline of the initial image
82) and
determining the position of the identified pixels within the initial image 82.
[070] In an embodiment in which the reference element 84 is provided with a
predefined
shape or outline such as when the reference element 84 corresponds to a
marker, a window
or porthole, or a screen, any adequate shape/object recognition method may be
used for
identifying the reference element 84. For example, edge detection or depth
sensing using
stereoscopy or laser range-finding may be used.
- 11 -
Date Recue/Date Received 2020-12-22

[071] In an embodiment in which the reference element 84 is a barcode such as
a QR
code, the shape/object recognition method is adapted to identify barcodes. The
barcode is
then associated with an outline for the simulation image to be combined with
the initial
image. Furthermore, the position of the barcode within the initial image 82
may be
indicative of the position at which the simulation image is to be inserted
into the augmented
reality image.
[072] In an embodiment in which the reference element 84 is provided with a
predefined
color, the identification of the reference element 84 within the initial image
82 consists in
analyzing the color of the different elements present in the initial image 82
and identifying
the element having the predefined color as being the reference element 84. The
characteristics of the reference element 84 such as its outline and the
position of the
reference element 84 within the initial image 82 are then determined from the
position of
the pixels identified as having the predefined color.
[073] Referring to FIG. 2 and FIG. 4, the next step 56 of the method comprises
generating a simulation image 86 to be combined with the initial image 82. The
simulation
image 86 is an image of a scene of a virtual environment such as an image of
the outdoor
that the user of a simulator would have seen through a window of the simulated
vehicle.
The size of the simulation image 86 is less than or equal to that of the
initial image. The
simulation image 86 is generated based on an outline hereinafter referred to
as the
simulation image outline. In one embodiment, the simulation image outline
corresponds to
the physical or real outline of the reference element 84 so that the shape and
size of the
simulation image 86 are identical to the shape and size of the reference
element 84,
respectively. This is the case when the reference element 84 corresponds to a
window or
porthole, a screen or when the reference element 84 is provided with a
predefined color. In
another embodiment, a predefined outline is associated with the reference
element 84 and
the predefined outline associated with the reference element 84 is independent
from the
physical or real outline of the reference element 84. This is the case when
the reference
element 84 is a marker such as a cross or a barcode. In this case, the method
50 further
comprises the steps of determining the predefined outline associated with the
reference
- 12 -
Date Recue/Date Received 2020-12-22

element 84 before generating the simulation image 86, and assigning the
determined
predefined outline associated with the reference element 84 to the simulation
image 86 so
that the simulation image outline corresponds to the determined predefined
outline
associated with the reference element 84. The predefined outline associated
with the
reference element 84 may be determined using a database storing predefined
outlines each
for a respective marker, using a mapping function or the like. In an
embodiment in which
the reference element 84 is a barcode such as a QR code, the predefined
outline may be
encoded in the barcode. In this case, the method 50 further comprises a step
of extracting
the predefined outline from the barcode such as the QR code using any adequate
algorithm.
[074] While the simulation image 86 is generated so that the simulation image
outline
corresponds to the outline associated with the reference element 84, the
person skilled in
the art will understand that several steps may be executed in order to obtain
the simulation
image 86 having the same outline as that associated with the reference element
84. A first
image having an outline different from the outline associated with the
reference element 84
may be first generated and then resized/rescaled to obtain the simulation
image 86. For
example, a first image having an outline larger than the outline associated
with the
reference element 84 may be first generated and then downsized to obtain the
simulated
image 86 of which the simulation image outline corresponds to the outline
associated with
the reference element 84, in order to obtain a high resolution for the
simulation image 86
when requested or required for example. In another example, a first image
having an
outline smaller than the outline associated with the reference element 84 may
be first
generated and then expanded to obtain the simulated image 86 of which the
simulation
image outline corresponds to the outline associated with the reference element
84. For
example, such a method for generating the simulation image 86 may be used when
a low
resolution for the simulation image 86 is acceptable or requested.
[075] In one embodiment, the simulation image 86 is generated further based on
a given
position. In one embodiment, the given position corresponds to the position of
the reference
element 84 within the initial image 82. This is the case when the reference
element 84
corresponds to a window or porthole, a screen or when the reference element 84
is provided
- 13 -
Date Recue/Date Received 2020-12-22

with a predefined color. This may also be the case when the reference element
84 is a
marker such as a cross or a barcode. For example, the given position may be
the position of
the center of the reference element 84. In another embodiment such as when the
reference
element 84 is a marker, the given position may be independent of the position
of the
reference element 84. In this case, the method 50 further comprises a step of
determining
the given position associated with the simulation image 86 before generating
the simulation
image 86. The given position may be determined using a database storing
predefined given
positions each for a respective marker, using a mapping function or the like.
In an
embodiment in which the reference element 84 is a barcode such as a QR code,
the given
position associated with the simulation image 86 may be encoded in the
barcode. In this
case, the method 50 further comprises a step of extracting the given position
from the
barcode such as the QR code using any adequate algorithm.
[076] Then at step 58, the generated simulation image 86 and the initial image
82 are
combined to create an augmented reality image 88. The initial image 82 then
corresponds
to a background image 82 relative to the simulation image 86 which corresponds
to a
foreground image.
[077] In one embodiment, the steps 56 and 58 are performed concurrently while
the
scene of the simulation image 86 is drawn over the initial image 82 based on
the
determined outline so that the drawn scene has the same outline as that
determined for the
simulation image 86, i.e. the outline associated with the reference element
84. In an
embodiment, in which the reference element 84 corresponds to a section of the
initial image
86, the scene is drawn over the reference element 84. In an embodiment in
which the
reference element 84 corresponds to a marker, the scene is drawn according the
given
position associated with the simulation image 86 which may be the position of
the marker
or a determined position as described above.
[078] In another embodiment depicted in FIG. 4, the step 58 of combining the
simulation
image 86 and the initial image 82 consists in overlaying the initial image 82
with the
simulation image 86, i.e. inserting the simulation image 86 over the reference
element 84
within the initial image 82 to obtain the augmented reality image 88.
- 14 -
Date Recue/Date Received 2020-12-22

[079] In a further embodiment depicted in FIG. 5, the step 56 of generating
the
simulation image 86 consists in generating a first image 92 of a scene having
an outline
being larger than the outline associated with the simulation image 86, and
then cropping the
first image to obtain the simulation image 86, i.e. rendering transparent some
pixels 94 of
the first image 92 to obtain the simulation image 86. The first generated
image 92 has a
rectangular or square shape and a size chosen so that the outline associated
with the
simulation image 86 would fit thereinto. The content of the first generated
image 92 may be
created according to the given position associated with the simulation image
86 such as the
position of the reference element 84 within the initial image 82. Once the
first generated
image 92 has been generated, some pixels 94 are rendered transparent to obtain
the
simulation image 86. The selection of the transparent pixels 94 is performed
according to
the desired outline for the simulation image 86 so that the remaining pixels
which are not
transparent form a foreground image portion 98 which has the desired outline.
The thusly
obtained simulation image 86 is positioned over the background image 82 so
that the
foreground image portion 98 covers the reference element 82 to obtain the
augmented
reality image 88.
[080] In one embodiment, the initial image 92 is generated using the two-step
approach
described above. An image having an outline larger than the outline of the
first image 92
may be first generated, and then downsized to obtain the initial image 92.
Alternatively, an
image having an outlie smaller than the outline of the first image 92 may be
generated and
then expanded to obtain the first image 92.
[081] In one embodiment, the size/outline of the first image 92 may be
minimized as
long it contains therein the outline for the simulation image 86.
[082] Referring back to FIG. 2, once it has been created at step 58, the
augmented reality
image 88 is provided for display at step 60. In one embodiment, the augmented
reality
image 88 is stored in memory. In the same or another embodiment, the augmented
reality
image 88 is transmitted to a display unit to be displayed thereon.
- 15 -
Date Recue/Date Received 2020-12-22

[083] In one embodiment, the display may be a portable display to be worn by
the user
of the simulator. For example, the display may be secured to a helmet to be
worn by the
user.
[084] In one embodiment, the identification of the reference element 84 within
the initial
image 82 is performed by an image analyzer while the augmented reality image
88 is
generated by an image generator subsystem which may comprise an image
generator and
an image combiner as described below. In this case, the method 50 comprises a
step of
transmitting the outline and for the simulation image 86 from the image
analyzer to the
image generator. In one embodiment, the transmission step consists in
transmitting an
identification of the pixels of the initial image 82 that form the reference
element 84.
[085] In one embodiment, the position of the identified pixels is transmitted
from the
image analyzer to the image generator. For example, a table comprising a line
per identified
pixel and x and y positions per line may be transmitted to the image
generator.
[086] In another embodiment, a channel of the initial image 82 other than a
color
channel is used for identifying the pixels that form the reference element 84.
In this case,
the channel value of the pixels identified as forming the reference element 84
is changed to
a predefined value, thereby obtaining a modified initial image. The step of
transmitting the
information about the reference element 84 to the image generator subsystem
then consists
in transmitting the modified initial image to the image generator.
[087] In one embodiment, the image channel used for transmitting the
information about
the reference element 84 is an alpha channel, a stencil channel, a depth
channel or the like.
[088] In a further embodiment, a color channel such as an RGB channel or a
CMYK
channel may be used for transmitting the information about the reference
element 84. For
example, if a particular color channel of the image is not used, this
particular color channel
may be used for transmitting the information about the reference element 84.
For example,
if an image does not contain any red, the red color channel may be used for
transmitting the
information about the reference element. For example, a predefined value may
be assigned
in the red color channel for each pixel representing the reference element.
- 16 -
Date Recue/Date Received 2020-12-22

[089] In an embodiment in which the camera captures the field of view of the
user of the
simulator, the method 50 further comprises a step of receiving the line of
sight of the user.
In one embodiment, the line of sight of the user is represented by the
position and
orientation of the head of the user.
[090] In one embodiment, the method 50 may further comprise a step of tracking
the line
of sight of the user. It will be understood that any adequate tracking device
for determining
the line of sight of the user or the position and orientation of the user's
head may be used.
[091] When the camera captures the field of view of the user, the step 56 of
generating
the simulation image 86 is further performed according to the line of sight of
the user.
[092] In one embodiment, the method 50 is performed in real-time. For example,
the
method 50 may generate augmented reality images by combining real-time
captured video
images and real-time simulation rendered images.
[093] In one embodiment, the above-described method 50 may be embodied as a
computer program product for generating an augmented reality image, the
computer
program product comprising a computer readable memory storing computer
executable
instructions thereon that when executed by a computer perform the steps of the
above
described method 50.
[094] In another embodiment, the above-described method 50 may be performed by
a
system for generating an augmented reality image, the system comprising a
communication
unit for at least one of receiving and transmitting data, a memory and at
least one
processing unit configured for executing the steps of the above described
method 50.
[095] FIG. 6 illustrates one embodiment of a system 100 for generating an
augmented
reality image. The system 100 comprises an image analyzer 102 and an augmented
reality
image generator 104. In one embodiment, the image analyzer 102 and the
augmented
reality image generator 104 are each be provided with a respective processor
or processing
unit, a respective memory and respective communication means. In another
embodiment,
- 17 -
Date Recue/Date Received 2020-12-22

the image analyzer 102 and the augmented reality image generator 104 share a
same
processor or processing unit, a same memory and/or same communication means.
[096] The image analyzer 102 is configured for receiving an initial image such
as the
initial image 82. As described above, the initial image 82 comprises at least
a representation
of a reference element such as the reference element 84. The image analyzer
102 is
configured for identifying the reference element within the initial image 82
and determining
the outline and position for the simulation image 86 to be generated using the
method
described above.
[097] As described above, the outline and/or the position for the simulation
image 86
may correspond to the outline and/or position of the reference element 84. In
this case, the
image analyzer 102 is configured for determining the outline and/or the
position of the
reference element 84. In another embodiment, the outline and/or the position
for the
simulation image 86 is independent of the outline and/or position of the
reference element
84. In this case the image analyzer 102 is configured for determining the
outline and/or
position for the simulation image 86 based on the reference element 84 such as
based on a
QR code, as described above.
[098] In one embodiment, the initial image 82 is a video frame of a video. In
one
embodiment, the camera used for taking the initial image 82 may have a fixed
or static
position. In another embodiment, the camera may have a dynamic position. For
example,
the camera may be attached to a user in order to capture the field of view of
the user.
[099] In an embodiment in which the reference element 84 is provided with a
predefined
shape such as when the reference element 84 is a marker, a window or a screen,
the image
analyzer 102 is configured for performing shape recognition in order to
identify the
reference element 84 within the initial image 82, i.e. identifying the outline
and/or position
within the initial image 82 of the reference element 84. It will be understood
that any
adequate method for identifying a predefined shape within an image may be
used. For
example, spectral shape analysis may be used. In another example, pattern
recognition
using artificial neural networks and deep learning may be performed by the
image
- 18 -
Date Recue/Date Received 2020-12-22

analyzer 102. In this embodiment, the predefined shape of the reference
element 84 is
stored in memory. The image analyzer 102 detects and identifies the shape of
the elements
contained in the initial image 82 and compares the identified shapes to the
predefined shape
stored in memory to identify the reference element 84.
[100] In an embodiment in which the reference element 84 is provided with a
predefined
color such as a predefined green color or a predefined blue color, the image
analyzer 102 is
configured for comparing the color of each pixel of the initial image 82 to
the predefined
image in order to identify at least the outline of the reference element 84
within the initial
image 82. When the image analyzer 102 determines that the color of a given
pixel
substantially corresponds to the predefined color, the given pixel is
considered as belonging
to the reference element 84 to be identified. The position of the reference
element 84 within
the initial image 82 and the outline of the reference element 84 are then
obtained from the
position within the initial image 82 of the pixels that are identified as
having the predefined
color.
[101] In an embodiment in which the outline and/or position of the simulation
image
within the augmented reality image for the simulation image 86 correspond to
the outline
and/or position within the initial image 82 of the reference element 84, the
image analyzer
102 is configured for outputting the outline and/or position within the
initial image 82 of
the reference element 84.
[102] In an embodiment in which the outline for the simulation image 86 is
independent
of the outline of the reference element 84 such as when the reference element
84 is a
marker, the image analyzer 102 is configured for determining the outline for
the simulation
image 86 using the above described method. For example, when the reference
element 84 is
a QR code and the outline for the simulation image 86 is encoded into the QR
code, the
image analyzer 102 may be configured for extracting the outline for the
simulation image
86 using any adequate algorithm. In another example, the outline for the
simulation image
86 may be determined using a database as described above.
- 19 -
Date Recue/Date Received 2020-12-22

[103] In an embodiment in which the position for the simulation image 86
within the
augmented reality image is independent of the position of the reference
element 84 within
the initial image 82, the image analyzer 102 is configured for determining the
position for
the simulation image 86 within the augmented reality image based on the
reference element
84. For example, when the reference element 84 is a QR code and the position
for the
simulation image 86 is encoded into the QR code, the image analyzer 102 may be
configured for extracting the position for the simulation image 86 from the QR
code using
any adequate algorithm. In another example, the position for the simulation
image 86 may
be retrieved from a database as described above.
[104] In one embodiment, the transmission step consists in transmitting an
identification
of the pixels of the initial image 82 that correspond to the reference element
84 from which
the outline of the simulation image 86 and the desired position of the
simulation image 86
within the augmented reality image may be determined. For example, a table
comprising a
line per identified pixel and x and y positions per line may be transmitted to
the augmented
reality image generator 104.
[105] In another embodiment, a given channel of the initial image 82 other
than a color
channel is used for identifying the pixels that form the reference element. In
this case, the
image analyzer 102 is further configured for identifying the pixels belonging
to the
reference element 84 or the outline of the reference element 84 within the
given channel.
For example, the image analyzer 102 may assign a predefined value to each
pixel
associated with the reference element 84 within the given channel of the
initial image 82.
The image analyzer 102 then transmits the initial image 82 having the modified
given
channel to the augmented reality image generator 104. For example, the channel
to be
modified to identify the pixels belonging to the reference element 84 may be
an alpha
channel, a stencil channel, a depth channel or the like.
[106] In a further embodiment, a color channel such as an RGB channel or a
CMYK
channel may be used for identifying the pixels representing the reference
element 84 within
the initial image 82. For example, if a particular color channel of the
initial image 82 is not
- 20 -
Date Recue/Date Received 2020-12-22

used, the image analyzer 102 may assign a predefined value to each pixel
associated with
the reference element 84 within the unused color channel.
[107] The augmented reality image generator 104 is configured for receiving
the initial
image, generating a simulation image 86 and combining the simulation image 86
and the
initial image 82 to obtain the augmented reality image. The simulation image
86 may be an
image of a scene. The initial image 82 then corresponds to a background image
relatively to
the simulation image 86, i.e. the simulation image 86 is to be
superimposed/overlaid/drawn
over the initial image 82.
[108] The augmented reality image generator 104 receives the information about
the
simulation image 86, i.e. the outline for the simulation image 86 and the
position of the
simulation image 86 within the augmented reality image, and generates the
simulation
image 86 using the received information.
[109] In an embodiment in which the information about the reference element 84
is
provided by the identification of the pixels that represent the reference
element 84 within
the initial image 82 or the outline of the reference element 84, the augmented
reality image
generator 104 determines the position of the identified pixels within the
initial image 82 to
determine the outline and position within the initial image 82 of the
reference element 84,
which correspond to the outline and position within the augmented reality
image for the
simulation image 86. The augmented reality image generator 104 then generates
the
simulation image 86 as a function of the outline and combines the generated
simulation
image 86 within the initial image 82 at the received position.
[110] In an embodiment in which a given channel of the initial image is
modified to
identify the pixels that represent the reference element 84, the augmented
reality image
generator 104 compares the values of each pixel for the given channel to a
predefined value
and identifies the pixels that represent the reference element 84 based on the
comparison.
Pixels having a given channel value substantially equal to the predefined
value are
considered as representing the reference element 847. The outline and position
for the
- 21 -
Date Recue/Date Received 2020-12-22

simulation image 86 are then given by the position of the identified pixels
within the given
channel.
[111] It will be understood that when a given channel of the initial image is
used for
transmitting the information about the reference element 84, the augmented
reality image
generator 104 receives the initial image 82 having the given channel. In an
embodiment in
which the transmission of the information about the reference element 84 is
independent of
the initial image 82, e.g. when a table comprising the position of the
identified pixels is
transmitted to the augmented reality image generator 104, it will be
understood that the
initial image 82 is further sent to the augmented reality image generator 104.
[112] In one embodiment, the step of combining the simulation image 86 of the
scene
with the initial image 82 consists in drawing over the initial image, as
illustrated in FIG. 3.
In this case, the augmented reality image generator 104 may be configured for
drawing a
scene or a portion of a scene over the reference element 84 within the initial
image 82 to
obtain the augmented reality image. The scene to be drawn, i.e. the content,
the outline and
position within the initial image of the scene to be drawn, is determined
according to the
outline of the reference element 84 and the position of the reference element
84 within the
initial image 82.
[113] In another embodiment, the augmented reality image generator 104 is
configured
for creating a simulation image 86 of a scene according to the outline and
position
determined for the simulation image 86. As described above, the outline and/or
position for
the simulation image 86 may correspond to the outline and/or position of the
reference
element 84. In another embodiment, the outline and/or position for the
simulation image 86
may not correspond to the outline and/or position of the reference element 84,
as described
above. The augmented reality image generator 104 is further configured for
inserting or
positioning the generated simulation image 86 over the reference element in
the initial
image 82 to obtain the augmented reality image.
[114] It will be understood that the size of the simulation image 84 is less
than or equal
to that of the initial image 82.
- 22 -
Date Recue/Date Received 2020-12-22

[115] In one embodiment, the augmented reality image generator 104 is
configured for
generating a first image 92 of a scene, then rendering transparent some pixels
94 of the first
image to obtain the simulation image 86 and combining the thus-obtained
simulation image
86 with the initial image 82, i.e. superimposing/overlaying the simulation
image 86 over the
reference element 84 within the initial image 82, as illustrated in FIG. 5.
The first generated
image 92 may have a predefined shape such as a rectangular shape and the
augmented
reality image generator 104 is configured for adequately determining the
size/outline of the
first image 92 so that the outline for the simulation image 86 fits into the
first image 92.
The augmented reality image generator 104 then creates a first scene for the
first image 92
according to the position for the simulation image 86 within the augmented
reality image.
Once the first image 92 has been created, the augmented reality image
generator 104
renders some pixels 94 of the first image 92 transparent so that the outline
of the resulting
image corresponds to the outline for the simulation image 86. The resulting
image then
corresponds to the simulation image 86. The augmented reality image generator
104
identifies the pixels to be rendered transparent based on the outline for the
simulation image
86 and optionally the position associated with the simulation image 86 so that
the
remaining pixels which are not rendered transparent form the desired
simulation image 86
which has the desired outline.
[116] In one embodiment, the augmented reality image generator 104 may be
configured
to minimize the size of the first image 92 as long it may contain the outline
for the
simulation image 86 therein.
[117] Once generated, the augmented reality image is provided by the augmented
reality
image generator 104 for display. For example, the augmented reality image
generator 104
may store the augmented reality image in memory. In the same or another
example, the
augmented reality image generator 104 may transmit the augmented reality image
to a
display unit to be displayed thereon.
[118] As described above, the display may be a portable display to be worn in
front of
the eyes of the user of the simulator. For example, the display may be secured
to a helmet
to be worn by the user.
- 23 -
Date Recue/Date Received 2020-12-22

[119] As described above, the initial image may be an image captured by a
camera. In
one embodiment, the camera may have a fixed position and a fixed field of
view. In another
embodiment, the camera may be a camera configured for capturing the field of
view of the
user of a simulator. For example, the camera may be fixed to the head of the
user such as
on a helmet worn by the user, to capture the field of view of the user. In
this case, the
augmented reality image generator 104 is further configured for receiving the
line of sight
of the user and generating the simulation image based on the line of sight of
the user.
[120] In one embodiment, the line of sight of the user may be determined from
the
position and orientation of the head of the user. In this case, the augmented
reality image
generator 104 may be configured for receiving the position and orientation of
the head of
the user and determining the line of sight of the user.
[121] It will be understood that any adequate tracking device for determining
the line of
sight of the user or the position and orientation of the user's head may be
used, and that the
tracking device and the camera may be part of the system 100.
[122] In one embodiment, the system 100 operates in substantially real-time.
[123] While in the system 100, the generation of the simulation image 86 and
the
combination of the simulation image 86 with the initial image 82 are performed
by a same
component/unit, i.e. the augmented reality image generator 104, it will be
understood that
the simulation image generation and the combination step may be performed by
two
separate components/units as illustrated in FIG. 7.
[124] FIG. 7 illustrates one embodiment of a system 150 to be used for
training a user in
a simulator. The simulator may be a vehicle simulator such as an aircraft
simulator, a
helicopter simulator, a tank simulator, an infantry fighting vehicle
simulator, or the like.
The simulator comprises at least one reference element 84 such as at least one
green or blue
screen or a barcode. For example, the reference element 84 may be used for
simulating a
window present in the simulated vehicle.
- 24 -
Date Recue/Date Received 2020-12-22

[125] The system 150 comprises the image analyzer 102, a simulation engine 154
for
generating simulation images and an image combiner 156 for generating
augmented reality
images. The simulation engine 154 comprises or is in communication with a
database
having stored thereon at least topography information about the simulated
terrain and
simulated structures such as buildings, walls, trees, bridges, and moving
entities such as
other vehicles, landable ships, and/or the like. For example, the database may
contain
information such as the position information, dimension information,
information about the
material from which a structure is made, and/or the like.
[126] In an embodiment in which the outline and/or position for the simulation
image 86
is different from the outline and/or the position of the reference element 84
within the
initial image 82, the database comprises a respective outline and/or a
respective position
within the augmented reality image for each possible reference element 84.
[127] The simulation engine 154 is configured for receiving from the image
analyzer
102 the information about the simulation image 86, i.e. the outline for the
simulation image
86 and optionally the position of the simulation image 86, and generating the
scene to be
displayed in replacement of the reference element 84 of the initial image 82,
i.e. it is
configured for generating the simulation image 86 following the process
described above
with respect to the augmented reality image generator 104. The simulation
engine 154 is
further configured for transmitting the generated simulation image 86 to the
image
combiner 156.
[128] The image combiner 156 is configured for receiving the initial image 82,
the
position for the simulation image and the simulation image generated by the
simulation
engine 154 and combining the initial image 82 and the simulation image 86. For
example,
the image combiner 156 may superimpose the simulation image 86 over the
reference
element 84 in the initial image 82 to obtain the augmented reality image. The
image
combiner 156 is further configured for outputting the augmented reality image,
i.e.
providing the augmented reality image for display. In one embodiment, the
image combiner
156 stores the generated augmented reality image in memory from which it may
be
accessed by a display for display purposes. In the same or another embodiment,
the image
- 25 -
Date Recue/Date Received 2020-12-22

combiner 156 is configured for transmitting the generated augmented reality
image to a
display to be displayed thereon. For example, the display may be a wearable
display worn
by the user of the simulator.
[129] In an embodiment in which a first image 92 is generated and some pixels
94 of the
generated first image 92 are rendered transparent to obtain the simulation
image 86, the
simulation engine 154 may be configured to both generate the first image 92,
render
transparent some of the pixels 94 of the first image 92 to obtain the
simulation image 86
and transmit the simulation image 86 to the image combiner 156. In another
embodiment,
the simulation engine 154 is configured for generating the first image 92 and
transmitting
the first image 92 to the image combiner 156. The image combiner 156 is then
configured
for receiving the first image 92 and rendering transparent some of the pixels
94 of the first
image 92 to obtain the simulation image 86, as described above.
[130] FIG. 8 is a block diagram illustrating an exemplary processing module
200 for
executing the steps 52 to 58 of the method 50, in accordance with some
embodiments. The
processing module 200 typically includes one or more Computer Processing Units
(CPUs)
and/or Graphic Processing Units (GPUs) 202 for executing modules or programs
and/or
instructions stored in memory 204 and thereby performing processing
operations, memory
204, and one or more communication buses 206 for interconnecting these
components. The
communication buses 206 optionally include circuitry (sometimes called a
chipset) that
interconnects and controls communications between system components. The
memory 204
includes high-speed random access memory, such as DRAM, SRAM, DDR RAM or other
random access solid state memory devices, and may include non-volatile memory,
such as
one or more magnetic disk storage devices, optical disk storage devices, flash
memory
devices, or other non-volatile solid state storage devices. The memory 204
optionally
includes one or more storage devices remotely located from the CPU(s) 202. The
memory
204, or alternately the non-volatile memory device(s) within the memory 204,
comprises a
non-transitory computer readable storage medium. In some embodiments, the
memory 204,
or the computer readable storage medium of the memory 204 stores the following
programs, modules, and data structures, or a subset thereof:
- 26 -
Date Recue/Date Received 2020-12-22

an image analyzer module 210 for receiving an initial image, identifying a
reference element in the initial image and determining an outline and a
position for a
simulation image of a scene to be generated, as described above; and
an image generator module 212 for generating the simulation image of the
scene using at least the outline associated with the simulation image; and
a combiner module 214 for combining the simulation image and the initial
image using the position associated with the simulation imageto obtain an
augmented
reality image and providing the augmented reality image for display.
[131] Each of the above identified elements may be stored in one or more of
the
previously mentioned memory devices and corresponds to a set of instructions
for
performing a function described above. The above identified modules or
programs (i.e., sets
of instructions) need not be implemented as separate software programs,
procedures or
modules, and thus various subsets of these modules may be combined or
otherwise re-
arranged in various embodiments. In some embodiments, the memory 84 may store
a
subset of the modules and data structures identified above. Furthermore, the
memory 84
may store additional modules and data structures not described above.
[132] Although it shows a processing module 200, FIG. 8 is intended more as
functional
description of the various features which may be present in a management
module than as a
structural schematic of the embodiments described herein. In practice, and as
recognized by
the person skilled in the art, items shown separately could be combined and
some items
could be separated. For example, the augmented reality image generator module
may be
split into two modules, i.e. a simulation engine module and an image combiner
module.
[133] The embodiments of the invention described above are intended to be
exemplary
only. The scope of the invention is therefore intended to be limited solely by
the scope of
the appended claims.
- 27 -
Date Recue/Date Received 2020-12-22

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

2024-08-01:As part of the Next Generation Patents (NGP) transition, the Canadian Patents Database (CPD) now contains a more detailed Event History, which replicates the Event Log of our new back-office solution.

Please note that "Inactive:" events refers to events no longer in use in our new back-office solution.

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Event History , Maintenance Fee  and Payment History  should be consulted.

Event History

Description Date
Letter Sent 2022-04-05
Grant by Issuance 2022-04-05
Inactive: Cover page published 2022-04-04
Inactive: Final fee received 2022-02-02
Pre-grant 2022-02-02
Notice of Allowance is Issued 2022-01-10
Letter Sent 2022-01-10
4 2022-01-10
Notice of Allowance is Issued 2022-01-10
Inactive: Q2 passed 2022-01-07
Inactive: Approved for allowance (AFA) 2022-01-07
Letter Sent 2021-12-08
Amendment Received - Response to Examiner's Requisition 2021-12-06
Amendment Received - Voluntary Amendment 2021-12-06
Inactive: Single transfer 2021-11-23
Common Representative Appointed 2021-11-13
Examiner's Report 2021-08-13
Inactive: Report - No QC 2021-08-13
Amendment Received - Response to Examiner's Requisition 2021-08-04
Amendment Received - Voluntary Amendment 2021-08-04
Examiner's Report 2021-04-07
Inactive: Report - No QC 2021-04-06
Application Published (Open to Public Inspection) 2021-03-11
Inactive: Cover page published 2021-03-10
Advanced Examination Determined Compliant - paragraph 84(1)(a) of the Patent Rules 2021-01-14
Inactive: Office letter 2021-01-14
Letter sent 2021-01-14
Inactive: IPC assigned 2021-01-12
Inactive: First IPC assigned 2021-01-12
Inactive: IPC assigned 2021-01-12
Inactive: IPC assigned 2021-01-12
Inactive: IPC assigned 2021-01-12
Filing Requirements Determined Compliant 2021-01-11
Letter sent 2021-01-11
Letter Sent 2021-01-08
Common Representative Appointed 2020-12-22
Request for Examination Requirements Determined Compliant 2020-12-22
Inactive: Advanced examination (SO) fee processed 2020-12-22
All Requirements for Examination Determined Compliant 2020-12-22
Application Received - Regular National 2020-12-22
Inactive: QC images - Scanning 2020-12-22

Abandonment History

There is no abandonment history.

Fee History

Fee Type Anniversary Year Due Date Paid Date
Request for examination - standard 2024-12-23 2020-12-22
Advanced Examination 2020-12-22 2020-12-22
Application fee - standard 2020-12-22 2020-12-22
Registration of a document 2021-11-23
Final fee - standard 2022-05-10 2022-02-02
MF (patent, 2nd anniv.) - standard 2022-12-22 2022-11-02
MF (patent, 3rd anniv.) - standard 2023-12-22 2023-10-10
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
CAE INC
Past Owners on Record
ALEXANDRE MILLETTE
SAMUEL BERUBE
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column (Temporarily unavailable). To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Cover Page 2022-03-06 1 35
Description 2020-12-21 27 1,380
Claims 2020-12-21 6 217
Drawings 2020-12-21 8 303
Abstract 2020-12-21 1 15
Cover Page 2021-02-01 2 36
Representative drawing 2021-02-01 1 6
Description 2021-08-03 27 1,371
Claims 2021-08-03 6 224
Claims 2021-12-05 6 224
Representative drawing 2022-03-06 1 6
Courtesy - Acknowledgement of Request for Examination 2021-01-07 1 433
Courtesy - Filing certificate 2021-01-10 1 578
Courtesy - Certificate of registration (related document(s)) 2021-12-07 1 365
Commissioner's Notice - Application Found Allowable 2022-01-09 1 570
Electronic Grant Certificate 2022-04-04 1 2,527
New application 2020-12-21 8 278
Courtesy - Office Letter 2021-01-13 2 233
Courtesy - Advanced Examination Request - Compliant (SO) 2021-01-13 1 174
Examiner requisition 2021-04-06 6 314
Amendment / response to report 2021-08-03 21 773
Examiner requisition 2021-08-12 3 155
Amendment / response to report 2021-12-05 18 620
Final fee 2022-02-01 5 140