Language selection

Search

Patent 3056831 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent: (11) CA 3056831
(54) English Title: SYSTEM AND METHOD FOR VALIDATING GEOSPATIAL DATA COLLECTION WITH MEDIATED REALITY
(54) French Title: SYSTEME ET METHODE POUR VALIDER LA COLLECTE DE DONNEES GEOSPATIALES AU MOYEN DE REALITE ELECTRONIQUE
Status: Granted and Issued
Bibliographic Data
Abstracts

English Abstract

There is provided a system and method of validating geospatial object data with mediated reality. The method including: receiving an object definition associated with a geospatial object, the object definition comprising a type of object and a position; displaying a visual representation of the geospatial object to a user relative to a corresponding geospatial object located in a physical scene; and receiving input validating a placement of the visual representation relative to the corresponding geospatial object located in the physical scene.


French Abstract

Il est décrit un système et une méthode servant à valider des données sur des objets géospatiaux au moyen de la réalité électronique. La méthode comprend : recevoir une définition dobjet associée à un objet géospatial, la définition dobjet comprenant un type dobjet et une position; afficher une représentation visuelle de lobjet géospatial pour un utilisateur par rapport à un objet géospatial correspondant situé dans une scène physique; et recevoir des entrées validant lemplacement dune représentation visuelle par rapport à lobjet géospatial correspondant situé dans la scène physique.

Claims

Note: Claims are shown in the official language in which they were submitted.


Application # 3,056,831
Amendment dated Sept. 27, 2021
CLAIMS
1. A computer-implemented method of positioning a geospatial object with a
mediated
reality device, the method comprising:
collecting a geospatial object comprising:
receiving, from a global navigation satellite systems (GNSS) receiver, GNSS
position data based on wireless signals received from a GNSS satellite;
determining, based on the GNSS position data, a position of a geospatial
object;
generating an object definition associated with the geospatial object, the
object
definition comprising the position of the geospatial object and a type of the
geospatial object; and
storing the object definition in a data storage; and
immediately after collecting the geospatial object, displaying and verifying
the
position of the geospatial object, by:
determining an orientation of the mediated reality device in a physical scene;
determining relative positioning of the geospatial object in the physical
scene
based on the position of the geospatial object relative to the orientation of
the
mediated reality device; and
displaying a visual representation of the geospatial object to a user on the
mediated reality device using the relative positioning of the geospatial
object
located in the physical scene for the user to verify positioning of the
geospatial
object; and receiving input confirming a placement of the visual
representation
relative to the corresponding geospatial object located in the physical scene.
2. The method of claim 1, wherein the position of the geospatial object in the
object
definition comprises latitude, longitude, and elevation.
3. The method of claim 2, wherein the orientation of the mediated reality
device in the
physical scene comprises latitude, longitude, elevation, and direction.
4. The method of claim 1, wherein the visual representation comprises a
virtual model of
the geospatial object based on the type of object in the object definition.
16
Date recue / Date received 2021-11-25

Application # 3,056,831
Amendment dated Sept. 27, 2021
5. The method of claim 2, wherein the position of the geospatial object in the
object
definition comprises a point associated with the geospatial object.
6. The method of claim 5, wherein the position of the geospatial object is
determined using
both global navigation satellite systems (GNSS) and real-time kinematic (RTK)
positioning.
7. The method of claim 1, wherein receiving input confirming the placement of
the visual
representation of the geospatial object comprises receiving a confirmatory
input from a
user.
8. The method of claim 1, wherein receiving input confirming the placement of
the visual
representation of the geospatial object comprises receiving a confirmatory
output from
machine vision and artificial intelligence techniques.
9. The method of claim 5, further comprising recording, in the object
definition, the
confirmed position associated with the geospatial object.
10. A system of positioning a geospatial object with a mediated reality
device, the system
comprising one or more processors and data storage memory in communication
with the
one or more processors, the one or more processors configured to execute:
a recordation module to collect a geospatial object by receiving, from a
global
navigation satellite systems (GNSS) receiver, GNSS position data based on
wireless
signals received from a GNSS satellite and determine, based on the GNSS
position
data, a position of a geospatial object;
an object module to generate and store an object definition associated with
the
collected geospatial object in a data storage, the object definition
comprising the
position of the geospatial object and a type of the geospatial object;
a position module to determine an orientation of the mediated reality device
in a
physical scene and to determine relative positioning of the geospatial object
in the
physical scene based on the position of the geospatial object relative to the
orientation of the mediated reality device;
a display module to, immediately after collecting the geospatial object,
display a
visual representation of the geospatial object to a user using the relative
positioning
17
Date recue / Date received 2021-11-25

Application # 3,056,831
Amendment dated Sept. 27, 2021
of the geospatial object located in the physical scene for the user to verify
positioning
of the geospatial object; and
a validation module to receive input confirming a placement of the visual
representation relative to the corresponding geospatial object located in the
physical
scene.
11. The system of claim 10, wherein the position of the geospatial object in
the object
definition comprises latitude, longitude, and elevation.
12. The system of claim 11, wherein the orientation of the mediated reality
device in the
physical scene comprises latitude, longitude, elevation, and direction.
13. The system of claim 12, wherein the visual representation comprises a
virtual model of
the geospatial object based on the type of object in the object definition.
14. The system of claim 11, wherein the position of the geospatial object in
the object
definition comprises a point associated with the geospatial object.
15. The system of claim 12, wherein the position of the geospatial object is
determined using
both global navigation satellite systems (GNSS) and real-time kinematic (RTK)
positioning.
16. The system of claim 10, wherein receiving the input confirming the
placement of the
visual representation of the geospatial object comprises receiving a
confirmatory input
from a user via an input device.
17. The system of claim 10, wherein receiving input confirming the placement
of the visual
representation of the geospatial object comprises receiving a confirmatory
output from
machine vision and artificial intelligence techniques.
18. The system of claim 14, wherein the recordation module further records, in
the object
definition, the confirmed position associated with the geospatial object.
18
Date recue / Date received 2021-11-25

Description

Note: Descriptions are shown in the official language in which they were submitted.


1 SYSTEM AND METHOD FOR VALIDATING GEOSPATIAL DATA COLLECTION WITH
2 MEDIATED REALITY
3 TECHNICAL FIELD
4 [0001] The following relates generally to geospatial data management; and
more particularly, to
systems and methods for validating geospatial data collection with mediated
reality.
6 BACKGROUND
7 [0002] Surveying firms, mapping firms, municipalities, public utilities,
and many other entities,
8 collect, store, use, and disseminate vast amounts of geospatial data.
This geospatial data can be
9 used to manage daily operations and conduct mission-critical tasks; for
example, asset
maintenance, construction plan design, zoning proposals, among many others.
Traditionally,
11 geospatial data is collected using manual measurements (offsets) from
detectable local
12 landscape features; for example, a curb line. Then the collected
measurements would be plotted
13 on a map to indicate object/asset locations. The maps could then be
reprinted for use in the field.
14 While much of this geospatial data can be digitized, the accuracy and
quality of such digital
representations may affect the tasks and applications that rely on such data.
In other approaches,
16 location tools, such as global navigation satellite systems (GNSS)
and/or real-time kinematic
17 (RTK), can be used to collect digital geospatial data. These approaches
generally require
18 cumbersome, unsophisticated, and time-consuming validation techniques.
19 SUMMARY
[0003] In an aspect, there is provided a computer-implemented method of
validating geospatial
21 object data collection with mediated reality, the method comprising:
receiving an object definition
22 associated with a geospatial object, the object definition comprising a
type of object and a position;
23 displaying a visual representation of the geospatial object to a user
relative to a corresponding
24 geospatial object located in a physical scene; and receiving input
validating a placement of the
visual representation relative to the corresponding geospatial object located
in the physical scene.
26 [0004] In a particular case of the method, the position of the
geospatial object in the object
27 definition comprises latitude and longitude.
28 [0005] In another case of the method, the position of the geospatial
object in the object definition
29 comprises a point associated with the geospatial object.
1
CA 3056831 2019-09-26

1 [0006] In yet another case of the method, the position of the geospatial
object is determined using
2 at least one of global navigation satellite systems (GNSS) and real-time
kinematic (RTK)
3 positioning.
4 [0007] In yet another case of the method, receiving input validating the
placement of the visual
representation comprises receiving a confirmatory input from a user.
6 [0008] In yet another case of the method, receiving input validating the
placement of the visual
7 representation comprises receiving a confirmatory output from machine
vision and artificial
8 intelligence techniques.
9 [0009] In yet another case of the method, the method further comprising
recording, in the object
definition, the validated position of the geospatial object.
11 [0010] In yet another case of the method, the object definition further
comprises one or more
12 attributes associated with the geospatial object, and the method further
comprises receiving input
13 validating one or more of the attributes associated with the geospatial
object.
14 [0011] In yet another case of the method, the method further comprising
recording, in the object
definition, the one or more validated attributes associated with the
geospatial object.
16 [0012] In yet another case of the method, the method further comprising
associating an image of
17 the physical scene with the validated position.
18 [0013] In another aspect, there is provided a system of validating
geospatial object data collection
19 with mediated reality, the system comprising one or more processors and
data storage memory
in communication with the one or more processors, the one or more processors
configured to
21 execute: an object module to receive an object definition associated
with a geospatial object, the
22 object definition comprising a type of object and a position; a display
module to display a visual
23 representation of the geospatial object to a user relative to a
corresponding geospatial object
24 located in a physical scene; a validation module to receive input
validating a placement of the
visual representation relative to the corresponding geospatial object located
in the physical scene.
26 [0014] In a particular case of the system, the position of the
geospatial object in the object
27 definition comprises latitude and longitude.
2
CA 3056831 2019-09-26

1 [0015] In another case of the system, the position of the geospatial
object in the object definition
2 comprises a point associated with the geospatial object.
3 [0016] In yet another case of the system, the position of the geospatial
object is determined using
4 at least one of global navigation satellite systems (GNSS) and real-time
kinematic (RTK)
positioning.
6 [0017] In yet another case of the system, receiving the input validating
the placement of the visual
7 representation comprises receiving a confirmatory input from a user via
an input device.
8 [0018] In yet another case of the system, receiving input validating the
placement of the visual
9 representation comprises receiving a confirmatory output from machine
vision and artificial
intelligence techniques.
11 [0019] In yet another case of the system, the system further comprising
a recordation module to
12 record, in the object definition, the validated position of the
geospatial object.
13 [0020] In yet another case of the system, the object definition further
comprises one or more
14 attributes associated with the geospatial object, and wherein the
validation module further
receives input validating one or more of the attributes associated with the
geospatial object.
16 [0021] In yet another case of the system, the system further comprising
a recordation module to
17 record, in the object definition, the one or more validated attributes
associated with the geospatial
18 object.
19 [0022] In yet another case of the system, the system further comprising
a recordation module to
record an image of the physical scene associated with the validated position.
21 [0023] These and other aspects are contemplated and described herein. It
will be appreciated
22 that the foregoing summary sets out representative aspects of the system
and method to assist
23 skilled readers in understanding the following detailed description.
24 BRIEF DESCRIPTION OF THE DRAWINGS
[0024] A greater understanding of the embodiments will be had with reference
to the figures, in
26 which:
3
CA 3056831 2019-09-26

1 [0025] FIG. 1 illustrates a block diagram of a system of collecting
geospatial object data with
2 mediated reality, according to an embodiment;
3 [0026] FIG. 2 illustrates a flow diagram of a method of collecting
geospatial object data with
4 mediated reality, according to an embodiment;
[0027] FIG. 3A illustrates an exemplary image of collecting geospatial data by
placing an
6 antenna;
7 [0028] FIG. 3B illustrates an exemplary diagram of collecting geospatial
data by placing an
8 antenna;
9 [0029] FIG. 4 illustrates exemplary screenshots of two-dimensional maps
showing geospatial
objects;
11 [0030] FIG. 5 illustrates an exemplary image of a user validating
geospatial objects using the
12 system of FIG. 1;
13 [0031] FIG. 6 illustrates an example screenshot of a visual
representation of a received
14 geospatial object over a captured scene, in accordance with the system
of FIG. 1;
[0032] FIG. 7 illustrates an example screenshot of validating placement of the
visual
16 representation of the object of FIG. 6, in accordance with the system of
FIG. 1;
17 [0033] FIG. 8 illustrates an example screenshot of recordation of the
verified placement of the
18 object of FIG. 6, in accordance with the system of FIG. 1.
19 DETAILED DESCRIPTION
[0034] Embodiments will now be described with reference to the figures. For
simplicity and clarity
21 of illustration, where considered appropriate, reference numerals may be
repeated among the
22 Figures to indicate corresponding or analogous elements. In addition,
numerous specific details
23 are set forth in order to provide a thorough understanding of the
embodiments described herein.
24 However, it will be understood by those of ordinary skill in the art
that the embodiments described
herein may be practiced without these specific details. In other instances,
well-known methods,
26 procedures, and components have not been described in detail so as not
to obscure the
27 embodiments described herein. Also, the description is not to be
considered as limiting the scope
28 of the embodiments described herein.
4
CA 3056831 2019-09-26

1 [0035] Various terms used throughout the present description may be read
and understood as
2 .. follows, unless the context indicates otherwise: "or" as used throughout
is inclusive, as though
3 written "and/or"; singular articles and pronouns as used throughout
include their plural forms, and
4 vice versa; similarly, gendered pronouns include their counterpart
pronouns so that pronouns
should not be understood as limiting anything described herein to use,
implementation,
6 performance, etc. by a single gender; "exemplary" should be understood as
"illustrative" or
7 "exemplifying" and not necessarily as "preferred" over other embodiments.
Further definitions for
8 terms may be set out herein; these may apply to prior and subsequent
instances of those terms,
9 as will be understood from a reading of the present description.
[0036] Any module, unit, component, server, computer, terminal, engine, or
device exemplified
11 herein that executes instructions may include or otherwise have access
to computer-readable
12 media such as storage media, computer storage media, or data storage
devices (removable
13 and/or non-removable) such as, for example, magnetic disks, optical
disks, or tape. Computer
14 storage media may include volatile and non-volatile, removable and non-
removable media
implemented in any method or technology for storage of information, such as
computer-readable
16 instructions, data structures, program modules, or other data. Examples
of computer storage
17 media include RAM, ROM, EEPROM, flash memory or other memory technology, CD-
ROM,
18 digital versatile disks (DVD) or other optical storage, magnetic
cassettes, magnetic tape, magnetic
19 .. disk storage or other magnetic storage devices, or any other medium
which can be used to store
the desired information, and which can be accessed by an application, module,
or both. Any such
21 computer storage media may be part of the device or accessible or
connectable thereto. Further,
22 unless the context clearly indicates otherwise, any processor or
controller set out herein may be
23 implemented as a singular processor or as a plurality of processors. The
plurality of processors
24 may be arrayed or distributed, and any processing function referred to
herein may be carried out
by one or by a plurality of processors, even though a single processor may be
exemplified. Any
26 method, application, or module herein described may be implemented using
computer
27 readable/executable instructions that may be stored or otherwise held by
such computer-readable
28 media and executed by the one or more processors.
29 [0037] The following relates generally to geospatial data management;
and more particularly, to
systems and methods 'for validating geospatial data collection with mediated
reality.
31 [0038] While the following disclosure refers to mediated reality, it is
contemplated that this
32 includes any suitable mixture of virtual aspects and real aspects; for
example, augmented reality
5
CA 3056831 2019-09-26

1 (AR), mixed reality, modulated reality, holograms, and the like. The
mediated reality techniques
2 .. described herein can utilize any suitable hardware; for example,
smartphones, tablets, mixed
3 .. reality devices (for example, MicrosoftTM HoloLensTm), true holographic
systems, purpose-built
4 hardware, and the like.
[0039] Advantageously, the present embodiments employ advanced visualization
technologies,
6 .. such as mediated reality techniques, to work in conjunction with other
data collection techniques
7 .. to provide immediate visual validation of data collection accuracy.
8 [0040] In some cases, survey-grade data collection can be accomplished
with geographic
9 .. information systems (GIS) using high-precision global navigation
satellite systems (GNSS) and/or
real-time kinematic (RTK) positioning, to capture location of assets or points
for geospatial data
11 .. collection. As illustrated in the example of FIGS. 3A and 3B, the
collection of geospatial data can
12 be accomplished by placing a GNSS antenna near or on top of a placemark
and then recording,
13 .. for example, latitude, longitude and elevation of the antenna; thus, the
'x, y, z' geospatial
14 .. coordinates of the placemark in two-dimensional (2D) or three-
dimensional (3D) space. In some
.. cases, complimentary tools, for example laser mapping, can further enhance
collection by
16 .. allowing collection of data of hard-to-reach objects.
17 .. [0041] As illustrated in the example screenshots of FIG. 4, in some
approaches, captured
18 geospatial data can be displayed on a 2D map to help a surveyor, or
other user, validate accuracy
19 .. of the data. In some cases, the captured information can be transferred
to a server or stored
locally for review. In some cases, the elevation data and other attributes of
the object can be
21 .. stored as part of the captured geospatial data metadata in text format.
22 .. [0042] In many cases, in order to ensure accuracy of the data, the
surveyor is required to check
23 .. the collected data points and objects multiple times. This requirement
can result in substantial
24 inefficiencies for the user, and potentially add substantial time and
cost to projects; especially
.. where many objects are to be collected.
26 [0043] Advantageously, embodiments of the present disclosure can
substantially improve
27 efficiencies in accuracy verification of collected geospatial objects.
Immediately after a data point
28 on a geospatial object is collected, embodiments of the present
disclosure can use mediated
29 .. reality techniques to display a virtually representation of the
collected object relative to its physical
.. space. Thus, enabling the user to easily and immediately validate the
accuracy of the collected
31 object.
6
CA 3056831 2019-09-26

1 [0044] Advantageously, embodiments of the present disclosure can provide
real-time visual
2 validation of geospatial object placement in, for example, three-
dimensional (3D) space (e.g.,
3 latitude, longitude, and elevation). Such embodiments can improve the
quality and accuracy of
4 collected geospatial objects, and increase the speed of geospatial object
data collection. In some
cases, the present embodiments can be used to supplement other tools and
approaches for
6 geospatial data collection. In doing so, the present embodiments can
reduce the cost of data
7 collection by reducing the time needed for data capture and quality
control.
8 [0045] Embodiments of the present disclosure can provide verification by
generating a geospatial
9 image overlaid over in an image or video of a scene captured by a camera
(for example, as in
augmented reality) or displayed as a hologram (for example, as in mixed
reality or holographic
11 systems). This can be performed in a manner that anchors to reality
through geographical
12 positioning, thereby generating a geographically relevant composite
image or a hologram that can
13 be presented to a user.
14 [0046] Embodiments of the present disclosure can advantageously provide
a three-dimensional
model or a raster symbol of real time data viewable, for example, on mobile
devices, wearable
16 devices, or other viewing platforms. The geospatial images can be used
to provide real-time visual
17 representations to perform geospatial data verification of data
collected via other approaches.
18 [0047] Turning to FIG. 1, a system of validating geospatial data with
mediated reality 150 is
19 shown, according to an embodiment. In this embodiment, the system 150 is
run on a local
computing device (for example, a mobile device). In further embodiments, the
system 150 can be
21 run on any other computing device; for example, a server, a dedicated
price of hardware, a laptop
22 computer, a smartphone, a tablet, mixed reality devices such as
MicrosoftTM HoloLensTM, true
23 holographic systems, purpose-built hardware, or the like. In some
embodiments, the components
24 of the system 150 are stored by and executed on a single computing device.
In other
embodiments, the components of the system 150 are distributed among two or
more computer
26 systems that may be locally or remotely distributed; for example, using
cloud-computing
27 resources.
28 [0048] FIG. 1 shows various physical and logical components of an
embodiment of the system
29 150. As shown, the system 150 has a number of physical and logical
components, including a
central processing unit ("CPU") 152 (comprising one or more processors),
random access
31 memory ("RAM") 154, a user interface 156, a device interface 158, a
network interface 160, non-
7
CA 3056831 2019-09-26

I volatile storage 162, and a local bus 164 enabling CPU 152 to communicate
with the other
2 components. CPU 152 executes an operating system, and various modules, as
described below
3 in greater detail. RAM 154 provides relatively responsive volatile
storage to CPU 152. The user
4 interface 156 enables an administrator or user to provide input via an
input device, for example a
mouse or a touchscreen. The user interface 156 also outputs information to
output devices; for
6 example, a mediated reality device 192, a display or multiple displays, a
holographic visualization
7 unit, and the like. The mediated reality device 192 can include any
device suitable for displaying
8 augmented or mixed reality visuals; for example, smartphones, tablets,
holographic goggles,
9 purpose-built hardware, or other devices. The mediated reality device 192
may include other
output sources, such as speakers. In some cases, the system 150 can be
collocated or part of
11 the mediated reality device 192. In some cases, the user interface 156
can have the input device
12 and the output device be the same device (for example, via a
touchscreen).
13 [0049] The network interface 160 and/or the device interface 158 permits
communication with
14 other systems or devices, such as other computing devices and servers
remotely located from
the system 150. The device interface 158 can communicate with one or more
other computing
16 devices 190 that are either internal or external to the system 150; for
example, a GNSS device to
17 capture a position and/or elevation, a camera or camera array to capture
image(s) of a scene,
18 sensors for determining position and/or orientation (for example, time-
of-flight sensors, compass,
19 depth sensors, spatial sensors, inertial measurement unit (IMU), laser
mapping, and the like). In
some cases, at least some of the computing devices 190 can be collocated or
part of the mediated
21 reality device 192. In some embodiments, the device interface 158 can
receive and send data to
22 other devices, such as positions, elevations, and images, which have
been previously captured,
23 from the local database 166 or a remote database via the network
interface 160.
24 [0050] Non-volatile storage 162 stores the operating system and
programs, including computer-
executable instructions for implementing the operating system and modules, as
well as any data
26 used by these services. Additional stored data can be stored in a
database 166. During operation
27 of the system 150, the operating system, the modules, and the related
data may be retrieved from
28 the non-volatile storage 162 and placed in RAM 154 to facilitate
execution.
29 [0051] In an embodiment, the system 150 further includes a number of
modules to be executed
on the one or more processors 152, including an object module 170, a position
module 172, a
31 display module 174, a validation module 176, and a recordation module
184.
8
CA 3056831 2019-09-26

1 [0052] Turning to FIG. 2, a flowchart for a method of validating
geospatial data with mediated
2 reality 200 is shown, according to an embodiment.
3 [0053] At block 202, the object module 170 receives an object definition
associated with a
4 geospatial object that has been collected by a computing device 190
comprising a geographic
information system (GIS). The geospatial object data corresponds to a
geospatial object
6 physically located in space and the collection of the geospatial object
data can be accomplished
7 using any suitable approach. In an example, a user collecting geospatial
object data can place a
8 GNSS and/or RTK on top or near a point associated with the object they
want to collect. In other
9 cases, the user can use a laser mapping device to remotely determine a
distance and elevation
to such point; in some cases, using GNSS and/or RTK for positioning
information as an anchor.
11 After identification of the position and/or elevation of the object,
this information can be stored as
12 part of an object definition for the geospatial object. The object
definition can be stored locally on
13 the respective computing device 190, stored on the database 166, or
stored remotely (for
14 example, a cloud-based or server-based repository) and communicated via
the network interface
160.
16 [0054] The object definition includes the type of object (for example, a
pipe or a point) and the
17 geographical coordinates of its physical position. In further cases, the
object definition can also
18 include attributes or characteristics of the object. In most cases, the
geographical coordinates are
19 relative to the surface of the earth, for example latitude and
longitude. In other cases, the
geographical coordinates can be relative to another object; for example,
relative to a building or
21 landmark. In some cases, the object definition includes other
properties; for example, an
22 elevation, an object type, an object size, an object orientation, a
material type, and the like.
23 [0055] In some cases, the object definition can include representation
information for the
24 geospatial object. In some cases, the representation information can
include a point, line, or area
associated with the physical position of the geospatial object. In other
cases, the representation
26 information can include information required for creating more
sophisticated 3D visuals; for
27 example, geometry type, a 3D model, an object type (such as hydrant or
manhole), object
28 condition, colour, shape, and other parameters. In some cases, the
representation information
29 can be determined by the computing device 190 comprising the spatial
sensor and included in
the object definition sent to the system 150. In other cases, the
representation information can be
31 determined by the object module 170 upon receiving the object
definition. For example, by
32 referencing the properties of the object in the object definition. As an
example, the object definition
9
CA 3056831 2019-09-26

1 can include: a manhole 1.2m wide and 3.2m deep with grey cover installed
in 1987 oriented 14d
2 North.
3 [0056] In some cases, the object module 170 can receive the object
definition using "push" or
4 "pull" approaches; such as over an application programming interface
(API). In some cases, the
format of the object definition can include GeoJSON or other protocols.
6 [0057] In some cases, once the user captures a position of a geospatial
object, the object
7 definition is automatically sent to the object module 170 and method 200
proceeds. In other cases,
8 once the user captures a position of a geospatial object, the user is
given the option to proceed
9 with the method 200 and thus send the object definition to the object
module 170.
[0058] At block 204, the position module 172 receives or determines a physical
position of the
11 system 150 from a computing device 190 comprising a spatial sensor;
where the physical position
12 includes geographical coordinates. In most cases, the geographical
coordinates are relative to
13 the surface of the earth, for example latitude and longitude. In other
cases, the geographical
14 coordinates can be relative to another object; for example, relative to
a building or landmark. In
some cases, the physical position includes an elevation. In some cases, the
position module 172
16 also receives or determines an orientation or bearing of the system 100;
for example, comprising
17 the physical orientation of the direction of the camera. In an example,
the position module 172
18 can determine the position and orientation in 2D or 3D space (latitude,
longitude, and, in some
19 cases, elevation) using internal or external spatial sensors and
positioning frameworks; for
example, global position system (GPS), GNSS and/or RTK, Wi-Fi positioning
system (WPS),
21 manual calibration, vGIS calibration, markers, and/or other approaches.
The position module 172
22 can then track the position and/or the orientation during operation of
the system 150. The physical
23 position is used by the system 150 to, as described herein, accurately
place the visual
24 representation of the geospatial object displayed to the user relative
to the physical space using
the position of the geospatial object in the physical space in the object
definition.
26 [0059] At block 206, in some cases, the display module 174 displays a
mediated reality 'live' view
27 (such as a video stream or a sequential stream of captured images)
received from a camera. This
28 live view is oriented in the direction of the system 150 as received by
the position module 172 in
29 block 202. In embodiments using holographic devices, in some cases,
receiving the 'live view'
can be omitted because the visual representation itself is displayed in the
physical space.
CA 3056831 2019-09-26

1 [0060] At block 208, the display module 174 presents a visual
representation to the user via the
2 user interface 156, where the visual representation is a representation
of the received object
3 definition. The object definition includes has spatial attributes (for
example, latitude, longitude,
4 and elevation), which the display module 174 can use, in conjunction with
the physical position
information from the position module 172, to place the visual representation
relative to the
6 captured scene. For example, placing the visual representation of the
object overlaid onto the live
7 view. In most cases, the object definition can be used to scale the
visual representation according
8 to the 3D perspective of the live view.
9 [0061] The visual representation can be, for example, a three-dimensional
(3D) digital-twin model
resembling the collected object. In further cases, the visual representation
can be, for example, a
11 symbol representing the object, such as a point, a flag, a tag, or the
like. In further cases, the
12 visual representation can be, for example, a schematic representation, a
raster image, or the like.
13 In some cases, the type of visual representation can be associated with
the object in the library;
14 and in other cases, the type of visual representation can be selected by
the user.
[0062] In some cases, for example where the visual representation is anything
other than a 3D
16 model of the object (for example, a manhole symbolized by a point or a
flag), a key location (for
17 example, the point or the base of the symbol) can be placed at a
respective key point of the
18 physical object captured by the camera (for example, at the center of
the manhole). In some
19 cases, the symbology for each object, as well as the key locations and
points, can be defined by
each user.
21 [0063] In some cases, along with the visual representation, other
information can be displayed;
22 for example, distance, elevation, size, shape, colours, and the like,
can be displayed to assist with
23 visualization and/or precise placement. In some cases, such as with GIS,
the visual
24 representation location can be represented by a single point, line, or
outline, and to help the user
understand where the object is placed, a point, a cross, a line, or other
means, can be used within
26 the visual representation.
27 [0064] In other cases, the display module 174 can stream the visual
representation (for example,
28 a 3D model or model rendering) from a server, cloud-based
infrastructure, or other external
29 processing device. Instructions for such streaming can be provided in
any suitable format (for
example, KML or GeoJSON) or any other proprietary format.
11
CA 3056831 2019-09-26

1 [0065] FIG. 5 illustrates an example of a user (surveyor) validating the
position of piping of a fire
2 hydrant. In this example, the system 150 is located on a tablet computer.
The object module 172
3 has received geospatial object data for the piping of the fire hydrant
and the display module 174
4 displays a 3D model as a visual representation of the object on the
display of the tablet. The visual
representation is displayed by the display module 174 overtop of a live view
that is being captured
6 by the rear-facing camera of the tablet.
7 [0066] In some cases, the position of the object represented by the
visual representation by the
8 display module 174 can be accurately determined; for example, by the
position module 172, by
9 using the position of the system 150 (for example, its latitude,
longitude, and elevation), an
azimuth of the system 150, and the distance to one or more objects captured by
the camera. In
11 some cases, the position module 172 can capture metadata from the GNSS
and/or RTK device,
12 and then correct it for the elevation and distance difference between
the GNSS antenna and the
13 presented visual representation to achieve survey-grade data accuracy.
In some cases, the user
14 can update the position and/or the properties of the visual
representation manually.
[0067] In an example, when the position module 172 determines the system's 150
location and
16 orientation in x,y,z space, the position module 172 also determines the
position and orientation of
17 the physical camera (x,y,z plus bearing). The display module 174 can use
the positioning
18 information to access spatial data (the data with x,y,z coordinates) and
create (or use an existing)
19 visual representation (for example, a 3D model). The display module 174
can place the virtual
camera in the location of the physical camera (x,y,z plus bearing) relative to
the visual
21 representation. The visual representation can be overlaid on top of the
physical representation
22 such that it can appear in the correct location, matching physical
objects around it. In this way, by
23 understanding x,y,z and orientation of the physical camera, the display
module 174 can display
24 visual representations of objects that are in the scene (or field of
view), and size and orient the
visual representation to allow for visualization that matches the physical
world accurately.
26 [0068] In some cases, distance to the physical geospatial object can be
measured using distance
27 finders. The distance finder may also detect the object's elevation
(either relative or absolute).
28 Examples of other distance determination approaches can include optical
tools, such as depth
29 cameras or time-of-flight sensor, or image processing that compares
images taken from multiple
locations or angles to determine distances. These other approaches can be used
separately, or
31 they can be connected or associated with the system 150 to provide
information automatically.
32 For example, the display module 174 may display cross-hairs and, upon
aligning the cross-hairs
12
CA 3056831 2019-09-26

1 with the physical object, the system 150 can send a request to a distance
finder to determine the
2 distance to that point. In another example, the user interface 156 can
receive an indication from
3 the user of a point they want to measure the distance to. In another
example, an external distance
4 finder can be used to determine the distance to the physical object
separately, and that distance
can be used to ensure accurate object placement by displaying the distance to
the collected
6 object.
7 [0069] At block 210, the validation module 176 receives input with
respect to validating the
8 position and attributes of the visual representation relative to the
corresponding geospatial object
9 captured in the scene. If visual validation shows discrepancy in
alignments and/or any of the
associated attributes of the geospatial object and the corresponding visual
representation, the
11 user, via the user interface 156, can indicate that there is a
discrepancy, and/or indicate that there
12 is a discrepancy with any of the associated attributes.
13 [0070] In some cases, the input received by the validation module 176
could include outputs from
14 machine vision (MV) and/or artificial intelligence (Al) techniques. Such
techniques can be used
by the display module 174 to automate display of the visual representation. In
an example, MV
16 and Al techniques can automatically detect a geospatial object in the
captured scene, then
17 validate whether the placement of the visual representation is aligned
with the geospatial object
18 in the captured scene. In an example, the outputs from the MV and/or Al
techniques can
19 determine the correct position of the visual representation by snapping
the visual representation
to the corresponding geospatial object captured by the camera (for example, in
the example of
21 FIG. 7, snapping the diameter of the pipe to the diameter of the
manhole). If this position is
22 different than the position indicated in the object definition, the
validation module 176 can indicate
23 a discrepancy, otherwise it can validate the position. In some cases,
the machine vision technique
24 can be used to auto-size the visual representation to the geospatial
object captured by the
camera; then, the user can validate the position.
26 [0071] In some cases, the input received by the validation module 176
could include corrections
27 to attributes or properties (as described herein) associated with the
geospatial object; either
28 inputted by the user or automatically detected by the MV and/or Al
techniques. For example,
29 corrections to the object type, object size, colours, shape, elevation,
rotation, object conditions
(e.g., rust or damages), and the like.
13
CA 3056831 2019-09-26

1 [0072] In some cases, = the visual representation can be temporarily
fixed in place so that the
2 user can look at it from different angles to confirm the location.
3 [0073] At block 212, the recordation module 178 records the validated
position of the respective
4 geospatial object as the position of the corresponding visual
representation in the captured scene.
In some cases, the recordation module 178 also records validated attributes
and/or properties of
6 the geospatial object in the object definition (for example as metadata);
for example, size, height,
7 orientation, GNSS and/or RTK information, and the like. The recordation
can include editing the
8 object definition in a geospatial data storage; for example, on the
database 166 or sent to an
9 external storage via the network interface 160. In other cases, a new
revised entry can be stored
in the geospatial data storage. In some cases, the recordation module 178 can
record an image
11 of the physical scene in association with the validated position; where
this image can be used
12 later for evidence purposes.
13 [0074] The storage methods may include on device storage, or
synchronization with centralized
14 data repository such as a GIS system, or through other means that would
allow for the storage
and retrieval of spatial data.
16 [0075] FIGS. 6 to 8 illustrate example screenshots of an example
implementation of the system
17 150; in this example, validating the position of a pipe located beneath
a manhole. FIG. 6 illustrates
18 an example screenshot of the display module 174 displaying a 3D model
visual representation of
19 the pipe 502 with the positioning as dictated by the object definition
received by the object module
170. As illustrated, the background of the screen can be the mediated reality
'live' view received
21 from the camera that is oriented in the direction of the system 150. In
this case, the user would
22 indicate a discrepancy with the placement of the geospatial object in
the physical space because
23 the visual representation is no collocated with the corresponding
manhole cover. In this example,
24 the visual representation 502 represents a manhole pipe that should be
aligned with, and located
subsurface to, the manhole cover 504 captured by the camera. FIG. 7
illustrates an example
26 screenshot of the visual representation 502 in which the user would
validate the object definition
27 as comprising the correct position of the visual representation 502
relative to the corresponding
28 geospatial object (manhole cover) 504 captured by the camera. FIG. 8
illustrates an example
29 screenshot after the position of the pipe 502 has been validated.
[0076] In some cases, the validation module 176 can also receive input of from
a user validating
31 attributes associated with the geospatial object via the user interface
156. These attributes can
14
CA 3056831 2019-09-26

1 be part of the object definition received by the object module 170. The
recordation module 178
2 can store these validated attributes as metadata associated with the
collected object. These
3 attributes can include, for example, elements such as colour, material
type, shape, installation
4 date, and the like. In some cases, the system can also automatically
capture complementary
attributes, for example, date of the validation, person who performed the
validation, equipment
6 used, and the like. In some cases, these complementary attributes can be
validated using MV
7 and/or Al.
8 [0077] In some cases, the system 150 can be initiated from an external
system that provides
9 instructions to begin. Examples of such external systems can include
ticket management or
spatial tracking systems. In an example, a technician may be reviewing a work
ticket using a third-
11 party ticket management system, and as part of the ticket workflow, the
external system may
12 launch the system 150 for the technician to complete the assignment. In
another example, a
13 technician may be passing through an area for which they will need to
collect spatial information.
14 Upon detecting the technician's location, a process may notify the
technician about the
assignment and automatically launch the system 150.
16 [0078] Advantageously, the present embodiments can reduce costs and speed
up geospatial
17 data collection by providing a quicker and more efficient way to
validate the data collection.
18 [0079] While the forgoing refers to a camera to capture a physical scene
and a screen to display
19 the mixture of physical and visual representations, it is contemplated
that any apparatus for
blending virtual and real objects can be used; for example, a holographic
system that displays
21 holographic augmentation or projects holograms.
22 [0080] Although the foregoing has been described with reference to
certain specific
23 embodiments, various modifications thereto will be apparent to those
skilled in the art without
24 departing from the spirit and scope of the invention as outlined in the
appended claims.
CA 3056831 2019-09-26

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

2024-08-01:As part of the Next Generation Patents (NGP) transition, the Canadian Patents Database (CPD) now contains a more detailed Event History, which replicates the Event Log of our new back-office solution.

Please note that "Inactive:" events refers to events no longer in use in our new back-office solution.

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Event History , Maintenance Fee  and Payment History  should be consulted.

Event History

Description Date
Inactive: Grant downloaded 2022-04-27
Letter Sent 2022-04-26
Grant by Issuance 2022-04-26
Inactive: Cover page published 2022-04-25
Inactive: Final fee received 2022-03-02
Pre-grant 2022-03-02
Inactive: Office letter 2022-01-04
Inactive: Correspondence - Prosecution 2022-01-04
Notice of Allowance is Issued 2021-12-20
Letter Sent 2021-12-20
Notice of Allowance is Issued 2021-12-20
Inactive: Approved for allowance (AFA) 2021-12-16
Inactive: Q2 passed 2021-12-16
Amendment Received - Response to Examiner's Requisition 2021-11-25
Amendment Received - Voluntary Amendment 2021-11-25
Examiner's Report 2021-10-18
Inactive: Report - No QC 2021-10-15
Letter Sent 2021-10-12
Request for Examination Received 2021-10-04
Request for Examination Requirements Determined Compliant 2021-10-04
All Requirements for Examination Determined Compliant 2021-10-04
Change of Address or Method of Correspondence Request Received 2021-10-04
Advanced Examination Determined Compliant - PPH 2021-10-04
Advanced Examination Requested - PPH 2021-10-04
Amendment Received - Voluntary Amendment 2021-10-04
Application Published (Open to Public Inspection) 2021-03-26
Inactive: Cover page published 2021-03-25
Common Representative Appointed 2020-11-07
Common Representative Appointed 2019-10-30
Common Representative Appointed 2019-10-30
Inactive: Filing certificate - No RFE (bilingual) 2019-10-17
Inactive: IPC assigned 2019-10-03
Inactive: First IPC assigned 2019-10-03
Inactive: IPC assigned 2019-10-03
Application Received - Regular National 2019-09-30

Abandonment History

There is no abandonment history.

Maintenance Fee

The last payment was received on 2021-09-24

Note : If the full payment has not been received on or before the date indicated, a further fee may be required which may be one of the following

  • the reinstatement fee;
  • the late payment fee; or
  • additional fee to reverse deemed expiry.

Patent fees are adjusted on the 1st of January every year. The amounts above are the current amounts if received by December 31 of the current year.
Please refer to the CIPO Patent Fees web page to see all current fee amounts.

Fee History

Fee Type Anniversary Year Due Date Paid Date
Application fee - standard 2019-09-26
MF (application, 2nd anniv.) - standard 02 2021-09-27 2021-09-24
Request for examination - standard 2024-09-26 2021-10-04
Final fee - standard 2022-04-20 2022-03-02
MF (patent, 3rd anniv.) - standard 2022-09-26 2022-07-25
MF (patent, 4th anniv.) - standard 2023-09-26 2023-06-30
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
VGIS INC.
Past Owners on Record
ALEXANDRE PESTOV
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Description 2019-09-25 15 826
Abstract 2019-09-25 1 13
Drawings 2019-09-25 8 958
Claims 2019-09-25 3 94
Representative drawing 2021-02-18 1 3
Claims 2021-10-03 3 129
Claims 2021-11-24 3 128
Drawings 2021-11-24 8 1,002
Representative drawing 2022-03-27 1 3
Filing Certificate 2019-10-16 1 213
Courtesy - Acknowledgement of Request for Examination 2021-10-11 1 424
Commissioner's Notice - Application Found Allowable 2021-12-19 1 579
Electronic Grant Certificate 2022-04-25 1 2,527
Request for examination / Amendment / PPH request 2021-10-03 15 561
Change to the Method of Correspondence 2021-10-03 3 71
Examiner requisition 2021-10-17 3 163
Amendment 2021-11-24 16 1,292
Prosecution correspondence 2022-01-03 5 128
Courtesy - Office Letter 2022-02-01 1 161
Final fee 2022-03-01 5 148