Language selection

Search

Patent 2943247 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent Application: (11) CA 2943247
(54) English Title: METHOD AND SYSTEM FOR DYNAMICALLY POSITIONING, VIEWING AND SHARING LOCATION BASED MIXED REALITY CONTENT
(54) French Title: METHODE ET SYSTEME DE POSITIONNEMENT DYNAMIQUE, AFFICHAGE ET PARTAGE D'EMPLACEMENT FONDES SUR UN CONTENU DE REALITE MIXTE
Status: Deemed Abandoned and Beyond the Period of Reinstatement - Pending Response to Notice of Disregarded Communication
Bibliographic Data
(51) International Patent Classification (IPC):
  • H04W 4/021 (2018.01)
  • H04N 21/262 (2011.01)
  • H04N 21/2743 (2011.01)
(72) Inventors :
  • MILLS, DANIEL CHANTAL (Canada)
  • KRISHNA, SRINIVAS (Canada)
  • SIDHDHARTHKUMAR, PATEL (Canada)
  • THOMAS, LAURA BETH (Canada)
  • JAKHU, PAVAN (Canada)
  • ROSALES, EDWARD ALBERT (Canada)
  • YUE, DAVID ALEXANDER (Canada)
  • KHAN, NAIMUL MAFRAZ (Canada)
(73) Owners :
  • AWE COMPANY LTD.
(71) Applicants :
  • AWE COMPANY LTD. (Canada)
(74) Agent: OPEN IP CORPORATION
(74) Associate agent:
(45) Issued:
(22) Filed Date: 2016-09-27
(41) Open to Public Inspection: 2018-03-27
Availability of licence: N/A
Dedicated to the Public: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): No

(30) Application Priority Data: None

Abstracts

English Abstract


A location based mobile computing method and system is disclosed
that can self-register local position and orientation using on
board sensors and use that positional information to locate
virtual objects (e.g. 2D or 3D virtual objects, or streaming
videos) in the real world video view.


Claims

Note: Claims are shown in the official language in which they were submitted.


Claims
What is claimed is:
1. A system as substantially described above.
2. A method as substantially described above.
6

Description

Note: Descriptions are shown in the official language in which they were submitted.


CA 02943247 2016-09-27
Title
Method and system for dynamically positioning, viewing and
sharing location based mixed reality content
Field
The field of the invention is augmented or mixed reality methods
and systems.
Figures
Fig. 1 is a plan view map with markers to show GPS search
results for geographically tagged "geotagged" videos in the
proximity of the mobile computing device.
Fig. 2 shows thumbnails of geotagged videos shown in the mobile
computing device's viewer and overlaid on the real world
environment as viewed through the device's video sensor.
Fig. 3 shows a list view or one of more available videos to
watch.
Fig. 4 shows a video projected onto a virtual screen that stays
in a stationary position relative to the real environment as the
mobile computing device is moved or rotated.
Fig. 5 shows that an individual can record a video of themselves
or surrounding and post it using the application. When posed it
will have the geotag of the location as well as the relative
position and orientation information that maps the video when
future users of the application use it.
Detailed Description
In an aspect, a location based mobile computing system is
disclosed that can self-register local position and orientation
using on board sensors and use that positional information to
locate mixed reality content (e.g. 2D or 3D virtual objects, or
streaming videos) in the real world video sensor view.
The geographic position may be determined using Global
Positioning Systems (GPS) or other global locating features like
position determination from cellular tower trilateration. Once
the user and their mobile computing system is located
1

CA 02943247 2016-09-27
geographically, other sensing is completed to model portions of
the local 3D environment.
This local 3D environment model is used to self-register the
mobile computing system's local position and orientation
relative to the surrounding environment. This allows for a mixed
reality content viewer to be virtually positioned in the real
world. This mixed reality content viewer can be shown within the
video screen of the mobile computing device, projected into the
real world to interact with the real world or projected back
onto the user's eye in certain head up displays.
If the user creates mixed reality content while at a geographic
location, they can share this back to Youtube or other social
media sites and have geotagged as well as local 3D environment
information associated with this object so that future viewers
will experience in similar ways. The user may have input to
modify the local 3D environment information to enhance the
future viewer experience. Other users who are travelling near
the location of this geotagged mixed reality content will be
notified by the application that one or more geotagged objects
are in the vicinity. The user will be able to view these
thumbnails or markers for these objects through their mobile
computing device viewer and see the markers geographically
positioned. Once they select a marker, they will be able to
view, experience, or interact with the geotagged object and also
have the correct local 3D environment data associated with it
for orientation.
In another aspect, a method is disclosed comprising the
following steps:
- Detection of the geographic location of a mobile computing
system;
- Automatic on-line search for geotagged mixed reality
content within a search radius of the mobile computing
system;
- Recording of video that is geotagged as well as having
local position and orientation information;
- Posting of geotagged mixed reality content to a 3d party
on-line hosting site and/or social media;
2

CA 02943247 2016-09-27
- This mixed reality content will also have the ability to
have the location position and orientation information, at
time of creation or upload, associated with it; and
- The ability for an individual to view the mixed reality
content to also leave geotagged comments that are visible
to future users.
In an embodiment, the detection system and method uses the
Global Positioning System (GPS) sensors of the mobile computing
system to locate the user's terrestrial position with latitude
and longitude coordinates. A further embodiment could utilize
cellular tower trilateration to locate the user's terrestrial
position with latitude and longitude coordinates.
The detection system may also be used to pass latitude and
longitude coordinates to a software application that will search
the world wide web for geotagged mixed reality content that is
posted on popular video hosting and social network websites. An
embodiment of this would be a search of Youtube for geotagged
video as illustrated in Fig. 1.
The search can be further refined by other search terms that
could be manually input or taken from predefined user
preferences. The result of the search will be geotagged mixed
reality content with a refined topic.
The results of the world wide web search of will be shown
graphically within the software application to visually depict
virtual objects/videos that are located within a specified
vicinity around the individual. If multiple geotagged objects
are found within the search region, a visual list or
representation of all of them will be presented to the
individual. Referring now to Fig. 2, the mixed reality content
thumbnails or representations may be visually placed using a
position and orientation calculated between the individual's
latitude and longitude and the mixed reality content's latitude
and longitude.
Referring now to Fig. 3, the individual will be able to select a
specific mixed reality content to experience from the ones
presented in the search results. Only mixed reality content that
is within the geographic buffer or region of interest
surrounding the location of the individual will be shown.
3

CA 02943247 2016-09-27
While at the geographic location, the individual may view and
interact with the mixed reality content. The mixed reality
content will have a position and orientation within the local
environment around the individual. The position and orientation
will be determined and continually updated in real time using
available sensors on the mobile computing device. One embodiment
of this system and method will use the GPS, accelerometer,
gyroscope, compass and video sensor of the mobile computing
device to calculate this real time position and orientation. Key
frames taken from the video sensor will be used to calculate, in
real time, the 3D position of the surrounding environment as it
relates to the mobile computing device. The form of monocular
position and depth sensing will be completed using common
features automatically determined in the video key frames. This
position information will determine the location and rotation of
the mobile computing device as well as how to present the mixed
reality content within the local environment in perspective
correct and stationary virtual position. This position
information may also allow the mixed reality content to interact
with one or more users at the same location.
The position/orientation results will be visually presented to
the individual to create a virtual or mixed reality experience
in the surrounding environment. One embodiment will have the
mixed reality content shown on the mobile computing device's
screen as a camera/viewer within a viewer. Another embodiment
will have the positioned and oriented mixed reality content
projected into the real world using a projection device. Another
embodiment of this will have the positioned and oriented mixed
reality content to be projected onto a pair of glasses or heads-
up display in front of the individual. Another embodiment of
this will have the positioned and oriented mixed reality content
projected back onto the individual's eye.
In an embodiment, there may be functionality to allow the
individual to leave geotagged mixed reality content or
information at the geographic location that will be viewable by
other individuals. One embodiment of this will allow the
individual to record a video or leave a written comment that is
geotagged to the location. This will be posted to popular video
hosting or social networking web sites.
4

CA 02943247 2016-09-27
The posted mixed reality content may be further enhanced with
local position and orientation information that may be recorded
with the geotagged information. The position and orientation may
be determined and continually updated in real time using
available sensors on the mobile computing device. For example,
as shown in Fig. 5, an Individual can record a video of
themselves or surrounding and post it using the application.
When posed it will have the geotag of the location as well as
the relative position and orientation information that maps the
video when future users of the application use it.
The position and orientation results will be included, when
available, with geographic location information that may be
associated with the mixed reality content. An embodiment may
have the orientation and rotation information embedded in the
subtitles or description portion of the posting and hosted
directly on the video hosting, such as Youtube, web site.
Another embodiment may have the position and orientation
information hosted on a different server than the video but they
will be dynamically linked and accessible for real time
consumption.
Other curation tools will be available to the individual to
enhance what they are posting. An embodiment may be a mobile
application based dashboard that allows the individual to edit
the length or order of a video sequence, change sound, add
further positional information or rules for future users, allow
successful viewing of the virtual object/video to trigger a
secondary event of object as a reward for completion. Another
embodiment may be a web based version of this dashboard.

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

2024-08-01:As part of the Next Generation Patents (NGP) transition, the Canadian Patents Database (CPD) now contains a more detailed Event History, which replicates the Event Log of our new back-office solution.

Please note that "Inactive:" events refers to events no longer in use in our new back-office solution.

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Event History , Maintenance Fee  and Payment History  should be consulted.

Event History

Description Date
Common Representative Appointed 2019-10-30
Common Representative Appointed 2019-10-30
Application Not Reinstated by Deadline 2019-09-27
Time Limit for Reversal Expired 2019-09-27
Inactive: IPC deactivated 2019-01-19
Deemed Abandoned - Failure to Respond to Maintenance Fee Notice 2018-09-27
Inactive: Cover page published 2018-04-09
Inactive: IPC assigned 2018-04-04
Inactive: First IPC assigned 2018-04-04
Application Published (Open to Public Inspection) 2018-03-27
Inactive: IPC expired 2018-01-01
Inactive: IPC assigned 2016-11-08
Inactive: First IPC assigned 2016-11-08
Inactive: IPC assigned 2016-11-08
Inactive: IPC assigned 2016-11-08
Inactive: Filing certificate - No RFE (bilingual) 2016-10-04
Filing Requirements Determined Compliant 2016-10-04
Application Received - Regular National 2016-09-28

Abandonment History

Abandonment Date Reason Reinstatement Date
2018-09-27

Fee History

Fee Type Anniversary Year Due Date Paid Date
Application fee - standard 2016-09-27
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
AWE COMPANY LTD.
Past Owners on Record
DANIEL CHANTAL MILLS
DAVID ALEXANDER YUE
EDWARD ALBERT ROSALES
LAURA BETH THOMAS
NAIMUL MAFRAZ KHAN
PATEL SIDHDHARTHKUMAR
PAVAN JAKHU
SRINIVAS KRISHNA
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Abstract 2016-09-27 1 8
Description 2016-09-27 5 225
Drawings 2016-09-27 3 58
Claims 2016-09-27 1 4
Representative drawing 2018-04-09 1 5
Cover Page 2018-04-09 1 32
Filing Certificate 2016-10-04 1 202
Courtesy - Abandonment Letter (Maintenance Fee) 2018-11-08 1 174
Reminder of maintenance fee due 2018-05-29 1 110
New application 2016-09-27 6 148