Language selection

Search

Patent 2881744 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent: (11) CA 2881744
(54) English Title: UNMANNED VEHICLE (UV) CONTROL SYSTEM AND UV MOVEMENT AND DATA CONTROL SYSTEM
(54) French Title: SYSTEME DE COMMANDE DE VEHICULE SANS PILOTE ET SYSTEME DE COMMANDE DE MOUVEMENT ET DE DONNEES DE VEHICULE SANS PILOTE
Status: Granted
Bibliographic Data
(51) International Patent Classification (IPC):
  • B60W 60/00 (2020.01)
  • B60W 50/00 (2006.01)
  • B64D 47/00 (2006.01)
  • G08C 19/12 (2006.01)
  • H04N 7/18 (2006.01)
  • G05D 1/02 (2020.01)
  • B64C 39/02 (2006.01)
(72) Inventors :
  • MULLAN, PRAMILA (United States of America)
  • NEGM, WALID (United States of America)
  • LIONGOSARI, EDY S. (United States of America)
  • BARSAMIAN, PAUL (United States of America)
  • RICHARDS, BRIAN (United States of America)
  • KIM, SANG-IK (United States of America)
  • MUI, MICHAEL (United States of America)
  • FENNEY, ROBERT (United States of America)
(73) Owners :
  • ACCENTURE GLOBAL SERVICES LIMITED (Ireland)
(71) Applicants :
  • ACCENTURE GLOBAL SERVICES LIMITED (Ireland)
(74) Agent: SMART & BIGGAR LP
(74) Associate agent:
(45) Issued: 2017-03-07
(22) Filed Date: 2015-02-13
(41) Open to Public Inspection: 2015-08-14
Examination requested: 2015-02-13
Availability of licence: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): No

(30) Application Priority Data:
Application No. Country/Territory Date
61/940,251 United States of America 2014-02-14
61/943,224 United States of America 2014-02-21
14/619,749 United States of America 2015-02-11
14/619,854 United States of America 2015-02-11

Abstracts

English Abstract

Unmanned vehicle (UV) control may include receiving a UV work order and generating a mission request based on the UV work order. The mission request may identify an objective of a mission, assign a UV and a sensor to the mission from a fleet of UVs and sensors, and assign a first movement plan to the mission based on the identified objective of the mission. The assigned UV may be controlled according to the assigned first movement plan, and communication data may be received from the assigned sensor. The communication data may be analyzed to identify an event related to the mission. The identified event and the first movement plan may be analyzed to assign a second movement plan to the mission based on the analysis of the identified event and the first movement plan to meet the identified objective of the mission.


French Abstract

Commande de véhicule sans pilote pouvant comprendre la réception dune demande de travail pour le véhicule sans pilote et la génération dune demande de mission fondée sur ladite demande de travail. La demande de mission peut indiquer un objectif de mission, attribuer un véhicule sans pilote et un capteur à la mission, à partir dune flotte de véhicules sans pilote et de capteurs, et attribuer un premier plan de mouvement fondé sur lobjectif établi pour la mission. Le véhicule sans pilote attribué peut être commandé conformément au premier plan de mouvement attribué, et des données de communication peuvent être reçues du capteur attribué. Les données de communication peuvent être analysées pour déterminer un événement en lien avec la mission. Lévénement déterminé et le premier plan de mouvement peuvent être analysés pour attribuer un deuxième plan de mouvement à la mission en fonction de lanalyse de lévénement déterminé et du premier plan de mouvement, pour atteindre lobjectif de la mission.

Claims

Note: Claims are shown in the official language in which they were submitted.


CLAIMS:
1. An unmanned vehicle (UV) control system comprising:
a fleet and mission operations controller, executed by at least one hardware
processor, to receive a UV work order and to generate a mission request based
on the
UV work order, the mission request identifying an objective of a mission,
assigning a
UV and a sensor to the mission from a fleet of UVs and sensors, and assigning
a first
movement plan to the mission based on the identified objective of the mission;
a mission controller, executed by the at least one hardware processor, to
control the assigned UV according to the assigned first movement plan,
and
receive communication data from the assigned sensor, wherein the
communication data includes telemetry data and video stream data; and
an event detector, executed by the at least one hardware processor, to
receive the communication data that includes the telemetry data and the
video stream data,
combine the telemetry data and the video stream data,
analyze the combined telemetry data and the video stream data to
identify an event related to the mission, and
forward the identified event to the fleet and mission operations controller
to
analyze the identified event and the first movement plan, and
82

assign a second movement plan to the mission based on the
analysis of the identified event and the first movement plan to meet the
identified objective of the mission, wherein the second movement plan is
different than the first movement plan,
wherein the mission controller is to control the assigned UV
according to the assigned second movement plan, and
wherein the fleet and mission operations controller and the event
detector are hosted in an off-site facility relative to the assigned UV.
2. The UV control system according to claim 1, further comprising:
a compliance evaluator, executed by the at least one hardware processor, to
determine whether the mission request is compliant with at least one of
regulations and safety parameters, and
in response to a determination that the mission request is compliant with
the at least one of regulations and safety parameters, forward the mission
request to the mission controller.
3. The UV control system according to claim 1, further comprising:
a compliance evaluator, executed by the at least one hardware processor, to
determine whether the assigned UV and a UV operation crew associated
with the mission request is compliant with at least one of regulations and
safety
parameters, and
83

in response to a determination that the assigned UV and the UV
operation crew associated with the mission request is compliant with the at
least one of regulations and safety parameters, forward the mission request to

the mission controller.
4. The UV control system according to claim 1, wherein the fleet of UVs
includes
unmanned aerial vehicles (UAVs).
5. The UV control system according to claim 1, wherein the sensors include
video
cameras, gas detectors, infrared (IR) cameras, and pressure sensors.
6. The UV control system according to claim 1, wherein the event detector
is to
combine the telemetry data and the video stream data by
determining a number of frames per second for the video stream data,
determining, from the telemetry data, a time and a location associated with
the
video stream data, and
generating a meta tag for each of the frames of the video stream data, wherein

the meta tag includes the time and the location associated with the video
stream data.
84

7. The UV control system according to claim 1, wherein
the first movement plan includes predefined way points and alternate points
for
the UV based on the identified objective of the mission, and
the second movement plan includes at least one different predefined way point
from the predefined way points and at least one different alternate point from
the
alternate points for the UV based on the analysis of the identified event to
meet the
identified objective of the mission.
8. The UV control system according to claim 1, wherein
the sensor includes a video camera, and
the mission controller is to
generate a real-time display from the video camera,
receive instructions to modify movement of the UV based on an analysis
of the real-time display from the video camera, and
modify movement of the UV based on the received instructions.
9. The UV control system according to claim 1, wherein the UV is an
autonomous
UV.

10. The UV control system according to claim 1, wherein the event detector
is to
analyze the communication data to identify the event that includes a potential
leak by
converting the video stream data from red, green, and blue (RGB) into
corresponding hue-saturation-values (HSVs) to adjust for variations in
lighting
conditions and shadows,
defining lower and upper bounds of the HSVs based on a type of material
associated with the potential leak,
analyzing, based on the defined lower and upper bounds, each video frame
associated with the video stream data to overlay shapes on each of the video
frames
associated with the video stream data, and
applying parameter constraints to determine whether an area of a video frame
associated with the overlayed shapes represents the potential leak.
11. The UV control system according to claim 1, wherein the event detector
is to
analyze the communication data to identify the event that includes a potential
leak or
an intruder related to a pipeline.
12. The UV control system according to claim 1, wherein the event detector
is to
analyze the communication data to identify the event related to a pipeline,
and
generate instructions for preventative actions with respect to the pipeline
based
on the identification of the event.
86

13. The UV control system according to claim 1, wherein the mission
controller is to
analyze the event to determine a severity level of the event, and
generate a real-time display related to the event, wherein the real-time
display
includes a characterization of a type and the severity level of the event.
14. A method for unmanned vehicle (UV) control, the method comprising:
generating, by a fleet and mission operations controller that is executed by
at
least one hardware processor, a mission request to
identify an objective of a mission,
assign a UV and a sensor to the mission from a fleet of UVs and
sensors, and
assign a first movement plan to the mission based on the identified
objective of the mission;
controlling, by a mission controller that is executed by the at least one
hardware
processor, the assigned UV according to the assigned first movement plan;
receiving, by an event detector that is executed by the at least one hardware
processor, communication data that includes telemetry data and video stream
data;
analyzing, by the event detector, the communication data from the assigned
sensor by combining the telemetry data and the video stream data to identify
an event
related to the mission;
analyzing, by the fleet and mission operations controller, the identified
event
and the first movement plan;
87

assigning, by the fleet and mission operations controller, a second movement
plan to the mission based on the analysis of the identified event and the
first
movement plan to meet the identified objective of the mission, wherein the
second
movement plan is different than the first movement plan; and
controlling, by the mission controller, the assigned UV according to the
assigned second movement plan.
15. The method for UV control according to claim 14, wherein analyzing, by
the
event detector that is executed by the at least one hardware processor, the
communication data from the assigned sensor by combining the telemetry data
and
the video stream data to identify the event related to the mission further
comprises:
analyzing, by the event detector, the communication data to identify the event

that includes a potential leak or an intruder related to a pipeline.
16. The method for UV control according to claim 14, further comprising:
determining, by a compliance evaluator that is executed by the at least one
hardware processor, whether the mission request is compliant with regulations,
and
in response to a determination that the mission request is compliant with
regulations, forwarding, from the compliance evaluator, the mission request to
the
mission controller.
88

17. The method for UV control according to claim 14, further comprising:
generating, by the mission controller, a real-time display related to the
event,
wherein the real-time display includes a characterization of a type and a
severity level
of the event.
18. The method for UV control according to claim 14, further comprising:
analyzing, by the event detector, the communication data to identify the event

related to a pipeline, and
generating, by the event detector, instructions for preventative actions with
respect to the pipeline based on the identification of the event.
19. The method for UV control according to claim 14, wherein the sensor
includes a
video camera, the method further comprises:
generating, by the mission controller, a real-time display from the video
camera,
receiving, by the mission controller, instructions to modify movement of the
UV
based on an analysis of the real-time display from the video camera, and
modifying, by the mission controller, movement of the UV based on the
received instructions.
89

20. A non-transitory computer readable medium having stored thereon machine
readable instructions for UV control, the machine readable instructions when
executed
cause at least one hardware processor to:
receive, at a mission controller that is executed by the at least one hardware

processor, a mission request that
identifies an objective of a mission,
assigns a UV and a sensor to the mission from a fleet of UVs and
sensors, and
assigns a first movement plan to the mission based on the identified
objective of the mission;
control, by the mission controller, the assigned UV according to the assigned
first movement plan;
receive, by an event detector that is executed by the at least one hardware
processor, communication data that includes telemetry data and video stream
data;
analyze, by the event detector that is executed by the at least one hardware
processor, the communication data from the assigned sensor by combining the
telemetry data and the video stream data to identify an event related to the
mission;
analyze, by a fleet and mission operations controller, the identified event
and
the first movement plan; and
receive, at the mission controller, a second movement plan for the mission
based on the analysis of the identified event and the first movement plan to
meet the
identified objective of the mission, wherein the second movement plan is
different than
the first movement plan.

Description

Note: Descriptions are shown in the official language in which they were submitted.


CA 02881744 2015-02-13
,
,
, D14-092-02678-00-US and US2 '
PATENT
UNMANNED VEHICLE (UV) CONTROL SYSTEM AND UV MOVEMENT AND
DATA CONTROL SYSTEM
BACKGROUND
[0001] Unmanned vehicles (UVs) such as aerial vehicles (e.g.,
Unmanned
aerial vehicles (UAVs), or drones), land vehicles, or even collaborative
robots are
typically operated without a human aboard. UVs may include three types of
platforms based on their ability to control their operation. For example, UVs
may
be categorized as remote controlled (RC), task following, and semi to fully
autonomous.
[0002] RC platform based UVs typically do not include the
capability to control
the UV behavior, and rely on an external operator to perform tasks. For
example, a
RC platform based UV may be instructed by an operator who has a line-of-sight
to
the UV to implement every behavior change, and to guide the UV through each
task that is to be performed.
[0003] A task following platform based UV may include the ability
to receive
instructions on how to perform a task, and then repeat the task until
receiving
instructions to stop performing the task, or based on the occurrence of an
exception that the UV has been preprogrammed to respond to. An operator for a
task following platform based UV may monitor the status of the UV, and then
report
the results of the task following platform based UV's execution. Task
following
1

CA 02881744 2015-02-13
,
i D14-092-02678-00-US and US2
PATENT
platform based UVs may be operated without a line-of-sight to the UV, even
when
the UV is being manually controlled by an operator. For example, a video
camera
mounted on the UV and a wireless video link (e.g., a "first-person-view", or
FPV)
may allow an operator to control the UV without line of site.
[0004] A semi or fully autonomous platform (e.g., "smart platform") based
UV
may receive instructions related to a task. Based on access to real-time
sensor
data on the UV and a set of objectives that are specified by the instructions,
the
semi or fully autonomous platform based UV may be deployed to follow the
instructions.
2

CA 02881744 2016-07-14
95421-80T
SUMMARY
[0004a] In an aspect, there is provided an unmanned vehicle (UV)
control system
comprising: a fleet and mission operations controller, executed by at least
one
hardware processor, to receive a UV work order and to generate a mission
request
based on the UV work order, the mission request identifying an objective of a
mission,
assigning a UV and a sensor to the mission from a fleet of UVs and sensors,
and
assigning a first movement plan to the mission based on the identified
objective of the
mission; a mission controller, executed by the at least one hardware
processor, to
control the assigned UV according to the assigned first movement plan, and
receive communication data from the assigned sensor, wherein the communication
data includes telemetry data and video stream data; and an event detector,
executed
by the at least one hardware processor, to receive the communication data that

includes the telemetry data and the video stream data, combine the telemetry
data and
the video stream data, analyze the combined telemetry data and the video
stream data
to identify an event related to the mission, and forward the identified event
to the fleet
and mission operations controller to analyze the identified event and the
first
movement plan, and assign a second movement plan to the mission based on the
analysis of the identified event and the first movement plan to meet the
identified
objective of the mission, wherein the second movement plan is different than
the first
movement plan, wherein the mission controller is to control the assigned UV
according
to the assigned second movement plan, and wherein the fleet and mission
operations
controller and the event detector are hosted in an off-site facility relative
to the
assigned UV.
2a

CA 02881744 2016-07-14
95421-80T
[0004b] In another aspect, there is provided a method for unmanned
vehicle (UV)
control, the method comprising: generating, by a fleet and mission operations
controller that is executed by at least one hardware processor, a mission
request to
identify an objective of a mission, assign a UV and a sensor to the mission
from a fleet
of UVs and sensors, and assign a first movement plan to the mission based on
the
identified objective of the mission; controlling, by a mission controller that
is executed
by the at least one hardware processor, the assigned UV according to the
assigned
first movement plan; receiving, by an event detector that is executed by the
at least
one hardware processor, communication data that includes telemetry data and
video
stream data; analyzing, by the event detector, the communication data from the
assigned sensor by combining the telemetry data and the video stream data to
identify
an event related to the mission; analyzing, by the fleet and mission
operations
controller, the identified event and the first movement plan; assigning, by
the fleet and
mission operations controller, a second movement plan to the mission based on
the
analysis of the identified event and the first movement plan to meet the
identified
objective of the mission, wherein the second movement plan is different than
the first
movement plan; and controlling, by the mission controller, the assigned UV
according
to the assigned second movement plan.
[0004c] In another aspect, there is provided a non-transitory computer
readable
medium having stored thereon machine readable instructions for UV control, the
machine readable instructions when executed cause at least one hardware
processor
to: receive, at a mission controller that is executed by the at least one
hardware
processor, a mission request that identifies an objective of a mission,
assigns a UV
2b

CA 02881744 2016-07-14
95421-80T
and a sensor to the mission from a fleet of UVs and sensors, and assigns a
first
movement plan to the mission based on the identified objective of the mission;
control,
by the mission controller, the assigned UV according to the assigned first
movement
plan; receive, by an event detector that is executed by the at least one
hardware
processor, communication data that includes telemetry data and video stream
data;
analyze, by the event detector that is executed by the at least one hardware
processor, the communication data from the assigned sensor by combining the
telemetry data and the video stream data to identify an event related to the
mission;
analyze, by a fleet and mission operations controller, the identified event
and the first
movement plan; and receive, at the mission controller, a second movement plan
for
the mission based on the analysis of the identified event and the first
movement plan
to meet the identified objective of the mission, wherein the second movement
plan is
different than the first movement plan.
2c

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
BRIEF DESCRIPTION OF DRAWINGS
[0005] Features of the present disclosure are illustrated by way of
examples
shown in the following figures. In the following figures, like numerals
indicate like
elements, in which:
[0006] Figure 1 illustrates a detailed architecture of a UV control system,
according to an example of the present disclosure;
[0007] Figure 2 illustrates a mission console of the UV control system
of Figure
1, according to an example of the present disclosure;
[0008] Figures 3A-3C illustrate an environment for operation of the UV
control
system of Figure 1, according to an example of the present disclosure;
[0009] Figure 4 illustrates an output of an event detector of the UV
control
system of Figure 1, according to an example of the present disclosure;
[0010] Figure 5 illustrates an architecture of a mission controller of
the UV
control system of Figure 1 for processing data from sensors, according to an
example of the present disclosure;
[0011] Figure 6 illustrates an architecture of an event detector of the
UV control
system of Figure 1, according to an example of the present disclosure;
[0012] Figure 7 illustrates a screenshot of an event detection analytics
processing video of the UV control system of Figure 1, according to an example
of
the present disclosure;
3

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
[0013] Figure 8 illustrates a screenshot of an event detection
analytics
processing video of the UV control system of Figure 1, according to an example
of
the present disclosure;
[0014] Figure 9 illustrates a method for UV control, according to an
example of
the present disclosure;
[0015] Figure 10 illustrates further details of the method for UV
control,
according to an example of the present disclosure;
[0016] Figure 11 illustrates a computer system, according to an example
of the
present disclosure.
[0017] Figure 12 illustrates a detailed architecture of a UV movement and
data
control system, according to an example of the present disclosure;
[0018] Figure 13 illustrates a logic diagram of components of UV
movement and
data control system of Figure 12 for event detection, according to an example
of
the present disclosure;
[0019] Figure 14 illustrates a method for UV movement and data control,
according to an example of the present disclosure;
[0020] Figure 15 illustrates further details of the method for UV
movement and
data control, according to an example of the present disclosure; and
[0021] Figure 16 illustrates a computer system, according to an example
of the
present disclosure.
4

CA 02881744 2015-02-13
,
, D14-092-02678-00-US and US2
PATENT
DETAILED DESCRIPTION
[0022] For simplicity and illustrative purposes, the present
disclosure is
described by referring mainly to examples thereof. In the following
description,
numerous specific details are set forth in order to provide a thorough
understanding
of the present disclosure. It will be readily apparent however, that the
present
disclosure may be practiced without limitation to these specific details. In
other
instances, some methods and structures have not been described in detail so as

not to unnecessarily obscure the present disclosure.
[0023] Throughout the present disclosure, the terms "a" and "an"
are intended to
denote at least one of a particular element. As used herein, the term
"includes"
means includes but not limited to, the term "including" means including but
not
limited to. The term "based on" means based at least in part on.
[0024] Unmanned vehicles (UVs) such as aerial vehicles (e.g.,
Unmanned
aerial vehicles (UAVs), or drones), land vehicles, or even collaborative
robots are
typically operated without a human aboard. With improvements in sensors, data
analytics capabilities, and programmatic mechanization components, UVs may be
used in a variety of ambiguous environments, and for performance of a variety
of
ambiguous tasks. For example, UVs may be used for package delivery,
agriculture, emergency services, pipeline inspection, etc. However,
integration of
UVs in a workflow involving such areas is limited.
[0025] According to examples, a UV control system and a method for UV
control are disclosed herein. The system and method disclosed herein may
5

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
generally utilize a hardware implemented mission manager and a hardware
implemented event detector to assign and manage a mission upon receipt of a
work order. The mission manager may maintain knowledge of a fleet of UVs,
sensors, and crew, as well as information regarding work order status, and
mission
status. A hardware implemented fleet and mission operations controller may
operate in conjunction with the mission manager to translate the work order
into a
mission request by assigning UVs, sensors, and crew to the mission request,
identifying a movement plan, and an objective for the mission. Once the
mission is
launched, the event detector may analyze communication data received during
the
mission, and generate alarms to the mission manager when events that may
result
in potential problems are detected. The fleet and mission operations
controller
may operate in conjunction with the mission manager to modify the UV movement
plan, and adjust, in real-time, movement plans of the UV based on the events.
[0026] With respect to the mission manager that tracks information
regarding
UVs, sensors, and UV operation crew, UVs may be tracked, for example, by type,
availability, and ability to mount particular sensors. The mission manager may
also
track sensors by type, availability, and ability to be mounted on particular
UVs. UV
operation crews may also be tracked by availability and ability to operate
particular
UVs.
[0027] The fleet and mission operations controller may receive a work order
related to UV mission. According to an example, work orders may be received
from various enterprises and cover a variety of applications of UVs. The fleet
and
6

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
mission operations controller may operate in conjunction with the mission
manager
to translate a work order into a mission request. A mission request may
identify,
for example, an operation for a UV, a type of a UV to complete the operation,
at
least one type of sensor to be mounted on the UV, a UV operation crew, a
movement plan, and/or an objective for the mission. For example, a mission
request may indicate that a fixed wing UAV or quadcopter (i.e., types of UAVs)
may
be equipped with a video camera, a gas detector, an infrared (IR) camera,
and/or a
pressure sensor to detect leaks in an oil pipeline.
[0028] After launching the mission, the UV may follow the movement plan
autonomously, or with varying degrees of remote operator guidance from a
hardware implemented movement planning controller operated by an operations
crew. Sensors mounted onto the UV may transmit data in real-time to a ground
station on the field, such as a portable device with a hardware implemented UV

data receiver, and the ground station may transmit the data to the event
detector,
which may be disposed off-site. The event detector may process the data to
identify an event. When an event is identified, the event detector may
transmit an
alarm to the fleet and mission operations controller for further review by a
mission
operator. The alarm may include information such as an identification of the
event,
data associated with the event, a location of the event, etc. After reviewing
the
event, the mission manager may operate in conjunction with the fleet and
mission
operations controller to generate instructions in real-time with an updated
movement plan for a UV operator.
7

CA 02881744 2015-02-13
' D14-092-02678-00-US and US2
PATENT
[0029] The system and method disclosed herein may be used in a variety of
environments and for a variety of purposes. For example, the system and method

disclosed herein may be used to monitor a pipeline in the oil and gas
industry. In
the oil and gas industry, the system and method disclosed herein may be used
in
other scenarios, including other types of exploration (e.g., site survey, site
drilling,
etc.), development (e.g., pad placement, facility rendering, capital project,
surveillance, etc.), production (e.g., flare/vent inspection, oil sheen
detection,
disaster prevention, etc.), manufacturing (flute/chimney inspection, tank/gas
inspection, gas detection, etc.), and transportation (e.g., right of way
monitoring,
theft monitoring, etc.).
[0030] The system and method disclosed herein may be used in package
delivery (e.g., food, medicine, equipment, etc.), aerial surveillance (e.g.,
police/fire
department, cartography, photography, film, journalism, real estate, etc.),
exploration (e.g., mine detection, site survey, etc.), research (e.g.,
wildlife,
atmosphere, ocean, etc.), remote sensing (e.g., telecommunications, weather,
maritime, construction, etc.), disaster relief (e.g., survivors, explore
contaminated
areas, etc.), environment (e.g., forest fires, threats, etc.), and agriculture
(e.g.,
spray pesticides, crop growth, disease, irrigation level, wild animals, etc.).
[0031] The system and method disclosed herein may be used for scheduling of
predictive maintenance to provide asset inspection, diagnostics, repair, and
maintenance work. Further, the system and method disclosed herein may be
used, for example, to identify and schedule environmental (e.g., terrain,
vegetation,
8

CA 02881744 2015-02-13
,
' D14-092-02678-00-US and US2
PATENT
etc.) management. The system and method disclosed herein may also provide for
enhancements in safety and environmental protection related to the various
activities described herein. For example, with respect to the oil and gas
industry,
the system and method disclosed herein may be used to protect assets from
sabotage, illegal tapping, and terrorist actions in an efficient and
economical
manner.
[0032] The system and method disclosed herein may be used to analyze data
from a UV to determine tasks that may be both electronically and mechanically
automated in a workflow, and to identify insights that may be obtained from
the
data. These insights may be used to drive operational decisions, such as
shortening lead time to problem detection, or predictive maintenance with
pipelines, for example, in the oil and gas industry. The system and method
disclosed herein may provide for the reduction of exposure to hazardous
environments, increase efficiency and effectiveness with respect to UV
control, and
optimize operations.
[0033] Generally, the system and method disclosed herein may be
envisioned
in a broad range of applications where drones or UVs may be used to reduce
cost,
increase safety, and increase productivity.
[0034] The system and method disclosed herein may account for aspects
related to the state of UV technology, regulation and compliance, readiness,
and
safety and privacy. With respect to UV technology, the system and method
disclosed herein may provide the hardware and software platform and setup for
UV
9

CA 02881744 2015-02-13
µ
,
' D14-092-02678-00-US and US2
PATENT
control. The system and method disclosed herein may also provide for
implementation of aspects such as optimal movement planning operations and
life
cycle management, selection of specialized sensors, direct data transmission
from
a UV, UV infrastructure and availability management, task distribution among
multiple UVs, and reprioritization of UV objectives. With respect to security,
safety,
and regulations, the system and method disclosed herein may provide for
constraints based on local regulations and certification, UV certification and

operator training, requirements regarding reporting of incidents to
authorities,
obstacle avoidance, authentication and authorization of missions, ensuring
that a
mission has not been compromised or sabotaged, and protection against misuse.
The system and method disclosed herein may also provide for secure
transmission
of data from the event detector that may be implemented in a cloud
environment,
end-to-end process integration, analytics requirements based on vertical
industry,
data storage and security, defining business rules, and redefining workflows
to
incorporate use of the UVs and availability of new insights into related
processes.
[0035] For the system and method disclosed herein, a hardware
implemented
order generator may generate and/or submit work orders to the fleet and
mission
operations controller. The hardware implemented order generator may execute
machine readable instructions to generate and/or submit the work orders,
and/or
be implemented to include and/or utilize a cloud based service to generate
and/or
submit the work orders.

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
[0036] For the system and method disclosed herein, the fleet and mission
operations controller may perform various tasks, such as, specification of
mission
objectives and routes, scheduling of missions, assignment of a mission
operator
and assistant, assignment of UV equipment, monitoring of missions in progress,
making adjustments to mission requirements. Further, the fleet and mission
operations controller may operate in conjunction with the mission manager to
generate a mission request from a work order.
[0037] For the system and method disclosed herein, the movement planning
controller may plan and execute a mission. Further, the movement planning
controller may monitor the FPV to ensure that mission objectives are being
met,
and adjust mission routes as needed.
[0038] For the system and method disclosed herein, a hardware
implemented
mission planning controller may manage, for example, a camera gimbal and a
video camera, and monitor video capture to ensure quality.
[0039] According to examples disclosed herein, the UV control system may
include the hardware implemented fleet and mission operations controller that
is
executed by at least one hardware processor to receive a UV work order and to
generate a mission request based on the UV work order. According to an
example, the mission request may include an identification of an objective of
a
mission, an assignment of a UV and a sensor to the mission from a fleet of UVs
and sensors, and an assignment of a first movement plan to the mission based
on
the identified objective of the mission.
11

CA 02881744 2015-02-13
, .
.'
D14-092-02678-00-US and US2
PATENT
[0040] According to an example, the hardware implemented mission
controller
that is executed by the at least one hardware processor may control the
assigned
UV according to the assigned first movement plan, and receive communication
data from the assigned sensor.
[0041] According to an example, the hardware implemented event detector
that
is executed by the at least one hardware processor may analyze the
communication data to identify an event related to the mission, and forward
the
identified event to the hardware implemented fleet and mission operations
controller. The mission operations controller may analyze the identified event
and
the first movement plan, and assign a second movement plan to the mission
based
on the analysis of the identified event and the first movement plan to meet
the
identified objective of the mission. The second movement plan may be different

than the first movement plan. The hardware implemented mission controller may
control the assigned UV according to the assigned second movement plan.
According to an example, the first movement plan may include predefined way
points and alternate points for the UV based on the identified objective of
the
mission, and the second movement plan may include one or more different
predefined way points from the predefined way points and one or more different

alternate points from the alternate points for the UV based on the analysis of
the
identified event to meet the identified objective of the mission.
[0042] According to an example, a hardware implemented compliance evaluator
that is executed by the at least one hardware processor may determine whether
12

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
the mission request is compliant with regulations. In response to a
determination
that the mission request is compliant with regulations, the hardware
implemented
compliance evaluator may forward the mission request to the hardware
implemented mission controller. For example, the hardware implemented
compliance evaluator may determine whether the assigned UV and a UV operation
crew associated with the mission request is compliant with regulations. The
hardware implemented compliance evaluator may also determine, on a continuous
or semi-continuous basis, whether additional in-flight changes to a mission,
based
on sensor requests, deviate from the mission request.
[0043] According to an example, the sensor may include a video camera, and
the hardware implemented mission controller may generate a real-time display
from the video camera, receive instructions to modify movement of the UV based

on an analysis of the real-time display from the video camera, and modify
movement of the UV based on the received instructions.
[0044] According to an example, the hardware implemented event detector may
analyze the communication data to identify the event that includes a potential
leak
or an intruder related to a pipeline.
[0045] According to an example, the hardware implemented event detector may
analyze the communication data to identify the event related to a pipeline,
and
generate instructions for preventative actions with respect to the pipeline
based on
the identification of the event.
13

CA 02881744 2015-02-13
I
D14-092-02678-00-US and US2
PATENT
[0046] According to an example, the hardware implemented mission
controller
may generate a real-time display related to the event, where the real-time
display
includes a characterization of a type and a severity level of the event.
[0047] The UV control system and the method for UV control
disclosed herein
provide a technical solution to technical problems related, for example, to UV
control. The system and method disclosed herein provide the technical solution
of
the hardware implemented fleet and mission operations controller that is
executed
by at least one hardware processor to receive a UV work order and to generate
a
mission request based on the UV work order. The mission request may include an
identification of an objective of a mission, an assignment of a UV and a
sensor to
the mission from a fleet of UVs and sensors, and an assignment of a first
movement plan to the mission based on the identified objective of the mission.

According to an example, the hardware implemented mission controller that is
executed by the at least one hardware processor may control the assigned UV
according to the assigned first movement plan, and receive communication data
from the assigned sensor. According to an example, the hardware implemented
event detector that is executed by the at least one hardware processor may
analyze the communication data to identify an event related to the mission,
and
forward the identified event to the hardware implemented fleet and mission
operations controller. The mission operations controller may analyze the
identified
event and the first movement plan, and assign a second movement plan to the
mission based on the analysis of the identified event and the first movement
plan to
14

CA 02881744 2015-02-13
, .
' D14-092-02678-00-US and US2
PATENT
meet the identified objective of the mission. The second movement plan may be
different than the first movement plan. The hardware implemented mission
controller may control the assigned UV according to the assigned second
movement plan. According to an example, a hardware implemented compliance
evaluator that is executed by the at least one hardware processor may
determine
whether the mission request is compliant with regulations. In response to a
determination that the mission request is compliant with regulations, the
hardware
implemented compliance evaluator may forward the mission request to the
hardware implemented mission controller. For example, the hardware
implemented compliance evaluator may determine whether the assigned UV and a
UV operation crew associated with the mission request is compliant with
regulations.
[0048] The hardware implemented components described herein with
respect to
the system and method disclosed herein may execute machine readable
instructions, and/or be implemented to provide and/or utilize a cloud based
service.
[0049] Figure 1 illustrates a detailed architecture of a UV
control system 100,
according to an example of the present disclosure. The UV control system 100
may include a hardware implemented mission controller 102 that is to perform
various operations related, for example, to mission planning, movement
planning,
and receiving of data from a UV 104. The UV 104 may include a plurality of
UVs.
The UV 104 may include a sensor 106. The sensor 106 may include a plurality of

sensors. The UV 104 may encompass all types of UVs, including a variety of

CA 02881744 2015-02-13
. .
,
D14-092-02678-00-US and US2
PATENT
aerial, land, space, and marine UVs. The UV 104 may take off (e.g., for an
aerial
UV), navigate, capture data, transmit collected data, return, and land without

human interaction.
[0050]
The sensor 106 may gather data associated with a mission. The sensor
106 may include a variety of types of sensors that may be categorized as sight
sensors, sound sensors, touch sensors, smell sensors, position sensors,
external
communication sensors, proximity sensors, and other (e.g., miscellaneous
sensors). The sight sensors may include sensors for ascertaining light
intensity,
color, distance (e.g., by infrared (IR), measuring angle of light bounce),
video
capture, rotation (e.g., optical encoders), and/or light signal read (e.g.,
infrared
codes). The sound sensors may include sensors (e.g., a microphone) for
ascertaining volume (e.g., decibel meter), frequency measurement, and/or
distance
(e.g., sonar, measuring time to echo). The touch sensors may include sensors
for
ascertaining position awareness (e.g., collision alert, contact confirmation,
etc.),
bend/strain, temperature, and/or pressure (e.g., barometric, grip strength,
etc.).
The smell sensors may include sensors such as gas sensors, alcohol sensors,
etc.
The position sensors may include sensors (e.g., accelerometer, digital
compass,
gyroscope) for ascertaining location (e.g., based on global positioning system

(GPS), proximity to a beacon, etc.), and/or tilt. The external communication
sensors may include sensors for ascertaining radio communication, and/or IR
codes. The proximity sensors may include sensors to ascertain nearness in
space,
time, and/or relationship. The miscellaneous sensors may include sensors for
16

CA 02881744 2015-02-13
,
,
D14-092-02678-00-US and US2
PATENT
ascertaining date and time (e.g., ultra-low frequency (ULF) updates), network
communication status, and/or voltage (e.g., low fuel, low battery).
[0051] The UV 104 may also include various components for processing,
and
generating outputs. For example, with respect to processing, the UV 104 may
provide for sensor data processing for analog and digital input/output (I/O),
kinematics (e.g., position and orientation of objects), proportional-integral-
derivative
(P1D) feedback control, rules application (e.g., if this, do that), navigation
(e.g.,
move to a waypoint), mission execution (e.g., manage multiple waypoints),
telemetry management (e.g., summarizing telemetry data), counter, audio/voice
processing (e.g., speech to text, text to speech), manage date/time, and data
management (e.g., memory, disk, etc.). With respect to processing, the UV 104
may provide for outputs such as movement, motors (e.g., servos, stepper,
brushless), hydraulics, pneumatics, gravity release, visual
indicators/feedback,
LEDs, LCDs, displays, audio indicators/feedback, speaker, buzzer, etc.,
voltage
change (e.g., not in use, go to low power mode), and external communication
subsystems (e.g., radio, IR codes).
[0052] In the example of Figure 1, the UV 104 and the mission
controller 102
may be disposed in a field (e.g., above dashed line 108), whereas the fleet
and
mission operations controller 110, the mission manager 112, the compliance
evaluator 114, and the event detector 116 may be hosted in an off-site
facility (e.g.,
below the dashed line 108), such as a cloud environment 118. In some examples,

the cloud environment 118 may be a data center or another distributed network
17

CA 02881744 2015-02-13
'
, .
D14-092-02678-00-US and US2
PATENT
capable of processing relatively large amounts of data in real time. In other
examples, the components of the UV control system 100 that are located in an
off-
site facility may be based, for example, on the hardware capabilities of chips

installed on the UV 104, a size and power associated with the UV 104, and
processing requirements of a mission executed by the UV 104.
[0053] The mission planning controller 120 may enable the UV 104
to be
programmed to run autonomously. The UV 104 may be equipped with the sensor
106 and intelligence to maintain altitude and a stabilized flight (e.g., for
an aerial
UV). The sensor 106 may be used to determine the position and altitude of the
UV
104 at any given point in time. This enables the UV 104 to navigate between
two
points according to pre-defined waypoints, without any human interaction
during
the flight (e.g., for an aerial UV). The mission planning controller 120 may
generate a display of the mission details that may be viewed by a UV operation

crew (e.g., a pilot and/or assistant).
[0054] The movement planning controller 122 may be used to launch the UV
104, and control the UV flight path (e.g., for an aerial UV) and associated
sensors.
Once the UV 104 begins its movement plan from the launch point, the mission
planning controller 120 may communicate with the mission manager 112 to
indicate the beginning of the mission. According to an example, the mission
controller 102 may be stored on a tablet or another portable device.
[0055] A hardware implemented UV data receiver 124 may be used to
receive
various types of communication data from the UV 104. The communication data
18

CA 02881744 2015-02-13
. .
,
D14-092-02678-00-US and US2
PATENT
may be used, for example, by the event detector 116 to determine events
related to
an objective of the mission.
[0056] The fleet and mission operations controller 110 may receive
a work
order from the order generator 130. The work order may identify a problem
detected, for example, at a particular location or region of a pipeline that
requires
further exploration. The mission manager 112 may maintain information
regarding
UVs and sensors in inventory. For example, the mission manager 112 may track
UVs by type, availability, and an ability to mount particular sensors. The
mission
manager 112 may also track sensors by type, availability, and ability to be
mounted
on a particular UV.
[0057] The fleet and mission operations controller 110 may operate
in
conjunction with the mission manager 112 to convert the UV work order 126 to a

mission request 128 (see Figure 3A). For an aerial UV, the mission request 128

may specify, for example, a flight time, a flight plan, equipment (e.g., the
specific
UV, sensors, and any UV operation crew). The flight plan may include a launch
point, predefined way points, alternate rally points, payload requirements,
video or
other data gathering requirements, payload operation instructions, and/or
mission
objectives.
[0058] The compliance evaluator 114 may confirm whether the
mission request
complies with regulations (e.g., government regulations) governing the use of
UVs,
as well as with other policies related to UVs.
[0059] The mission manager 112 may schedule and assign the
mission.
19

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
Specifically the mission manager 112 may assign the UV 104 (or a plurality of
UVs), the sensor 106 (or a plurality of sensors), and any UV operation crew to
a
location for completing the mission request.
[0060] The mission controller 102 may receive the mission request from
the
mission manager 112, and operate the assigned UV 104 according to the
movement plan. The UV 104 may follow the movement plan autonomously or with
varying degrees of remote operator guidance from the movement planning
controller 122 that may be operated by a UV operation crew.
[0061] Data from the sensor 106 may be received at the UV data receiver
124,
and forwarded (e.g., pushed) in real-time to the event detector 116.
Alternatively or
additionally, data from the sensor 106 may be communicated directly to the
event
detector 116 based on the placement of hardware associated with the event
detector 116 near the edge of the field (e.g., dashed line 108) or within the
field.
[0062] The event detector 116 may interpret the data from the sensor 106
in
real-time to detect any events or potential problems that warrant further
exploration. The event detector 116 may include, for example, event
processing,
video stream playback, facial recognition, blob detection, and general
inspection for
the pipeline example described herein. However, those skilled in the art will
appreciate in view of this disclosure that the processing capabilities of the
event
detector 116 may vary depending on the purpose of the mission and the types of
sensors that are used for the UV 104.
[0063] If an event is detected, the event detector 116 may generate an
alert

CA 02881744 2015-02-13
. .
D14-092-02678-00-US and US2
PATENT
and forward the alert to the fleet and mission operations controller 110.
Further,
data associated with the event may be displayed in real-time at the fleet and
mission operations controller 110. The data associated with the event may be
analyzed by the fleet and mission operations controller 110 and the mission
manager 112. Based on the analysis of the data, the fleet and mission
operations
controller 110 may operate in conjunction with the mission manager 112 to
communicate to the mission controller 102 a change in movement plan or other
further instructions, such as a notification that the mission is complete and
an
instruction to dismount the sensors from the UV 104 and leave the field.
[0064] Figure 2 illustrates a mission console 200 of the UV control system
100,
according to an example of the present disclosure. The mission console 200 may

be used to display various types of status related to a mission, such as, work
order
status, and mission status. The mission manager 112 may manage work orders.
For example, the mission manager 112 may track a total number of the work
orders, as well as a status of each work order, such as under review,
scheduled for
mission, mission in progress, and mission completed. The mission manager 112
may also track the UVs and sensors in inventory. For example, for each UV 104
and sensor 106, the mission manager 112 may track a status (e.g., available,
in
maintenance, or assigned), a type (e.g., fixed wing or quadcopter for a
vehicle, and
pressure, gas, IR, or video for a sensor), and a location. The mission manager
112
may also keep track of which type of UVs each sensor may be used on. For
example, IR cameras and their associated gimbals may be mounted on a specific
21

CA 02881744 2015-02-13
. ,
D14-092-02678-00-US and US2
PATENT
UV. The mission manager 112 may provide for the selection of a particular
available UV and available sensors based on mission objectives.
[0065] Figures 3A-3C illustrate an environment for operation of
the UV control
system 100, according to an example of the present disclosure.
[0066] Referring to Figure 3A, at 300, a UV work order 126 may be received
at
the fleet and mission operations controller 110 from the order generator 130.
In the
example of Figures 3A-3C, the UV work order 126 may be related to monitoring
of
pipeline conditions and other characteristics of interest. The UV work order
126
may include, for example, a mission date, a mission time, and/or a mission
objective (e.g., check pipeline sections A, B, and C for leaks). For example,
the UV
work order 126 may include a request for inspection which states that a
potential
leak was detected in a certain region of the pipeline and warrants further
exploration.
[0067] At 302, the fleet and mission operations controller 110 may
convert the
UV work order 126 to the mission request 128. The mission request 128 may be
determined based on a time for a mission, and include aspects related to
compliance requirements for the mission, selection of the UV 104 with
specified
equipment, selection of a mission operator and assistant, identification of an

objective for the mission (e.g., to identify a potential leak), and
specification of a
required movement path. The mission request 128 may be sent to the mission
manager 112 for further analysis.
[0068] At 304, the mission manager 112 may forward the mission
request 128
22

CA 02881744 2015-02-13
..
. ,
D14-092-02678-00-US and US2
PATENT
to the compliance evaluator 114 to determine whether the mission request 128
is in
compliance with regulations (e.g., government, or other regulations). For
example,
the compliance evaluator 114 may determine whether equipment associated with
the UV 104 as well as any UV operation crew are in compliance with
regulations.
[0069] At 306, in response to a determination that the mission request 128
is
not in compliance with regulations, the mission request 128 may be returned to
the
order generator 130, where a modified UV work order 126 may be re-submitted to

the fleet and mission operations controller 110.
[0070] At 308, in response to a determination that the mission
request 128 is in
compliance with regulations, at 310, the mission request 128 may be forwarded
to
the mission controller 102. The mission request 128 may include information
such
as the UV identification (ID), the UV type, the specific sensor, the time of
the
mission request, and the movement plan information (e.g., for an aerial UV)
which
includes the launch point, predefined way points and alternate points, and
defines
payload requirements (e.g., video camera), payload operation instructions, and
mission objectives. At the mission controller 102, the mission planning
controller
120 may generate a display of the mission requirements. The mission planning
controller 120 may further generate the display of the objectives for the
mission,
and a movement path for the mission.
[0071] Referring to Figures 3A and 3B, at 312, the mission controller 102
may
launch the mission. With respect to launch of the mission, the mission
controller
102 may define the mission movement plan, and operate the UV 104 using
23

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
movement planning machine readable instructions and ground control. The UV
104 may be operated by a UV operation crew, and/or by the mission controller
102.
Further, the mission controller 102 may monitor the real-time movement and
display a FPV to ensure the UV 104 completes the movement path and captures
data specified in the mission objective. For example, the mission controller
102
may control a camera mounted on the UV 104 for accurate video capture using
movement planning machine readable instructions. The FPV may be monitored by
a mission operator.
[0072] At 314, communication data including telemetry data 132 and
video
stream data 134 may be received at the mission controller 102 from the UV 104.
The telemetry data 132 and the video stream data 134 may include a video feed
or
other real-time data depending on the mission and data collected.
[0073] With respect to the aerial UV 104, as the flight is in progress,
at 316, the
telemetry data 132 and the video stream data 134 may be forwarded to the event
detector 116. The event detector 116 may analyze the telemetry data 132 and
the
video stream data 134, detect any events 140 related to the UV 104, and notify
the
fleet and mission operations controller 110 of any problems related to the
detected
events (e.g., based on alerts). The mission and alerts may be displayed in a
real
time display at the fleet and mission operations controller 110. For example,
with
respect to pipeline monitoring, the alerts may be related to an intruder,
pipeline
maintenance, vegetation, etc. For example, with respect to pipeline
monitoring, at
318, the event detector 116 may identify a potential leak at a location-X of a
24

CA 02881744 2015-02-13
=
, .
D14-092-02678-00-US and US2
PATENT
section-Y of a pipeline. Further, at 320, the event detector 116 may generate
a
notification to the fleet and mission operations controller 110 of the
potential leak,
and forward an associated video of the leak. The event detector 116 may also
generate a notification to the fleet and mission operations controller 110
with
respect to any differences between where the leak was detected versus where
the
leak was actually seen during the mission.
[0074] At 322, if deemed appropriate, the fleet and mission
operations
controller 110 may send instructions to the mission controller 102 to change a

movement path while the mission is in progress. Any change to a movement path
(or other mission aspects generally) may be re-verified for compliance by the
compliance evaluator 114. The mission planning controller 120 may display any
adjustment to the movement plan. For example, the movement plan may be
modified by directing the UV 104 to a new way point and awaiting instructions
to
complete the mission. The movement planning controller 122 may be used to
complete the mission. For example, processing hardware such as an SD or
another type of video card may be loaded to the mission planning controller
120,
and upon completion of the mission, the processing hardware associated with
the
UV 104 may be shipped to an entity associated with the order generator 130
(e.g.,
the entity for which the UV work order 126 is generated). At the completion of
the
mission, the UV work order 126 may be processed as being completed, and the
mission controller 102 may provide feedback on potential predictive
maintenance
to the entity associated with the order generator 130.

CA 02881744 2015-02-13
,
,
D14-092-02678-00-US and US2
PATENT
[0075] Upon completion of the mission, the mission manager 112 may
document the results of the mission for further processing and analytics. The
event
detector 116 may also provide recommendations on predictive maintenance for
use in future missions. The recommendations may be specified in a report
format,
which may be used for further preventive actions, such as, shutting off the
upstream pipelines to avoid further leakage.
[0076] Figure 4 illustrates an output of an event detector of the
UV control
system 100, according to an example of the present disclosure. The fleet and
mission operations controller 110 may generate a display of different types of
events 140 as a function of time. According to an example, with respect to
pipeline
monitoring, a unique shape may be used to indicate a particular type of event.
For
example, a potential leak may be represented as a star, a potential intruder
may be
represented as a square, and a land subside may be represented as a circle.
Similarly, color coding may be used to indicate a severity level of an event.
For
example, a red color may indicate a critical severity, an orange color may
indicate a
major severity, and a blue color may indicate a minor severity. Any number of
techniques may be used to graphically depict events and information relating
to the
events.
[0077] Referring to Figure 4, the y-axis may be separated into
different graphs,
each representing subsections of a pipeline, with the subsections being
denoted as
regions. Within each graph, a location of an event may reflect an overall risk
score
associated with the event. For example, a relatively high risk score may be
26

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
represented as a higher location, and a relatively low risk score may be
represented as a lower location. The risk score may be based upon the type of
risk
the problem presents as well as results of the analytics processing.
[0078] An event may be selected to generate a further view of a source
video
(e.g., the source video 400) that caused the event detector 116 to detect and
classify the event. According to an example, analytics may be performed on the

source video by the event detector 116 and displayed as an analytics
processing
video. For example, an analytics processing video may include voice over
describing the analytics performed on source video. The event detector 116 may
generate a notification that is sent to the fleet and mission operations
controller
110, where the fleet and mission operations controller 110 may perform actions

such as changing the movement plan in real-time to make further observations
related to the potential threat. Any change to a movement plan may be re-
verified
for compliance by the compliance evaluator 114.
[0079] Figure 5 illustrates architecture of the mission controller 102 for
processing data from sensors (e.g., including the sensor 106), according to an

example of the present disclosure. As described herein with reference to
Figure 6,
the processed data may be transmitted to the event detector 116 for further
analysis. Referring to Figure 5, data from the sensor 106 of the UV 104 may be
received at the mission controller 102. For example, telemetry data 132 and
video
stream data 134 may be received at the mission controller 102. The telemetry
data
132 may include information such as a UV location (e.g., to infer latitude,
longitude,
27

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
and height), attitude relative to earth, and an associated time stamp for the
location
measurement. The telemetry data 132 may be received from the sensor 106 on
the UV 104 at the mission planning controller 120 as the UV 104 is operated by
the
movement planning controller 122. The telemetry data 132 may be processed by a
telemetry data analyzer 136 of the UV data receiver 124. Similarly the video
stream data 134 may be received, for example, from a camera installed on the
UV
104, by the movement planning controller 122, and processed by a streaming
data
analyzer 138. Various other types of data may be received and pre-processed at

the mission controller 102 for the event detector 116.
[0080] Referring to Figure 5, micro air vehicle link (MAVLINK) may
represent
an open source communications protocol used for telemetry communications
between open source ground stations and UV flight controllers (e.g., the
mission
controller 102). MAVLINK may use a packet based communication that
standardizes packets and types for communicating a large number of UV flight
control, position, attitude, status, and other relevant data. The MAVLINK shim
may
be used to interrupt or inject MAVLINK packets into a current communications
stream between a personal computer (PC) based ground control station and a UV
flight controller. The MAVLINK shim may provide for additional analytics and
control machine readable instructions components described herein to send
commands over long distance telemetry radios to the UV and vice versa.
Further,
the MAVLINK shim may operate without interrupting the MAVLINK stream used for
communication between a ground station and the UV, and bifurcate off the same
28

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
information so that the information may be sent via a Transmission Control
Protocol/Internet Protocol (TCP/IP) stream to the telemetry data analyzer 136.

Open Source Computer Vision (OPENCV) may represent an open source video
analytics library used to read images as a video stream from the UV in real-
time,
and feed the images into the additional analytics components described herein.
Libdc1394 & 1394 may represent serial communications libraries used in
programming. National Television System Committee (NTSC), Phase Alternating
Line (PAL), and Sequential Color with Memory (SECAM) may refer to
international
standards for video transmission and decoding.
[0081] Referring to Figure 6, the event detector 116 may include a slice or
container 602 and an event orchestrator 604. The slice 602 may include a
hardware implemented data integrator 606 and a hardware implemented data
analyzer 608. The data integrator 606 may fuse the various data types received

from the sensor 106 through the mission controller 102. For example, the data
integrator 606 may combine the video stream data 134 at 30 frames per second
with the telemetry data 132 (e.g., time and location) one frame at a time.
After
exiting the data integrator 606, each frame of video stream data 134 may
include
time and location information in a meta tag, and is passed to the data
analyzer 608.
The data analyzer 608 may include various applications (i.e., machine readable
instructions) for processing the various types of data for events. For
example, with
respect to an oil and gas detection application 610, the data analyzer 608 may

include a blob detector for detecting oil leaks, vegetation or intruders, an
oil
29

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
detector, a face detector, an event detector, and/or other sub-components for
image recognition. The various components such as USB and RS232 represent
communications protocols used over serial interfaces, TCP/IP represents the
global standard Internet/networking protocol used for computer communications,
and the OpenCV and OpenNI components represent open source libraries and
machine readable instructions used for development of the data integrator 606.
[0082] The detected events may be forwarded to the event orchestrator
604.
The event orchestrator 604 may publish the events to the fleet and mission
operations controller 110.
[0083] Upon the launch of the UV 104 by the movement planning controller
122,
a mission planner may initiate a request for a mission session with the event
detector 116. An example of a mission planner may include MAVLINK Shim. The
mission planning controller 120 may receive the request from the mission
planner,
and generate a request for the initiation of a session at the event
orchestrator 604.
Session initiation may include allocation of a data integrator and multiple
data
analyzers, and connecting the data integrator and the data analyzers to a
receiver
for data capture. The event orchestrator 604 may receive the events and
publish
the events for use at a dashboard for the mission controller 102.
[0084] Figures 7 and 8 illustrate screenshots of event detection
analytics
processing videos of the UV control system 100, according to an example of the
present disclosure. Referring to Figure 7, the event detection analytics
processing
video may represent a spill or blob detection analytics processing video.
Similarly,

CA 02881744 2015-02-13
A
D14-092-02678-00-US and US2
PATENT
referring to Figure 8, the event detection analytics processing video may
represent
an intruder detection analytics processing video.
[0085] Referring to Figures 5-7, with respect to the oil and gas
detection
application 610, spill or blob detection analytics may include converting
source
video frame (and its pixels), for example, from red, green, and blue (RGB)
into its
corresponding hue-saturation-values (HSVs) to adjust for variations in
lighting
conditions and shadows. The data analyzer 608 may define lower and upper
bounds of the HSVs depending on the type of material and the time of the day
so
that pixels with the correct HSV may be extracted. The data analyzer 608 may
analyze each video frame, apply a binary mask (e.g., accepted pixels are
black,
others are white) to extract portions of the frame that fit within the defined
HSV
requirements. By using, for example, a built-in blob detection library, the
data
analyzer 608 may extract the positions of all of the blobs in the binary mask,
and
use these positions to overlay as circles on top of the original video (see
Figure 7).
Simultaneously, parameter constraints may be placed on the extracted blobs and
include, for example, area, perimeter, circularity, max_x, and max_y. The
max_x
and max_y may represent the maximum distance from the pipeline used for
consideration based on the assumption that the spill should not be too far
away
from the pipeline. Once a blob fits the above requirements with a high degree
of
confidence, the data analyzer 608 may classify the blob as a potential spill,
and
forward the indication to the event orchestrator 604. For example, in Figure
7, a
grate may be classified as a potential spill.
31

CA 02881744 2015-02-13
i
'
. ,
D14-092-02678-00-US and US2
PATENT
[0086] Referring to Figures 5, 6, and 8, the oil and gas detection
application 610
may include intruder analytics that include detection and recognition. With
respect
to detection, depending on the type of objects being detected and/or tracked,
different cascading classifiers may be used. Cascading classifiers may
represent a
concatenation of a plurality of classifiers to test for different features
that factor into
a single detection problem. An object may be considered as detected if it
passes
all the classifiers within the cascade (i.e., by using a cascade-of-rejectors
approach). The classifiers may be trained using a large set of training data
at
various angles, or at an expected angle appropriate to the UV 104 (e.g., an
angle
that is equivalent to an above eye level). Examples of cascading classifiers
that
are used for face and person detection may respectively include the HAAR
cascade and the histograms of oriented gradients (HOG) cascade, and other such

techniques. The HOG cascade may use a dense grid of HOGs that are
determined over blocks of pixels to represent a detection window. The HOG
cascade technique may include a speed limitation depending on the sparse
scanning technique and how many windows may be analyzed per frame per
second. These detection windows may represent the functional units used to
learn
features. The actual feature selection process may be performed, for example,
by
the Adaptive Boosting technique, or other such techniques.
[0087] With respect to facial recognition, examples of techniques may
include
the FISHERFACES technique, Linear Discriminant Analysis (LDA), or other such
techniques. The FISHERFACES technique may be used for dimensionality
32

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
reduction and classification. A set of training data may be fed to perform
cross
principal component analysis (PCA) with the detected features (e.g., face,
person,
eyes, etc.).
[0088] With respect to intruder detection, intruder detection may
include using
the HOG cascading classifier for person detection, setting the max and min
acceptable area of the desired object, gray scaling each video frame for
faster
motion detection, applying, for example, a KALMAN filter to consistently take
in
series of measurements of detected persons (e.g., width, height, positions
etc.
stored within a standard vector), and filtering out those entries with
measurements
that are not consistent with past entries. For example, the probability of a
person's
position transitioning to the next position decreases exponentially with
differences
in the two positions being above a threshold that would be unreasonable for
the
person to travel in a specified period of time. Intruder detection may also
account
for a dimension change over measurements. For example, as a person is walking
towards a camera of the UV 104, the dimensions of the person should increase
proportionally, and outlying measurements may be eliminated.
[0089] The elements of the UV control system 100 described herein may be
machine readable instructions stored on a non-transitory computer readable
medium. In addition, or alternatively, the elements of the UV control system
100
may be hardware or a combination of machine readable instructions and
hardware.
[0090] Figures 9 and 10 illustrate flowcharts of methods 900 and 1000
for UV
control, according to examples. The methods 900 and 1000 may be implemented
33

CA 02881744 2015-02-13
,
. ,
,
D14-092-02678-00-US and US2
PATENT
on the UV control system 100 described above with reference to Figures 1-8 by
way of example and not limitation. The methods 900 and 1000 may be practiced
in
other systems.
[0091] Referring to Figures 1 and 9, at block 902, the method 900
may include
generating, by a hardware implemented fleet and mission operations controller
that
is executed by at least one hardware processor, a mission request to identify
an
objective of a mission, assign a UV and a sensor to the mission from a fleet
of UVs
and sensors, and assign a first movement plan to the mission based on the
identified objective of the mission. For example, referring to Figures 1, 3A-
3C, and
11, the hardware implemented fleet and mission operations controller 110 that
is
executed by at least one hardware processor (e.g., the hardware processor
1102),
may generate the mission request 128 to identify an objective of a mission,
assign
a UV 104 and a sensor 106 to the mission from a fleet of UVs and sensors, and
assign a first movement plan to the mission based on the identified objective
of the
mission.
[0092] At block 904, the method 900 may include controlling, by a
hardware
implemented mission controller that is executed by the at least one hardware
processor, the assigned UV according to the assigned first movement plan. For
example, referring to Figures 1, 3A-3C, and 11, the hardware implemented
mission
controller 102 that is executed by the at least one hardware processor may
control
the assigned UV according to the assigned first movement plan.
[0093] At block 906, the method 900 may include analyzing, by a
hardware
34

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
implemented event detector that is executed by the at least one hardware
processor, communication data from the assigned sensor to identify an event
related to the mission. For example, referring to Figures 1, 3A-3C, and 11,
the
hardware implemented event detector 116 that is executed by the at least one
hardware processor may analyze communication data (e.g., the telemetry data
132
and the video stream data 134) from the assigned sensor to identify an event
(e.g.,
one of the events 140) related to the mission.
[0094] At block 908, the method 900 may include analyzing, by the
hardware
implemented fleet and mission operations controller, the identified event and
the
first movement plan. For example, referring to Figures 1, 3A-3C, and 11, the
hardware implemented fleet and mission operations controller 110 may analyze
the
identified event and the first movement plan.
[0095] At block 910, the method 900 may include assigning, by the
hardware
implemented fleet and mission operations controller, a second movement plan to
the mission based on the analysis of the identified event and the first
movement
plan to meet the identified objective of the mission, where the second
movement
plan is different than the first movement plan. For example, referring to
Figures 1,
3A-3C, and 11, the hardware implemented fleet and mission operations
controller
110 may assign a second movement plan to the mission based on the analysis of
the identified event and the first movement plan to meet the identified
objective of
the mission, where the second movement plan is different than the first
movement
plan.

CA 02881744 2015-02-13
,
D14-092-02678-00-US and US2
PATENT
[0096] At block 912, the method 900 may include controlling, by the
hardware
implemented mission controller, the assigned UV according to the assigned
second
movement plan. For example, referring to Figures 1, 3A-3C, and 11, the
hardware
implemented mission controller 102 may control the assigned UV according to
the
assigned second movement plan.
[0097] Referring to Figures 1 and 10, at block 1002, the method 1000
may
include receiving, at a hardware implemented mission controller that is
executed by
the at least one hardware processor, a mission request that identifies an
objective
of a mission, assigns a UV and a sensor to the mission from a fleet of UVs and
sensors, and assigns a first movement plan to the mission based on the
identified
objective of the mission. For example, referring to Figures 1, 3A-3C, and 11,
the
hardware implemented mission controller 102 may receive the mission request
128
that identifies an objective of a mission, assigns the UV 104 and the sensor
106 to
the mission from a fleet of UVs and sensors, and assigns a first movement plan
to
the mission based on the identified objective of the mission.
[0098] At block 1004, the method 1000 may include controlling, by the
hardware implemented mission controller, the assigned UV according to the
assigned first movement plan. For example, referring to Figures 1, 3A-3C, and
11,
the hardware implemented mission controller 102 may control the assigned UV
according to the assigned first movement plan.
[0099] At block 1006, the method 1000 may include analyzing, by a
hardware
implemented event detector that is executed by the at least one hardware
36

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
processor, communication data from the assigned sensor to identify an event
related to the mission. For example, referring to Figures 1, 3A-3C, and 11,
the
hardware implemented event detector 116 may analyze communication data from
the assigned sensor to identify an event related to the mission.
[0100] At block 1008, the method 1000 may include analyzing, by a hardware
implemented fleet and mission operations controller, the identified event and
the
first movement plan. For example, referring to Figures 1, 3A-3C, and 11, the
hardware implemented fleet and mission operations controller 110 may analyze
the
identified event and the first movement plan.
[0101] At block 1010, the method 1000 may include receiving, at the
hardware
implemented mission controller, a second movement plan for the mission based
on
the analysis of the identified event and the first movement plan to meet the
identified objective of the mission, where the second movement plan is
different
than the first movement plan. For example, referring to Figures 1, 3A-3C, and
11,
the hardware implemented mission controller 102 may receive a second movement
plan for the mission based on the analysis of the identified event and the
first
movement plan to meet the identified objective of the mission, where the
second
movement plan is different than the first movement plan.
[0102] Figure 11 shows a computer system 1100 that may be used with the
examples described herein. The computer system may represent a generic
platform that includes components that may be in a server or another computer
system. The computer system 1100 may be used as a platform for the system
37

CA 02881744 2015-02-13
S,
D14-092-02678-00-US and US2
PATENT
100. The computer system 1100 may execute, by a processor (e.g., a single or
multiple processors) or other hardware processing circuit, the methods,
functions
and other processes described herein. These methods, functions and other
processes may be embodied as machine readable instructions stored on a
computer readable medium, which may be non-transitory, such as hardware
storage devices (e.g., RAM (random access memory), ROM (read only memory),
EPROM (erasable, programmable ROM), EEPROM (electrically erasable,
programmable ROM), hard drives, and flash memory).
[0103] The computer system 1100 may include a processor 1102
that may
implement or execute machine readable instructions performing some or all of
the
methods, functions and other processes described herein. Commands and data
from the processor 1102 may be communicated over a communication bus 1104.
The computer system may also include a main memory 1106, such as a random
access memory (RAM), where the machine readable instructions and data for the
processor 1102 may reside during runtime, and a secondary data storage 1108,
which may be non-volatile and stores machine readable instructions and data.
The
memory and data storage are examples of computer readable mediums. The
memory 1106 may include a UV controller 1120 including machine readable
instructions residing in the memory 1106 during runtime and executed by the
processor 1102. The UV controller 1120 may include the elements of the system
100 shown in Figure 1.
[0104] The computer system 1100 may include an I/O device 1110,
such as a
38

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
keyboard, a mouse, a display, etc. The computer system may include a network
interface 1112 for connecting to a network. Other known electronic components
may be added or substituted in the computer system.
[0105] In another embodiment, according to examples, a UV movement and
data control system and a method for UV movement and data control are
disclosed
herein. The system and method disclosed herein may generally utilize a
hardware
implemented mission manager and a hardware implemented event detector to
assign and manage a mission. The mission manager may maintain knowledge of
a fleet of UVs, sensors, and crew, as well as information regarding mission
status.
A hardware implemented fleet and mission operations controller may operate in
conjunction with the mission manager to assign UVs, sensors, and crew to a
mission request, identify a movement plan (e.g., a flight plan for a UAV), and
an
objective for the mission. Once the mission is launched, the event detector
may
analyze communication data received during the mission, and generate alarms to
the mission manager when events that may result in potential problems are
detected. The fleet and mission operations controller may operate in
conjunction
with the mission manager to modify the UV movement plan, and adjust, in real-
time, movement plans of the UV based on the events.
[0106] A mission request may identify, for example, an operation for a UV,
a
type of a UV to complete the operation, at least one type of sensor to be
mounted
on the UV, UV operation crew, a movement plan, and/or an objective for the
39

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
mission. For example, a mission request may indicate that a fixed wing UAV or
quadcopter (i.e., types of UAVs) may be equipped with a video camera, a gas
detector, an infrared (IR) camera, and/or a pressure sensor to detect leaks in
an oil
pipeline.
[0107] After launching the mission, the UV may follow the movement plan
autonomously, or with varying degrees of remote operator guidance from a
hardware implemented movement planning controller operated by an operations
crew. Sensors mounted onto the UV may transmit data in real-time to a ground
station on the field, such as a portable device with a hardware implemented UV
data receiver, and the ground station may transmit the data to the event
detector,
which may be disposed off-site.
[0108] The UV data receiver may include a hardware implemented
telemetry
data analyzer to buffer predetermined message format, such as a JavaScript
Object Notation (JSON) format, UV movement and status messages that are sent
to a hardware implemented data integrator of the event detector. The UV data
receiver may include a hardware implemented stream data analyzer to receive
unformatted data streams from multiple data sources (e.g., video, thermal,
near
infrared (NIR), multispectral, etc.), and forward the data streams to the data

integrator for pre-processing (i.e., synchronization by frames tagged with
time and
location) and further analytics processing.
[0109] The data integrator, which may be implemented as a component of
the
event detector or separately from the event detector, may include a hardware

CA 02881744 2015-02-13
,
i .
D14-092-02678-00-US and US2
PATENT
implemented time and location service (TLS) analyzer and a hardware
implemented stream integrator. The TLS analyzer may receive status messages
from the telemetry data analyzer, and maintain a model of a current state of
the UV
while it is performing a mission (e.g., in flight for a UAV). The stream
integrator
may receive a stream of data from the stream data analyzer, extract frames of
data
from the stream, retrieve time and location information from the TLS analyzer
to
insert into the each frame's metadata, and create time and location correlated

frames.
[0110] A data analyzer of the event detector may process the data
to identify an
event. When an event is identified, the data analyzer may transmit an alarm to
the
fleet and mission operations controller for further review by a mission
operator.
The alarm may include information such as an identification of the event, data

associated with the event, a location of the event, etc. After reviewing the
event,
the mission manager may operate in conjunction with the fleet and mission
operations controller to generate instructions in real-time with an updated
movement plan for a UV operator.
[0111] The system and method disclosed herein may be used in a
variety of
environments and for a variety of purposes. For example, the system and method

disclosed herein may be used to monitor a pipeline in the oil and gas
industry. In
the oil and gas industry, the system and method disclosed herein may be used
in
other scenarios, including other types of exploration (e.g., site survey, site
drilling,
etc.), development (e.g., pad placement, facility rendering, capital project,
41

CA 02881744 2015-02-13
,
. i
D14-092-02678-00-US and US2
PATENT
surveillance, etc.), production (e.g., flare/vent inspection, oil sheen
detection,
disaster prevention, etc.), manufacturing (flute/chimney inspection, tank/gas
inspection, gas detection, etc.), and transportation (e.g., right of way
monitoring,
theft monitoring, etc.).
[0112] The system and method disclosed herein may be used in package
delivery (e.g., food, medicine, equipment, etc.), aerial surveillance (e.g.,
police/fire
department, cartography, photography, film, journalism, real estate, etc.),
exploration (e.g., mine detection, site survey, etc.), research (e.g.,
wildlife,
atmosphere, ocean, etc.), remote sensing (e.g., telecommunications, weather,
maritime, construction, etc.), disaster relief (e.g., survivors, explore
contaminated
areas, etc.), environment (e.g., forest fires, threats, etc.), and agriculture
(e.g.,
spray pesticides, crop growth, disease, irrigation level, wild animals, etc.).
[0113] The system and method disclosed herein may be used for
scheduling of
predictive maintenance to provide asset inspection, diagnostics, repair, and
maintenance work. Further, the system and method disclosed herein may be
used, for example, to identify and schedule environmental (e.g., terrain,
vegetation,
etc.) management. The system and method disclosed herein may also provide for
enhancements in safety and environmental protection related to the various
activities described herein. For example, with respect to the oil and gas
industry,
the system and method disclosed herein may be used to protect assets from
sabotage, illegal tapping, and terrorist actions in an efficient and
economical
manner.
42

CA 02881744 2015-02-13
,
D14-092-02678-00-US and US2
PATENT
[0114] The system and method disclosed herein may be used to analyze data
from a UV to determine tasks that may be both electronically and mechanically
automated in a workflow, and to identify insights that may be obtained from
the
data. These insights may be used to drive operational decisions, such as
shortening lead time to problem detection, or predictive maintenance with
pipelines, for example, in the oil and gas industry. The system and method
disclosed herein may provide for the reduction of exposure to hazardous
environments, increase efficiency and effectiveness with respect to UV
movement
and data control, and optimize operations.
[0115] Generally, the system and method disclosed herein may be envisioned
in a broad range of applications where drones or UVs may be used to reduce
cost,
increase safety, and increase productivity.
[0116] The system and method disclosed herein may account for aspects
related to the state of UV technology, regulation and compliance, readiness,
and
safety and privacy. With respect to UV technology, the system and method
disclosed herein may provide the hardware and software platform and setup for
UV
movement and data control. The system and method disclosed herein may also
provide for implementation of aspects such as optimal movement planning
operations and life cycle management, selection of specialized sensors, direct
data
transmission from a UV, UV infrastructure and availability management, task
distribution among multiple UVs, and reprioritization of UV objectives. With
respect
to security, safety, and regulations, the system and method disclosed herein
may
43

CA 02881744 2015-02-13
,
4 '
.
D14-092-02678-00-US and US2
PATENT
provide for constraints based on local regulations and certification, UV
certification
and operator training, requirements regarding reporting of incidents to
authorities,
obstacle avoidance, authentication and authorization of missions, ensuring
that a
mission has not been compromised or sabotaged, and protection against misuse.
The system and method disclosed herein may also provide for secure
transmission
of data from the event detector that may be implemented in a cloud
environment,
end-to-end process integration, analytics requirements based on vertical
industry,
data storage and security, defining business rules, and redefining workflows
to
incorporate use of the UVs and availability of new insights into related
processes.
[0117] For the system and method disclosed herein, the fleet and mission
operations controller may perform various tasks, such as, specification of
mission
objectives and routes, scheduling of missions, assignment of a mission
operator
and assistant, assignment of UV equipment, monitoring of missions in progress,

making adjustments to mission requirements.
[0118] For the system and method disclosed herein, the movement planning
controller may plan and execute a mission. Further, the movement planning
controller may monitor the FPV to ensure that mission objectives are being
met,
and adjust mission routes as needed.
[0119] For the system and method disclosed herein, a hardware
implemented
mission planning controller may manage, for example, a camera gimbal and a
video camera, and monitor video capture to ensure quality.
44

CA 02881744 2015-02-13
. =
D14-092-02678-00-US and US2
PATENT
[0120] According to examples disclosed herein, the UV movement and
data
control system may include a hardware implemented mission controller that is
executed by at least one hardware processor to control a UV according to a
movement plan. A hardware implemented telemetry data analyzer that is executed
by the at least one hardware processor may receive, from the mission
controller,
formatted movement and status metadata from at least one sensor of the UV
during movement of the UV according to the movement plan. The hardware
implemented telemetry data analyzer may buffer the movement and status
metadata for forwarding to a hardware implemented data integrator. According
to
an example, the movement and status metadata may include time and location
information for the UV during the movement of the UV according to the movement

plan. A hardware implemented stream data analyzer that is executed by the at
least one hardware processor may receive an unformatted data stream from the
at
least one sensor of the UV. The data integrator that is executed by the at
least one
hardware processor may inject the time and location information into metadata
of
the unformatted data stream to generate a time and location correlated (TLC)
stream. A hardware implemented TLC stream data analyzer that is executed by
the at least one hardware processor may analyze the TLC stream to identify an
event related to the UV. A hardware implemented event orchestrator that is
executed by the at least one hardware processor may generate a notification
related to the event.

CA 02881744 2015-02-13
'
D14-092-02678-00-US and US2
PATENT
[0121] According to an example, the mission controller may convert the
movement and status metadata to a predetermined message format, such as a
JSON format, for processing by the data integrator.
[0122] According to an example, the data integrator may include a
hardware
implemented time and location service analyzer that is executed by the at
least one
hardware processor to generate, based on the movement and status metadata, a
model of a state of the UV during the movement of the UV according to the
movement plan.
[0123] According to an example, the data integrator may include a
hardware
implemented stream integrator that is executed by the at least one hardware
processor to extract frames of data from the unformatted data stream, retrieve
the
time and location information from the model of the state of the UV, and
inject the
time and location information into the metadata of each of the frames of the
data
from the unformatted data stream to generate TLC frames. A collection of the
TLC
frames may represent the TLC stream.
[0124] According to an example, the stream integrator may extract a
frame of
data from the unformatted data stream, and pre-process the data from the
unformatted data stream for detecting a leak in a pipeline by passing the
frame of
the data through hue-saturation-value (HSV) based clustering to segment
environment into distinct color patches. The leak in the pipeline may be
detected
by using a transform to extract the pipeline spanning the frame.
46

CA 02881744 2015-02-13
,
=
, -
,
D14-092-02678-00-US and US2
PATENT
[0125] According to an example, the TLC stream data analyzer may
utilize a
histogram-based sliding window to identify pixels of interest in the frame of
the
data.
[0126] According to an example, the TLC stream data analyzer may
utilize Teh-
Chin chain approximation to extract a blob that represents the leak in the
pipeline.
[0127] According to an example, the TLC stream data analyzer may
utilize a
naïve-Bayes classifier to classify the blob as a leak or not a leak.
[0128] According to an example, the TLC stream data analyzer may
retain the
classification of the blob based on corroboration of the classification with a
plurality
of frames of the data including the frame of the data.
[0129] According to an example, the TLC stream data analyzer may
include a
hardware implemented stream processing unit (SPU) including a plurality of
hardware implemented event analyzers that are executed by the at least one
hardware processor to analyze the TLC stream to identify the event related to
the
UV.
[0130] The UV movement and data control system and the method for UV
movement and data control disclosed herein provide a technical solution to
technical problems related, for example, to UV movement and data control. The
system and method disclosed herein provide the technical solution of a
hardware
implemented mission controller that is executed by at least one hardware
processor to control a UV according to a movement plan. A hardware
47

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
implemented telemetry data analyzer that is executed by the at least one
hardware
processor may receive, from the mission controller, formatted movement and
status metadata from at least one sensor of the UV during movement of the UV
according to the movement plan. The hardware implemented telemetry data
analyzer may buffer the movement and status metadata for forwarding to a
hardware implemented data integrator. The movement and status metadata may
include time and location information for the UV during the movement of the UV

according to the movement plan. A hardware implemented stream data analyzer
that is executed by the at least one hardware processor may receive an
unformatted data stream from the at least one sensor of the UV. The data
integrator that is executed by the at least one hardware processor may inject
the
time and location information into metadata of the unformatted data stream to
generate a time and location correlated (TLC) stream. A hardware implemented
TLC stream data analyzer that is executed by the at least one hardware
processor
may analyze the TLC stream to identify an event related to the UV. A hardware
implemented event orchestrator that is executed by the at least one hardware
processor may generate a notification related to the event.
[0131] Figure 12 illustrates a detailed architecture of a UV movement
and data
control system 2100, according to an example of the present disclosure. The UV
movement and data control system 2100 may include a hardware implemented
mission controller 2102 that is to perform various operations related, for
example,
to mission planning, movement planning, and receiving of data from a UV 2104.
48

CA 02881744 2015-02-13
. .
D14-092-02678-00-US and US2
PATENT
The UV 2104 may include a plurality of UVs. The UV 2104 may include a sensor
2106. The sensor 2106 may include a plurality of sensors. The UV 2104 may
encompass all types of UVs, including a variety of aerial, land, space, and
marine
UVs. The UV 2104 may take off (e.g., for an aerial UV), navigate, capture
data,
transmit collected data, return, and land without human interaction.
[0132] The sensor 2106 may gather data associated with a mission.
The
sensor 2106 may include a variety of types of sensors that may be categorized
as
sight sensors, sound sensors, touch sensors, smell sensors, position sensors,
external communication sensors, and other (e.g., miscellaneous sensors). The
sight sensors may include sensors for ascertaining light intensity, color,
distance
(e.g., by infrared (IR), measuring angle of light bounce), video capture,
rotation
(e.g., optical encoders), and/or light signal read (e.g., infrared codes). The
sound
sensors may include sensors (e.g., a microphone) for ascertaining volume
(e.g.,
decibel meter), frequency measurement, and/or distance (e.g., sonar, measuring
time to echo). The touch sensors may include sensors for ascertaining position
awareness (e.g., collision alert, contact confirmation, etc.), bend/strain,
temperature, and/or pressure (e.g., barometric, grip strength, etc.). The
smell
sensors may include sensors such as gas sensors, alcohol sensors, etc. The
position sensors may include sensors (e.g., accelerometer, digital compass,
gyroscope) for ascertaining location (e.g., based on global positioning system
(GPS), proximity to a beacon, etc.), and/or tilt. The external communication
sensors may include sensors for ascertaining radio communication, and/or IR
49

CA 02881744 2015-02-13
. . ,
D14-092-02678-00-US and US2
PATENT
codes. The miscellaneous sensors may include sensors for ascertaining date and

time (e.g., ultra-low frequency (ULF) updates), network communication status,
and/or voltage (e.g., low fuel, low battery).
[0133]
The UV 2104 may also include various components for processing, and
generating outputs. For example, with respect to processing, the UV 2104 may
provide for sensor data processing for analog and digital input/output (I/O),
kinematics (e.g., position and orientation of objects), proportional-integral-
derivative
(PID) feedback control, rules application (e.g., if this, do that), navigation
(e.g.,
move to a waypoint), mission execution (e.g., manage multiple waypoints),
telemetry management (e.g., summarizing telemetry data), counter, audio/voice
processing (e.g., speech to text, text to speech), manage date/time, and data
management (e.g., memory, disk, etc.). With respect to processing, the UV 104
may provide for outputs such as movement, motors (e.g., servos, stepper,
brushless), hydraulics, pneumatics, gravity release, visual
indicators/feedback,
LEDs, LCDs, displays, audio indicators/feedback, speaker, buzzer, etc.,
voltage
change (e.g., not in use, go to low power mode), and external communication
subsystems (e.g., radio, IR codes).
[0134] In the example of Figure 12, the UV 2104 and the mission
controller
2102 may be disposed in a field (e.g., above dashed line 2108), whereas the
fleet
and mission operations controller 2110, the mission manager 2112, and the
event
detector 2116 may be hosted in an off-site facility (e.g., below the dashed
line
2108), such as a cloud environment 2118. In some examples, the cloud

CA 02881744 2015-02-13
=
. . ,
' D14-092-02678-00-US and US2
PATENT
environment 2118 may be a data center or another distributed network capable
of
processing relatively large amounts of data in real time. In other examples,
the
components of the UV movement and data control system 2100 that are located in

an off-site facility may be based, for example, on the hardware capabilities
of chips
installed on the UV 2104, a size and power associated with the UV 2104, and
processing requirements of a mission executed by the UV 2104.
[0135] With respect to the mission controller 2102, the mission
planning
controller 2120 may enable the UV 2104 to be programmed to run autonomously.
The UV 2104 may be equipped with the sensor 2106 and intelligence to maintain
altitude and a stabilized flight (e.g., for an aerial UV). The sensor 2106 may
be
used to determine the position and altitude of the UV 2104 at any given point
in
time. This enables the UV 2104 to navigate between two points according to pre-

defined waypoints, without any human interaction during the flight (e.g., for
an
aerial UV). The mission planning controller 2120 may generate a display of the
mission details that may be viewed by a UV operation crew (e.g., a pilot
and/or
assistant).
[0136] With respect to the mission controller 2102, the movement
planning
controller 2122 may be used to launch the UV 2104, and control the UV flight
path
(e.g., for an aerial UV) and associated sensors. Once the UV 2104 begins its
movement plan from the launch point, the mission planning controller 2120 may
communicate with the mission manager 2112 to indicate the beginning of the
mission. According to an example, the mission controller 2102 may be stored on
a
51

CA 02881744 2015-02-13
..
'
, .
D14-092-02678-00-US and US2
PATENT
tablet or another portable device.
[0137] The fleet and mission operations controller 2110 may
analyze a mission
request that specifies, for example, a movement time, a movement plan, and
equipment (e.g., the specific UV, sensors, and any UV operation crew). The
movement plan may include a launch point, predefined way points, alternate
rally
points, payload requirements, video or other data gathering requirements,
payload
operation instructions, and/or mission objectives.
[0138] The mission manager 2112 may maintain information
regarding UVs and
sensors in inventory. For example, the mission manager 2112 may track UVs by
type, availability, and an ability to mount particular sensors. The mission
manager
2112 may also track sensors by type, availability, and ability to be mounted
on a
particular UV. The mission manager 2112 may schedule and assign the mission.
Specifically the mission manager 2112 may assign the UV 2104 (or a plurality
of
UVs), the sensor 2106 (or a plurality of sensors), and any UV operation crew
to a
location for completing the mission request.
[0139] The mission controller 2102 may receive the mission
request from the
mission manager 2112, and operate the assigned UV 2104 according to the
movement plan. The UV 2104 may follow the movement plan autonomously or
with varying degrees of remote operator guidance from the movement planning
controller 2122 that may be operated by a UV operation crew.
[0140] With respect to the mission controller 2102 and the event
detector 2116,
data from the sensor 2106 may be received at a UV data receiver 2124, and
52

CA 02881744 2015-02-13
,
. . ,
D14-092-02678-00-US and US2
PATENT
forwarded (e.g., pushed) in real-time to the event detector 2116.
Alternatively or
additionally, data from the sensor 2106 may be communicated directly to the
event
detector 2116 based on the placement of hardware associated with the event
detector 2116 near the edge of the field (e.g., dashed line 2108) or within
the field.
[0141] Generally, the event detector 2116 may interpret the data from the
sensor 2106 in real-time to detect any events or potential problems that
warrant
further exploration. The event detector 2116 may include, for example, event
processing, video stream playback, facial recognition, blob detection, and
general
inspection for the pipeline example described herein. However, those skilled
in the
art will appreciate in view of this disclosure that the processing
capabilities of the
event detector 2116 may vary depending on the purpose of the mission and the
types of sensors that are used for the UV 2104.
[0142] If an event is detected, the event detector 2116 may
generate an alert
and forward the alert to the fleet and mission operations controller 2110.
Further,
data associated with the event may be displayed in real-time at the fleet and
mission operations controller 2110. The data associated with the event may be
analyzed by the fleet and mission operations controller 2110 and the mission
manager 2112. Based on the analysis of the data, the fleet and mission
operations
controller 2110 may operate in conjunction with the mission manager 2112 to
communicate to the mission controller 2102 a change in movement plan or other
further instructions, such as a notification that the mission is complete and
an
instruction to dismount the sensors from the UV 2104 and leave the field.
53

CA 02881744 2015-02-13
,
=
. .
' D14-092-02678-00-US and US2
PATENT
[0143] Figure 13 illustrates a logic diagram of components of UV
movement
and data control system 2100 for event detection, according to an example of
the
present disclosure.
[0144] Referring to Figures 12 and 13, the movement planning
controller 2122
may include any type of movement planner that is used to control and monitor
the
UV movement in real-time. For a UAV, a flight planner may be used to control
and
monitor the UV movement in real-time. A fleet management set of machine
readable instructions may be used to generate the movement control information

during the mission definition process, and forward the information to the
movement
planning controller 2122 that transmits the information to the UV 2104 to
commence the mission. A custom message shim may be built on top of the
movement planner to intercept the movement and status metadata from the UV
2104 via a telemetry communications link, and forward the movement and status
metadata in a predetermined message format (e.g., a JSON format) to a hardware
implemented telemetry data analyzer 2136.
[0145] The UV data receiver 2124 may include the telemetry data
analyzer
2136 and a hardware implemented stream data analyzer 2138, and is used to
receive various types of communication data from the UV 2104. The
communication data may be used, for example, by the event detector 2116 to
determine events related to an objective of the mission.
[0146] Referring to Figure 13, the telemetry data analyzer 2136
may buffer UV
movement and status messages that are sent to a hardware implemented data
54

CA 02881744 2015-02-13
, . .
' D14-092-02678-00-US and US2
PATENT
integrator 2142 of the event detector 2116. With respect to the telemetry data

analyzer 2136, messages may be sent from the movement planning controller
2122 (i.e., of the mission controller 2102) to the telemetry data analyzer
2136 over
a Transmission Control Protocol/Internet Protocol (TCP/IP) port on a localhost
interface. Further, the telemetry data analyzer 2136 may forward the formatted
messages to the data integrator 2142, for example, over TCP/IP. The telemetry
data analyzer 2136 may be used to buffer the movement and status messages so
the communications latency from sending messages to the data integrator 2142
does not slow down or affect operation of the movement planning controller
2122.
[0147] Referring to Figure 13, the stream data analyzer 2138 may receive
data
streams from multiple data sources (e.g., video, thermal, near infrared (NIR),

multispectral, etc.), and forward the data streams to the data integrator 2142
for
pre-processing (i.e., synchronization by frames tagged with time and location)
and
further analytics processing. The stream data analyzer 2138 may communicate
with the data integrator 2142 over TCP/IP. With respect to the real-time
streaming
of the data streams from multiple data sources, the data streams may be
processed in real-time, any alerts related to the movement of the UV 2104 may
be
generated in real-time, and the movement plan of the UV 2104 may also be
modified as needed in real-time.
[0148] Referring to Figure 13, the data integrator 2142, which may be
implemented as a component of the event detector 2116 or separately from the
event detector 2116, may include a hardware implemented time and location

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
service (TLS) analyzer 2144 and a hardware implemented stream integrator 2146.

Generally, the data integrator 2142 may receive streams of data, and inject
time
and location information into the metadata of the streams (i.e., to generate a
time
and location correlated stream) before forwarding the time and location
correlated
stream to the time and location correlated (TLC) stream data analyzer 2148 for
processing. Context may be established using various different streams of data

(e.g., video, audio, information tags that may be extracted from other
external
sources, etc.), and a port may be maintained to monitor receiving streams.
Time
and location metadata may be extracted from the flight log for a UAV, and each
stream may be separated into units (frame for video, segments by time for
audio,
etc.), and synchronized by time and location upon collection and receipt.
[0149] The TLS analyzer 2144 of the data integrator 2142 may receive
status
messages (e.g., JSON status messages) from the telemetry data analyzer 2136,
and maintain a model of a current state of the UV 2104 while the UV 2104 is
performing a mission (e.g., in flight for a UAV). According to an example, the
model for the TLS analyzer 2144 may include information that includes time and

location. For example, with respect to the model for the TLS analyzer 2144,
the
time and location data may be captured and injected into each of the data
streams.
For example, in the case of video, the time and location data may be captured
and
injected into each frame of the video data stream. The time and location may
be
retrieved from GPS information received on the UV 2104. The time and location
information may be available via a TCP/IP interface. The TCP/IP interface may
56

CA 02881744 2015-02-13
,
= . ,
D14-092-02678-00-US and US2
PATENT
operate as a request-based interface where a user may open a port to read a
line
of text, and then close the port after receiving the message. According to an
example, the JSON status message may include a JSON format as follows:
{ "tls" : [ "time" : "HH:MM:SS", "longitude" : "32.99999", "latitude" : "-
123.9999999", "altitude" : "10.31 }
[0150] The TLS analyzer 2144 may report the current known time and
location
for the UV 2104 for each request.
[0151] The stream integrator 2146 of the data integrator 2142 may
receive a
stream of data from the stream data analyzer 2138. The stream integrator 2146
may publish a port for the stream data analyzer 2138 to use in order to push
the
processed stream data to the stream integrator 2146. The stream integrator
2146
may extract frames of data from the stream, retrieve time and location
information
from the TLS analyzer 2144 to insert into the each frame's metadata, creating
time
and location correlated frames. A time and location correlated frame may
represent a header packet associated with each packet of data which has the
metadata of time and location attached to it, with the associated information
being
inserted into each packet. The stream integrator 2146 may publish the
collection of
time and location correlated frames as a time and location correlated (TLC)
stream
or TLC stream to the TLC stream data analyzer 2148. For example, if the source
data is a MP4 video stream, then the associated MP4 video frame may include
metadata which has time, longitude, latitude, and altitude.
[0152] Referring to Figure 13, the TLC stream data analyzer 2148
of the event
57

CA 02881744 2015-02-13
"
, = .
D14-092-02678-00-US and US2
PATENT
detector 2116 may process the data to identify an event (e.g., a leak, a blob,
etc.).
When an event is identified, the TLC stream data analyzer 2148 may transmit an

alarm to the fleet and mission operations controller 2110 for further review
by a
mission operator. The alarm may include information such as an identification
of
the event, data associated with the event, a location of the event, etc. After
reviewing the event, the mission manager 2112 may operate in conjunction with
the fleet and mission operations controller 2110 to generate instructions in
real-
time with an updated movement plan for a UV operator.
[0153]
The TLC stream data analyzer 2148 may be organized as a collection of
stream processing units (SPUs) 2150 each assigned to one or more event
analyzers and including access to an associative memory that is shared by all
SPUs and an event orchestrator 2152. Event analyzers may generally include any

type of analyzer to search for an event in a source data stream based on
application needs. For example, in the area of oil and gas, an event analyzer
may
include a leak detection event analyzer, an intruder detection event analyzer,
etc.
The TLC stream data analyzer 2148 may receive a TLC stream, place the TLC
stream into its associative memory, and start the SPUs 2150. Each SPU may
execute all of their event analyzers on the TLC stream, and forward an event
to the
event orchestrator 2152 for all features of interest found in the TLC stream.
In this
regard, an event may be described as an item of interest, where an alert may
be
generated based on the presence of an event. Further, a feature may be
descried
as an attribute, where a plurality of attributes, when correlated together,
may
58

CA 02881744 2015-02-13
=
D14-092-02678-00-US and US2
PATENT
function as indicators of the occurrence of an event.
[0154] Each SPU may be assigned to one or more event analyzers.
Each SPU
may have access to an associative memory including one or more TLC streams.
When a SPU receives a command from the TLC stream data analyzer 2148 to
process a TLC stream, the SPU may execute instances of each event analyzer on
the assigned TLC stream, and send any results obtained to the event
orchestrator
2152.
[0155] With respect to memory organization, for the TLC stream
data analyzer
2148, the operating memory that manages the TLC streams may be implemented,
for example, in a two-tier structure. The two-tier structure of the operating
memory
may facilitate scaling with respect to data analysis. For example, based on
the
two-tier structure of the operating memory, a plurality of event analyzers may
be
used to analyze the same data source simultaneously. According to an example,
Tier 1 may include a file system to store all TLC streams. Tier 2 may include
an
associative memory that is accessible by all SPUs. The TLC stream data
analyzer
148 may load the TLC streams into Tier 2 when the SPUs are to process the
streams. When the TLC stream data analyzer 2148 determines that all SPUs have
processed a stream, the TLC stream data analyzer 148 may remove the stream
from the Tier 2 Memory. Event analyzers may also be stored in Tier 2.
[0156] With respect to event analyzer management, event analyzers may be
stored in both tiers of memory. For example, event analyzers may be stored in
Tier
1 and organized as needed. According to an example, active event analyzers may
59

CA 02881744 2015-02-13
o, .= .
D14-092-02678-00-US and US2
PATENT
be stored in Tier 2 (accessible by SPUs). Examples of active event analyzers
may
include event analyzers related to intruder detection, leak detection, etc.
Generally, active event analyzers may include event analyzers related to
events
that a UV operator would like to detect. Event analyzers may be written in any
language that may execute on a SPU with the requirement that the event
analyzer
can access the associative memory and send the correct events to the event
orchestrator 2152.
[0157] The event orchestrator 2152 may receive event messages
from the
SPUs of the TLC stream data analyzer 2148. The event orchestrator 2152 may
format the event messages as needed before forwarding the event messages to
preconfigured endpoints. For example, the event orchestrator 2152 may receive
the event messages and cross-correlate the event messages with other data in
order to generate higher level event messages that need further investigation
and
response. Examples of higher level event messages may include an alarm to a
mission operator, and/or a message forwarded to a mission operator for further
follow-up or action, and/or the creation of a workflow to deploy maintenance
crew,
etc. The event messages may be JSON formatted, and may be sent to http-
compliant endpoints.
[0158] An example implementation of the system 2100 for leak
detection (e.g.,
for a pipeline leak) is described with reference to Figures 12 and 13.
[0159] At the outset, based on the launch of the UV 2104, and
control and
monitoring by the movement planning controller 2122 of the mission controller

CA 02881744 2015-02-13
=
D14-092-02678-00-US and US2
PATENT
2102, with respect to pre-processing, using openCV, a streaming video (e.g.,
30
fps) may be buffered by the telemetry data analyzer 2136 and a visual frame
may
be extracted for every second and stored using a matrix container. Each frame
that is forwarded to the telemetry data analyzer 2136 may be buffered for a
bit
(e.g., to generate a continuous stream), and each frame may be stored within
the
matrix container. In this case, each pixel may be stored within the matrix
container
by its color (hue-saturation-value (HSV) value) and position. For example, a
matrix
container may be formatted as follows:
[(1,212,0) (5,212,0) ... (8,121,33)] n pixels
[(33,225,0) (12,222,3) ... (2, 151, 43)] n pixels
[(21,12,0) (69,52,14) ... (52, 11,33)] n pixels.
[0160] The initial frame may be passed, for example, through an HSV
based k-
means clustering to segment the environment into distinct color patches. For
each
similarly colored patch, the mean HSV value may be determined, and the entire
patch may be converted to that color, which may facilitate reduction of the
color
variation. Other clustering techniques, such as hierarchical clustering,
agglomerative clustering, and other such techniques may be similarly applied.
The
HSV based k-means clustering may represent an iterative refinement approach to

partition n pixels into k clusters based on HSV values. For leak detection
applications traceable to a defined source of leakage, further filtering may
be
performed on the k-means clustered frame. For example, in order to target leak
detection from a pipeline, Hough Line Transform (HLT) may be performed to
61

CA 02881744 2015-02-13
. ,
D14-092-02678-00-US and US2
PATENT
extract out the pipeline spanning the entire frame, as well as the surrounding

clustered patches that define the environment encompassing both the source and

the drain of the leakage. Other techniques, such as classifiers may be applied
to
extract out the pipeline spanning the entire frame. Using the extracted color
patches, the position of the color patches relative to the entire frame may be
set as
the bounded search space for the subsequent steps. Considering the
computational complexities of the above clustering and filtering, the
clustering and
filtering may be performed in an initial set of frames to establish a baseline
search
area within the future frames. This aspect of establishing a baseline search
area
within the future frames may provide for reduction of the computational
complexity
of the above clustering and filtering to limit the search space to areas close
to the
pipeline which are more likely to include leaks.
[0161] Each extracted visual frame may be analyzed by using a
histogram-
based sliding window that focuses on the occurrence frequency of quantized
distribution of component HSV indices. To reduce the computational complexity
of
evaluating the objective function on the slider over the entire frame (i.e.,
(0(n2(r2 +
H)) given an x n frame, ar x r window, and a histogram of dimension H without
optimization), the slider may be set to traverse through the bounded search
space
(i.e., truncated n) defined in the pre-processing stage. Within each window
that
slides through the entire frame, a histogram of all of the detected pixels and
their
frequencies may be determine to provide a break down by pixel HSV colors.
Depending on the nature of the operation, in order to reduce computational
62

CA 02881744 2015-02-13
=
, =
D14-092-02678-00-US and US2
PATENT
intensity and analysis of the entire frame, the space needed to apply the
sliding
window may be truncated beforehand. For example, based on the assumption that
the regions of interest that are to be detected (e.g., for a leak) will be
beside a
pipeline, other areas that are further away from the pipeline may be truncated
and
not searched. For the leak material that is being detected, the histograms for
a set
of training data may be created and normalized. Each window may pick up pixels

with a mixture of HSV values (colors). A set of training data (of features
that are to
be picked up, and their own HSV distribution given the same size window), and
a
mean of their distribution may be used for comparison against each sliding
window's HSV distribution. In this regard, an indicator such as 'positive' may
be
used to indicate training data that represents what is to be labeled as yes,
or
classified as true. For example, a leak may include a high frequency of black,
dark
gray, and brown (with more granularity), and each sliding window's color
frequency
distribution may be correlated with the training data, meaning that if the
sliding
window's HSV distribution is inserted instead of the training data, a
determination
may be made as to whether the distribution is skewed by a certain threshold.
If
each sliding window's color frequency distribution is indeed correlated with
the
training data (i.e., each sliding window's color frequency distribution is
similar to the
training data), the entire sliding window may be highlighted as a potential
patch for
further analysis. The entire window may be further assigned to a particular
color /
HSV value that is not in the frame, such that further analysis may include
determining patches with that particular color / HSV value to narrow the
search
63

CA 02881744 2015-02-13
. ,
,
D14-092-02678-00-US and US2
PATENT
space. The bounded search space may be divided into slots (e.g., windows),
converted to normalized histograms, and compared to those of the training data

using a correlation comparator. An example of the correlation comparator may
include CV_COMP_CORREL implemented in openCV's
compareHist(h1,h2,CV_COMP_CORREL) process. Slots passing the similarities
threshold may be color-coded with a distinct HSV value outside the HSV domain
of
the frame.
[0162] The color-coded frame may be passed through a binary
threshold to
extract a binary image including only the pixels of interest identified by the
histogram-based sliding window. That is, the frame (RGB, color) may be passed
through a binary threshold to keep a range of HSV values and filter out the
remaining values. For example, an oil leak may be a gradient of black and
gray, so
that pixels within these HSV values may be retained and other pixels may be
filtered out. The sliding window may be used to analyze the entire image, and
if
there are sufficient pixels in the sliding window that are within an HSV of
interest,
then the entire window may be retained and set to a predetermined color (e.g.,

either black or white). The binary image may be passed through the openCV
findContour technique that includes, for example, the TEH-CHIN chain
approximation event analyzer to extract out the blobs that may represent
potential
leaks. The binary image (e.g., assuming all of the pixels that are black and
gray,
potentially representing the oil leak, remain), may be put through the openCV
findContour technique to locate potential contours (i.e., continuation of
pixels that
64

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
are the same HSV value). The openCV findContour technique may identify
potential outlines within the binary image, and those with close enough
circularity,
size (depending on the size of a leak) etc., to recognize contours from just
random
lines. The TEH-CHIN chain approximation technique may include an
approximation for contour lines. Other techniques may include
CV CHAIN_APPROX SIMPLE that compresses horitzontal, vertical, and diagonal
segments, and leave only their end points. The positions of the centroids of
each
blob as well as the parameters of the boundaries may be identified. After
applying
the openCV findContour technique, certain shapes may be identified and
filtered
out. In this case, shapes with contours like an oil patch and within each
patch may
be highlighted, their centroid (i.e., approximate center) position may be
identified,
and the size of the box that bounds the whole patch may be obtained and
stored,
representing a potential leak patch.
[0163] Each of the extracted blobs (i.e., datum) may be passed through
a
naïve-Bayes classifier that models a joint distribution over a label Y = MO},
and a
set of conditionally independent features F = {F1, F2, , Fia such as the
geometry
(e.g., area, circularity, etc.), presence of secondary bounding contours,
contextual
features (e.g., distance to the pipeline), given the assumption of conditional

independence as follows:
= P (10 ni P (FilY) Equation (1)
Each datum may be classified by finding the most probable label given the
feature
values for each pixel using, for example, Bayes Theorem as follows:

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
P(Y) P(filY)
P (Ylfl, fn) =
Equation (2)
= ar gmaxy (P W[113711=1 (fiiY))
argmaxyP I ft, fri)
P(f1,...,fn) )
=(argMaXy (P (37) 1 I P ft IY))
t=1
Each datum (e.g., a potential patch / blob) that needs to be classified may be
put
through a classifier. Each pixel within the patch may represent a feature, and
the
presence of each feature may provide confidence towards each of the labels
(e.g.,
a leak or not a leak). To overcome the issue of underflow, the log
probabilities may
be used as follows:
argmaxy log P(yIL, fn) = argmaxy log P(y)
P (fily) = argmaxy[log(P(y) +
Eril=i log P (LIM]
Equation (3)
Underflow may occur when the value (number) that is being stored becomes so
small that precision is lost during representation and storage of the value in
memory (in this case, multiplication of small probabilities together may
result in a
relatively small value). Laplace smoothing may be used to reduce under-fitting

when using empirical observations. The optimal k value for the smoothing may
be
determined during a validation step that classifies all of the extracted blobs
into
either a leak or not a leak.
[0164] In order to further reduce the probability of picking up noise,
potential
leak patches that consistently appear in the subsequent frames may be
retained.
In this regard, when a potential patch has been identified, the same patch may
be
66

CA 02881744 2015-02-13
= ,
..
D14-092-02678-00-US and US2
PATENT
searched for in the next subsequent frames (until the patch is out of the
field of
view by taking into account the speed of the UV). Further, the more times the
patch appears, the higher the confidence that it is either a leak or not a
leak. In
order to consider the constant velocity movement of the camera, a KALMAN
filter
may be used. For the KALMAN filter, with respect to elimination of the static
components, foreground extraction (or background filtering) may be omitted.
The
KF prediction step at each frame may provide an expected region where the
detected patch in the current frame should reside in the next frame. The KF
may
take into account prior information (in this case, the locations of the
detected
patches), and predict the likely future locations (each time, the KF may
predict the
locations of the patches in the subsequent frames).
[0165] As a first step, with respect to a hash table
implementation for the multi-
frame tracking and matching, when the first potential leak is detected, a key-
value
pair < bloblD , ffpixelPos0,bool inFrame,bool isTracking, confidence) > for
each
detected leak may be inserted into the hash table (0(1)), where inFrame is 1
if the
leak is found in the frame, else 0, and isTracking is 1 if the leak is still
being tracked
and matched until its KALMAN filter expected position is out of the dimension
of the
frame (no longer in FOV), else 0. Each detected blob may be stored in a hash
table to track whether it is in frame, and being tracked (i.e., stop tracking
when out
of frame). The KF may predict the detected blob's subsequent position, so that
when the KF predicts that the blob will be out of frame, tracking may be
terminated.
[0166] As a second step, for each of the potential leaks in the
hash table, the
67

CA 02881744 2015-02-13
. .
;
D14-092-02678-00-US and US2
PATENT
KALMAN filter may predict their expected pixel positions in the next frame.
[0167] As a third step, in the next frame, for each of the
potential leaks, if the
potential leak is a new potential leak that is not in the hash table, the
potential leak
may be inserted into the hash table. Further, for each of the potential leaks,
if the
potential leak is in one of the expected positions predicted by the KALMAN
filter in
the previous frame, the matching leak may be located in the hash table, and a
new
pixel position with increased confidence may be inserted into the value for
that
entry < bloblD, tpixelPosi, pixelPos2), 1,1, confidence + +>. For leaks in the
hash
table that are not updated in this iteration, a KALMAN filter predicted
position may
be inserted into the entry but set inFrame as 0 as follows:
< blobl D, fpixelPosi, tmpK FPixelPos2), 0,1, confidence >.
[0168] The second and third steps may be repeated, and if an
entry is not
updated in the previous iteration (inFrame = 0), and is not updated again, the
entry
(0(1)) may be deleted. In this case, for example, in frame-1, if a blob is
detected at
(x1 ,y1), and in frame-2 the KF predicts that the blob will be at (x2, y2),
then further
search for the blob may continue around (x2, y2) so that the search space may
be
truncated, thereby reducing computational complexity and power related to blob

detection. If the blob is within the region, the location in the hash for that
blob may
be updated with increased confidence. If the blob is not located, the new
predicted
location (since sometimes the classifier did not pass it based on imaging
conditions
such as lighting, angle, etc.) may be updated for that blob in the hash table,
but set
inFrame = 0 to indicate that the blob is not located. If an entry is not
updated in the
68

CA 02881744 2015-02-13
=
, .
D14-092-02678-00-US and US2
PATENT
previous iteration (inFrame = 0), and is not updated again, the entry may be
removed (indicating this is not a blob).
[0169] After the potential leak is out of the FOV, the corresponding
confidence
value in the hash table may represent its consistency of being tracked
(passing all
aforementioned steps) across frames, translating to the confidence of the
potential
leak being an actual leak.
[0170] Figures 14 and 15 illustrate flowcharts of methods 2300 and 2400
for UV
movement and data control, according to examples. The methods 2300 and 2400
may be implemented on the UV movement and data control system 2100
described above with reference to Figures 12 and 13 by way of example and not
limitation. The methods 300 and 400 may be practiced in other systems.
[0171] Referring to Figures 12 and 14, at block 2302, the method 2300
may
include receiving, at a hardware implemented telemetry data analyzer that is
executed by at least one hardware processor, formatted movement and status
metadata from at least one sensor of a UV during movement of the UV. The
movement and status metadata may include time and location information for the

UV during the movement of the UV. For example, referring to Figures 12, 13,
and
16, the hardware implemented telemetry data analyzer 2135 that is executed by
at
least one hardware processor (e.g., the hardware processor 2502) may receive
formatted movement and status metadata from at least one sensor 2106 of the UV

2104 during movement of the UV.
69

CA 02881744 2015-02-13
,
D14-092-02678-00-US and US2
PATENT
[0172] At block 2304, the method 2300 may include receiving, at a
hardware
implemented stream data analyzer that is executed by the at least one hardware

processor, an unformatted data stream from the at least one sensor of the UV.
For
example, referring to Figures 12, 13, and 16, the hardware implemented stream
data analyzer 2138 may receive an unformatted data stream from the at least
one
sensor 2106 of the UV 2104.
[0173] At block 2306, the method 2300 may include injecting, by a
hardware
implemented data integrator that is executed by the at least one hardware
processor, the time and location information into metadata of the unformatted
data
stream to generate a TLC stream. For example, referring to Figures 12, 13, and
16, the hardware implemented data integrator 2142 may inject the time and
location information into metadata of the unformatted data stream to generate
a
TLC stream.
[0174] At block 308, the method 2300 may include analyzing, by a
hardware
implemented TLC stream data analyzer that is executed by the at least one
hardware processor, the TLC stream to identify an event related to the UV. For

example, referring to Figures 12, 13, and 16, the hardware implemented TLC
stream data analyzer 2148 may analyze the TLC stream to identify an event
related to the UV 2104.
[0175] Referring to Figures 12 and 15, at block 2402, the method 2400 may
include receiving, at a hardware implemented telemetry data analyzer that is
executed by at least one hardware processor, movement and status metadata from

CA 02881744 2015-02-13
=
. .
1:
D14-092-02678-00-US and US2
PATENT
at least one sensor of a UV during movement of the UV. The movement and
status metadata may include time and location information for the UV during
the
movement of the UV. For example, referring to Figures 12, 13, and 16, the
hardware implemented telemetry data analyzer 2135 may receive movement and
status metadata (e.g., JSON format movement and status metadata) from at least
one sensor 2106 of a UV 2104 during movement of the UV 2104.
[0176] At block 2404, the method 2400 may include receiving, at
a hardware
implemented stream data analyzer that is executed by the at least one hardware

processor, an unformatted data stream from the at least one sensor of the UV.
For
example, referring to Figures 12, 13, and 16, the hardware implemented stream
data analyzer 2138 may receive an unformatted data stream from the at least
one
sensor 2106 of the UV 2104.
[0177] At block 2406, the method 2400 may include injecting, by
a hardware
implemented data integrator that is executed by the at least one hardware
processor, the time and location information into metadata of the unformatted
data
stream to generate a TLC stream. For example, referring to Figures 12, 13, and

16, the hardware implemented data integrator 2142 may inject the time and
location information into metadata of the unformatted data stream to generate
a
TLC stream.
[0178] At block 2408, the method 2400 may include analyzing, by a hardware
implemented TLC stream data analyzer that is executed by the at least one
hardware processor, the TLC stream to identify an event related to the UV. For
71

CA 02881744 2015-02-13
. . ' .
D14-092-02678-00-US and US2
PATENT
example, referring to Figures 12, 13, and 16, the hardware implemented TLC
stream data analyzer 2148 may analyze the TLC stream to identify an event
related to the UV.
[0179] At block 2410, the method 2400 may include generating, at a
hardware
implemented event orchestrator that is executed by the at least one hardware
processor, a notification related to the event. For example, referring to
Figures 12,
13, and 16, the hardware implemented event orchestrator 2152 may generate a
notification related to the event.
[0180] Figure 16 shows a computer system 2500 that may be used with
the
examples described herein. The computer system may represent a generic
platform that includes components that may be in a server or another computer
system. The computer system 2500 may be used as a platform for the system
2100. The computer system 2500 may execute, by a processor (e.g., a single or
multiple processors) or other hardware processing circuit, the methods,
functions
and other processes described herein. These methods, functions and other
processes may be embodied as machine readable instructions stored on a
computer readable medium, which may be non-transitory, such as hardware
storage devices (e.g., RAM (random access memory), ROM (read only memory),
EPROM (erasable, programmable ROM), EEPROM (electrically erasable,
programmable ROM), hard drives, and flash memory).
[0181] The computer system 2500 may include a processor 2502 that
may
implement or execute machine readable instructions performing some or all of
the
72

CA 02881744 2015-02-13
= .
:
D14-092-02678-00-US and US2
PATENT
methods, functions and other processes described herein. Commands and data
from the processor 2502 may be communicated over a communication bus 2504.
The computer system may also include a main memory 2506, such as a random
access memory (RAM), where the machine readable instructions and data for the
processor 2502 may reside during runtime, and a secondary data storage 2508,
which may be non-volatile and stores machine readable instructions and data.
The
memory and data storage are examples of computer readable mediums. The
memory 2506 may include a UV movement and data controller 2520 including
machine readable instructions residing in the memory 2506 during runtime and
executed by the processor 2502. The UV movement and data controller 2520 may
include the elements of the system 2100 shown in Figure 12.
[0182]
The computer system 2500 may include an I/O device 2510, such as a
keyboard, a mouse, a display, etc. The computer system may include a network
interface 2512 for connecting to a network. Other known electronic components
may be added or substituted in the computer system.
Clause 1. A unmanned vehicle (UV) movement and data control
system
cornprising:
a hardware implemented mission controller, executed by at least one
hardware processor, to control a UV according to a movement plan;
a hardware implemented telemetry data analyzer, executed by the at least
one hardware processor, to
73

CA 02881744 2015-02-13
,
. .
D14-092-02678-00-US and US2
PATENT
receive, from the mission controller, formatted movement and status
metadata from at least one sensor of the UV during movement of the UV
according
to the movement plan, and
buffer the movement and status metadata for forwarding to a
hardware implemented data integrator, wherein the movement and status metadata
includes time and location information for the UV during the movement of the
UV
according to the movement plan;
a hardware implemented stream data analyzer, executed by the at least one
hardware processor, to receive an unformatted data stream from the at least
one
sensor of the UV, wherein the data integrator is executed by the at least one
hardware processor to inject the time and location information into metadata
of the
unformatted data stream to generate a time and location correlated (TLC)
stream;
a hardware implemented TLC stream data analyzer, executed by the at
least one hardware processor, to analyze the TLC stream to identify an event
related to the UV; and
a hardware implemented event orchestrator, executed by the at least one
hardware processor, to generate a notification related to the event.
Clause 2. The UV movement and data control system according to
clause 1,
wherein the mission controller is to convert the movement and status metadata
to a
JavaScript Object Notation (JSON) format for processing by the data
integrator.
74

CA 02881744 2015-02-13
. =
D14-092-02678-00-US and US2
PATENT
Clause 3. The UV movement and data control system according to clause 1,
wherein the at least one sensor includes a video camera, a gas detector, an
infrared (IR) camera, and a pressure sensor.
Clause 4. The UV movement and data control system according to clause 1,
wherein the data integrator comprises:
a hardware implemented time and location service analyzer, executed by
the at least one hardware processor, to generate, based on the movement and
status metadata, a model of a state of the UV during the movement of the UV
according to the movement plan.
Clause 5. The UV movement and data control system according to clause 4,
wherein the data integrator comprises:
a hardware implemented stream integrator, executed by the at least one
hardware processor, to:
extract frames of data from the unformatted data stream;
retrieve the time and location information from the model of the state
of the UV; and
inject the time and location information into the metadata of each of
the frames of the data from the unformatted data stream to generate TLC
frames,
wherein a collection of the TLC frames represents the TLC stream.

CA 02881744 2015-02-13
D14-092-02678-00-US and US2
PATENT
Clause 6. The UV movement and data control system according to clause 4,
wherein the data integrator comprises:
a hardware implemented stream integrator, executed by the at least one
hardware processor, to:
extract a frame of data from the unformatted data stream; and
pre-process the data from the unformatted data stream for detecting
a leak in a pipeline by passing the frame of the data through hue-saturation-
value
(HSV) based clustering to segment environment into distinct color patches,
wherein
the leak in the pipeline is detected by using a transform to extract the
pipeline
spanning the frame.
Clause 7. The UV movement and data control system according to clause 6,
wherein the TLC stream data analyzer is to utilize a histogram-based sliding
window to identify pixels of interest in the frame of the data.
Clause 8. The UV movement and data control system according to clause 6,
wherein the TLC stream data analyzer is to utilize Teh-Chin chain
approximation to
extract a blob that represents the leak in the pipeline.
Clause 9. The UV movement and data control system according to clause 8,
wherein the TLC stream data analyzer is to utilize a naïve-Bayes classifier to

classify the blob as a leak or not a leak.
76

CA 02881744 2015-02-13
=
D14-092-02678-00-US and US2
PATENT
Clause 10. The UV movement and data control system according to clause 9,
wherein the TLC stream data analyzer is to retain the classification of the
blob
based on corroboration of the classification with a plurality of frames of the
data
including the frame of the data.
Clause 11. The UV movement and data control system according to clause 1,
wherein the TLC stream data analyzer further comprises:
a hardware implemented stream processing unit (SPU) including a plurality
of hardware implemented event analyzers, executed by the at least one hardware
processor, to analyze the TLC stream to identify the event related to the UV.
Clause 12. The UV movement and data control system according to clause 1,
wherein the UV is an autonomous UV.
Clause 13. The UV movement and data control system according to clause 1,
wherein the UV is an unmanned aerial vehicle (UAV).
Clause 14. The UV movement and data control system according to clause 1,
wherein the TLC stream data analyzer is to analyze the TLC stream to identify
the
event that includes a potential leak or an intruder related to a pipeline.
77

CA 02881744 2015-02-13
,
. ,
D14-092-02678-00-US and US2
PATENT
Clause 15. A method for unmanned vehicle (UV) movement and data control, the
method comprising:
receiving, at a hardware implemented telemetry data analyzer that is
executed by at least one hardware processor, formatted movement and status
metadata from at least one sensor of a UV during movement of the UV, wherein
the movement and status metadata includes time and location information for
the
UV during the movement of the UV;
receiving, at a hardware implemented stream data analyzer that is executed
by the at least one hardware processor, an unformatted data stream from the at
least one sensor of the UV;
injecting, by a hardware implemented data integrator that is executed by the
at least one hardware processor, the time and location information into
metadata of
the unformatted data stream to generate a time and location correlated (TLC)
stream; and
analyzing, by a hardware implemented TLC stream data analyzer that is
executed by the at least one hardware processor, the TLC stream to identify an

event related to the UV.
Clause 16. The method for UV movement and data control according to clause
15, further comprising:
78

CA 02881744 2015-02-13
,
=
. =
D14-092-02678-00-US and US2
PATENT
generating, by a hardware implemented time and location service analyzer
that is executed by the at least one hardware processor, based on the movement

and status metadata, a model of a state of the UV during the movement of the
UV.
Clause 17. The method for UV movement and data control according to clause
16, further comprising:
extracting, by a hardware implemented stream integrator that is executed by
the at least one hardware processor, a frame of data from the unformatted data

stream; and
pre-processing, by the stream integrator, the data from the unformatted data
stream to detect a leak in a pipeline by passing the frame of the data through
hue-
saturation-value (HSV) based clustering to segment environment into distinct
color
patches, wherein the leak in the pipeline is detected by using a transform to
extract
the pipeline spanning the frame.
Clause 18. The method for UV movement and data control according to clause
16, further comprising:
utilizing, by the TLC stream data analyzer, Teh-Chin chain approximation to
extract a blob that represents the leak in the pipeline.
Clause 19. The method for UV movement and data control according to clause
16, further comprising:
79

CA 02881744 2015-02-13
'
. , .
D14-092-02678-00-US and US2
PATENT
utilizing, by the TLC stream data analyzer, a naïve-Bayes classifier to
classify the blob as a leak or not a leak.
Clause 20. A non-transitory computer readable medium having stored thereon
machine readable instructions for UV movement and data control, the machine
readable instructions when executed cause at least one hardware processor to:
receive, at a hardware implemented telemetry data analyzer that is executed
by at least one hardware processor, movement and status metadata from at least

one sensor of a UV during movement of the UV, wherein the movement and status
metadata includes time and location information for the UV during the movement
of
the UV;
receive, at a hardware implemented stream data analyzer that is executed
by the at least one hardware processor, an unformatted data stream from the at

least one sensor of the UV;
inject, by a hardware implemented data integrator that is executed by the at
least one hardware processor, the time and location information into metadata
of
the unformatted data stream to generate a time and location correlated (TLC)
stream;
analyze, by a hardware implemented TLC stream data analyzer that is
executed by the at least one hardware processor, the TLC stream to identify an
event related to the UV; and

, CA 02881744 2015-02-13
4 ,
.. .
D14-092-02678-00-US and US2
PATENT
generate, at a hardware implemented event orchestrator that is executed by
the at least one hardware processor, a notification related to the event.
[0183]
What has been described and illustrated herein is an example along with
some of its variations. The terms, descriptions and figures used herein are
set
forth by way of illustration only and are not meant as limitations. Many
variations
are possible within the spirit and scope of the subject matter, which is
intended to
be defined by the following claims -- and their equivalents -- in which all
terms are
meant in their broadest reasonable sense unless otherwise indicated.
81

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date 2017-03-07
(22) Filed 2015-02-13
Examination Requested 2015-02-13
(41) Open to Public Inspection 2015-08-14
(45) Issued 2017-03-07

Abandonment History

There is no abandonment history.

Maintenance Fee

Last Payment of $210.51 was received on 2023-12-06


 Upcoming maintenance fee amounts

Description Date Amount
Next Payment if small entity fee 2025-02-13 $125.00
Next Payment if standard fee 2025-02-13 $347.00

Note : If the full payment has not been received on or before the date indicated, a further fee may be required which may be one of the following

  • the reinstatement fee;
  • the late payment fee; or
  • additional fee to reverse deemed expiry.

Patent fees are adjusted on the 1st of January every year. The amounts above are the current amounts if received by December 31 of the current year.
Please refer to the CIPO Patent Fees web page to see all current fee amounts.

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Request for Examination $800.00 2015-02-13
Application Fee $400.00 2015-02-13
Maintenance Fee - Application - New Act 2 2017-02-13 $100.00 2016-12-08
Final Fee $366.00 2017-01-18
Maintenance Fee - Patent - New Act 3 2018-02-13 $100.00 2018-01-24
Maintenance Fee - Patent - New Act 4 2019-02-13 $100.00 2019-01-23
Maintenance Fee - Patent - New Act 5 2020-02-13 $200.00 2020-01-22
Maintenance Fee - Patent - New Act 6 2021-02-15 $200.00 2020-12-22
Maintenance Fee - Patent - New Act 7 2022-02-14 $204.00 2021-12-22
Maintenance Fee - Patent - New Act 8 2023-02-13 $203.59 2022-12-14
Maintenance Fee - Patent - New Act 9 2024-02-13 $210.51 2023-12-06
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
ACCENTURE GLOBAL SERVICES LIMITED
Past Owners on Record
None
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Abstract 2015-02-13 1 22
Description 2015-02-13 81 3,123
Claims 2015-02-13 9 243
Drawings 2015-02-13 18 289
Representative Drawing 2015-07-17 1 8
Cover Page 2015-08-24 2 50
Description 2016-07-14 84 3,235
Claims 2016-07-14 9 258
Representative Drawing 2017-02-08 1 8
Cover Page 2017-02-08 2 51
Amendment 2016-07-14 17 596
Assignment 2015-02-13 3 112
Examiner Requisition 2016-01-15 4 244
Final Fee 2017-01-18 1 51