Sélection de la langue

Search

Sommaire du brevet 3233393 

Énoncé de désistement de responsabilité concernant l'information provenant de tiers

Une partie des informations de ce site Web a été fournie par des sources externes. Le gouvernement du Canada n'assume aucune responsabilité concernant la précision, l'actualité ou la fiabilité des informations fournies par les sources externes. Les utilisateurs qui désirent employer cette information devraient consulter directement la source des informations. Le contenu fourni par les sources externes n'est pas assujetti aux exigences sur les langues officielles, la protection des renseignements personnels et l'accessibilité.

Disponibilité de l'Abrégé et des Revendications

L'apparition de différences dans le texte et l'image des Revendications et de l'Abrégé dépend du moment auquel le document est publié. Les textes des Revendications et de l'Abrégé sont affichés :

  • lorsque la demande peut être examinée par le public;
  • lorsque le brevet est émis (délivrance).
(12) Demande de brevet: (11) CA 3233393
(54) Titre français: EVITEMENT D'OBSTRUCTION
(54) Titre anglais: OBSTRUCTION AVOIDANCE
Statut: Entrée dans la phase nationale
Données bibliographiques
(51) Classification internationale des brevets (CIB):
  • A01B 63/02 (2006.01)
  • B60W 30/08 (2012.01)
  • G06V 10/77 (2022.01)
  • G08G 1/16 (2006.01)
(72) Inventeurs :
  • KARISHETTI, DEEPAK RAJASEKHAR (Etats-Unis d'Amérique)
  • GOYAL, SANKET (Etats-Unis d'Amérique)
(73) Titulaires :
  • ZIMENO, INC. DBA MONARCH TRACTOR
(71) Demandeurs :
  • ZIMENO, INC. DBA MONARCH TRACTOR (Etats-Unis d'Amérique)
(74) Agent: MBM INTELLECTUAL PROPERTY AGENCY
(74) Co-agent:
(45) Délivré:
(86) Date de dépôt PCT: 2021-09-30
(87) Mise à la disponibilité du public: 2023-04-06
Licence disponible: S.O.
Cédé au domaine public: S.O.
(25) Langue des documents déposés: Anglais

Traité de coopération en matière de brevets (PCT): Oui
(86) Numéro de la demande PCT: PCT/US2021/052780
(87) Numéro de publication internationale PCT: WO 2023055366
(85) Entrée nationale: 2024-03-27

(30) Données de priorité de la demande: S.O.

Abrégés

Abrégé français

Un système d'évitement d'obstruction peut comprendre un véhicule agricole, au moins un capteur configuré pour émettre des signaux servant de base pour un véhicule de nuage de points tridimensionnel (3D) et pour produire des signaux correspondant à une image bidimensionnelle (2D) , et des instructions pour guider un processeur à capturer une image 2D particulière et pour obtenir des signaux servant de base pour un nuage de points 3D particulier ; identifier un candidat d'obstruction dans l'image 2D particulière ; corréler le candidat d'obstruction à une partie du nuage de points 3D particulier pour déterminer une valeur pour un paramètre du candidat d'obstruction ; classifier le candidat d'obstruction en tant qu'obstruction sur la base du paramètre ; et produire des signaux de commande pour modifier un état du véhicule agricole en réponse au fait que le candidat d'obstruction est classé comme une obstruction.


Abrégé anglais

An obstruction avoidance system may include a agronomy vehicle, at least one sensor configured to output signals serving as a basis for a three-dimensional (3D) point cloud and to output signals corresponding to a two-dimensional (2D) image, and instructions to direct a processor to capture a particular 2D image and to obtain signals serving as a basis for a particular 3D point cloud; identify an obstruction candidate in the particular 2d image; correlate the obstruction candidate to a portion of the particular 3D point cloud to determine a value for a parameter of the obstruction candidate; classify the obstruction candidate as an obstruction based upon the parameter; and output control signals to alter a state of the agronomy vehicle in response to the obstruction candidate being classified as an obstruction.

Revendications

Note : Les revendications sont présentées dans la langue officielle dans laquelle elles ont été soumises.


WO 2023/055366
PCT/US2021/052780
WHAT IS CLAIMED IS:
1. An obstruction avoidance system comprising:
2 an agronomy vehicle comprising at least one
3 sensor configured to output signals serving as a basis for
4 a three-dimensional (3D) point cloud and to output
5 signals corresponding to a two-dimensional (2D) image;
6 and
7 an obstruction avoidance unit comprising:
8 a processor;
9 a non-transitory computer-readable medium
10 containing instructions to direct the processor to:
11 output control signals to the at least
12 on sensor to capture a particular 2D image proximate the
13 tractor;
14 output control signals to the at least
15 one sensor to output the signals serving as a basis for a
16 particular 3D point cloud;
17 identify an obstruction candidate in
18 the particular 2d image;
19 correlate the obstruction candidate in
20 the particular 2D image to a portion of the particular 3D
21 point cloud to determine a value of a parameter of the
22 obstruction candidate from the 3D point cloud;
23 classify the obstruction candidate as
24 an obstruction based on the value; and
25 output control signals to alter a state
26 of the agronomy vehicle in response to the obstruction
27 candidate being classified as an obstruction.
41
CA 03233393 2024- 3- 27

WO 2023/055366
PCT/US2021/052780
2. The obstruction avoidance systern of claim 1, wherein the
2 parameter is selected from a group of pararneters consisting
of: a location of
3 the obstruction candidate; a dimension of the obstruction
candidate; and a
4 height of the obstruction candidate.
3. The obstruction avoidance systern of claim 1, wherein the
2 agronomy vehicle comprises a tractor and an implement to be
moved by the
3 tractor.
4. The obstruction avoidance system of claim 1, wherein the at
2 least one sensor comprises a stereo carnera configured to
output the signals
3 serving as a basis for a three-dimensional (3D) point cloud
and to output the
4 signals corresponding to a two-dimensional (2D) image.
5. The obstruction avoidance systern of claim 1 further comprising
2 a stored library of objects, wherein the instructions are to
direct the processor
3 to identify the obstruction candidate in the 2D image by
comparing portions of
4 the 2D image with the objects in the stored library.
6. The obstruction avoidance systern of claim 1, wherein the
2 processing unit and the non-transitory computer-readable
medium form a
3 neural network that learns how to identify objects in the 2D
image from a
4 training set of images of objects.
7. The obstruction avoidance systern of claim 1, wherein the
2 instructions are to direct the processor to remove a ground
plane from the 3D
3 point cloud prior to the correlating of the obstruction
candidate identified in the
4 2D image to the portion of the 3D point cloud.
42
CA 03233393 2024- 3- 27

WO 2023/055366
PCT/US2021/052780
8. The obstruction avoidance systern of claim 1,
wherein the
2 agronomy vehicle comprises a tractor and an implement,
wherein the at least
3 one sensor is carried by the tractor and wherein the state
of the agronomy
4 vehicle altered in response to the obstruction candidate
being classified as an
obstruction is a state of the implernent.
9. The obstruction avoidance systern of claim 8,
wherein
2 implement comprises a floating implement, wherein the
dimension comprises
3 a height of the obstruction and wherein the control signals
raise or lower the
4 implement based upon the deterrnined location and height of
the obstruction.
10. The obstruction avoidance systern of claim 8,
wherein the
2 control signals are to raise or lower the implement based
upon a comparison
3 of a minirnum clearance height for the implement and the
determined height
4 of the obstruction.
11. The obstruction avoidance systern of claim 10
further comprising
2 a stored library of minimum clearance heights for different
implements.
12. The obstruction avoidance system of claim 10,
wherein the
2 instructions are to direct the processor to capture an image
of the implement
3 coupled to the tractor and to identify the implement based
upon the image and
4 wherein the instructions are to direct the processor to
retrieve the minimum
5 height for the implement from a stored library based upon
the identification of
6 the implement from the image.
13. The obstruction avoidance systern of claim 8,
wherein the
2 instructions are to direct the processor to record a
geographic location of the
3 obstruction or a geographic location at which the state of
the agronomy
4 vehicle was altered.
14. The obstruction avoidance system of claim 8,
wherein the
2 instructions are to direct the processor to output control
signals to alter a state
43
CA 03233393 2024- 3- 27

WO 2023/055366
PCT/US2021/052780
3 of the implement coupled to the tractor by adjusting an operational state
of the
4 implement based upon the determined location and dimension of the
5 obstruction.
15. A non-transitory computer-readable medium
containing
2 instructions to direct a processing unit, the instructions comprising:
3 2D image acquisition instructions for acquiring a two-
4 dimensional (2D) image of a region proximate a tractor;
5 3D point cloud acquisition instructions for acquiring a three-
6 dimensional (3D) point cloud of the region proximate the tractor;
7 obstruction identification instructions for identifying an
8 obstruction candidate in the 2D image;
9 correlation instructions for correlating the obstruction
candidate
10 identified in the 2D image to a portion of the 3D point cloud;
11 obstruction candidate parameter acquisition instructions for
12 determining a location of the obstruction candidate relative to the
13 tractor and a dimension of the obstruction candidate based upon the
14 portion of the 3D point cloud correlated to the obstruction candidate
15 identified in the 2D image;
16 classification instructions for classifying the obstruction
17 candidate as an obstruction based upon the location of the obstruction
18 candidate relative to the tractor and the dimension of the obstruction
19 candidate; and
20 obstruction response instructions for outputting control signals
21 alter a state of coupled to the tractor based upon the determined
22 location and the dimension of the obstruction.
44
CA 03233393 2024- 3- 27

Description

Note : Les descriptions sont présentées dans la langue officielle dans laquelle elles ont été soumises.


WO 2023/055366
PCT/US2021/052780
OBSTRUCTION AVOIDANCE
BACKGROUND
[0001] Agronomy vehicles, such as tractors and
associated
implements, are frequently used to carry out various tasks. As the agronomy
vehicles traverse a terrain, obstacles or obstructions may be encountered.
Such obstructions may cause damage to the agronomy vehicle, such as an
implement being moved by a tractor.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] FIG. 1 is a diagram schematically illustrated
portions of an
example obstruction avoidance system.
[0003] FIG. 2 is a diagram illustrating portions of an
example non-
transitory computer-readable medium of the example system of FIG. 1.
[0004] FIG. 3A is a flow diagram of an example
obstruction avoidance
method.
[0005] FIG. 3B is a flow diagram of an example
obstruction avoidance
method.
[0006] FIG. 4A is a diagram schematically illustrating
an example 2D
image including an example obstruction.
[0007] FIG. 4B is a diagram illustrating an example 3D
point cloud
corresponding to the 2D image of FIG. 4A.
[0008] FIG. 4C is a diagram illustrating the example 2D
image of FIG.
4A with the identification of the example obstruction.
1
CA 03233393 2024- 3- 27

WO 2023/055366
PCT/US2021/052780
[0009] HG. 4D is a diagram illustrating the example 3D
point cloud of
FIG. 4B after removal of the ground plane by processing.
[00010] FIG. 4E is a diagram illustrating the example 30
point cloud of
FIG. 4D correlated to the example identified obstacle in the 2D image of FIG.
4C.
[00011] FIG. 5 is a diagram illustrating an example
altering of the state
of an implement by the example obstruction avoidance system of FIG. 1.
[00012] FIG. 6 is a diagram schematically illustrating
portions of an
example obstruction avoidance system.
[00013] FIG. 7 is a left front perspective view of an
example tractor of an
example obstruction avoidance system.
[00014] FIG. 8 is a left rear perspective view of the
example tractor of
FIG. 7 coupled to an example implement (schematically illustrated).
[00015] FIG. 9 is a flow diagram of an example
obstruction avoidance
method.
[00016] FIG. 10 is a diagram illustrating an example 2D
image including
example obstructions.
[00017] FIG. 11 is a diagram illustrating an example 3D
point cloud
corresponding to the 2D image of FIG. 10 and taken along a z-axis.
[000M] FIG. 12 is a diagram illustrating the example 3D
point cloud of
FIG. 11 taken along an x-axis.
[00019] FIG. 13 is a diagram illustrating the example 3D
point cloud of
FIG. 11 following removal of a ground plane by processing.
2
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
[00020] FIG. 14 is a diagram illustrating the example 3D point cloud of
FIG. 12 following removal of the ground plane by processing.
[00021] FIG. 15 is a diagram just rating the example point cloud of FIG.
14 correlated to the example obstacles identified in the example 2D image of
FIG. 10.
[00022] Throughout the drawings, identical reference numbers designate
similar, but not necessarily identical, elements. The FIGS. are not
necessarily
to scale, and the size of some parts may be exaggerated to more clearly
illustrate the example shown. Moreover, the drawings provide examples
and/or implementations consistent with the description; however, the
description is not limited to the examples and/or implementations provided in
the drawings.
DETAILED DESCRIPTION OF EXAMPLES
[00023] Disclosed are example obstruction avoidance systems,
obstruction avoidance methods and obstruction avoidance mediums that
identify obstructions (also referred to as obstacles) in the path of an
agronomy
vehicle and that automatically respond to the upcoming obstruction to avoid or
reduce damage to the agronomy vehicle. For purposes of this disclosure, an
"agronomy vehicle" may comprise any vehicle that is to traverse a field,
vineyard or orchard while interacting with the soil or plants. An agronomy
vehicle may comprise a tractor and an associated implement that is pushed,
pulled or carried by the tractor. In agronomy vehicle may comprise a self-
propelled vehicle that incorporates components for interacting with the soil
or
plants in a field, vineyard or orchard. Interaction by an agronomy vehicle may
involve tillage of the soil, the application of herbicide, insecticide or
fertilizer to
the soil or to plants, the pruning of plants, the cutting or turning of
plants,
and/or the harvesting of plants or their produce.
3
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
[00024] For example, upon encountering an obstruction, the state of an
implement being towed by a tractor may be altered to reduce or avoid
damage to the implement. The example obstruction avoidance systems,
methods and mediums utilize a two-dimensional (2D) image to identify
obstruction candidates. An obstruction candidate may be classified as an
obstruction depending upon whether the obstruction candidate is expected to
be contacted or engaged by the agronomy vehicle. To determine whether an
obstruction candidate will be contacted or engaged by the agronomy vehicle,
the example obstruction avoidance systems, obstruction avoidance methods
and obstruction avoidance mediums utilize a 3D point cloud, corresponding to
the 2D image, to determine a value for a parameter of the obstruction
candidate. A 3D point cloud refers to a set of data points in space. Example
parameters of an obstruction candidate that may be determined from points in
a 3D point cloud include, but are not limited to, a location of the
obstruction
the candidate and dimensions (length, width and/or height) of the obstruction
candidate. If the obstruction candidate is classified as an obstruction due to
one or more values of one or more parameters of the obstruction candidate,
the state of the agronomy vehicle is altered.
[00025] In some implementations, the altering of the state of the
agronomy vehicle is automatic in response to the presence of an obstruction
candidate classified as an obstruction. In some implementations, the altering
of the state of the agronomy vehicle is carried out independent of the
particular location and/or particular dimensions of the particular obstruction
candidate. For example, an agronomy vehicle may be raised to a predefined
height in response to the presence of an obstruction, regardless of the actual
particular height of the obstruction. Said another way, the agronomy vehicle
may be adjusted in the same manner in response to two different obstructions
despite the two different obstructions having different locations and/or
different heights. In other implementations, the altering of the state of
agronomy vehicle may vary depending upon the particular location and/or
4
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
particular dimensions of the particular obstruction. For example, in agronomy
vehicle may be raised to a first agronomy vehicle height in response to the
presence of an obstruction having a first obstruction height and may be raised
to a second different agronomy vehicle height in response to the presence of
an obstruction having a second different obstruction height.
[00026] The identification of obstruction candidates in the 2D image may
be achieved in several ways. In some implementations, the example
obstruction avoidance systems, methods and mediums distinguish between
objects that are predetermined or known not to be obstructions and those
objects that may be obstructions. For example, crops or weeds which are
intended to be engaged by an implement may not be of concern and may not
constitute obstructions. Certain plants may be known to be sufficiently
flexible
so as to not damage and implement even when engaged by the implement.
In some implementations, any object in the 2D image that is not identified as
a
known non-obstruction is identified as an obstruction candidate. The
dimensions and/or locations of such identified known non-obstruction objects
is not determined or if determined, ignored.
[00027] The remaining identified objects (objects not identified as
known non-obstruction objects) constitute potential or candidate obstructions
depending upon whether such remaining objects are within the path of the
tractor and/or implement and depending upon the dimensions of such
remaining obstructions. The example obstruction avoidance systems,
obstruction avoidance methods and obstruction avoidance mediums correlate
the remaining identified objects to corresponding clusters of points in a 3D
point cloud corresponding to the 2D image to determine the location and/or
dimensions of the remaining objects. Based upon the determined location
and/or dimensions of the objects, the object may be classified as an
obstruction for which a response may be determined. The example
obstruction avoidance systems, methods and mediums alter a state of the
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
implement coupled to the tractor in response to the object or obstruction
candidate being classified as an obstruction.
11000281 In some implementations, the example obstruction avoidance
systems, obstruction avoidance methods and obstruction avoidance mediums
identify objects known to be non-obstruction objects. Any other object that
cannot be identified as a non-obstruction object is correlated to a cluster of
points in the 3D point cloud to determine its location and dimension so as to
determine whether or not the object is an obstruction for which a response
may be determined. In some implementations, the example obstruction
avoidance systems, obstruction avoidance methods and obstruction
avoidance mediums identify objects in the 2D image that may constitute an
obstruction, wherein those objects identified as obstruction candidates are
each correlated to a cluster of points in the 3D point cloud to determine the
location and dimension of the obstruction candidate so as to determine
whether or not the obstruction candidate is an actual obstruction, whether or
not the obstruction candidate will be engaged by the implement given the
characteristics of the implement and its current path as well as the location
and dimensions of the obstruction candidate.
[00029] For example, an object in the 2D image may be identified as a
rock. The rock identified in the 2D image is then correlated to a
corresponding cluster of points in the 3D point cloud to determine the
location
and dimension of the rock so as to determine whether or not the rock (a) is
located within the path of the tractor/implement and (b) has dimensions that
would result in the rock being undesirably engaged by the implement. In
some circumstances, the rock may have dimensions that would result in
engagement with implement, but the rock is not at a location in the current
path of the implement. In some instances, the rock may be within the current
path of the implement, but the rock may have dimensions such that the
implement may pass over the rock without damage to the implement. In
6
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
circumstances where the rock is within the path of the tractor/implement and
has dimensions that would result in the rock being engaged or contacted by
the implement, the rock may be classified as an obstruction, wherein an
appropriate response may be determined.
[00030] In some implementations, those objects in the 2D image that
cannot be identified as either known non-obstructions (plants, weeds and the
like) or known/predetermined potential obstructions or known obstruction
candidates (rocks and the like), may nevertheless be treated or identified as
obstruction candidates, wherein the unidentified objects in the 2D image are
likewise correlated to clusters of points in the 3D point cloud to determine
their
location and dimension and to classify each of such unidentified objects as
either (a) an obstruction for which a response may be needed or (b) not an
obstruction in that the unidentified object will not be engaged by the
implement when the tractor/implement passes by or over the unidentified
object.
[00031] In some implementations, the example obstruction avoidance
systems, obstruction avoidance methods and obstruction avoidance mediums
identify obstruction candidates by identifying those objects in the 2D image
that sufficiently match or correspond to predetermined or known obstructions,
wherein only those objects in the 2D image that sufficiently matter correspond
to the predetermined known obstructions are identified as obstruction
candidates.
[00032] The 2D image and the 3D point cloud are acquired by the
tractor. In some implementations, different sensors are utilized to acquire
the
2D image and the 3D point cloud. For example, the 2D image may be
acquired with the camera while the 3D point cloud may be acquired with a
different sensor, such as a LIDAR. In some implementations, the 2D image
and the 3D point cloud may be acquired by a stereo camera carried by the
tractor.
7
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
[00033] In some implementations, the example obstruction avoidance
systems, methods and mediums utilize a neural network to identify the
obstruction candidate based upon criteria learned from a prior set of training
images depicting various objects known to be non-obstruction objects and/or
known to be obstruction objects. In yet other implementations, the
identification of known non-obstruction objects and/or known obstruction
objects is carried out using other image analysis.
[00034] In implementations where the agronomy vehicle comprises an
implement moved by a tractor, the state of the implement being altered in
response to the determined location and dimension of an obstruction may
refer to the position or location at which the implement is supported or
otherwise coupled to the tractor. In some implementations, in response to an
obstruction having a determined location within the path of the tractor, the
tractor may raise the implement to a height of us to not engage the
obstruction and subsequently lower the implement back into operation after
the obstruction has been passed by the implement. In some implementations,
the tractor may redirect itself such the tractor and/or implement does not
engage the obstruction. In some implementations, the change in the state of
the implement may refer to a change in an operation of the implement based
upon the determined location and/or height of the obstruction. For example,
rotating or translating parts of the implement may be repositioned, slowed,
sped up or stopped based upon the determined location and/or height of the
obstruction.
[00035] For purposes of this disclosure, unless explicitly recited to the
contrary, the determination of something "based on" or "based upon" certain
information or factors means that the determination is made as a result of or
using at least such information or factors; it does not necessarily mean that
the determination is made solely using such information or factors. For
purposes of this disclosure, unless explicitly recited to the contrary, an
action
8
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
or response "based on" or "based upon" certain information or factors means
that the action is in response to or as a result of such information or
factors; it
does not necessarily mean that the action results solely in response to such
information or factors.
[00036] For purposes of this disclosure, the term "coupled" shall mean
the joining of two members directly or indirectly to one another. Such joining
may be stationary in nature or movable in nature. Such joining may be
achieved with the two members, or the two members and any additional
intermediate members being integrally formed as a single unitary body with
one another or with the two members or the two members and any additional
intermediate member being attached to one another. Such joining may be
permanent in nature or alternatively may be removable or releasable in
nature. The term "operably coupled" shall mean that two members are
directly or indirectly joined such that motion may be transmitted from one
member to the other member directly or via intermediate members.
[00037] For purposes of this disclosure, the term "processing unit" shall
mean a presently developed or future developed computing hardware that
executes sequences of instructions contained in a non-transitory memory.
Execution of the sequences of instructions causes the processing unit to
perform steps such as generating control signals. The instructions may be
loaded in a random-access memory (RAM) for execution by the processing
unit from a read only memory (ROM), a mass storage device, or some other
persistent storage. In other embodiments, hard wired circuitry may be used in
place of or in combination with software instructions to implement the
functions described. For example, a controller may be embodied as part of
one or more application-specific integrated circuits (ASICs). Unless otherwise
specifically noted, the controller is not limited to any specific combination
of
hardware circuitry and software, nor to any particular source for the
instructions executed by the processing unit.
9
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
[00038] Disclosed is an example obstruction avoidance system that may
include an agronomy vehicle, at least one sensor configured to output signals
serving as a basis for a three-dimensional (3D) point cloud and to output
signals corresponding to a two-dimensional (2D) image, a processor, and a
non-transitory computer-readable including instructions to direct the a
processor to: output control signals to capture a particular 2D image
proximate the agronomy vehicle and to obtain signals serving as a basis for a
particular 3D point cloud; identify an obstruction candidate in the particular
2d
image; correlate the obstruction candidate in the particular 2D image to a
portion of the particular 3D point cloud to determine a value for a parameter
of
the obstruction candidate; classify the obstruction candidate as an
obstruction
based upon the value of the parameter; and output control signals to alter a
state of the agronomy vehicle in response to the obstruction candidate being
classified as an obstruction.
[00039] Disclosed is an example obstruction avoidance system. The
obstruction avoidance system may include a tractor for use with an
implement, at least one sensor configured to output signals serving as a basis
for a three-dimensional (3D) point cloud and to output signals corresponding
to a two-dimensional (2D) image, a processor, and a non-transitory computer-
readable including instructions to direct the processor to: output control
signals to capture a particular 2D image proximate the tractor and to obtain
signals serving as a basis for a particular 3D point cloud; identify an
obstruction candidate in the particular 2d image; correlate the obstruction
candidate in the particular 2D image to a portion of the particular 3D point
cloud to determine a location of the obstruction candidate relative to the
tractor and to determine a dimension of the obstruction candidate; classify
the
obstruction candidate as an obstruction based upon the location and the
dimension; and output control signals to alter a state of an implement coupled
to the tractor in response to the obstruction candidate being classified as an
obstruction.
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
[00040] Disclosed an example obstruction avoidance method. The
method may include acquiring two-dimensional (2D) image of a region
proximate a tractor; acquiring a three-dimensional (3D) point cloud of the
region proximate the tractor; identifying an obstruction candidate in the 2D
image; correlating the obstruction identified in the 2D image to a portion of
the
3D point cloud; determining a location of the obstruction candidate relative
to
the tractor and a dimension of the obstruction candidate based upon the
portion of the 3D point cloud correlated to the obstruction identified in the
2D
image; classifying the obstruction candidate as an obstruction based upon the
location and the dimension; and outputting control signals to alter a state of
coupled to the tractor in response to the obstruction candidate being
classified
as an obstruction.
[00041] Disclosed is an example obstruction avoidance non-transitory
computer-readable medium. The media may include instructions for directing
a processor to carry out obstruction avoidance operations. The instructions
may include: (1) 2D image acquisition instructions for acquiring a two-
dimensional (2D) image of a region proximate a tractor; (2) 3D point cloud
acquisition instructions for acquiring a three-dimensional (3D) point cloud of
the region proximate the tractor; (3) obstruction identification instructions
for
identifying an obstruction in the 2D image; (4) correlation instructions for
correlating the obstruction identified in the 2D image to a portion of the 3D
point cloud; (5) obstruction parameter acquisition instructions for
determining
a location of the obstruction relative to the tractor and a dimension of the
obstruction based upon the portion of the 3D point cloud correlated to the
obstruction identified in the 2D image; and (6) obstruction response
instructions for outputting control signals alter a state of coupled to the
tractor
based upon the determined location and the determined dimension of the
obstruction.
11
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
[00042] FIG. 1 schematically illustrates portions of an example
obstruction avoidance system 20. Obstruction avoidance system 20 identifies
obstructions in the path of an agronomy vehicle, such as a tractor and/or an
implement being pushed, pulled or carried by the tractor and that
automatically respond to the upcoming obstruction to avoid or reduce damage
to the tractor and/or implement. Obstruction avoidance system 20 utilizes a
two-dimensional (2D) image to identify an obstruction candidate from other
objects that may exist within the path of the tractor/implement and correlates
a
three-dimensional (3D) point cloud corresponding to the 2D image with the
obstruction candidate identified in the 2D image to determine a value for at
least one parameter of the obstruction candidate, such as a value for a
location and/or a dimension of the obstruction candidate. System may further
classify the direction candidate as an actual obstruction based upon the value
of the at least one parameter. Obstruction avoidance system 20 may alter a
state of the implement coupled to the tractor in response to the obstruction
candidate being classified as an obstruction in some implementations, a state
of the implement may be adjusted or altered based upon the determined
location and dimension of the obstruction candidate that has been classified
as an obstruction. Obstruction avoidance system 20 comprises tractor 24 and
obstruction avoidance unit 28.
[00043] Tractor 24 comprises an agronomy vehicle and is configured for
use with an implement, such as implement 25. Implement 25 may comprise a
floating implement. Implement 25 may be cantilevered or suspended at a rear
of tractor 24 by a three-point hitch. In some implementations, implement 25
may be elevated by its own tires, sled or other ground engaging members. In
some implementations, implement 25 may be selectively raised and lowered
using an actuator of tractor 24. In some implementations, what 25 may carry
its own actuator to raise and lower itself in response to control signals
received from tractor 24. Examples of implement 25 include, but are not
limited to, implements having cutter bars such as a hay bine, chopper or the
like, tillage implements having shovels or blades that intersects the ground,
or
12
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
other implements that engage the ground or extending close proximity to the
surface of the ground.
[00044] Tractor 24 comprises at least one sensor 32. The at least one
sensor 32 is configured (a) to output signals serving as a basis for a 3D
point
cloud 40, and (B) to output signals corresponding to a 2D image 42. In some
implementations, the at least one sensor 32 comprises a stereo camera
having two or more lenses with a separate image sensor or film frame for
each lens. The stereo camera output signals for both the 2D image and the
3D point cloud. Some implementations, the three-dimensional images
captured by the stereo camera may be transformed into the 3D point cloud
with photogrammetry software. In some implementations, a two-dimensional
camera may be used for the 2D image and other sensors, such as a LIDAR
sensor may be used for the 3D point cloud.
[00045] Obstruction avoidance unit 28 automatically alters the state of
implement 25 based upon oncoming obstructions, obstructions in the current
path of tractor 24 as tractor 24 moves through a field, orchard or vineyard.
Obstruction avoidance unit 28 comprises processing unit 50 and a non-
transitory computer-readable medium 52. Processing unit 50 follows
instructions contained in medium 52.
[00046] Non-transitory computer-readable medium 52 comprises a
persistent storage device storing recorded instructions for processing unit
30.
Examples of medium 52 include, but are not limited to, solid-state memory
(flash memory), disk memory and the like. As shown by FIG. 2, medium 52
comprises 2D image acquisition instructions 60, 3D point cloud acquisition
instructions 62, obstruction candidate identification instructions 64, image-
cloud correlation instructions 66, obstruction candidate parameter acquisition
instructions 68, obstruction classification instructions 69 and obstruction
response instructions 70. Instructions 60-70 direct processing unit 50 to
carry
out the example obstruction avoidance method 100 outlined in FIG. 3A.
13
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
[00047] As indicated by block 104 of method 100 in FIG. 3A, 2D image
acquisition instructions 60 direct processing unit 50 to acquire a 2D image of
a
region proximate to tractor 24. This region may be an area in front of tractor
24 and/or to either side of tractor 24. The region may include an area in the
current path of tractor 24 which will pass beneath implement 25 if the current
path of tractor 24 is maintained. Instructions 60 direct processing unit 50 to
output control signals actuating the at least one sensor 32 so as to output
the
signals corresponding to the 2D image 42.
[00048] FIG. 4A illustrates an example 2D image 142 captured by the at
least one sensor 32. In the example illustrated, the 2D image 142 comprises
aground surface 144 having aground plane 146, plants 148 (schematically
illustrated) which may be provided in two consecutive rows, and an
obstruction 150 resting upon the ground surface 144 between the plants 148.
[00049] As indicated by block 108 of method 100 in FIG. 3A, 3D point
cloud acquisition instructions 60 direct processing unit 50 to acquire a 3D
point cloud of the region proximate the tractor, the same region for which a
2D
image was rendered and acquired. FIG. 4 illustrates an example 3D point
cloud 152 corresponding to the region depicted in 2D image 142. Each
individual point in 3D point cloud 152 may have an associated set of
Cartesian coordinates. In one implementation, the 3D point cloud is
generated from a stereo image captured by a stereo camera and transformed
to the 3D point cloud using photogrammetry software.
[00050] As indicated by block 112 of method 100 in FIG. 3A, obstruction
identification instructions direct processing unit 50 to identify an
obstruction
candidate in the 2D image. An obstruction candidate may refer to any object
not intended to lie within the current path of tractor 24 and implement 25.
For
example, plants in various crop rose are expected and intended and may not
constitute an obstruction candidate. By contrast, rocks, logs or other similar
14
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
objects may not be intended and may constitute obstructions depending upon
their dimensions and/or locations.
[00051] In some implementations, a particular obstruction is identified in
the 2D image using optical recognition or image in analysis, wherein
processing unit 50 analyzes different regions and pixels of the 2D image 142
to identify the outline of different structures. The outline may be compared
to
a stored library of objects including obstructions, wherein the object or
structure in the 2D image is compared against the objects in the store library
to determine whether the structure is an obstruction.
[00052] In some implementations, processing unit 50 and medium 52
may be part of a neural network that is trained to distinguish obstructions
from
non-obstructions and is configured to identify and obstruction based upon
previously stored training images of obstructions. FIG. 40 illustrates the
previous two-dimensional image 142 with the identified obstruction candidate
150. It should be noted that the ground surface 144 and the plants 148 are
distinguished from obstruction candidate 150 and not identified or classified
as obstructions. Although the obstruction candidate is identified in the 2D
image, its dimensions and location are not readily available from the 2D
image 142.
[00053] As indicated by block 116 of method 100 in FIG. 3A, image-
cloud correlation instructions 66 direct processing unit 50 to correlate the
obstruction candidate identified in block 112 in the 2D image acquired in
block
104 to a portion (less than whole) of the 3D point cloud acquired in block
108.
As shown by FIG. 4D, in some implementations, to facilitate such correlation,
medium 52 may include instructions that further direct processing unit 50 to
identify and remove the ground plane 146 of 3D point cloud 152 prior to such
correlation. FIG. 4E illustrates correlation of the identified obstruction
candidate 150 from the 2D image 142 to a corresponding portion 160 of 3D
point cloud 152. In one implementation, the 2D point cloud is digitally
overlaid
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
upon the remaining portions of the 3D point cloud. Those portions of the 3D
point cloud overlaid by those portions of the 2D image previously identified
as
an obstruction candidate serve to provide Cartesian coordinates for the
identified obstruction candidate. In the example shown in FIG. 4E, overlay of
the identified obstruction candidate 150 identifies what individual points in
the
3D point cloud correspond to points of the identified obstruction candidate
150.
[00054] As indicated by block 120 of method 100 in FIG. 3A, obstruction
parameter acquisition instructions 68 direct processing unit 50 to determine
parameters, such as a location and dimensions of the identified obstruction
candidate. For example, the individual points in the 3D point cloud that
corresponded to the identified obstruction candidate as determined in block
116 each have Cartesian coordinates. These Cartesian coordinates may be
used to identify the parameters of the identified obstruction candidate. In
some implementations, given the predetermined mounting of the at least one
sensor 32 and the sensing range or field-of-view of the lease one sensor 32,
process may use coordinates of the points cloud corresponding to the
identified obstruction in the 2D image to determine the positioning or
location
of the obstruction relative to tractor 24.
[00055] As indicated by block 121 of method 100 in FIG. 3A,
classification instructions 69 direct processing unit 50 to determine whether
the obstruction candidate 150 is expected to be an actual obstruction to
implement 25 based on the values for at least one parameter determined in
block 120. The classification of obstruction candidate 150 as an actual
obstruction may be based upon the current future path of the implement,
characteristics of the implement such as its height above the ground or soil,
the location of the obstruction candidate 150 relative to the current future
path
of the implement and the dimensions of the obstruction candidate 150 relative
to characteristics of the implement. For example, in circumstances where an
16
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
implement floats above the ground, an obstruction candidate 150 may be
classified as an actual obstruction if the height of the obstruction candidate
is
greater than the height at which the implement floats above the ground. In
circumstances where an implement engages the ground with ground
engaging members at predefined spacings (such as with a plow), and
obstruction candidate may be classified as an actual obstruction if a width of
the obstruction candidate is greater than the predefined spacings and is in a
path of one of the members.
[00056] In some implementations, the determination of whether an
obstruction candidate constitutes an actual obstruction may utilize both the
determine location and at least one dimension of the obstruction candidate
determine from the 3D point cloud. In some implementations, the
determination of whether an obstruction candidate constitutes an actual
obstruction may utilize just a determine location or just at least one
dimension
of the obstruction candidate. For example, the at least one sensor may have
a field-of-view so as to capture data from only those that will necessarily
lie
within the forthcoming path of the tractor and implement. In such
implementations, an obstruction candidate depicted in the 2D image, by
default, will be within the forthcoming path of the tractor/implement. In such
implementations, the classification of the obstruction candidate as an actual
obstruction may depend solely upon its dimensions, such as whether or not
the obstruction will have a height that causes the obstruction to be engaged
by the implement as implemented moved across the obstruction.
[00057] As indicated by block 122 of method 100 in FIG. 3A, obstruction
response instructions 70 direct processing unit 50 to determine whether any
action, if any, should be made and if so, what action should be made in
response to the obstruction candidate being classified as an actual
obstruction. In some implementations, the response may be automatic and
may be prescribed independent of the particular dimensions and/or locations
17
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
of the particular obstruction. For example, in response to any obstruction,
the
implement may be raised to a preestablished height for a predefined period of
time before being lowered once again.
[00058] In other implementations, the response may vary depending
upon the particular dimensions and/or location of the particular obstruction.
For example, processing unit 50 may use the determined location of the
obstruction to determine whether the obstruction lies in the current
trajectory
or path of tractor 24 and/or implement 25. In some implementations,
implement 25 may have a width greater than the width of tractor 24, wherein
the obstruction may be encountered by implement 25 while being avoided by
tractor 24. In particular implementations, using the determined location of
the
obstruction, processing unit 50 may determine what particular portion of
tractor 24 and/or implement 25 is expected to encounter the obstruction given
the current path or trajectory of tractor 24 and implement 25. Using the
determined dimensions of the obstruction, such as its transverse width and
height, processing unit 50 may determine whether the obstruction will freely
pass beneath the implement or will impact the implement at the current height
of the implement.
[00059] In some implementations, instructions 70 may direct processing
unit 50 to output control signals to an actuator to alter the trajectory or
path of
tractor 24 and the implement 25 such that the obstruction is completely
avoided, such that the obstruction may pass below selected portions of
implement 25 having a greater height above the ground surface as compared
to other portions of implement 25, or such that obstruction will encounter
those portions of the implement 25 that may be less susceptible to damage
from an encounter with the obstruction. For example, certain portions of
implement 25 may be more resilient to bend without damage, may be more
easily repaired or replaced, or may have a lesser impact on the overall
operational performance of the implement.
18
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
[00060] In some implementations, instructions 70 may direct processing
unit 50 to output control signals for an actuator to adjust the positioning of
implement 25 so as to avoid encountering the obstruction or reduce contact
with the obstruction. For example, such control signals may cause an
actuator to move the implement transversely or in a sideways direction
relative to tractor 24 so as to reduce or avoid contact with the obstruction.
Such control signals may cause an actuator to raise implement 25 to reduce
or avoid contact with the obstruction. For example, using the determined
height of the obstruction, processing unit 50 may help control signals causing
an actuator (such as a hydraulic or pneumatic cylinder-piston assembly) on
tractor 24 or on implement 25 to raise the lowest most surface of implement
25 to a height greater than the determined height of the obstruction.
[00061] Figure 39 illustrates an example obstruction avoidance method
130. Method 130 is similar to method 100 except that method 130 comprises
blocks 132 in place of blocks 121 and 122. In block 121, unit 28 determines a
location of the obstruction candidate relative to the tractor any dimension of
the obstruction candidate based upon the portion of the 3D point cloud
correlated to the obstruction candidate identified in the 2D image. In block
134, unit 28 output control signal to alter a state of the implement coupled
to
the tractor based upon the determine location and the determined dimension
of the obstruction candidate classified as the obstruction.
[00062] FIG. 5 schematically illustrates one example of obstruction
avoidance unit 28 of tractor 24 outputting control signals to an actuator 80
on
tractor 20 and/or actuator 82 of implement 25 (both of which are schematically
illustrated) to raise implement 25 from the first lower position 84 shown in
broken lines to the raised position 86, wherein the lowest most surface 87 of
implement 25 is raised to a height above the height of obstruction candidate
150 that is been classified as an actual obstruction 150. As a result, tractor
24 and implement 25, when moving along a path, as indicated by arrow 88,
19
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
pass over the obstruction 150. In some implementations, actuator 80 may
comprise a hydraulic cylinder-piston assembly connected to a three-point
hitch or other structure that may pivot or translates so as to raise and lower
or
transversely move implement 25. Actuator 82 may comprise a hydraulic
cylinder-piston assembly having ends operably coupled between a wheel axle
and a frame so as to raise and lower the bottom floor of implement 25 relative
to the wheel axle. In some implementations, actuator 80 or actuator 82 may
comprise a pneumatic pump which may increase the inflation of tires to raise
the bottom or floor of the implement 25 to a greater height.
[00063] In some implementations, obstruction avoidance unit 28 may
further determine when implement 25 will have fully passed over obstruction
150 based upon the location of obstruction 150, the speed of tractor 24 and
the dimensions of tractor 24 and implement 25. At such time, processor 50
may output control signals to actuator 80 and/or actuator 82 to return
implement 25 to its original height or original transverse position. In some
implementations, unit 28 may further help control signals which adjust the
operation of implement 25. For example, processing unit 50 may output
control signals based upon the determined location and/or height of the
obstruction such that rotating or translating parts of the implement may be
repositioned, slowed, sped up or stopped.
[00064] As further shown by FIG. 5, in some implementations,
obstruction avoidance unit 28 may record or store the location at which
obstruction 150 was identified and the associated adjustments to the
operation of implement 25 based upon the presence of the obstruction 150.
The location may be recorded as part of an obstruction/action map 90. The
generated map 90 may be subsequently used by a manager to take remedial
action such as removing obstruction 150 and/or addressing the prior inaction
or altered operations that resulted from the prior presence of obstruction 150
at the determined location. In some implementations, the generated map 90
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
may be specifically used by tractor 24 other tractors 24 to automatically
adjust
the positioning or operation of implement 25 or other implement 25 when the
tractors and implements once again have a path that intersects the
obstruction 150.
[00065] FIG. 6 is a diagram schematically illustrating portions of an
example obstruction avoidance system 220. FIG. 6 illustrates an example
where a stereo camera 232 is used to acquire both 2D images and a 3D point
cloud and where the processing unit 50 and the non-transitory computer-
readable medium 52 are part of a larger neural network 254 that learns how to
identify objects in 2D images based upon a set of human or computer defined
obstruction training images 256. Each of the training images may be a 2D
image of a defined or known non-obstruction object (plant, weed and the like
as described above) and/or a defined or known obstruction candidate (rock or
other structure that may cause damage to the implement). In some
implementations, a box or window may be drawn by human about the non
obstruction object or known obstruction candidate.
[00066] Processing unit 50, as part of the neural network 254, may
identify common features, factors or criteria with respect to human defined or
computer defined object in the training images so as to use the same criteria
to identify an obstruction in a sample 2D image. For example, the neural
network may identify particular pixel colors, densities, clusters, boundaries,
shadings, lighting or the like common amongst human defined or computer
defined training portions in the training images and then, through optical
analysis, identify those portions of the sample image having the same
characteristic pixel colors, densities, clusters, boundaries, shadings,
lighting
or the like of an obstruction and identify the selected portion of the sample
2D
image comprising the predefined non obstruction or predefined obstruction
candidate. The remaining components and operations of tractor 24 and
implement 25 may be similar to that of system 20. For example, obstruction
avoidance unit 228 may be similar to obstruction avoidance unit 28 except
21
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
that obstruction avoidance unit 228 carries out the identification of the
obstruction candidate using a neural network.
[00067] FIGS. 7 and 8 are perspective views of an example obstruction
avoidance system 320. FIGS. 7 and 8 illustrate an example of how the
schematically illustrated obstruction avoidance system 20 may be embodied
in a tractor. Obstruction avoidance system 320 comprises tractor 324 and
obstruction avoidance unit 328.
[00068] Tractor 324 comprises an agronomy vehicle that may be
employed in various settings such as an agricultural setting, a residential
setting or a construction setting. Tractor 324 may be used for a variety of
purposes in agricultural construction and residential purposes. Tractor 324
may be used to push or pull an implement. Tractor 324 may include
attachments, such as a bucket, blade, backhoe, or the like for digging,
displacing, and/or carrying various materials such as earthen materials,
animal waste and produce. Tractor 324 may include forks or other coupling
mechanisms for engaging pallets, bins, boxes, or the like, wherein the
tractors
carry and/or lift the engaged items.
[00069] Tractor 324 comprises chassis 400, ground propulsion members
402, drive/steering controller 404, input 405, agronomy vehicle cab 406,
drive/steering controller 416, global positioning system (GPS) units 420-1 and
420-1 (collectively referred to as GPS units 420), stereo cameras 422-1, 422-
2 (collectively referred to as cameras 422), three- point hitch 426 and
actuator
428.
[00070] Ground propulsion members 402 comprise members that
engage the underlying terrain and which are driven. In the example
illustrated, ground propulsion members 402 comprise rear wheels 450 and
front wheels 452. In the example illustrated, rear wheels 450 are driven by an
electrical drive while front wheels 452 are manipulated or turned by steering
actuator. In other implementations, ground propulsion members 402 may
comprise tracks or other ground engaging members.
22
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
[00071] Drive/steering controller 416 comprises a processing unit and
associated non-transitory computer-readable medium containing instructions
for directing the processing unit to output control signals for controlling
the
steering and speed at which tractor 324 moves. Such control signals may be
generated in response to a computer program controlling automatic
navigation and automated operations of tractor 324. In some
implementations, the control signals may direct tractor 324 along a predefined
preprogrammed route or path between rows of plants are within a field,
orchard or vineyard. In some modes or in some implementations, such
control signals may be generated in response to inputs received from an
operator remote from tractor 324, not residing in cab 418. In some modes or
in some implementations, such control signals may be generated in response
to inputs received from an operator providing input which is captured by
camera unit 422. In some modes or in some implementations, such control
signals may be generated in response to inputs from an operator residing
within cab 406.
[00072] Cab 406 comprises a compartment in which an operator may be
seated when operating tractor 324. Cab 406 comprises a seat 460, a steering
wheel 462, a control console 464 and a roof 466. Roof 620 extends over
control seat 460 and control console 464. In some implementations, roof 466
may be raised and lowered.
[00073] GPS units 420 are supported by roof 466. Each of GPS units
420 comprises a GPS antenna. In the example illustrated, GPS unit 420-1 is
located at a front end of roof 466, forward of a rear axle while GPS unit 420-
2
is located at a rear end of roof 466, rearward of the rear axle. GPS units 420
receive signals from satellites, from which the geographical location of
tractor
324, such as defined by its base link or rear axle center, may be determined.
In some implementations, tractor 324 may comprise a single GPS antenna.
Signals from the GPS unit 420 may be used to map locations of obstructions.
[00074] Cameras 422-1 and 422-2 are supported by roof 466 at a front
and a rear of roof 466, facing in forward and rearward directions,
respectively.
23
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
Each of stereo cameras 422 have two or more lenses with a separate image
sensor or film frame for each lens. The stereo camera output signals for both
the 2D image and the 3D point cloud. In some implementations, the three-
dimensional images captured by the stereo camera may be transformed into
the 3D point cloud with photogrammetry software. In some implementations,
a two-dimensional camera may be used for the 2D image and other sensors,
such as a LI DAR sensor may be used for the 3D point cloud. Camera 422-1
captures stereo images in front of tractor 324. Camera 422-2 captures stereo
images towards a rear of tractor 324, towards implement 325.
[00075] Three-point hitch 426 is located at a rear of tractor 324 and is
configured to be coupled to an implement, such as implement 325. Three-
point hitch 426 is configured to be pivoted or articulated so as to raise and
lower implement 325. Actuator 428 comprises a powered component to move
three-point hitch 426 to raise and lower implement 325. In some
implementations, actuator 428 may comprise a hydraulic cylinder-piston
assembly or an electric solenoid.
[00076] Implement 325 comprises a floating implement, an implement
that is supported above the underlying ground surface during operation.
Implement 325 may be similar to implement 25 described above except that
implement 325 is specifically configured to be coupled to tractor 324 by three-
point hitch 426. In other implementations, implement 325 may be configured
to be connected or attached to tractor 324 in other fashions. In the example
illustrated, implement 325 is configured to be pivoted during raising as
indicated by arrow 327.
[00077] In the example illustrated, implement 325 further comprises an
implement component 329 (schematically illustrated). The implement
component 329 may comprise a powered mechanism which carries out
operations such as a pump for driving fluid such as an implement 325 is a
sprayer, a cutting blade which is translated or rotated, a harvesting
mechanism or the like. In some implementations, implement component 329
24
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
may be actuated between different states and may be turned on, turned off or
have its speed adjusted in response to signals from tractor 324 and/or
obstruction avoidance unit 328.
[00078] Obstruction avoidance unit 328 is similar to obstruction
avoidance unit 28 described above. Obstruction avoidance unit 328
comprises processing unit 50 and computer-readable medium 52 described
above and is configured to carry out method 100. In the example illustrated,
obstruction avoidance unit 328 may store characteristics of implement 325 as
well as the characteristics of other implements 325, wherein said stored
characteristics may indicate a minimum clearance height for implement 325
above the underlying ground. In some implementations, unit 328 may utilize
images of implement 325 captured by camera 422-2 to determine the identity
of implement 325, wherein the stored minimum height of implement 325 is
retrieved. In some implementations, unit 328 may receive an input for the
minimum clearance height of implement 325 through input 405. In some
implementations, input 405 may comprise an operator input such as a
keyboard, touchpad, button, switch, speech recognition and associate
microphone or the like.
[00079] In some implementations, instructions 69 may direct processing
unit 50 to determine whether the obstruction candidate is an actual
obstruction based upon the location and dimensions of the obstruction
candidate. 69 may evaluate the current path or future navigation of the
tractor
324 and the associated implement to determine whether the location of the
obstruction candidate lies within the forthcoming path of the implement.
Instruction 69 a further evaluate whether the obstruction candidate has a
height or width which will cause the instruction candidate to be contacted by
the implement. The obstruction candidate may be classified as an obstruction
in response to the obstruction candidate having a determine location the lies
within the future path of the implement and the obstruction candidate having a
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
dimension which will result in unintended or potentially damaging engagement
with the implement.
[00080] In such implementations, instructions 70 may direct processing
unit 50 to compare the minimum clearance height for implement 325 to the
determined height of an obstruction candidate and/or is location to determine
whether it implement 325 should be raised or the extent to which it should be
raised. In some implementations, instructions 70 may utilize the determined
height of the obstruction and/or its location to determine whether the
obstruction candidate is an actual obstruction and to determine whether the
operation of implement component 329 should be adjusted (being temporarily
paused, sped up, slowed down or otherwise changed). After the obstruction
has been passed by implement 325, instructions 70 may direct processing
unit 50 to output control signals lowering the implement to its normal
operating
height.
[00081] As with obstruction avoidance unit 28, obstruction avoidance
unit 328 may be configured to generate and store an obstruction/action map
90. In such implementations, instructions 70 may direct processing unit 50 to
record the geographic location of an obstruction or geographic location at
which a state of implement 25 or its component 329 was altered based upon
the determined location and dimension of the obstruction. A map 90 may
include multiple instances of different obstructions and different actions
taken
in response to such obstructions.
[00082] FIG. 9 is a flow diagram of an example obstruction avoidance
method 500 that may be carried out by system 320. As indicated by block
502 in FIG. 9, a stereo camera, such as camera 422-1 outputs a 2D image
642. FIG. 10 illustrates an example 2D image 642 which depicts obstructions
650.
26
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
[00083] As indicated by block 504 in FIG. 9, stereo camera 422-1 further
output signals that are utilized to generate a 3D point cloud. As indicated by
block 506, data preprocessing operations such as transforming the point
cloud to the tractor frame and removing invalid points (within the region of
interest) are carried out by obstruction avoidance unit 328.
[00084] FIG. 11 illustrates an example 3D point cloud 652 viewed along
AZ-at (top-down) corresponding to the 2D image 642 (generated from the
same stereo camera 422-1 at the same time as 2D image 642). FIG. 12
illustrates the example 3D point cloud 652 along the x-axis, illustrating how
tractor 324 may see obstructions.
[00085] As indicated by block 508, unit 328 applies a ground filter
algorithm which separates ground and non-ground points and retain non-
ground points that represent the object/obstructions. FIG. 13 illustrates the
example point cloud 652 along the z-axis following application of the ground
filter in block 508. FIG. 14 illustrates the example point cloud 652 along the
x-
axis following the application of the ground filter in block 508.
[00086] As indicated by block 510, unit 328 applies a clustering
algorithm to the remaining points in the 3D point cloud to remove any noise
and group such points together to represent individual objects.
[00087] As indicated by block 514 and block 516, unit 328, part of a
neural network, further applies a detection model learned from a 2D training
images depicting various obstacles or obstructions. Processing unit 50 and
the non-transitory computer-readable medium 52 are part of a larger neural
network that learns how to identify obstructions in 2D images based upon a
set of human or computer defined obstruction training images 256. Each of
the training images may be a 2D image of an object that is predefined as a
non-obstruction or an obstruction candidate. In some implementations, a box
or window may be drawn by human about the object. Processing unit 50, as
part of the neural network 254, may identify common features, factors or
27
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
criteria with respect to human defined or computer defined object in the
training images so as to form the detection model that uses the same criteria
to identify an object in a sample 2D image. For example, the neural network
may identify particular pixel colors, densities, clusters, boundaries,
shadings,
lighting or the like common amongst human defined or computer defined
training portions in the training images and then, through optical analysis,
identify those portions of the sample image having the same characteristic
pixel colors, densities, clusters, boundaries, shadings, lighting or the like
of an
obstruction and identify the selected portion of the sample 2D image
comprising the obstruction.
[00088] As indicated by block 520, the determination of the presence of
an obstruction candidate in the 2D image or a lack of an obstruction candidate
in the 2D image is stored on a remote storage or the "cloud". If an object in
the 2D image cannot be identified as a non-obstruction or as a known or
positive obstruction candidate, the object is identified as and treated as an
obstruction candidate along with any other obstruction candidates in the 2D
image that have been positively identified as obstruction candidates. In some
implementations, the determination may also be stored locally on tractor 324.
[00089] As indicated by block 522, in response to an obstruction
candidate being identified in the 2D image 642 in block 514-518, those points
in the 3D point cloud are correlated to the obstruction candidate to detect
the
location of the obstruction candidate as well as its coordinates from the 3D
point cloud. FIG. 15 illustrates the detection of the example obstructions in
the 3D point cloud 652, as identified in the example 2D image 642. The
coordinates of the various points in the 3D point cloud 652 indicate both the
geographic location of the identified obstruction as well as its dimensions.
As
further indicated by block 522, unit 328 classifies the obstruction candidate
as
an actual obstruction based upon the determined location and dimensions of
the obstruction candidate.
28
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
[00090] As indicated by block 526, unit 328 may use the identified
coordinates of the classified obstruction to track the identified
obstructions,
monitoring the relative position to tractor 324 tractor 324 is moving through
the field, or trigger vineyard. For example, unit 328 may, using signals from
GPS units 420-1 and/or 420-2, to determine the current location or
coordinates of tractor 324. Unit 328 may further utilize such coordinates from
the GPS units at different times or utilize signals from wheel encoders
indicating the current speed of tractor 324 signals from potentiometer
indicating the current steering angle of tractor 324 to determine the current
path of tractor 324. As tractor 324 moves along the path, the relative
position
of the obstruction/obstacle to tractor 324 and an implement, such as
implement 325, may change. This change in the relative positioning may be
tracked so as to time any adjustments to the state of the implement to reduce
or avoid damage to the implement that might be caused by the detected
obstruction.
[00091] As further illustrated by block 528 in FIG. 9, controller 404
and/or unit 328 may carry out tractor controls and may include an automated
floating control feature 530. As indicated by block 532, automatic or
automated floating control may be enabled or disabled. In response to the
floating implement control being disabled, control over the height or other
state of implement 325 may be static or maybe under manual control as
indicated by block 534.
[00092] As indicated by block 536, the tracked position of the detected
object serves as an input for decision-making. In response to the optical
being within the predicted path of tractor 324, unit 328 output control
signals
to drive /steering controller 404 as indicated by block 538. As indicated by
block 540, in response to the automatic floating control being enabled and in
response to the control signals in block 538, state of implement 540 may be
altered based upon the determined location and dimension of the obstruction.
29
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
As indicated by block 542, one example altering of the state of implement 540
may be raising, lowering or lateral movement of the hitch 426 of tractor 324
based upon the relative position of the obstacle with respect to the tractor
324, implement 325 or their paths. In other implementations, the state of
implement 325 or any other implement coupled to tractor 324 may be altered
in other fashions.
[00093] Although the claims of the present disclosure are generally
directed to a rear axle center locating system, medium and method, the
present disclosure is additionally directed to the features set forth in the
following definitions.
1. An obstruction avoidance system comprising:
a tractor for use with an implement, the tractor
comprising at least one sensor configured to output
signals serving as a basis for a three-dimensional (3D)
point cloud and to output signals corresponding to a two-
dimensional (2D) image; and
an obstruction avoidance unit comprising:
a processor;
a non-transitory computer-readable medium
containing instructions to direct the processor to:
output control signals to the at least
on sensor to capture a particular 2D image proximate the
tractor;
output control signals to the at least
one sensor to output the signals serving as a basis for a
particular 3D point cloud;
identify an obstruction candidate in
the particular 2d image;
CA 03233393 2024- 3- 27

WO 2023/055366
PCT/US2021/052780
correlate the obstruction candidate in
the particular 2D image to a portion of the particular 3D
point cloud to determine a location of the obstruction
candidate relative to the tractor and to determine a
dimension of the obstruction direction;
classify the obstruction candidate as
an obstruction based on the location and
the dimension; and
output control signals to alter a state
of an implement coupled to the tractor in response to the
obstruction candidate being classified as an obstruction.
2. The obstruction avoidance system of definition 1, wherein the at
least one sensor comprises a stereo camera configured to output the signals
serving as a basis for a three-dimensional (3D) point cloud and to output the
signals corresponding to a two-dimensional (2D) image.
3. The obstruction avoidance system of definition 1 further
comprising a stored library of objects, wherein the instructions are to direct
the
processor to identify the obstruction candidate in the 2D image by comparing
portions of the 2D image with the objects in the stored library.
4. The obstruction avoidance system of definition 1, wherein the
processing unit and the non-transitory computer-readable medium form a
neural network that learns how to identify objects in the 2D image from a
training set of images of objects.
5. The obstruction avoidance system of definition 1, wherein the
instructions are to direct the processor to remove a ground plane from the 3D
31
CA 03233393 2024- 3- 27

WO 2023/055366
PCT/US2021/052780
point cloud prior to the correlating of the obstruction candidate identified
in the
2D image to the portion of the 3D point cloud.
6. The obstruction avoidance system of definition 1, wherein the
instructions are direct processor to perform clustering on the 3D point cloud.
7. The obstruction avoidance system of definition 1, wherein
implement comprises a floating implement, wherein the dimension comprises
a height of the obstruction and wherein the control signals raise or lower the
implement based upon the determined location and height of the obstruction.
8. The obstruction avoidance system of definition 1, wherein the
control signals are to raise or lower the implement based upon a comparison
of a minimum clearance height for the implement and the determined height
of the obstruction.
9. The obstruction avoidance system of definition 8 further
comprising a stored library of minimum clearance heights for different
implements.
10. The obstruction avoidance system of definition 1, wherein the
instructions are to direct the processor to capture an image of the implement
coupled to the tractor and to identify the implement based upon the image and
wherein the instructions are to direct the processor to retrieve the minimum
height for the implement from a stored library based upon the identification
of
the implement from the image.
11. The obstruction avoidance system of definition 8 further
comprising an operator input to receive a minimum clearance height for the
implement from an operator.
32
CA 03233393 2024- 3- 27

WO 2023/055366
PCT/US2021/052780
12. The obstruction avoidance system of definition 1, wherein the
instructions are to direct the processor to record a geographic location of
the
obstruction or a geographic location at which the state of the implement was
altered based upon the determined location and dimension of the obstruction.
13. The obstruction avoidance system of definition 1, wherein the
instructions are to direct the processor to generate and store a map of
obstruction locations or geographic location at which the state of the
implement was altered based upon the determined location and dimension of
the obstruction.
14. The obstruction avoidance system of definition 1, wherein the
instructions are to direct the processor to output control signals to alter a
state
of the implement coupled to the tractor by adjusting an operational state of
the
implement based upon the determined location and dimension of the
obstruction.
15. The obstruction avoidance system of definition 1, wherein the
implement is coupled to the tractor behind the tractor.
16. The obstruction avoidance system of definition 1, wherein the
instructions are to direct the processor to control a duration at which the
state
of the implement is altered based upon a speed of the tractor.
17. A method comprising:
acquiring two-dimensional (2D) image of a region
proximate a tractor;
acquiring a three-dimensional (3D) point cloud of the
region proximate the tractor;
33
CA 03233393 2024- 3- 27

WO 2023/055366
PCT/US2021/052780
identifying an obstruction candidate in the 2D image;
correlating the obstruction candidate identified in the 2D
image to a portion of the 3D point cloud;
determining a location of the obstruction candidate
relative to the tractor and a dimension of the obstruction
candidate based upon the portion of the 3D point cloud
correlated to the obstruction identified in the 2D image;
classifying the obstruction candidate as an obstruction
based upon the location and the dimension; and
outputting control signals to alter a state of coupled to the
tractor in response to the obstruction candidate being classified
as an obstruction.
18. The method of definition 17, wherein the 2D image and the
three-dimensional point cloud are obtained from signals output by a stereo
camera carried by the tractor.
19. The method of definition 17 wherein the identification of the
obstruction candidate in the 2D image is performed by a neural network
trained using training images of obstructions.
20. The method of definition 1 7, wherein a ground plane in the 3D
point cloud is identified and removed prior to the correlating of the
obstruction
candidate identified in the 2D image to the portion of the 3D point cloud.
21. A non-transitory computer-readable medium containing
instructions to direct a processing unit, the instructions comprising:
2D image acquisition instructions for acquiring a two-
dimensional (2D) image of a region proximate a tractor;
34
CA 03233393 2024- 3- 27

WO 2023/055366
PCT/US2021/052780
3D point cloud acquisition instructions for acquiring a three-
dimensional (3D) point cloud of the region proximate the tractor;
obstruction identification instructions for identifying an
obstruction candidate in the 2D image;
correlation instructions for correlating the obstruction candidate
identified in the 2D image to a portion of the 3D point cloud;
obstruction candidate parameter acquisition instructions for
determining a location of the obstruction candidate relative to the
tractor and a dimension of the obstruction candidate based upon the
portion of the 3D point cloud correlated to the obstruction candidate
identified in the 2D image;
classification instructions for classifying the obstruction
candidate as an obstruction based upon the location of the obstruction
candidate relative to the tractor and the dimension of the obstruction
candidate; and
obstruction response instructions for outputting control signals
alter a state of coupled to the tractor based upon the determined
location and the dimension of the obstruction.
22. An obstruction avoidance system comprising:
an agronomy vehicle comprising at least one
sensor configured to output signals serving as a basis for
a three-dimensional (3D) point cloud and to output
signals corresponding to a two-dimensional (2D) image;
and
an obstruction avoidance unit comprising:
a processor;
a non-transitory computer-readable medium
containing instructions to direct the processor to:
CA 03233393 2024- 3- 27

WO 2023/055366
PCT/US2021/052780
output control signals to the at least
on sensor to capture a particular 2D image proximate the
tractor;
output control signals to the at least
one sensor to output the signals serving as a basis for a
particular 3D point cloud;
identify an obstruction candidate in
the particular 2d image;
correlate the obstruction candidate in
the particular 2D image to a portion of the particular 3D
point cloud to determine a value of a parameter of the
obstruction candidate from the 3D point cloud;
classify the obstruction candidate as
an obstruction based on the value; and
output control signals to alter a state
of the agronomy vehicle in response to the obstruction
candidate being classified as an obstruction.
23. The obstruction avoidance system of definition 22, wherein the
parameter comprises a location of the obstruction candidate.
24. The obstruction avoidance system of definition 22, wherein the
parameter comprises a dimension of the obstruction candidate.
25. The obstruction avoidance system of definition 22, wherein the
parameter comprises a height of the obstruction candidate.
26. The obstruction avoidance system of definition 22, wherein the
agronomy vehicle comprises a tractor and an implement to be moved by the
tractor.
36
CA 03233393 2024- 3- 27

WO 2023/055366
PCT/US2021/052780
27. The obstruction avoidance system of definition 22, wherein the
at least one sensor comprises a stereo camera configured to output the
signals serving as a basis for a three-dimensional (3D) point cloud and to
output the signals corresponding to a two-dimensional (2D) image.
28. The obstruction avoidance system of definition 22 further
comprising a stored library of objects, wherein the instructions are to direct
the
processor to identify the obstruction candidate in the 2D image by comparing
portions of the 2D image with the objects in the stored library.
29. The obstruction avoidance system of definition 22, wherein the
processing unit and the non-transitory computer-readable medium form a
neural network that learns how to identify objects in the 2D image from a
training set of images of objects.
30. The obstruction avoidance system of definition 22, wherein the
instructions are to direct the processor to remove a ground plane from the 3D
point cloud prior to the correlating of the obstruction candidate identified
in the
2D image to the portion of the 3D point cloud.
31. The obstruction avoidance system of definition 22, wherein the
instructions are direct processor to perform clustering on the 3D point cloud.
32. The obstruction avoidance system of definition 22, wherein the
agronomy vehicle comprises a tractor and an implement, wherein the at least
one sensor is carried by the tractor and wherein the state of the agronomy
vehicle altered in response to the obstruction candidate being classified as
an
obstruction is a state of the implement.
37
CA 03233393 2024- 3- 27

WO 2023/055366
PCT/US2021/052780
33. The obstruction avoidance system of definition 32, wherein
implement comprises a floating implement, wherein the dimension comprises
a height of the obstruction and wherein the control signals raise or lower the
implement based upon the determined location and height of the obstruction.
34. The obstruction avoidance system of definition 32, wherein the
control signals are to raise or lower the implement based upon a comparison
of a minimum clearance height for the implement and the determined height
of the obstruction.
35. The obstruction avoidance system of definition 34 further
comprising a stored library of minimum clearance heights for different
implements.
36. The obstruction avoidance system of definition 34, wherein the
instructions are to direct the processor to capture an image of the implement
coupled to the tractor and to identify the implement based upon the image and
wherein the instructions are to direct the processor to retrieve the minimum
height for the implement from a stored library based upon the identification
of
the implement from the image.
37. The obstruction avoidance system of definition 36 further
comprising an operator input to receive a minimum clearance height for the
implement from an operator.
38. The obstruction avoidance system of definition 32, wherein the
instructions are to direct the processor to record a geographic location of
the
obstruction or a geographic location at which the state of the agronomy
vehicle was altered.
38
CA 03233393 2024- 3- 27

WO 2023/055366 PCT/US2021/052780
39. The obstruction avoidance system of definition 32, wherein the
instructions are to direct the processor to generate and store a map of
obstruction locations or geographic location at which the state of the
implement was altered.
40. The obstruction avoidance system of definition 32, wherein the
instructions are to direct the processor to output control signals to alter a
state
of the implement coupled to the tractor by adjusting an operational state of
the
implement based upon the determined location and dimension of the
obstruction.
41. The obstruction avoidance system of definition 32, wherein the
implement is coupled to the tractor behind the tractor.
42. The obstruction avoidance system of definition 32, wherein the
instructions are to direct the processor to control a duration at which the
state
of the implement is altered based upon a speed of the tractor.
[00094] Although the present disclosure has been described with
reference to example implementations, workers skilled in the art will
recognize
that ratty look alike already a lot that changes may be made in form and
detail
without departing from the scope of the claimed subject matter. For example,
although different example implementations may have been described as
including features providing benefits, it is contemplated that the described
features may be interchanged with one another or alternatively be combined
with one another in the described example implementations or in other
alternative implementations. Because the technology of the present
disclosure is relatively complex, not all changes in the technology are
foreseeable. The present disclosure described with reference to the example
implementations and set forth in the following claims is manifestly intended
to
39
CA 03233393 2024- 3- 27

WO 2023/055366
PCT/US2021/052780
be as broad as possible. For example, unless specifically otherwise noted,
the claims reciting a single particular element also encompass a plurality of
such particular elements. The terms "first", "second", "third" and so on in
the
claims merely distinguish different elements and, unless otherwise stated, are
not to be specifically associated with a particular order or particular
numbering
of elements in the disclosure.
CA 03233393 2024- 3- 27

Dessin représentatif
Une figure unique qui représente un dessin illustrant l'invention.
États administratifs

2024-08-01 : Dans le cadre de la transition vers les Brevets de nouvelle génération (BNG), la base de données sur les brevets canadiens (BDBC) contient désormais un Historique d'événement plus détaillé, qui reproduit le Journal des événements de notre nouvelle solution interne.

Veuillez noter que les événements débutant par « Inactive : » se réfèrent à des événements qui ne sont plus utilisés dans notre nouvelle solution interne.

Pour une meilleure compréhension de l'état de la demande ou brevet qui figure sur cette page, la rubrique Mise en garde , et les descriptions de Brevet , Historique d'événement , Taxes périodiques et Historique des paiements devraient être consultées.

Historique d'événement

Description Date
Requête visant le maintien en état reçue 2024-09-16
Paiement d'une taxe pour le maintien en état jugé conforme 2024-09-16
Demande ou réponse transmise en ligne 2024-08-22
Inactive : Page couverture publiée 2024-04-09
Inactive : CIB en 1re position 2024-03-28
Lettre envoyée 2024-03-28
Inactive : CIB attribuée 2024-03-28
Inactive : CIB attribuée 2024-03-28
Inactive : CIB attribuée 2024-03-27
Lettre envoyée 2024-03-27
Demande reçue - PCT 2024-03-27
Exigences pour l'entrée dans la phase nationale - jugée conforme 2024-03-27
Lettre envoyée 2024-03-27
Inactive : CIB attribuée 2024-03-27
Demande publiée (accessible au public) 2023-04-06

Historique d'abandonnement

Il n'y a pas d'historique d'abandonnement

Taxes périodiques

Le dernier paiement a été reçu le 2024-09-16

Avis : Si le paiement en totalité n'a pas été reçu au plus tard à la date indiquée, une taxe supplémentaire peut être imposée, soit une des taxes suivantes :

  • taxe de rétablissement ;
  • taxe pour paiement en souffrance ; ou
  • taxe additionnelle pour le renversement d'une péremption réputée.

Veuillez vous référer à la page web des taxes sur les brevets de l'OPIC pour voir tous les montants actuels des taxes.

Historique des taxes

Type de taxes Anniversaire Échéance Date payée
TM (demande, 2e anniv.) - générale 02 2023-10-03 2024-03-27
Taxe nationale de base - générale 2024-03-27
TM (demande, 3e anniv.) - générale 03 2024-10-01 2024-09-16
Titulaires au dossier

Les titulaires actuels et antérieures au dossier sont affichés en ordre alphabétique.

Titulaires actuels au dossier
ZIMENO, INC. DBA MONARCH TRACTOR
Titulaires antérieures au dossier
DEEPAK RAJASEKHAR KARISHETTI
SANKET GOYAL
Les propriétaires antérieurs qui ne figurent pas dans la liste des « Propriétaires au dossier » apparaîtront dans d'autres documents au dossier.
Documents

Pour visionner les fichiers sélectionnés, entrer le code reCAPTCHA :



Pour visualiser une image, cliquer sur un lien dans la colonne description du document. Pour télécharger l'image (les images), cliquer l'une ou plusieurs cases à cocher dans la première colonne et ensuite cliquer sur le bouton "Télécharger sélection en format PDF (archive Zip)" ou le bouton "Télécharger sélection (en un fichier PDF fusionné)".

Liste des documents de brevet publiés et non publiés sur la BDBC .

Si vous avez des difficultés à accéder au contenu, veuillez communiquer avec le Centre de services à la clientèle au 1-866-997-1936, ou envoyer un courriel au Centre de service à la clientèle de l'OPIC.


Description du
Document 
Date
(aaaa-mm-jj) 
Nombre de pages   Taille de l'image (Ko) 
Description 2024-03-27 40 1 515
Dessins 2024-03-27 8 217
Revendications 2024-03-27 4 123
Abrégé 2024-03-27 1 19
Dessin représentatif 2024-04-09 1 19
Page couverture 2024-04-09 1 40
Confirmation de soumission électronique 2024-09-16 1 61
Requête d'examen 2024-06-26 1 400
Demande d'entrée en phase nationale 2024-03-27 2 45
Déclaration 2024-03-27 1 15
Traité de coopération en matière de brevets (PCT) 2024-03-27 2 64
Courtoisie - Lettre confirmant l'entrée en phase nationale en vertu du PCT 2024-03-27 2 47
Rapport de recherche internationale 2024-03-27 2 74
Demande d'entrée en phase nationale 2024-03-27 8 184
Avis du commissaire - Demande non conforme 2024-03-28 2 203