Sélection de la langue

Search

Sommaire du brevet 3239010 

Énoncé de désistement de responsabilité concernant l'information provenant de tiers

Une partie des informations de ce site Web a été fournie par des sources externes. Le gouvernement du Canada n'assume aucune responsabilité concernant la précision, l'actualité ou la fiabilité des informations fournies par les sources externes. Les utilisateurs qui désirent employer cette information devraient consulter directement la source des informations. Le contenu fourni par les sources externes n'est pas assujetti aux exigences sur les langues officielles, la protection des renseignements personnels et l'accessibilité.

Disponibilité de l'Abrégé et des Revendications

L'apparition de différences dans le texte et l'image des Revendications et de l'Abrégé dépend du moment auquel le document est publié. Les textes des Revendications et de l'Abrégé sont affichés :

  • lorsque la demande peut être examinée par le public;
  • lorsque le brevet est émis (délivrance).
(12) Demande de brevet: (11) CA 3239010
(54) Titre français: COMMANDE D'ESPACE D'IMAGE POUR OUTILS ENDOVASCULAIRES
(54) Titre anglais: IMAGE SPACE CONTROL FOR ENDOVASCULAR TOOLS
Statut: Demande conforme
Données bibliographiques
(51) Classification internationale des brevets (CIB):
  • G16H 30/40 (2018.01)
  • A61B 34/20 (2016.01)
(72) Inventeurs :
  • BELL, DAVID JAMES (Etats-Unis d'Amérique)
  • SGANGA, JAKE ANTHONY (Etats-Unis d'Amérique)
  • KAHN, GREGORY (Etats-Unis d'Amérique)
(73) Titulaires :
  • REMEDY ROBOTICS, INC.
(71) Demandeurs :
  • REMEDY ROBOTICS, INC. (Etats-Unis d'Amérique)
(74) Agent: MERIZZI RAMSBOTTOM & FORSTER
(74) Co-agent:
(45) Délivré:
(86) Date de dépôt PCT: 2022-08-11
(87) Mise à la disponibilité du public: 2023-06-01
Licence disponible: S.O.
Cédé au domaine public: S.O.
(25) Langue des documents déposés: Anglais

Traité de coopération en matière de brevets (PCT): Oui
(86) Numéro de la demande PCT: PCT/US2022/040118
(87) Numéro de publication internationale PCT: US2022040118
(85) Entrée nationale: 2024-05-23

(30) Données de priorité de la demande:
Numéro de la demande Pays / territoire Date
17/810,102 (Etats-Unis d'Amérique) 2022-06-30
63/264,531 (Etats-Unis d'Amérique) 2021-11-24

Abrégés

Abrégé français

L'invention concerne des systèmes et des procédés de commande d'espace d'image d'un instrument médical. Selon un exemple, un système est configuré pour afficher une image médicale bidimensionnelle comprenant une vue d'au moins une extrémité distale d'un instrument. Le système peut déterminer, sur la base d'un ou de plusieurs repères sur l'instrument, une estimation de roulis de l'instrument. Le système peut en outre recevoir une entrée utilisateur comportant une instruction de cap pour modifier un cap de l'instrument dans un plan de l'image médicale, ou une instruction d'inclinaison pour modifier une inclinaison de l'instrument dans le plan ou hors du plan de l'image médicale. Sur la base de l'estimation de roulis et de l'entrée utilisateur, le système peut générer une ou plusieurs instructions de moteur configurées pour amener un système robotique accouplé à l'instrument médical à déplacer l'instrument médical robotique.


Abrégé anglais

Systems and methods for image space control of a medical instrument are provided. In one example, a system is configured to display a two-dimensional medical image including a view of at least a distal end of an instrument. The system can determine, based on one or more fiducials on the instrument, a roll estimate of the instrument. The system further can receive a user input comprising a heading command to change a heading of the instrument within a plane of the medical image, or an incline command to change an incline of the instrument into or out of the plane of the medical image. Based on the roll estimate and the user input, the system can generate one or more motor commands configured to cause a robotic system coupled to the medical instrument to move the robotic medical instrument.

Revendications

Note : Les revendications sont présentées dans la langue officielle dans laquelle elles ont été soumises.


WO 2023/096679
PCT/US2022/040118
WHAT IS CLAIMED IS:
1. A computer-implemented system comprising at least one processor and at
least one electronic storage medium storing instructions configured to cause
the at least one
processor to:
display, on a graphical user interface, a two-dimensional medical image
including a view of at least a distal end of a medical instrument, the distal
end
including one or more fiducials positioned thereon and that are visible in the
medical
image;
deteimine, based on the one or more fiducials in the medical image, a roll
estimate of a current roll angle of the medical instrument;
receive a user input from a user input device, the user input comprising at
least one of:
a heading command to change a heading of the medical instrument
within a plane of the medical image, or
an incline command to change an incline of thc medical instrument
into or out of the plane of the medical image;
based on the roll estimate and the user input, generate one or more motor
commands configured to cause a robotic system coupled to the medical
instrument to
move the robotic medical instrument according to the user input; and
cause the robotic medical system to move the robotic medical system based on
the one or more motor commands.
2. The system of Claim 1, wherein the one or more motor commands comprise
pullwire commands configured to actuate one or more pullwires of the medical
instrument.
3. The system of Claim 1, wherein the roll estimate is determined based on
a
two-dimensional appearance of the one or more fiducials in the medical image.
4. The system of Claim 3, wherein the at least one processor is configured
to
determine the roll estimate based on a computer vision analysis of the one or
more fiducials
in the medical image.
5. The system of Claim 3, wherein the processor is further configured to:
-51 -
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
display, on the graphical user interface, a plurality of sample images, each
sample image comprising a shape corresponding to a sample two-dimensional
projection of the one or more fiducials onto a plane at a different roll
angle; and
receive a sample selection on the user input device, wherein the sample
selection comprises an indication of a sample image that most closely
corresponds to
the two-dimensional appearance of the one or more fiducials in the medical
image;
wherein the roll estimate is determined based on the sample selection.
6. The system of Claim 1, wherein the heading command to change the heading
of the medical instrument comprises an indication to move the distal end of
the medical
instrument to the left or to the right within the plane of the medical image
relative to a current
heading of the medical instrument.
7. The system of Claim 1, wherein the incline command to change the incline
of
the medical instrument comprises an indication to move the distal end of the
medical
instrument into or out of the plane of the medical image relative to the
current heading of the
medical instrument.
8. The system of Claim 1, wherein the heading command to change the heading
of the medical instrument comprises an indication of a desired heading for the
distal end of
the medical instrument within the plane of the medical image.
9. The system of Claim 1, wherein the incline command to change the incline
of
the medical instrument comprises an indication of a desired incline of the
distal end of the
medical instrument into or out of the plane of the medical image.
10. The system of Claim 1, wherein the processor is further configured to,
based
on the one or more fiducials in the medical image, determine a current incline
of the distal
end of the medical instrument into or out of the plane of the medical image.
11. The system of Claim 10, wherein the processor is further configured to
display, on the graphical user interface. an indication of the current incline
of the distal end
of the medical instrument.
12. The system of Claim 1, wherein the processor is further configured to,
based
on the medical image, determine a current heading of the distal end of the
medical instrument
within the plane of the medical image.
-52-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
13. The system of Claim 12, wherein the processor is further configured to
display, on the graphical user interface, an indication of the current heading
of the distal end
of the medical instrument.
14. The system of Claim 1, wherein the graphical user interface comprises
the
user input device.
15. The system of Claim 3, wherein the one or more fiducials are configured
such
that the two-dimensional appearance of the fiducials within the medical image
is visually
distinguishable for different roll angles and different inclines of the
medical instrument.
16. A method, comprising:
displaying, on a graphical user interface, a two-dimensional medical image
including a view of at least a distal end of a medical instrument, the distal
end
including one or more fiducials positioned thereon and that are visible in the
medical
image;
detetinining, based on the one or more fiducials in the medical image, a roll
estimate of a current roll angle of the medical instrument;
receiving a user input from a user input device, the user input comprising at
least one of:
a heading command to change a heading of the medical instrument
within a plane of the medical image, or
an incline command to change an incline of the medical instrument
into or out of the plane of the medical image;
based on the roll estimate and the user input, generating one or more motor
commands configured to cause a robotic system coupled to the medical
instrument to
move the robotic medical instrument according to the user input; and
cause the robotic medical system to move the robotic medical system based on
the one or more motor commands.
17. The method of Claim 16, wherein the one or more motor commands comprise
pullwire commands configured to actuate one or more pullwires of the medical
instrument.
18. The method of Claim 16, wherein the roll estimate is determined based
on a
two-dimensional appearance of the one or more fiducials in the medical image.
-53-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
19. The method of Claim 18, wherein determining the roll estimate is based
on a
computer vision analysis of the one or more fiducials in the medical image.
20. The method of Claim 18, further comprising:
displaying, on the graphical user interface, a plurality of sample images,
each
sample image comprising a shape corresponding to a sample two-dimensional
projection of the one or more fiducials onto a plane at a different roll
angle; and
receiving a sample selection on the user input device, wherein the sample
selection comprises an indication of a sample image that most closely
corresponds to
the two-dimensional appearance of the one or more fiducials in the medical
image;
wherein the roll estimate is determined based on the sample selection.
21. The method of Claim 16, wherein the heading command to change the
heading of the medical instrument comprises an indication to move the distal
end of the
medical instrument to the left or to the right within the plane of the medical
image relative to
a current heading of the medical instrument.
22. Thc method of Claim 16, wherein the incline command to change the
incline
of the medical instrument comprises an indication to move the distal end of
the medical
instrument into or out of the plane of the medical image relative to the
current heading of the
medical instrument.
23. The method of Claim 16, wherein the heading command to change the
heading of the medical instrument comprises an indication of a desired heading
for the distal
end of the medical instrument within the plane of the medical image.
24. The method of Claim 16, wherein the incline command to change the
incline
of the medical instrument comprises an indication of a desired incline of the
distal end of the
medical instrument into or out of the plane of the medical image.
25. The method of Claim 16, further comprising, based on the one or more
fiducials in the medical image, determining a current incline of the distal
end of the medical
instrument into or out of the plane of the medical image.
26. The method of Claim 25, further comprising, displaying, on the
graphical user
interface, an indication of the current incline of the distal end of the
medical instrument.
-54-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
27. The method of Claim 16, further comprising, based on the medical image,
determining a current heading of the distal end of the medical instrument
within the plane of
the medical image.
28. The method of Claim 27, further comprising, displaying, on the
graphical user
interface, an indication of the current heading of the distal end of the
medical instrument.
29. The method of Claim 16, wherein the graphical user interface comprises
the
user input device.
30. The method of Claim 18, wherein the one or more fiducials are
configured
such that the two-dimensional appearance of the fiducials within the medical
image is
visually distinguishable for different roll angles and different inclines of
the medical
instrument.
-55-
CA 03239010 2024- 5- 23

Description

Note : Les descriptions sont présentées dans la langue officielle dans laquelle elles ont été soumises.


WO 2023/096679
PCT/US2022/040118
IMAGE SPACE CONTROL FOR ENDO VASCULAR TOOLS
PRIORITY APPLICATIONS
[0001] This application is a continuation in part of U.S.
App. No. 17/810,102,
filed June 30, 2022, which claims priority to U.S. Provisional Application No.
63/202,963,
filed July 1, 2021, and to U.S. Provisional Application No. 63/264,531, filed
November 24,
2021, each of which are incorporated herein by reference.
B ACKGROUND
Field
[0002] The present application is directed to control
systems for endovascular and
other intraluminal tools or medical instruments, such as catheters. In some
embodiments, the
devices, systems, and methods described herein can be included in or used in
conjunction
with robotic medical systems configured to facilitate control and operation of
the medical
instrument.
Description
[0003] Endovascular medical procedures are common. During
an endovascular
procedure, a tool or medical instrument that is generally configured as a
long, thin, flexible
body is inserted into and navigated through a lumen or other cavity of the
body.
[0004] In some instances, the tools or medical instruments
are articulable or
controllable, for example, using one or more pull wires, to allow an operator
to navigate the
tool or medical instrument within the body. Such navigation is often
accomplished through
deflection (for example, bending) of the distal tip of the tool or medical
instrument.
[0005] Some tools or medical instruments are configured for
manual control, for
example, using knobs or levers mounted on a proximally located handle of the
tool or
medical instrument. In other instances, the tools or medical instruments can
be configured for
robotic control, for example, control by a robotic medical system. In some
embodiments, an
operator can use the robotic medical system (for example, a controller, user
interface, and/or
the like) to robotically control the tool or medical instrument.
-1-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
SUMMARY
[0006] This application describes devices, systems, and
methods for controlling
endovascular or other intraluminal tools during a medical procedure. In some
embodiments,
control inputs arc provided with respect to a plane of a two-dimensional
medical image such
as an X-ray. For example, control inputs can be provided to adjust a heading
of an instrument
within the plane of the medical image, adjust an incline of the instrument
into or out of the
plane of the medical image, and/or to insert or retract the medical
instrument. Providing a
control scheme in which control inputs are provided by a user with respect to
the plane of the
medical image can advantageously facilitate intuitive and natural control of
the instrument.
In some instances, such a control scheme is referred to herein as "image space
control"
because control inputs are provided with respect to the plane of a two-
dimensional medical
image.
[0007] Articulating the instrument, either to adjust the
heading of the instrument
within the two-dimensional plane of the medical image or to adjust the incline
of the
instrument into or out of the plane of the medical image, typically requires
an accurate
understanding of the current roll angle of the instrument about its
longitudinal axis. During a
medical procedure it can be difficult for a human user controlling the
instrument to keep
track of or understand the current roll of the instrument, especially as the
instrument is
navigated through generally tortuous paths, such as luminal networks of the
body. In some
embodiments, the control scheme described herein allows for determination or
estimation of
the current roll angle of the instrument based on an appearance of one or more
radio-opaque
fiducials that are included on the medical instrument, and which are visible
within the
medical image.
[0008] For example, computer vision can be used to analyze
a two-dimensional
medical image to determine the position and/or orientation (including, in some
examples,
current roll) of a catheter based on radio-opaque markers that are included on
the catheter. In
some instances, five degrees of freedom for the catheter can be determined:
two positional
degrees of freedom (e.g., x and y position) and three degrees of freedom
relating to
orientation (e.g., heading, incline, and roll). Various example configurations
for the radio-
opaque markers are disclosed. The use of other configurations for the radio-
opaque markers
is also possible, and this disclosure should not be limited to only the
disclosed configurations.
-2-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
[0009] The devices, systems, and methods described herein
can provide several
notable advantages over existing technologies. For example, position and
orientation can be
determined using minimally sophisticated medical imaging (e.g., single plane X-
ray).
Suitable medical imaging devices are widely available, allowing the devices,
systems, and
methods described herein to be widely available. Additionally, position and
orientation can
be determined without the additional hardware that is often required by other
systems. For
example, existing systems often determine position and orientation using
electromagnetic
sensors and electromagnetic field generators. Such systems are cumbersome and
relatively
inaccurate, requiring precise registrations with other forms of data (e.g.,
medical imaging
data, computer models, robotic movement data, etc.) in order to be useful.
With the
principals described herein, the need for such systems can be avoided.
Finally, the devices,
systems, and methods of the present application can allow for safer and more
precise control
of a catheter. This can, in turn, facilitate remote or autonomous control of
the catheter.
Additionally, the control schemes described herein can facilitate natural and
intuitive control
of an instrument, allowing the user to provide inputs with respect to the
plane of the medical
image that the user is currently viewing, and in some cases, without requiring
the user to
actively consider the current roll of the instrument. These and other benefits
and advantages
of the application will become more apparent after considering the disclosure
and drawings
in the Detailed Description section below.
[0010] In a first aspect, a computer-implemented system
comprises at least one
processor and at least one electronic storage medium, the electronic storage
medium storing
instructions configured to cause the at least one processor to: receive, from
a medical
imaging device, a two-dimensional medical image including a view of at least a
distal portion
of a medical instrument, the distal portion of the medical instrument
including one or more
fiducials positioned thereon, the one or more fiducials being radio-opaque and
visible in the
medical image; detect, within the medical image, a two-dimensional appearance
of the one or
more fiducials; and based on the two-dimensional appearance of the one or more
fiducials,
determine at least one of: a roll angle of the distal portion of the medical
instrument, and an
incline of the distal portion of the medical instrument.
[0011] The system may include one or more of the following
features in any
combination: (a) wherein the at least one processor is configured to detect
the two-
-3-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
dimensional appearance of the one or more fiducials based on a computer vision
algorithm;
(b) the at least one processor is configured to detect the two-dimensional
appearance of the
one or more fiducials using a neural network; (c) wherein the at least one
processor is further
configured to determine both of the roll angle of the distal portion of the
medical instrument
and the incline of the distal portion of the medical instrument; (d) wherein
the at least one
processor is further configured to determine the incline with respect to an
image plane of the
two-dimensional medical image; (e) wherein the one or more fiducials are
configured such
that the two-dimensional appearance of the fiducials within the medical image
is visually
distinguishable for different roll angles and different inclines of the
medical instrument; (f)
wherein the one or more fiducials are configured such that the two-dimensional
appearance
of the fiducials within the medical image is visually distinguishable for
different roll angles
and different inclines of the medical instrument for incremental changes of
less than 5
degrees, less than 10 degrees, less than 15 degrees, less than 20 degrees,
less than 25 degrees,
less than 30 degrees, less than 35 degrees, or less than 40 degrees; (g)
wherein the one or
more fiducials arc configured such that the two-dimensional appearance of the
fiducials
within the medical image is visually distinguishable for different roll angles
and different
inclines of the medical instrument for incremental changes of about 5 degrees,
about 10
degrees, about 15 degrees, about 20 degrees, about 25 degrees, about 30
degrees, about 35
degrees, or about 40 degrees; (h) wherein the at least one processor is
further configured to
detect, within the medical image, a distal tip of the medical instrument, and
based on the
detected distal tip of the medical instrument, determine a two-dimensional
position of the
distal tip of the medical instrument within a plane of the two-dimensional
medical image; (i)
wherein detecting the distal tip of the medical instrument comprises
determining, based on
the medical image, a centerline of the distal portion of the medical
instrument, and
determining an endpoint for the centerline; (j) wherein the at least one
processor is further
configured to detect, within the medical image, a portion of the medical
instrument, and
based on the detected distal portion of the medical instrument, determine a
heading of the
medical instrument within a plane of the two-dimensional medical image; (k)
wherein
determining the heading of the medical instrument comprises determining, based
on the
medical image, a centerline of the distal portion of the medical instrument,
and determining
an endpoint for the centerline, and determining a vector extending from the
endpoint, the
-4-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
vector being colinear with a distal portion of the centerline; (1) wherein the
medical
instrument comprises an endoluminal medical instrument; (m) wherein the
medical
instrument comprises a catheter; (n) wherein the medical imaging device
comprises an X-ray
device; (o) wherein the processor is further configured to determine one or
more motor
controls configured to cause articulation of the distal portion of the medical
instrument,
wherein the one or more motor controls are determined at least in part based
on the
determined roll angle or the determined incline, and transmit the one or more
motor controls
to a robotic system coupled with the medical instrument, whereby the robotic
system causes
articulation of the medical instrument based on the one or more motor
controls; (p) wherein
the processor is further configured to determine the one or more motor
controls based on a
user input; (q) wherein the processor is further configured to cause the
determined roll angle
or the determined incline to be displayed on a user display; (r) the processor
is further
configured to cause the two-dimensional medical image to be displayed on the
user display;
and/or other features as described throughout this application.
[0012] In another aspect, a method includes: receiving,
from a medical imaging
device, a two-dimensional medical image including a view of at least a distal
portion of a
medical instrument, the distal portion of the medical instrument including one
or more
fiducials positioned thereon, the one or more fiducials being radio-opaque and
visible in the
medical image; detecting, within the medical image, a two-dimensional
appearance of the
one or more fiducials; and based on the two-dimensional appearance of the one
or more
fiducials, determining at least one of: a roll angle of the distal portion of
the medical
instrument, and an incline of the distal portion of the medical instrument.
[0013] The method may include one or more of the following
features in any
combination: (a) wherein detecting the two-dimensional appearance of the one
or more
fiducials is based on a computer vision algorithm; (b) detecting the two-
dimensional
appearance of the one or more fiducials using a neural network; (c)
determining both of the
roll angle of the distal portion of the medical instrument and the incline of
the distal portion
of the medical instrument; (d) determining the incline with respect to an
image plane of the
two-dimensional medical image; (r) wherein the one or more fiducials are
configured such
that the two-dimensional appearance of the fiducials within the medical image
is visually
distinguishable for different roll angles and different inclines of the
medical instrument; (f)
-5-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
wherein the one or more fiducials are configured such that the two-dimensional
appearance
of the fiducials within the medical image is visually distinguishable for
different roll angles
and different inclines of the medical instrument for incremental changes of
less than 5
degrees, less than 10 degrees, less than 15 degrees, less than 20 degrees,
less than 25 degrees,
less than 30 degrees, less than 35 degrees, or less than 40 degrees; (g)
wherein the one or
more fiducials are configured such that the two-dimensional appearance of the
fiducials
within the medical image is visually distinguishable for different roll angles
and different
inclines of the medical instrument for incremental changes of about 5 degrees,
about 10
degrees, about 15 degrees, about 20 degrees, about 25 degrees, about 30
degrees, about 35
degrees, or about 40 degrees; (h) detecting, within the medical image, a
distal tip of the
medical instrument, and based on the detected distal tip of the medical
instrument,
determining a two-dimensional position of the distal tip of the medical
instrument within a
plane of the two-dimensional medical image; (i) wherein detecting the distal
tip of the
medical instrument comprises determining, based on the medical image, a
centerline of the
distal portion of the medical instrument, and determining an endpoint for the
centerline; (j)
detecting, within the medical image, a portion of the medical instrument, and
based on the
detected distal portion of the medical instrument, determine a heading of the
medical
instrument within a plane of the two-dimensional medical image; (k) wherein
determining
the heading of the medical instrument comprises determining, based on the
medical image, a
centerline of the distal portion of the medical instrument, and determining an
endpoint for the
centerline, and determining a vector extending from the endpoint, the vector
being colinear
with a distal portion of the centerline; (1) wherein the medical instrument
comprises an
endoluminal medical instrument; (m) wherein the medical instrument comprises a
catheter;
(n) wherein the medical imaging device comprises an X-ray device; (o)
determining one or
more motor controls configured to cause articulation of the distal portion of
the medical
instrument, wherein the one or more motor controls are determined at least in
part based on
the determined roll angle or the determined incline, and transmitting the one
or more motor
controls to a robotic system coupled with the medical instrument, whereby the
robotic system
causes articulation of the medical instrument based on the one or more motor
controls; (p)
determining the one or more motor controls based on a user input; (q) causing
the determined
roll angle or the determined incline to be displayed on a user display; (r)
causing the two-
-6-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
dimensional medical image to be displayed on the user display; and/or other
features as
described throughout this application.
[0014] In another aspect, a computer-implemented system is
disclosed. The
system includes at least one processor and at least one electronic storage
medium storing
instructions configured to cause the at least one processor to: display, on a
graphical user
interface, a two-dimensional medical image including a view of at least a
distal end of a
medical instrument, the distal end including one or more fiducials positioned
thereon and that
are visible in the medical image; determine, based on the one or more
fiducials in the medical
image, a roll estimate of a current roll angle of the medical instrument;
receive a user input
from a user input device, the user input comprising at least one of: a heading
command to
change a heading of the medical instrument within a plane of the medical
image, or an
incline command to change an incline of the medical instrument into or out of
the plane of
the medical image; based on the roll estimate and the user input, generate one
or more motor
commands configured to cause a robotic system coupled to the medical
instrument to move
the robotic medical instrument according to the user input; and cause the
robotic medical
system to move the robotic medical system based on the one or more motor
commands.
[0015] The system may include one or more of the following
features in any
combination: (a) wherein the one or more motor commands comprise pullwire
commands
configured to actuate one or more pullwires of the medical instrument; (b)
wherein the roll
estimate is determined based on a two-dimensional appearance of the one or
more fiducials
in the medical image; (c) wherein the at least one processor is configured to
determine the
roll estimate based on a computer vision analysis of the one or more fiducials
in the medical
image; (d) wherein the processor is further configured to display, on the
graphical user
interface, a plurality of sample images, each sample image comprising a shape
corresponding
to a sample two-dimensional projection of the one or more fiducials onto a
plane at a
different roll angle, and receive a sample selection on the user input device,
wherein the
sample selection comprises an indication of a sample image that most closely
corresponds to
the two-dimensional appearance of the one or more fiducials in the medical
image, wherein
the roll estimate is determined based on the sample selection; (e) wherein the
heading
command to change the heading of the medical instrument comprises an
indication to move
the distal end of the medical instrument to the left or to the right within
the plane of the
-7-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
medical image relative to a current heading of the medical instrument; (f)
wherein the incline
command to change the incline of the medical instrument comprises an
indication to move
the distal end of the medical instrument into or out of the plane of the
medical image relative
to the current heading of the medical instrument; (g) wherein the heading
command to
change the heading of the medical instrument comprises an indication of a
desired heading
for the distal end of the medical instrument within the plane of the medical
image; (h)
wherein the incline command to change the incline of the medical instrument
comprises an
indication of a desired incline of the distal end of the medical instrument
into or out of the
plane of the medical image; (i) wherein the processor is further configured
to, based on the
one or more fiducials in the medical image, determine a current incline of the
distal end of
the medical instrument into or out of the plane of the medical image; (j)
wherein the
processor is further configured to display, on the graphical user interface,
an indication of the
current incline of the distal end of the medical instrument; (k) wherein the
processor is
further configured to, based on the medical image, determine a current heading
of the distal
end of the medical instrument within the plane of the medical image; (1)
wherein the
processor is further configured to display, on the graphical user interface,
an indication of the
current heading of the distal end of the medical instrument; (m) wherein the
graphical user
interface comprises the user input; (n) wherein the one or more fiducials are
configured such
that the two-dimensional appearance of the fiducials within the medical image
is visually
distinguishable for different roll angles and different inclines of the
medical instrument;
and/or other features as described throughout this application.
[0016] In another aspect, a method, is disclosed which
includes: displaying, on a
graphical user interface, a two-dimensional medical image including a view of
at least a
distal end of a medical instrument, the distal end including one or more
fiducials positioned
thereon and that are visible in the medical image; determining, based on the
one or more
fiducials in the medical image, a roll estimate of a current roll angle of the
medical
instrument; receiving a user input from a user input device, the user input
comprising at least
one of: a heading command to change a heading of the medical instrument within
a plane of
the medical image, or an incline command to change an incline of the medical
instrument
into or out of the plane of the medical image; based on the roll estimate and
the user input,
generating one or more motor commands configured to cause a robotic system
coupled to the
-8-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
medical instrument to move the robotic medical instrument according to the
user input; and
cause the robotic medical system to move the robotic medical system based on
the one or
more motor commands.
[0017] The method may include one or more of the following
features in any
combination: (a) wherein the one or more motor commands comprise pullwire
commands
configured to actuate one or more pullwires of the medical instrument; (b)
wherein the roll
estimate is determined based on a two-dimensional appearance of the one or
more fiducials
in the medical image; (c) wherein determining the roll estimate is based on a
computer vision
analysis of the one or more fiducials in the medical image; (d) displaying, on
the graphical
user interface, a plurality of sample images, each sample image comprising a
shape
corresponding to a sample two-dimensional projection of the one or more
fiducials onto a
plane at a different roll angle, and receiving a sample selection on the user
input device,
wherein the sample selection comprises an indication of a sample image that
most closely
corresponds to the two-dimensional appearance of the one or more fiducials in
the medical
image, wherein the roll estimate is determined based on the sample selection;
(c) wherein the
heading command to change the heading of the medical instrument comprises an
indication
to move the distal end of the medical instrument to the left or to the right
within the plane of
the medical image relative to a current heading of the medical instrument; (f)
wherein the
incline command to change the incline of the medical instrument comprises an
indication to
move the distal end of the medical instrument into or out of the plane of the
medical image
relative to the current heading of the medical instrument; (g) wherein the
heading command
to change the heading of the medical instrument comprises an indication of a
desired heading
for the distal end of the medical instrument within the plane of the medical
image; (h)
wherein the incline command to change the incline of the medical instrument
comprises an
indication of a desired incline of the distal end of the medical instrument
into or out of the
plane of the medical image; (i) based on the one or more fiducials in the
medical image,
determining a current incline of the distal end of the medical instrument into
or out of the
plane of the medical image; (j) displaying, on the graphical user interface,
an indication of
the current incline of the distal end of the medical instrument; (k) based on
the medical
image, determining a current heading of the distal end of the medical
instrument within the
plane of the medical image; (1) displaying, on the graphical user interface,
an indication of
-9-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
the current heading of the distal end of the medical instrument; (m) wherein
the graphical
user interface comprises the user input device; (n) wherein the one or more
fiducials are
configured such that the two-dimensional appearance of the fiducials within
the medical
image is visually distinguishable for different roll angles and different
inclines of the medical
instrument; and/or other features as described throughout this application.
[0018] For purposes of this summary, certain aspects,
advantages, and novel
features are described herein. It is to be understood that not necessarily all
such advantages
may be achieved in accordance with any particular embodiment. Thus, for
example, those
skilled in the art will recognize the disclosures herein may be embodied or
carried out in a
manner that achieves one or more advantages taught herein without necessarily
achieving
other advantages as may be taught or suggested herein.
[0019] All of the embodiments described herein are intended
to be within the
scope of the present disclosure. These and other embodiments will be readily
apparent to
those skilled in the art from the following detailed description, having
reference to the
attached figures. The invention is not intended to be limited to any
particular disclosed
embodiment or embodiments.
BRIEF DESCRIPTION OF THE DRAWINGS
[0020] These and other features, aspects and advantages of
the present application
are described with reference to drawings of certain embodiments, which are
intended to
illustrate, but not to limit, the present disclosure. It is to be understood
that the attached
drawings are for the purpose of illustrating concepts disclosed in the present
application and
may not be to scale.
[0021] FIGs. 1A-1D illustrate an example coordinate system
for three-
dimensional image pose estimation.
[0022] FIG. 2A illustrates a side view of a distal end of
an embodiment of an
endovascular catheter.
[0023] FIG. 2B is a fluoroscopic image illustrating an
endovascular catheter, such
as that shown in FIG. 2A, navigating through a vascular network of a patient,
according to an
embodiment.
-10-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
[0024] FIGs. 3A-3F illustrate an embodiment of a marker on
a distal end of a
catheter at different orientations, according to an example.
[0025] FIG. 4 illustrates an example model of a vascular
network of a patient.
[0026] FIG. 5A is an example fluoroscopic image of a
catheter navigating
through the vasculature.
[0027] FIG. 5B is another example fluoroscopic image of a
catheter navigating
through the vasculature.
[0028] FIGs. 6A-6F illustrate determination of out-of-plane
angle based on
detection of a shape of the marker within a medical image, according to some
examples.
[0029] FIGs. 7A-7C illustrate an example of a semicircular
marker for a catheter
configured to allow for, among other things, determination of the sign of the
incline of the
catheter.
[0030] FIG. 7D illustrates an additional example of a
catheter including different
marker types that can be configured to allow for, among other things,
determination of the
sign of the incline of the catheter.
[0031] FIGs. 8A-8B illustrate example fluoroscopic images
of a catheter
including an example of a non-circumferential ring configured to allow for
determination of,
among other things, both the sign and magnitude of the incline of the
catheter.
[0032] FIG. 8C illustrates example two-dimensional
appearances of radio-opaque
markers at various heading and roll positions.
[0033] FIG. 9A illustrates examples of projections of three-
dimensional generated
catheters onto real world two-dimensional X-ray images, according to some
examples.
[0034] FIGs. 9B-9C illustrate an example prediction of a
trained deep neural
network for predicting a position of a body of a catheter body, according to
an example.
[0035] FIG. 9D illustrates an example determination of a
centerline of a catheter.
[0036] FIG. 9E illustrates an example determination of a
position and heading of
a catheter.
[0037] FIG. 10 is a block diagram of an embodiment of a
computer system
configured to implement features described herein.
-11-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
[0038] FIGs. 11A and 11B are side and top views of an
embodiment of a catheter
that includes markers configured to allow for determination of, among other
things, a
catheter roll angle.
[0039] FIGs. 12A-12D show the catheter and markers at
various roll positions.
[0040] FIGs. 13A and 13B are perspective and side views of
another embodiment
of a catheter that includes markers configured to allow for determination of a
catheter roll
angle.
[0041] FIGs. 14A and 14B are perspective and top views of
another embodiment
of a catheter that includes markers configured to allow for determination of a
catheter roll
angle.
[0042] FIGs. 15A-15H illustrate determination of roll angle
based on detection of
a phase of a sinusoid of a helical fiducial within a medical image, according
to some
examples.
[0043] FIG. 16A is a perspective view of another embodiment
of a catheter that
includes a radio-opaque one and one quarter roll helix fiducial configured to
allow for
determination of a catheter roll angle.
[0044] FIGs. 16B-16E show fluoroscopic images of a catheter
for determination
of a catheter roll angle, according to some examples.
[0045] FIG. 16F illustrates the two-dimensional appearance
of a radio-opaque
marker at different roll positions.
[0046] FIGs. 17A-17B illustrate an embodiment of a catheter
that includes radio
opaque braids can be used to deteimine roll, according to some examples.
[0047] FIGs. 18A-18B illustrate another embodiment of a
catheter where radio
opaque markers or fiducials can be used to detect planarity.
[0048] FIG. 19 illustrates another embodiment of a catheter
that includes markers
can be used to deteiminc whether the catheter is angled into or out of the
image plane,
according to some examples.
[0049] FIGs. 20A-20D illustrate an embodiment of a
graphical user interface for
providing image space control of a medical instrument.
[0050] FIGs. 21A-21B illustrate two example embodiments for
roll estimate
determination.
-12-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
[0051] FIGs. 22A-22B illustrate an embodiment of a user
input device for
providing image space control of a medical instrument.
[0052] FIG. 23 illustrates another embodiment of a user
input device for
providing image space control of a medical instrument.
[0053] FIG. 24 is a flowchart depicting an example user
space control method.
DETAILED DESCRIPTION
[0054] This application describes devices, systems, and
methods for detecting or
determining the position and/or orientation of endovascular or other
intraluminal tools or
medical instruments, such as catheters. In some instances, the term "pose- is
used herein to
refer to the position and orientation of a catheter. In some embodiments,
determination of
pose can be made based on a two-dimensional medical image, such as a single
plane X-ray
image, and one or more radio-opaque markers included on a catheter. Computer
vision
models can be employed to detect the radio-opaque markers in the two-
dimensional medical
image and to determine the pose of the catheter therefrom. In some instances,
the pose can be
defined by five degrees of freedom for the catheter. The five degrees of
freedom can include
two positional degrees of freedom (e.g., x and y position) and three degrees
of freedom
relating to orientation (e.g., heading, incline, and roll). In other
embodiments, the pose can
comprise greater (e.g., six) or fewer (e.g., four or fewer) degrees of
freedom. The pose of an
instrument can be defined in many different ways. While this application
primarily describes
examples of pose in terms of x, y, and z for position, and heading, incline,
and roll for
orientation, other methods for describing or defining the pose (e.g.,
alternative coordinate
systems, alternative naming conventions, etc.) are possible, and the
principles of this
application extend to all methods for defining pose. Further, in some
embodiments, the
methods and systems of this application may be used to determine one, more
than one, or all
elements of pose.
[0055] This application also describes devices, systems,
and methods for
controlling endovascular or other intraluminal tools or medical instruments,
such as
catheters, wherein control inputs are provided with respect to a plane of a
two-dimensional
medical image. For example, a user can provide control inputs to change a
heading of an
instrument within the plane of the medical image and/or to change an incline
of the
-13-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
instrument into or out of the plane of the medical image. A computer system
can determine
appropriate motor commands to cause the desired movement/articulation of the
instrument
based on the control inputs and an estimate of a current roll of the
instrument. The estimate
of the current roll of the instrument can be determined based on an appearance
of one or
more radio-opaque markers or fiducials that are included on a distal end of
the image and
which are visible within the medical image. In some instances, the estimate of
current roll is
determined by the system based on a computer vision analysis of the medical
image. In some
instances, the estimate of the current roll is determined and input by the
user based on a user-
identified appearance of the fiducials.
[0056] This type of control system is referred to herein as
"image space control"
because control inputs are provided with respect to the plane of the image
(e.g., adjust
heading within the plane or adjust inclination into or out of the plane). This
type of control
system is intuitive as the user may provide such inputs while viewing the
medical image.
That is, the user can provide control inputs relative to the current
appearance of the
instrument within a medical image and without, in some embodiments, needed to
specifically
understand which pull wires of the instrument need to be actuated to achieve a
desired
motion.
[0057] The principals described herein can be applicable to
robotic medical
procedures, for example, where the catheter is robotically controlled by a
robotic medical
system that is configured to insert, retract, roll, and/or articulate the
catheter based on inputs
received from a physician or in an autonomous or semi-autonomous manner. In
some
instances the principals of this disclosure may also be applicable to manually
controlled
catheters.
[0058] The principles of this disclosure are described
below with primary
reference to examples wherein the medical instrument or tool is an
endovascular catheter
configured to navigate within the vasculature of the patient. These examples.
however,
should not be construed as limiting of the principles of the disclosure. Those
of skill in the
art, upon consideration of the principles disclosed herein, will appreciate
that the devices,
systems, and methods for detecting or determining position and/or orientation
described
herein have application in other contexts. For examples, the principles
described herein can
be useful with other endoluminal, endoscopic, or laparoscopic tools,
instruments, procedures
-14-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
and/or the like. For ease of illustration, however, a primary example related
to an
endovascular catheter is provided. Accordingly, it should be realized that any
of the
following description of an endovascular catheter or catheter may also be
applied to other
endoluminal, endoscopic, and/or laparoscopic tools or the like. Additionally,
it should he
realized that while this application provides several example configurations
for tools or
medical instruments that include specific configurations of radio-opaque
markers, other
configurations of radio-opaque markers can also be used.
[0059] Safe navigation of a catheter within a patient's
body generally requires an
accurate understanding of the current pose of the catheter. It can be
difficult to gain an
accurate understanding of pose from a single two-dimensional medical image.
For example,
FIG. 2B provides an example X-ray image of a catheter 100 navigating through
an aortic
arch of a patient. From FIG. 2B alone, however, a person would have difficulty
understanding the exact pose of the catheter 100. For example, does the
catheter 100 lie
completely within the plane of the image, or is it inclined into or out of the
plane of the
image? Highly skilled physicians may be able to make educated guesses with
respect to these
questions based on their understanding of human anatomy. Still, however,
uncertainty exists,
which increases the risk of damage to the patient during a medical procedure.
[0060] Perhaps even more critically, from FIG. 2B alone, it
is extremely difficult,
if not impossible, for even skilled physicians to determine the current roll
angle of the
catheter 100 (i.e., the rotational angle of the catheter about its
longitudinal axis).
Understanding the current roll of the catheter 100 can be critical for safe
navigation,
especially considering how most articulable catheters are controlled. Most
articulable
catheters include pullwires that can be actuated (e.g., tensioned or pulled)
to cause deflection
of a distal tip of the catheter. See, for example, the catheter 100 of FIG. 2A
described below.
Commonly, catheters include four pullwires, each configured to cause
deflection of the
catheter in one of four cardinal directions. For example, one pullwire can be
associated with
deflecting the tip of the catheter up, one pullwire can be associated with
deflecting the tip
down, one pullwire can be associated with deflecting the tip right, and one
pullwire can be
associated with deflecting the tip down. However, in order to know which
pullwire to actuate
to cause a given deflection requires an understanding of the current roll
position of the
catheter. For example, if the distal tip of the catheter is rolled by 90
degrees, actuating the
-15-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
pullvvire generally associated with an upward deflection of the tip would
instead cause the
catheter to articulate (possibly unexpectedly) to the right or left.
Unintended articulation can
frustrate navigation and cause injury to a patient. Moreover, the distal tip
of a catheter often
rolls (in unexpected ways) as the catheter is navigated through complex
anatomy, such as
through a vascular network of a patient, even if roll inputs are not provided
at the proximal
(e.g., external) end of the catheter.
[0061] As will be described in more detail below, the
systems, methods, and
devices provide for accurate determination of the pose of a catheter
(including its roll) based
on detection of radio-opaque fiducials included on the catheter. In some
embodiments,
detection of the radio-opaque fiducials is achieved using computer vision
analysis of a two-
dimensional medical image of the catheter. The methods and systems described
herein can
also be used with biplane imaging systems to determine six degree of freedom
pose estimates
of the catheter. In such cases, determination of incline may (in some
instances) be determined
from the biplane images, while roll angle can be determined based on computer
vision
analysis of the radio-opaque fiducials included on the catheter.
[0062] FIGs. 1A-1D illustrate an example coordinate system
for three-
dimensional image pose estimation or determination. As noted above, the term
"pose" is used
herein to refer to the different combinations of positions and/or orientations
of an
endovascular tool, intraluminal tool, medical instrument and/or the like.
Position can refer to,
for example, an x, y, and z position (for example, a location within three-
dimensional space).
In some embodiments, position can refer to the x and y position within the two-
dimensional
image plane of a medical image, such as an X-ray. Orientation can be
represented in many
different ways. In general, in this application, orientation is referred to
using three Euler
angles: (1) heading, which can be a measure of angulation or articulation
about the z-axis
and/or where the device is pointing in the image plane; (2) incline, which can
be a measure of
rotation about the x-axis and/or where the device is pointing out of the image
plane (e.g.,
pitch); and (3) roll which can be a measure of rotation about the longitudinal
or central axis
of the catheter.
[0063] FIGs. 1A-1D provide additional illustrations. FIG.
lA represents two-
dimensional position within the x-y plane of the medical image. FIG. 1B
illustrates the
heading angle, measured about the z-axis, which generally corresponds to the
apparent
-16-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
heading or direction of the catheter within the x-y plane of the image. FIG.
1C illustrates the
incline angle, measured about the x-axis, which generally corresponds to a
measurement of
the angle into or out of the plane of the image. FIG. ID illustrates the roll
of the catheter,
measured about the catheter's longitudinal or central axis. The methods and
systems
described herein are also applicable to determination of one, more than one,
or all elements
of pose, regardless of the manner in which pose is defined (e.g., regardless
of coordinate
system, nomenclature, etc.).
[0064] As will be further described herein, the use of
computer vision algorithms
and unique radio-opaque markings or fiducials can be included on the catheter
that may be
used to quantitatively estimate the endovascular tool's five-dimensional pose
(for example,
two-dimensional position (e.g., see FIG. 1A), as well as its heading (e.g.,
FIG. 1B), incline
(e.g., FIG. 1C), and roll (e.g., FIG. 1D). The use of radio-opaque markings
can preserve the
readily visible dimensions, for example, the position of the endovascular tool
on the image
plane (the x and y position) and the direction the endovascular tool points in
the image plane
(the heading), and further, the use of unique radio-opaque markings can
further allow the
sign of the incline dimension, the degree of incline dimension, and the roll
to be
quantitatively estimated or determined. While many different radio-opaque
materials may be
used, in some embodiments, it may be preferable to use platinum, tungsten,
and/or gold
because of their superior X-ray attenuation properties. In some embodiments,
the radio-
opaque material may be a piece of wire, metal, radio-opaque ink and/or the
like that is place
within a layer of the catheter or other tool. Examples are provided below.
[0065] FIG. 2A illustrates a distal end of an embodiment of
an endovascular
catheter 100. In the illustrated embodiment, the catheter 100 includes a long,
thin, and
flexible body 101 that extends to a distal end 103. The body 101 can be
configured to be
navigated through the patient's vasculature. In some embodiments, a channel
may be formed
through the body 101 such that other tools or instruments can be passed
through the catheter
100 and gain access to the patient's anatomy through an opening that can be
included on the
distal end 103 of the catheter. In some embodiments, one or more tools can be
integrated
directly into the catheter 100 itself.
[0066] To facilitate navigation, in some embodiments (such
as the illustrated
embodiment of FIG. 2A), the catheter 100 or a distal portion thereof can be
configured to be
-17-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
articulable or deflectable. To achieve articulation, the catheter 100 includes
one or more pull
wires 105 that extend on, in, or through the body 101 and attach at or near
the distal end 103.
Actuation (e.g., pulling or shortening) the pull wires 105 at or near their
proximal end can be
configured to cause a distal portion of the catheter 100 to deflect or
articulate. Depending on
the number and arrangement of pull wires 105, the catheter 100 can be
configured for one-
way, two-way, three-way, or four-way deflection, al though other
configurations providing
different degrees of articulation or deflection are also possible.
[0067] The catheter 100 can be configured such that control
thereof (e.g., control
of the deflection of the distal portion of the catheter 100) can be
accomplished manually or
robotically. For example, in some embodiments that are configured for manual
control, the
body 101 and pull wires 105 extend proximally to a handle located on a
proximal end of the
catheter 100 (not shown). The handle can be configured to by operated by hand
(e.g.,
manually) to actuate the pull wires 105. For example, the handle can include
one or more
manual inputs such as levers, buttons, dials and/or the like that allow a user
to manually
actuate the pull wires 105 to cause deflection of the distal portion of the
catheter 100. In
some robotically controllable embodiments, the body 101 and pull wires 105
extend
proximally to a base located on a proximal end of the catheter 100 (not
shown). The base can
be configured to connect to and be operated by a robotic medical system to
actuate the pull
wires 105. For example, the base can include one or more robotic inputs
configured to
engage with robotic outputs or actuators on the robotic medical system. In
some
embodiments, other methods and configurations for manual and robotic control
may be used.
[0068] FIG. 2A also illustrates that, in some embodiments,
the catheter 100 can
include a marker (or fiducial) 107. In the illustrated embodiment, the marker
107 is
positioned at the distal end 103 of the catheter 100. The marker 107 can be
configured to
facilitate identification of the distal end 103 of the catheter 100 in a
medical image captured
during a medical procedure. For example, the marker 107 can be radio-opaque so
as to be
readily identifiable within a fluoroscopic image. In some embodiments, the
marker 107
comprises a radio-opaque ring positioned on the distal end 103 of the catheter
100. When the
marker 107 comprises a radio-opaque material, it can be more easily identified
within, for
example, a fluoroscopic image such as an X-ray. Identification of the distal
end 103 within a
-18-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
medical image during a procedure can greatly facilitate navigation and control
of the catheter
100.
[0069] FIG. 2B provides an example medical image of a
catheter 100, such as the
catheter 100 of FIG. 2A, navigating within the vasculature of the patient. In
particular, the
image of FIG. 2B is a fluoroscopic X-ray image. As shown, both blood vessels
109 of the
patient's vasculature, as well as the catheter 100 itself, are visible in the
image. The marker
107 included at the distal end 103 of the catheter 100 helps a user (or a
computer-vision
algorithm) viewing the image to identify the distal end 103 of the catheter
100. Because the
marker 107 comprises a radio-opaque material, it shows up well within the
image.
[0070] In general, during an endovascular procedure, a
physician or other
operator, attempts to guide the distal end 103 of the catheter 100 to a
specific location, such
as, for example a treatment site. For example, one such procedure is a
mechanical
thrombectomy. A large vessel occlusion (LVO) stroke occurs when a blood clot
lodges in at
least one of the internal carotid, proximal middle cerebral artery, proximal
anterior cerebral
artery, basilar artery, or vertebral artery. Such a clot can partially or
completely occlude
downstream blood supply to brain tissue resulting in neuronal infarction and
subsequent
neurological impairment or death. During a mechanical thrombectomy, a
physician gains
access to the patient's vasculature and inserts a catheter, such as catheter
100. The catheter
100 is guided to the obstruction using, for example, one or more medical
images similar to
the one shown in FIG. 2B. Once the distal end 103 of the catheter 100 is
positioned near the
obstruction, tools are passed through the working channel of the catheter 100
to remove the
obstruction.
[0071] While medical images, such as that shown in FIG. 2B,
may help the
physician guide the catheter 100 to the treatment site, it is still difficult
to fully determine the
orientation or pose of the catheter 100 from the medical image alone. This
occurs, for
example and among other reasons, because the medical image often provides only
a two-
dimensional view of the patient's anatomy (e.g., the vasculature) and the
catheter 100, each
of which, in actuality, comprise three-dimensional shapes. In the past (for
example, without
the systems, methods, and devices described herein), physicians needed to rely
on their
knowledge of anatomy as well as various other assumptions (such as an
assumption that the
catheter 100 is located within the vasculature) to interpret the two-
dimensional medical
-19-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
image in a three-dimensional way. While many specialists are able to do this
to a limited
extent, this can provide a barrier to the availability of such procedures.
Further, because
different patients have different anatomies, the results are not always
precise.
[0072] Considering FIG. 2B, for example, one can relatively
easily understand
the shape of the catheter 100 within the plane of the two-dimensional of the
image. However,
the shape of the catheter 100 out of the plane of the image (e.g., the incline
of the catheter) is
harder to discern. For example, in FIG. 2B, it is difficult to determine
whether the catheter
100 is moving in a direction that is into or out of the plane of the image.
While an
understanding of anatomy can inform the answer to this question, the result is
still not wholly
determinable. As will be described in more detail below, this application
offers new devices,
systems, and methods for determining an out of the plane angle of the distal
end 103 of the
catheter 100. Increasing a physician's understanding of the out of the plane
angle of the distal
end 103 of the catheter 100 can greatly facilitate navigation and improve the
experience of
driving or controlling the catheter 100.
[0073] As described above, the catheter 100 may include an
articulable portion
that is actuated via pull wires. To accurately control the articulation of the
catheter 100, one
must understand the roll orientation of the catheter 100. As a simplified
example, if the
physician actuates the right most pull wire 105 expecting that this will cause
the catheter 100
to deflect to the right, the catheter 100 may move unexpectedly to the left if
the catheter 100
is rolled 180 degrees such that the right most pull wire 105 is positioned on
the left side of
the catheter 100. The physician may estimate the roll position of the catheter
100 based on
the roll position of the proximal end of the catheter 100. However, due to the
complex shape
of the vasculature, the roll position at the proximal end of the catheter 100
may not (and often
does not) directly correspond to the roll position at the distal end 103 of
the catheter 100.
Thus, to understand how actuation of the pull wires 105 will cause the distal
portion of the
catheter 100 to deflect, one must generally understand the roll position at
the distal portion of
the catheter 100.
[0074] Considering FIG. 2B further, one can see that the
roll position of the distal
portion of the catheter 100 is not readily discernible. As will be described
in more detail
below, this application offers new devices, systems, and methods for
determining the roll
position of the distal end 103 of the catheter 100. Increasing a physician's
understanding of
-20-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
the roll position of the distal end 103 of the catheter 100 can greatly
facilitate navigation and
improve the experience of driving or controlling the catheter 100.
Out-of-Plane or Incline Angle Detection for Endovascular and Other
Intraluminal Tools
[0075] As discussed in the preceding section, in some
embodiments, such as the
embodiment illustrated in FIG. 2A, a catheter 100 can include a marker 107 (or
a plurality of
markers) positioned on the distal end 103 thereof. In some embodiments, such
as the
illustrated embodiments, the marker 107 comprises a ring positioned on the
distal end 103.
The marker 107 can be radio-opaque such that it can be easily identifiable
within a medical
image, such as, for example, the medical image shown in FIG. 2B. As described
in this
section and throughout this application, the shape of the marker(s) 107 within
the two-
dimensional image can be analyzed to determine the out of the plane angle or
incline of the
catheter 100. In other words, the two-dimensional projection of the three-
dimensional shape
of the marker 107 can be analyzed to determine the out of the plane angle of
the catheter 100.
The term "incline," as referred to herein, can refer to the degree of
angulation of the catheter
towards or away from the X-ray source (e.g., into or out of the imaging
plane), for example,
as shown in FIG. 1C. In some examples, a positive degree of angulation
indicates that the
catheter is angled towards the X-ray emitter, with a maximum incline of +90-
degrees. A
negative degree of angulation indicates that the catheter is angled away from
the X-ray
emitter, with a maximum incline of -90-degrees.
[0076] Considering the example of FIG. 2A, in the
illustrated orientation, the
ring-shaped marker 107 presents or appears as a rectangle having a straight
edge at the distal
most tip within the two-dimensional plane of FIG. 2A. With this orientation,
it can be
determined that the distal portion of the catheter 100 lies in the plane of
the FIG. 2A (e.g., in
the illustrated orientation, the catheter 100 is not curving into or out of
the plane of FIG. 2A).
[0077] To further illustrate the principles and concepts,
FIGs. 3A-3F illustrate the
marker 107 in different orientations. In FIGs. 3A-3F, the distal most end of
the marker 107 is
illustrated with a double line. As before, the marker 107 comprises a ring
shape that can be
positioned on the end of a catheter, such as the catheter 100 of FIG. 2A. As
shown in FIG.
3A, one can see that, within the plane of the FIG. 3A, the marker 107 is
generally pointed up
and to the left. One can further see that the distal end of the marker 107
presents as an ellipse
within the plane of FIG. 3A. Because the entire ellipse is visible, it can be
determined that the
-21 -
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
marker 107 is pointed out of the plane of the page. In contrast, consider FIG.
3F. In FIG. 3F,
within the plane of the page, one can see that the marker 107 is pointed
straight up.
Considering the distal edge of the marker 107, one can only see half of the
ellipse (the other
half being blocked from view). From this, it can be determined the marker 107
of FIG. 3F is
angled into the page.
[0078] FIG. 3B presents a similar example to FIG. 3F,
except, as shown in FIG.
3B, the entire ellipse is visible. From this it can be determined that the
marker is angled out
of the plane of FIG. 3B. FIG. 3C presents an example where the marker 107 is
pointed up
and to the right within the plane of FIG. 3C and angled slightly out of the
page based on the
shape of the ellipse. FIG. 3D illustrates another example where the marker 107
is angled out
of the plane of the page. Comparing FIGs. 3B and 3D, one can see that the
length of the
minor axis of the ellipse is shorter in FIG. 3D as compared to FIG. 3B. This
indicates that the
degree of the angle out of the page in FIG. 3D is less than the degree of the
angle out of the
page of FIG. 3B.
[0079] If one considers that the marker 107 continues to
turn out of the plane of
the page, the minor axis of the ellipse will continue to increase in length
until the minor and
major axes are equal and the distal end of the marker 107 will present as a
circle in the plane
of the page. FIG. 3E illustrates an example similar to FIG. 2A, in which the
marker is pointed
upwards, within the plane of the page. At this orientation, the minor axis has
decreased to
zero, indicating that the marker 107 is pointed upwards within the plane of
the page.
[0080] From the examples provided, it can be seen that one
can analyze the two-
dimensional shape created by the ring or circled-shaped marker 107 within a
two-
dimensional imaging plane to determine the orientation of the marker 107 into
and out of the
page. Other shapes for marker 107 are possible. For example, the marker 107
need not
always comprise a ring shape.
[0081] In some embodiments, a computer system, such as or
which can include a
computer vision system, can be used to (1) detect the shape (e.g., the
visible, ellipse, circle,
curve, or line) created by the distal end of the ring-shaped marker 107 within
the two-
dimensional image, and (2) extract or determine the out of the plane angle of
the marker 107
or the tool incline from the detected shape. In some embodiments, the computer
system may
utilize artificial intelligence or machine learning to perform such
functionality. In some
-22-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
embodiments, for example, a neural network can be trained to detect the shape
created by the
distal end of the ring-shaped marker 107 within the two-dimensional image, and
extract or
determine the out of the plane angle of the marker 107 from the detected
shape. In some
embodiments, computer vision can be used to define the orientation of the tool
along the z-
axis.
[0082]
As mentioned above, determination of the out of the plane angle of the
catheter 100 can be important in improving and/or facilitating navigation
through a lumina'
network, such as the vasculature. In general, the vasculature of a patient
will not lie within a
single plane. This is apparent considering the model example vasculature
provided in FIG. 4.
As shown in FIG. 4, for example, the arch of the aorta does move slightly from
left to right,
but it moves or arches predominately from front to back. Thus, if the patient
(on an operating
table) is imaged from the top down, a tool navigating through the arch of the
aorta will be
moving significantly out of the plane of the imaging device. To fully
understand this motion,
the angle of the catheter 100 out of the page can be determined as described
above.
[0083]
FIGs. 5A and 5B provide additional example medical images of a
catheter,
such as the catheter 100 of FIG. 2A, navigating through vasculature. These
figures illustrate
how the catheter 100 moves at an angle into and out of the plane of the
medical image based
on the portion of the anatomy through which the catheter 100 passes. For
example, as shown
in FIG. 5A, which illustrates the catheter 100 navigating through a portion of
the carotid
artery, the distal end 103 of the catheter 100 lies generally within the plane
of the medical
image, and thus the distal end 103 of the marker 107 does not form an ellipse.
In contrast,
FIG. 5B illustrates the catheter 100 navigating through a portion of the
aortic arch. As shown,
the distal end 103 of the marker 107 does fat
_______________________________________ las an ellipse which indicates that
the catheter
100 is moving with at an angle that is moving out of the plane of FIG. 5B.
[0084]
FIGs. 6A-6F provide additional examples of detecting the shape (e.g.,
the
visible, ellipse, circle, curve, or line) created by the distal end of the
ring-shaped marker 107
within the two-dimensional image and extracting or determining the out of the
plane angle of
the marker 107 from the detected shape. FIGs. 6A-6C illustrate three example
images, and
FIGs. 6D-6F illustrate the same images with the shape of the ring-shaped
marker 107 that has
been detected and highlighted.
-23-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
[0085] Alternatively or in addition to detecting the shape
of the distal tip of the
catheter 100 within the plane of the image, other methods or mechanisms may
also be
employed for determining the out of the plane angle of the catheter 100. For
example, in
some embodiments, a degree of angulation may be presumed for each vessel
through which
the catheter 100 passes, for example, based on the general or average
angulation of that
vessel across the population. In some embodiments, the angulation of a vessel
can be
determined based on a CT scan. In some embodiments, an additional medical
image at a
different orientation than the first medical image (e.g., a lateral
angiographic view) can be
provided. In some embodiments, the medical imager can be moved so as to gain
an
understanding of the out of the plane angulation.
[0086] Although many of the preceding examples have
described the use of a
ring-shaped marker 107, other types of markers or fiducials can be used as
described further
below.
Incline Sign Detection for Endovascular and Other Infralumina' Tools
[0087] As discussed in the preceding section, the magnitude
of the incline of
endovascular and other intraluminal tools can, in some embodiments, be
determined by
analysis of the elliptical shape of the tool mouth (or a marker, such a ring-
shaped marker,
included thereon). However, the sign of the degree of angulation is not always
readily
identifiable solely from analysis of the marker 107 because the elliptical
shape may look the
same when imaged in two dimensions for both positive angulation and negative
angulations
of the same degree of incline. To determine whether the incline is positive or
negative, an
additional method may be desired.
[0088] FIG. 7A illustrates one method for determining the
sign of the incline of
catheter 100 where catheter 100 can include a semicircular marker 110
positioned on the
distal end 103 thereof. The semicircular marker 110 can be radio-opaque such
that it can
easily be identifiable within a medical image. In the illustrated example, the
semicircular
marker 110 extends around one half of the distal end 103 of the catheter 100.
In other
embodiments, other portions can be used (e.g., 1/4, etc.). In some
embodiments, additional
fiducials, such as helical fiducial 111 may also be used. As described in this
section, the
position of the marker 110 within the two-dimensional image can be analyzed to
determine
the sign of the incline of the catheter 100. The sign of the incline of
catheter 100 can be
-24-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
determined by whether the semicircular marker 110 appears to be above or below
the distal
end 103 in the two-dimensional image.
[0089] FIGS. 7B-7C illustrate the appearance of the marker
110 on a catheter tip
on an X-ray at different inclines. The catheter itself is not shown in these
figures. As shown,
the semicircular marker 110 in combination with helical fiducial 111 will
produce a different
appearance in the two-dimensional image when inclined at the same angle, but
at different
signed inclines (e.g., whether into or out of the plane of the image). For
example, the
magnitude of the incline could be determined to be 45-degrees using a method
such as the
ring-shaped marker method described above. The semicircular marker 110 in
combination
with helical fiducial 111 could then be used to determine the signed incline.
For example, in
FIG. 7B, the catheter tip is inclined at negative 45-degrees, and in FIG. 7C,
the catheter tip is
inclined at positive 45-degrees.
[0090] In some embodiments, a computer system, such as a
computer vision
system, can be used to (1) detect the position of the semicircular marker 110
in combination
with helical fiducial 111 within the two-dimensional image, and (2) extract or
determine the
sign of the tool incline from the detected position. In some embodiments, the
computer
system may utilize artificial intelligence or machine learning to perform such
functionality.
In some embodiments, for example, a neural network can be trained to detect
the position of
the semicircular marker 110 in combination with helical fiducial 111 within
the two-
dimensional image, and extract or determine the sign of the tool incline from
the detected
position. It should be noted that in some embodiments, the machine learning
algorithm does
not hardcode the aforementioned approach. Instead, the machine learning
algorithm trains a
deep neural network to directly predict the incline angle from the input of
the X-ray image.
[0091] FIG. 7D illustrates an additional example of a
catheter 100 including a
helical marker 111 and additional markers 114. In the illustrated embodiment,
the additional
markers 114 include a semi-circular portion that extends partially around the
radius of the
catheter 100 and a tail portion that extends along longitudinally along the
catheter.
[0092] FIGs 8A-8B illustrate provide an additional example
of determination of
the sign and magnitude of incline of catheter 100, where the catheter 100 can
include one or
more non-circumferential rings 113. In the illustrated embodiment, the non-
circumferential
rings are semi-circular, extending part way around the catheter 100. In some
embodiments,
-25-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
the non-circumferential rings 113 can be radio-opaque such that it can easily
be identifiable
within a medical image. The appearance of the non-circumferential rings 113
can be
analyzed to determine the sign and magnitude of the incline of the catheter
100. The sign and
magnitude of the incline of catheter 100 can be determined by the unique
appearance of the
non-circumferential rings 113 in the two-dimensional image at varying degrees
of incline,
both positive and negative. In some embodiments, non-circumferential rings 113
are
arranged in an asymmetrical design. That is, in some embodiments, the non-
circumferential
rings 113 are each positioned at a different rotational position around the
catheter 100. In the
illustrated embodiments, the non-circumferential rings are positioned at 90-
degree offsets. In
some embodiments, non-circumferential rings 113 are multiple ellipses offset
from each
other.
[0093] FIG. 8C illustrates how an example arrangement of
non-circumferential
rings 113 positioned on a distal end of a catheter may provide a unique
appearance at
different inclination and roll angles. Images are provided at positive,
neutral (i.e., zero), and
negative inclinations, as well as at different roll positions provided in 30-
degree increments.
As shown, each of the 36 different illustrated positions provides a unique
appearance. By
detected, for example, using computer vision, this appearance within a medical
image, the
incline (including its sign) and the roll of the catheter can be determined.
While FIG. 8C
illustrates how the radio-opaque markers provide different two-dimensional
appearances for
different roll positions at 30-degree increments and for different positive,
neutral (zero), and
negative inclines, the illustrated increments are not intended to be limiting.
[0094] In some embodiments, the radio-opaque markers
provide unique or
visually distinguishable two-dimensional appearances at all different roll or
incline positions.
In some embodiments, the radio-opaque markers provide unique or visually
distinguishable
two-dimensional appearances at different roll or incline positions within
increments of about,
at least, or at most 1 degree, 2 degrees, 3 degrees, 4 degrees, 5 degrees, 7.5
degrees, 10
degrees, 12.5 degrees, 15 degrees, 17.5 degrees, 20 degrees, 25 degrees, 30
degrees, or 40
degrees. That is, in some embodiments, the radio-opaque markers are configured
with a
three-dimensional shape that, when viewed within the two-dimensional plane of
a two-
dimensional medical imaging device, provides a unique or visually
distinguishable
appearance that can be distinguished at the different incremental roll or
incline angles listed
-26-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
above. The above listed increments can be considered minimum resolutions for
the system or
the minimum change in roll or incline that is detectable by the system.
Tool Position and Heading for Endovascular and Other Intraluminal Tools
[0095] FIG. 9A illustrates examples of projections of three-
dimensional generated
catheters onto real world two-dimensional X-ray images. FIGs. 9B-9C illustrate
an example
prediction of the trained deep neural network for predicting where the
catheter body is. As
noted above, position can refer to translation of endovascular and/or other
intraluminal tools
along the x, y, and z directions. In some embodiments, instead of determining
the three-
dimensional position of the instrument (z, y, and z), only the two-dimensional
(x and y)
position is used. This can be because estimating the z position (i.e., depth)
may require
calibrating the X-ray camera to obtain determine intrinsic parameters thereof.
While this is
feasible, it would likely (1) add a burdensome pre-operative image calibration
step, (2) may
not help that much because motion planning is primarily done in two-
dimensions, and (3) we
the C-arm can be rotated to obtain a Z position estimate.
[0096] In some embodiments, the system may be configured to
predict where the
full tool body is, and then from this tool body we the two-dimensional tip
location can be
extracted. This approach may be beneficial because the tool body provides a
very strong
training signal for learning deep neural network segmentation models. That is,
in some
instances, it may be easier for a neural network of computer vision algorithm
to detect the
body of a catheter and then extract the location of the tip from there. In
some embodiments,
catheter kinematics are further used refine this estimate.
[0097] For example, a deep neural network can be used to
estimate the two-
dimensional centerline position of the catheter based on one or more images of
the catheter
navigating within the body. FIG. 9D illustrates an example, in which the
neural network has
identified the catheter within the image, superimposed its estimated
centerline onto the
image, and highlighted the catheter. Once the centerline of the catheter has
been identified
within the image, the two-dimensional position can be directly extracted by
computing the
most distal position along the centerline. Similarly, the heading of the
catheter can also be
directly extracted from the body estimate by computing the vector of the tip
of the body line.
FIG. 9E illustrates an example in which the distal tip position and heading
angle have been
-27-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
determined and the image has been updated to include a highlight identifying
the position
and an arrow indicating the heading.
[0098] In some embodiments, a machine learning algorithm
for estimating the
position of a catheter and/or other tool may use the following approach.
First, the image
generation procedure is modified by drawing the catheter on top of tangible X-
ray images
(e.g., as shown in FIG. 9B). This process may have the advantage of training
the deep neural
network with realistic noise and occlusions that would be seen in actual X-
rays, making the
system robust to real world conditions.
[0099] Second, the two-dimensional x and y position is
estimated. In some
embodiments, radio-opaque markings may be added to the tool body, such as, for
example, a
full-length helix, to assist with the identification. In some embodiments, the
three-
dimensional x, y, and z position may be estimated instead. However, estimation
of the Z
position may require calibration of the X-ray camera to obtain its camera
intrinsic, which
requires an additional step of a pre-operation image calibration. In some
instances, the two-
dimensional position estimation will be preferable so the pre-operation image
calibration step
does not need to be completed and because the z position may not be necessary
because
motion planning is primarily conducted in two-dimensions. Further, a z
position estimate can
be obtained by rotating the C-arm. Using this method, the two-dimensional x
and y position
of the full tool body may be predicted. Thereafter, the two-dimensional x and
y position
location of the tool tip (such as, for example, the distal end 103 of catheter
100) can be
determined. This approach may be used because the tool body provides a very
strong training
signal for learning deep neural network segmentation models. In some
embodiments, catheter
kinematics may be used to further refine the position estimate.
[0100] As noted above, heading can refer to a measure of
angulation or
articulation about the z-axis and/or where the device is pointing in the image
plane. To
determine the heading of endovascular and/or other intraluminal tools, such as
a catheter, the
deep neural network prediction of the catheter body position may be used.
Based on the
prediction of the two-dimensional x and y position of the catheter tip, a
second position
located on the catheter body may be determined. The second position may be an
infinitesimal
distance from the tool tip in a direction along the catheter body. The heading
angle of the
-28-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
catheter may then be calculated using trigonometry based on the x and y
position of the tool
tip and the second position along the catheter body.
Roll Angle Detection for Endovascular and Other Intraluminal Tools
[0101] As noted above, radio-opaque markers can be placed
at the distal tip 103
of a catheter 100 to improve the visibility of the catheter 100 in a medical
image (see FIG.
2A and 2B). In some embodiments, these markers are symmetric about the tools
axis and
thus do not provide any information related to the roll angle of the catheter.
As described in
this section, the degree of rotation of a tool about its centerline may be
informed by the
addition of radio-opaque rotation fiducials. Numerous configurations of radio-
opaque
rotation fiducials can be utilized to determine the degree of tool rotation,
provided the
configurations result in a unique X-ray appearance of the tool at differing
degrees of rotation.
While multiple configurations will be disclosed with reference to certain
embodiments, it
will be understood that various changes may be made, and equivalents may be
substituted for
elements thereof without departing from the scope of the disclosure. For
example, markers
108 that arc rotationally asymmetric can be utilized on the catheter 100 and
configured to
provide information from which the roll angle of the catheter 100 can be
determined. In some
embodiments, a computer vision algorithm can be used to detect and analyze the
positions of
the markers 108 to determine the rotation or roll of the catheter 100 about
its longitudinal
axis. As described above, the rotation of the catheter 100 determines how an
actuation of a
given pull wire will move the catheter 100. Thus, an accurate understanding of
the roll angle
of the catheter 100 can be important for successful navigation of the catheter
100 through the
vasculature.
[0102] FIGs. 11A and 11B illustrate side and top views
respectively of a distal
portion of a catheter 100 that has been configured with markers 108 from which
the roll
angle of the catheter 100 can be determined. In this embodiment, the catheter
100 includes
markers 108A-108D configured as radio-opaque fiducials (e.g., dots) placed at
different
distances from the distal tip 103, at differing angles/arcs from the center of
the tool. This
configuration allows the catheter 100 to take on a unique appearance depending
on its
rotation around its longitudinal axis. In some embodiments, the markers may be
other radio-
opaque fiducials, such as, for example squares, circles, diamonds, and/or the
like.
-29-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
[0103] In the illustrated embodiment, a first marker 108A
is positioned a first
distance 2D (twice the diameter of the tool) from the distal tip 103 and at a
45-degree
rotational offset with respect to a first pull wire 105. A second marker 108B
is positioned an
additional distance 2D from the distal tip 103 (4D) and at a 135-degree
rotational offset with
respect to the first pull wire 105. A third marker 108C is positioned an
additional distance 2D
from the distal tip 103 (6D) and at a 225-degree rotational offset with
respect to the first pull
wire 105. A fourth marker 108D is positioned an additional distance 2D from
the distal tip
103 (8D) and at a 315-degree rotational offset with respect to the first pull
wire 105. In some
embodiments, the rotational offset of first marker 108A from the first pull
wire may be, for
example. 0-degrees, 5-degrees, 10-degrees, 15-degrees, 20-degrees, 25-degrees,
30-degrees,
35-degrees, 40-degrees, 45-degrees, 50-degrees, 55-degrees, 60-degrees, 65-
degrees, 70-
degrees, 75-degrees, 80-degrees, 85-degrees, 90-degrees, and/or the like, with
the second
marker 108B, third marker 108C, and fourth marker 108D being an additional 90-
degrees,
180 degrees, and 270-degrees respectively rotationally offset from first
marker 108A. In
some embodiments, the distance 2D is approximately 5mm.
[0104] In the illustrated example, each marker 108 is
positioned at a longitudinal
distance that is twice the diameter D of the catheter 100 below the marker 108
above it (or
below the distal tip 103 for the first marker). This need not be the case in
all embodiments,
and other spacings are possible. In the illustrated example, each marker 108
is positioned at
90-degree offsets and between the adjacent pull wires 105. Again, this need
not be the case in
all embodiments and other spacings are possible. Further, in FIG. 6A, the
catheter 100 is
illustrated as transparent, allowing all four markers 108 to be visible. It
should be
appreciated, however, that in the illustrated example, markers 108A and 108D
are on the
back side of the catheter 100 (the side facing away relative to the
orientation of the image)
and thus would not be visible if the catheter were not illustrated as
transparent.
[0105] With such a configuration or other suitable
configurations. the appearance
of the markers 108A-108D within a medical image provides a unique appearance
from which
the roll of the catheter 100 can be determined. FIGs. 12A-12D provide various
examples of
the catheter 100 at different orientations. In each of FIGs. 12A-12D, both a
model of the
catheter and a corresponding image of the catheter are shown. FIGs. 12A-12D
illustrate that
-30-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
for each roll position, the appearance of the markers 108A-108D is different,
thus allowing
for roll determination.
[0106] In some embodiments, determining roll from the
markers 108A-108D can
he accomplished as follows: (1) the locations of the markers 108A-108D within
an image can
be determined, in some embodiments, this is accomplished through computer
vision or a
neural network that has been trained to identify the markers 108A-108D; (2)
the centerline of
the catheter 100 can be determined using, for example, computer vision or a
neural network;
(3) the distance (with appropriate sign, positive or negative) between each of
the markers
108A-108D and the centerline can be determined; (4) the signed distance
between the
markers and the centerline can be used to determine the roll angle using
geometry principles.
[0107] Use of four markers 108A-108D, for example, as shown
in FIGs. 11A-
11B and FIGs. 12A-12D, can be advantageous for various reasons. For example,
because
each marker 108A-108D is placed at a different distance from the distal tip
103, they can
each be uniquely identified by that distance. This can allow the machine
learning model
and/or loss to be much more robust and/or generalizable. Use of four markers
108A-108D
may also prevent against or eliminate angle aliasing, for example, allowing
for full
determination of roll between 0 and 360 degrees. Because the markers 108A-108D
are
equidistantly spaced along the longitudinal axis, the markers 108A-108D can be
analyzed to
determine out-of-of plane rotations. Further, use of four markers 108A-108D
can allow for
determination of the centerline of the catheter 100. Use of markers 108A-108D
and four pull
wires 105 can also facilitate control and verification.
[0108] In some embodiments, increasing the axial distance
between the markers
108A-108D, can increase the signal to noise ratio, for example, allowing the
markers 108A-
108D to be more easily identified within the medical image. In some
embodiments, more
than four markers may be used to determine the roll of the catheter 100. In
some
embodiments, less than four markers may be used to determine the roll of the
catheter 100.
[0109] FIGs. 13A and 13B illustrate another embodiment of
markers 115 that can
be used to determine roll angle of a catheter 100. FIG. 13A is a perspective
view of a distal
end of a catheter 100 including markers thereon. FIG. 13B is a side view of
the markers 115.
In the illustrated embodiment of FIGs. 13A and 13B, multiple ring-shaped
markers 115 are
included. Each ring-shaped marker 115 can include a through hole formed
therethrough.
-31-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
When the ring 115 is rotated such that the holes align, there will be a
visible gap in the ring.
When the ring 115 is rotated such that the holes do not align, the ring will
appear solid. This
distinction may be detectable through a centroid detection algorithm or a
computer vision
algorithm trained for this type of device. In other embodiments, the hole need
not be circular,
the hole may be, for example, any polygonal shape.
[0110] As shown in FIGs. 13A and 13B, multiple rings 115
may be stacked at set
rotations, such that the rotation of the tool can be determined by identifying
which of the
rings 115 presents a visible gap.
[0111] In some embodiments, radio-opaque sleeves or other
features can be
coupled to the pull wires. In such cases, a computer vision algorithm can be
configured to
either detect the features at opposite sides of the catheter or to detect when
the features
overlap. In either case, these features would allow the computer vision system
to assess if the
tool is oriented with each pull wire in plane.
[0112] FIGs. 14A and 14B illustrate another embodiment of
catheter 100 that
includes the use of two markers 117 and 119 of different lengths oriented at
90 degrees with
respect to each other. Because the markers 117 and 119 are of different
length, the relative
position of the markers 117 to marker 119 can he determined, allowing roll
angle to be
reliably computed using only two markers.
[0113] FIGs. 15A-15H illustrate another embodiment of a
configuration that may
be used to determine the roll of the catheter 100. In this embodiment, a radio-
opaque helix
fiducial 121 is coupled to catheter 100. In some embodiments, the radio-opaque
helix fiducial
121 is approximately sinusoidal such that the phase of the sinusoid can be
used to determine
the degree of roll. In some embodiments, the phase of the sinusoid may be
determined by the
location of the radio-opaque helix fiducial 121 closest to the distal end of
the catheter 100
and by the location of the radio-opaque helix fiducial 121 furthest from the
distal end of the
catheter 100, near the base of the cylindrical band. It should be noted that
in some
embodiments, the machine learning algorithm does not hardcode the phase
determination.
Instead, the machine learning algorithm trains a deep neural network to
directly predict the
roll angle form the input X-ray image. FIGs. 15A-15H illustrate X-ray images
of catheter 100
at different degrees of roll and different degrees of incline.
-32-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
[0114] FIG. 16A illustrates another embodiment of a
configuration that may be
used to determine the roll of the catheter 100. In this embodiment, a radio-
opaque one and
one quarter roll helix fiducial 123 is coupled to catheter 100, where the
helix is made slightly
longer than one complete revolution, such that the roll helix fiducial 123 is
approximately
1.25 times the articulation length. The degree of roll of catheter 100 can be
determined
because the roll helix 123 takes on a different appearance depending on the
degree of roll. In
FIG. 16A, the roll helix fiducial can comprise other lengths greater than or
less than one and
one quarters. For example, as shown in FIGs. 16B-16E, at different degrees of
roll, the roll
helix 123 takes on a distinct appearance. In some embodiments, the
articulation length is
approximately two centimeters. Notably, the embodiments illustrated in FIGs.
16A-16E also
include examples of the non-circumferential markers discussed above.
[0115] FIG. 16F illustrates the two-dimensional appearance
(e.g., as within the
plane of medial image) of the helical fiducial 123 of FIG. 16A and different
roll positions in
30-degree increments. As shown, each roll position provides a unique
appearance which can
be used to determine roll, for example, by a computer vision, neural network,
or machine
learning system. While FIG. 16F illustrates how the radio-opaque markers
provide different
two-dimensional appearances for different roll positions at 30-degree
increments, the
illustrated increments are not intended to be limiting.
[0116] In some embodiments, the radio-opaque markers
provide unique or
visually distinguishable two-dimensional appearances at all different roll
positions. In some
embodiments, the radio-opaque markers provide unique or visually
distinguishable two-
dimensional appearances at different roll positions within increments of
about, at least, or at
most 1 degree, 2 degrees, 3 degrees, 4 degrees, 5 degrees, 7.5 degrees, 10
degrees, 12.5
degrees, 15 degrees, 17.5 degrees, 20 degrees, 25 degrees, 30 degrees, or 40
degrees. The
above listed increments can be considered minimum resolutions for the system
or the
minimum change in roll that is detectable by the system.
[0117] In some embodiments, the roll angle determined based
on the markers of
any of these embodiments can be used by a motion planning algorithm to
determine how it
will move the catheter. In one embodiment, the algorithm can be configured to
rotate the
catheter until the radio-opaque identifiers aligns with the imaging plane. In
another
-33-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
embodiment, the algorithm can measure the rotation of the tool using the radio-
opaque
identifiers and update which pull wires it uses to execute a maneuver.
Additional Detail
[0118] FIGs. 17A and 17B illustrate an embodiment where
radio opaque braids
125 can be used to determine roll. For example, the way in which the radio
opaque braids
125 are wound around the catheter 100 may he used to detect out-of-plane
deflection. For
example, one can detect the plane of deflection based on the frequency of the
sinusoidal
shape of the tantalum wire as shown in FIGs. 17A and 17B.
[0119] FIGs. 18A and 18B illustrate that radio opaque
markers or fiducials 127
can be used to detect planarity. For example, the distance observed in a
single plane can be
used to provide information regarding a tool's degree of out-of-plane
deflection. For
example. in FIGs. 18A and 18B below, the shorter distance between the two
fiducials 127
indicates bending out of plane, whereas the longer distance indicates the tool
is more planar.
[0120] As shown in FIG. 19 for example, by adding markers
on opposite sides of
the catheter that can be distinguished from one another, one can determine
whether the
catheter is angled into or out of the image plane. In FIG. 19, a double dot
131 is shown on
one side and a single dot 133 on the other side. The first image is a side
view, and the second
shows the device head on (from x-ray view). When angled out of the plane, one
can see the
double dot 131 above the single dot 133. When angled into the plane, the
single dot 133 is
above the double dot 131. One may need to know the roll of the device to use
this
information (for example, to know which one is in front and which one is
behind). With at
least two pairs of front/back markers 90 degrees apart and the roll is known,
one could
determine this deflection for any roll value. In some embodiments, alternative
markers could
be used, such as shapes, for example circles, squares, donuts, lines, and or
the like. Sime
some embodiments, markers such as a solid ring and/or a partial ring(s) and/or
the like may
be used.
Safeguards Against Unexpected Catheter Motion
[0121] Unexpected motion of the distal end of a catheter
can jeopardize the safety
of endovascular or other procedures. The term "unexpected motion," as referred
to herein,
describes any movement or behavior of the distal end of a catheter that is not
predicted based
on the movement and/or control of the proximal end of the catheter. An example
of
-34-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
unexpected motion may be a poor torque response where an unexpected roll
motion occurs at
the distal end of a catheter. The unexpected roll motion may occur when the
rotation at the
distal end of the catheter does not correlate to the rotation at the proximal
end. For example,
sometimes when the proximal end of the catheter is rotating, the distal end of
the catheter
may be rotating at a slower rate or may not be rotating at all. As the
proximal end of the
catheter continues to rotate and the distal end rotates at a slower rate or
does not rotate at all,
torque builds up in the catheter. If the proximal end of the catheter
continues to rotate,
eventually the torque in the catheter may cause the distal end of the catheter
to rotate very
quickly to catch up with the rotation imposed on the proximal end of the
catheter and the
quick rotation may be characterized as unexpected motion which could cause
harm to the
patient. For example, the unexpected motion at the distal end could cause
damage to or a tear
in a vessel wall.
[0122] As previously described, the degree of rotation of
the distal end of a tool
about its centerline may be informed by the addition of radio-opaque rotation
fiducials. Using
the methods described herein, unexpected motion at the distal end of a
catheter may be
prevented by tracking and comparing the rotation rate and roll of the distal
and proximal ends
of a catheter. In some embodiments, the comparison may be performed at
discrete steps. In
some embodiments, the comparison may be performed continuously. In some
embodiments,
rotation at the proximal end of a catheter may be prevented when there is a
difference of
more than a specific number of degrees of rotation between the proximal and
distal ends of
the catheter. By preventing further rotation after the difference in rotation
is calculated at a
specific amount, the system may prevent unexpected motion (for example,
snapping and/or
whipping of the distal end) to increase the safety of the procedure.
[0123] In some embodiments, a computer vision system may be
used to identify
the fiducials to model how far the distal catheter tip has rotated in relation
to how far the
motors controlling rotation at the proximal end of the catheter have moved. In
some
embodiments, this method may be paired with other safety information such as,
for example,
force detection and/or the like. In some embodiments, a similar method may be
applied to
detect discrepancies in expected advancement and retraction of the catheter as
compared to
actual advancement and retraction of the catheter. Use of this method may be
used to identify
potential obstructions to catheter motion. For example, using the methods
described herein to
-35-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
determine the actual position a catheter tip can be compared to the expected
position of a
catheter tip based on how the catheter was controlled.
Automated Tool Tracking
[0124] The automated identification of a tool, tool tip
and/or tool orientation may
be used to control movement of a decoupled imaging source to maintain optimal
viewing.
For example, in an endovascular procedure, movement of an angiography system
could be
precisely controlled and/or centered on the tool in question without manual
manipulation. In
another example, in a laparoscopic surgery, movement of a camera could be
precisely
controlled and/or centered on the tool in question without manual
manipulation. Being able
to maintain optimal viewing in this manner may have the advantage of
significantly better
imaging for the surgeon. In some cases, automated tool tracking may improve
the speed and
efficiency of procedures, for example, the procedure would be faster and more
efficient
because the surgeon does not have to put down the tools and move the camera
intermittently.
In some cases, automated tool tracking may reduce the number of personal
required in the
operation room because no one needs to control the camera.
[0125] Automated tool tracking may be most useful for
interventional and/or
surgical applications where the source of input imaging is decoupled from the
navigating
and/or interventional tool. In these procedures, generally an assistant must
manually track the
surgeon's tools which may result in both lag and imprecision. For example, in
an
angiographic procedure, the automated tool tracking system may be used to
maintain the tool
tip halfway across and one third up the image screen for the entire procedure
instead of the
current method of having the physician put down the tools and manually
readjust the screen
to focus on the tool. In another example, in a laparoscopic or thoracoscopic
procedure, the
automated tool tracking system may be used to enable an external camera source
to be
automatically adjusted to maintain the tools in the center of the image, thus
improving the
focus and positioning of the camera.
Example Endovascular and Other Applicable Procedures
[0126] The various technologies disclosed herein related to
determination of
position and/or orientation determination can be used to facilitate the
treatment of various
diseases and other conditions where a robotic or manual device is advanced
through an
intraluminal (e.g., intravascular) network of a subject to reach the site of
intravascular
-36-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
pathology (e.g., thrombosis, embolus, occlusion, aneurysm, rupture, bleeding,
dissection,
etc.). In some embodiments, the systems, devices. and methods described herein
can be used
to facilitate one or more endovascular purposes, surgeries, and/or treatments.
For example, in
some embodiments, the systems, processes, and methods described herein can he
used for
one or more of removal of intravascular blockage/reestablishment of perfusion;
treatment of
vessel wall injury (aneurysm and/or dissection); treatment of bleeding:
aneurysm
rupture/trauma; and/or the like. Moreover, in some embodiments, the systems,
devices, and
methods described herein can be used to treat vascular trauma.
[0127] In some embodiments, the systems, devices, and
methods described herein
can be used to facilitate neurovascular applications and/or treatments, such
as for example to
treat subarachnoid hemorrhage, aneurysm, arteriovenous malformation, and/or
the like. In
some embodiments, the systems, devices, and methods described herein can be
used for
cardiovascular applications and/or treatments, such as for example to treat
myocardial
infarction, coronary artery disease, pacemaker insertion, and/or the like. In
some
embodiments, the systems, devices, and methods described herein can be used
for aortic
applications and/or treatments, such as for example to treat aortic
dissection, aortic
aneurysm, and/or the like. In some embodiments, the systems, devices, and
methods
described herein can be used for peripheral emboli applications and/or
treatments. In some
embodiments, the systems, devices, and methods described herein can be used
for vascular
trauma applications and/or treatments. In some embodiments, the systems,
devices, and
methods described herein can be used for venous applications and/or
treatments.
[0128] While the features of this application have largely
been described in the
context of endoluminal or endovascular procedures, the inventions described
herein may also
be practiced fluoroscopically guided procedures, such as endoscopic retrograde
cholangiopancreatography (ERCP), discography and vertebroplasty, orthopedic
and podiatric
surgery, urological procedures including pyclography, intracardiac placement
of intracardiac
devices, ablations, and lumbar punctures.
Image Space Control Systems
[0129] This section describes devices, systems, and methods
for controlling
medical instruments, such as catheters, wherein user inputs are provided with
respect to a
plane of a two-dimensional medical image. For example, a user, such as a
physician that is
-37-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
controlling a robotic medical instrument or a medical instrument that is
coupled to a robotic
system, can provide user inputs to change a heading of the instrument within
the plane of the
medical image (e.g., as shown in FIG. 1B) and/or to change an incline of the
instrument into
or out of the plane of the medical image (e.g., as shown in FIG. 1C). This
type of control
system can be referred to herein as "image space control" because the user
inputs are
provided with respect to the plane of the image (e.g., adjusting the heading
within the plane
or adjusting the inclination into or out of the plane).
[0130] This type of control system is intuitive as the user
may provide such inputs
while viewing the medical image which includes at least a representation of a
distal portion
of the instrument. That is, the user can provide control inputs relative to
the current
appearance of the instrument within a medical image.
[0131] As discussed above with respect to FIG. 2A, many
articulable medical
instruments or catheters use one or more pull wires to control articulation
(bending or
deflection) of a distal tip of the instrument. For example, some catheters
include four
pullwircs, each configured to cause deflection of the catheter in one of four
directions: one
pullwire can be associated with deflecting the tip of the catheter up, one
pullwire can be
associated with deflecting the tip down, one pullwire can he associated with
deflecting the tip
right, and one pullwire can be associated with deflecting the tip down.
However, knowing
which pullwire to actuate to cause a given deflection requires an
understanding of the current
roll position of the catheter. If the distal tip of the catheter is rolled by
90 degrees, actuating
the pullwire generally associated with an upward deflection of the tip would
instead cause the
catheter to articulate (possibly unexpectedly) to the right or left.
[0132] However, without the fiducials described in this
application, which allow
for vision-based determination of, among other things, the current roll of the
instrument (see,
e.g., FIG. 16F), it can be extremely difficult or even impossible to determine
the current roll
of a medical instrument from an X-ray image. Thus, a physician controlling the
instrument
will have great difficulty in deciding which pullwires to actuate in order to
cause a desired
motion.
[0133] The image space control systems and methods
described herein reduce or
eliminate these difficulties and provide a natural and intuitive way to
control a medical
instrument by providing simplified user inputs with respect to a medical
image, such as a
-38-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
two-dimensional X-ray. Specifically, in some embodiments, a computer system
can
determine appropriate motor commands to cause the desired
movement/articulation of the
instrument based on user provided control inputs and an estimate of a current
roll of the
instrument. The estimate of the current roll of the instrument can be
determined based on an
appearance of one or more radio-opaque markers or fiducials that are included
on a distal end
of the image and which are visible within the medical image. In some
instances, the estimate
of current roll is determined by the system based on a computer vision
analysis of the
medical image. In some instances, the estimate of the current roll is
determined and input by
the user based on a user-identified appearance of the fiducials.
[0134] In this way, the user can provide natural and
intuitive inputs with respect
to the current position and orientation of the instrument within a medical
image, and the
system can determine appropriate motor commands (e.g., commands for actuating
one or
more of the pullwires of the instrument) to cause the desired motion. In some
embodiments,
this can allow the user to control the catheter in one or more of the
following three directions:
forward and back (insertion), left and right (heading), and/or into and out of
the image
(incline). These directions move with respect the plane of the image
regardless of how the X-
ray is moved or how the catheter is rolled in the body. This control mode is
intuitive and
provides a large advancement over the cun-ent standard of care, which requires
the user to
frequently guess and check which way the catheter will move on screen. Using
these
controls, the user can easily access tricky vessels and ensure safe navigation
of the
instrument through the vessels in an atraumatic fashion.
[0135] A user may provide user inputs in various ways. For
example, in some
embodiments, the user can specify desired targets for insertion, heading,
and/or incline. Once
specified, the system can determine the appropriate motor commands for causing
the
instrument to move from its current position and orientation to the desired
position and
orientation. Providing such absolute targets (e.g., desired targets for
insertion, heading,
and/or incline) may advantageously provide some resiliency and safety in the
event in a lag
in communication between the user and the robotic medical system. This can be
advantageous for situations wherein the user is remotely located from the
robotic system and
patient and communication occurs of a computer network, such as the internet.
-39-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
[0136] As another example, a user may provide user inputs
that are indicated
relative to the current position or orientation of the instrument. For
example, a user can
specify that the instrument adjust its heading to the right relative to the
current heading of the
instrument, While such a system may be less tolerant to high latency and
communication lag,
it still allows user to navigate in a simple and intuitive manner.
[0137] To enable image space control, it is necessary that
the robotic system that
is controlling the instrument have an accurate estimate of the current roll of
the instrument in
order to determine which pullwires to actuate to cause a desired movement. In
some
embodiments, the roll estimate is determined automatically the system based on
the
appearance of one or more fiducials on the instrument in the image. In some
embodiments,
the roll estimate may be determined with assistance from the user. For
example, the user may
select or provide a roll estimate by comparing the current appearance of the
one or more
fiducials in the image to one or more sample images representative of the
appearance of the
one or more fiducials at different roll angles.
[0138] Another advantage to image space control is that it
continues to function
even if the imaging device, for example, a C-arm of an X-ray machine, is
moved. This is
because control inputs are provided with respect to the plane of the image. If
the imaging
device is moved, the plane will move also, and control inputs will be provided
with respect to
the moved plane.
[0139] FIGs. 20A-20D illustrate an embodiment of a
graphical user interface 200
for providing image space control of a medical instrument. In the illustrated
embodiment, the
graphical user interface is configured to display a two-dimensional medical
image 202, such
as an X-ray. The medical image 202 includes a view of a distal end of a
medical instrument,
such as a catheter 204. The catheter 204 includes one or more fiducials 206
positioned
thereon. The fiducials 206 are visible within the medical image 202. The
fiducials 206 can be
configured as described above in order to allow for vision-based determination
of the
position and orientation (including roll) of the medical instrument. For
example, at least one
fiducial 206 can be configured such that it provides unique two-dimensional
appearances
associated with different roll angles for the catheter 204, for example, as
described above
with reference to FIG. 16F.
-40-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
[0140] The graphical user interface 200 may also include a
user input device 208.
The user input device 208 is configured to receive user inputs from a user
that are provided
with respect to the two-dimensional medical image 202. For example, in the
illustrated
embodiment, the user input device 208 includes features for allowing a user to
input insert
commands (e.g., to advance or retract the instrument 204), heading commands
(e.g., to alter
the heading of the medical instrument 204 within the plane of the medical
image 202, for
example, to the right or left of the instrument's current heading), and
incline commands (e.g.,
to alter the incline of the medical instrument 204 into or out of the plane of
the medical
image 202. The user input device 208 may include other options as well. For
example, in the
illustrated embodiment, the user input device 208 includes options to inject
contrast, confirm
an entered movement, and to relax the catheter.
[0141] Although the user input device 208 is illustrated as
a component of the
graphical user interface 200, this need not be the case in all embodiments.
For example, in
some embodiments, the user input device 208 can comprise a handheld control.
[0142] Importantly, the user input device 208 allows the
user to provide user
inputs for controlling the instrument 204 with respect to the current
configuration of
instrument as shown in the two-dimensional medical image 202. For example, as
shown in
FIG. 20B, the user may input a desired heading for the medical instrument via
the heading
input of the user input device. In the illustrated configuration, the user can
input a desired
heading by selecting a target point on the wheel. In FIG. 20B, the desired
heading is shown at
about 355 degrees with a highlighted circle. The current heading is also shown
on the wheel
at about 270 degrees as a lighter circle. The user may also select a desired
inclination using
the incline slider, if desired.
[0143] Continuing this example, with reference to FIG. 20C,
by selecting the
confirm move option, the robotic system can determine appropriate motor
commands to
cause the instrument 204 to move to the desired heading and incline. FIG. 20C
shows the
instrument 204 after movement. FIG. 20D illustrates that, by using the insert
arrows of the
user input device 208 the user can command forward and backward motion of the
instrument
208.
[0144] The graphical user interface 200 and user input
device 208 of FIGs. 20A-
20D provide only one example of how these features may be configured.
Additional
-41-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
examples are discussed below with reference to FIGs. 22A-23, and further
embodiments will
be apparent to those of ordinary skill in the art upon consideration of this
disclosure.
[0145] In order to generate appropriate motor commands
based on the user inputs
to cause the instrument to move appropriately, it is important that the
current roll of the
instrument be accounted for. This is necessary to ensure that the appropriate
pullwires are
actuated to cause the specified motion. In some embodiments, the system
determines the roll
of the instrument automatically, for example, using computer vision analysis
of the
appearance of one or more of the fiducials in the image as discussed above.
[0146] In other embodiments, the system may determine the
roll of the instrument
based upon a user input. For example, FIGs. 21A-21B illustrate two example
embodiments
for roll estimate determination. In FIG. 21A, the user is presented with both
the medical
image and a plurality of sample images. Each of the sample images illustrates
the appearance
of a fiducial at a specified roll value (e.g., similar to FIG. 16F). The user
is prompted to
select which images most closely corresponds to the appearance of the fiducial
in the medical
image. By increasing the number of sample images, the accuracy of the roll
estimate can be
improved. In some instances, the sample images are presented to the user in a
series of steps
(e.g., a first set of images at roll increments of 30 degrees, a second set of
images at roll
increments of 5 degrees, and a third set of images at roll increments of 1
degree).
[0147] FIG. 21B provides an alternative embodiment, wherein
a user is asked to
adjust a position of a slider (or provide another input) that causes a sample
image that
includes a representation of the fiducial at a given roll angle to change. The
position of the
slide is adjusted until the sample images corresponds to the appearance of the
fiducial in the
image.
[0148] In either embodiment, once the roll estimate is
determined, this
information can be used in conjunction with the user inputs of heading,
incline, and/or
insertion to provide appropriate motor commands.
[0149] FIGs. 22A-22B illustrate an embodiment of a user
input device for
providing image space control of a medical instrument. In this example, the
user input device
is similar to that which is shown in FIGs. 20A-20D. In this example, the user
input device is
configured to provide absolute or target-based inputs for heading and incline.
That is, using
the wheel for heading and the slider for incline, the user can select desired
angles. By
-42-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
selecting confirm move, the system can determine the appropriate motor
commands and
cause movement of the instrument until the specified angles are reached. In
some
embodiments, the system only moves while the user holds down the confirm move
button,
although this need not be the case in all embodiments. The mechanisms for
providing
absolute or target-based inputs for heading and incline should not be limited
to only the
wheel and slider shown. Other mechanisms are possible.
[0150] In the example of FIGs. 22A-22B, insert and retract
commands can be
provided using the appropriate arrows. This is an example of a relative input
scheme.
Pressing the up arrow can cause the instrument to advance relative to its
current position, an
pressing the down arrow can cause the instrument to retract relative to its
current position. In
some embodiments, insert commands can be provided in an absolute or target-
based manner.
For example, a user can specify a desired insertion or retraction (e.g., in
terms of millimeters,
pixels, etc.) and the system can generate motor commands configured to cause
movement to
the target.
[0151] FIG. 23 illustrates another embodiment of a user
input device for
providing image space control of a medical instrument. In the example of FIG.
23, arrows are
provided for each of insert, heading, and incline in order to provide relative
control. Pressing
any of the arrows can cause motion in the indicated direction (e.g., insert or
retract; alter
heading in the plane of the image; or alter incline into or out of the plane
of direction)
relative to the current position and/or heading of the instrument,
[0152] FIG. 24 is a flowchart depicting an example user
space control method
300. The method 300 begins at block 302 at which the system displays a medical
image to
the user. The medical image can be displayed, for example, on a graphical user
interface. The
medical image can be a two-dimensional medical image, such as an X-ray. The
medical
image can include a view of at least a distal end of a medical instrument as
well as one or
more fiducials positioned on the instrument.
[0153] At block 304, a roll estimate for the instrument is
determined based on the
medical image. In some embodiments, the roll estimate is determined based on a
two-
dimensional appearance of the one or more fiducials in the medical image. In
some
embodiments, the roll estimate is determined based on a computer vision
analysis of the one
or more fiducials in the medical image. In other embodiments, the roll
estimate is determined
-43-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
based on a user input, for example, as described with reference to FIGs. 21A
and 21B. For
example, the user may select which of a plurality of sample images, each
corresponding to a
different roll angle, most closely corresponds to the appearance of the
fiducial in the medical
image.
[0154] Next, at block 306, user inputs for desired motion
are provided. For
example, a user input can be received from a user input device. The user input
can comprise
at least one of a heading command to change a heading of the medical
instrument within a
plane of the medical image, or an incline command to change an incline of the
medical
instrument into or out of the plane of the medical image. In some embodiments,
the user
input comprises an insert or retract command. The user inputs can be provided
relative to the
current position of the instrument or as absolute or desired targets.
[0155] At block 308, based on the roll estimate and the
user input, the method
300 determines one or more motor commands configured to cause a robotic system
coupled
to the medical instrument to move the robotic medical instrument according to
the user input.
In some embodiments, the motor commands comprise pullwirc commands configured
to
actuate one or more pullwires of the medical instrument.
[0156] Finally, at block 310, the motor commands are
transmitted to a robotic
system that moves the instrument according to the commands.
Computer System
[0157] In some embodiments, the systems, processes, and
methods described
herein are implemented using a computing system, such as the one illustrated
in FIG. 10. The
example computer system 1002 is in communication with one or more computing
systems
1020 and/or one or more data sources 1022 via one or more networks 1018. While
FIG. 10
illustrates an embodiment of a computing system 1002, it is recognized that
the functionality
provided for in the components and modules of computer system 1002 can be
combined into
fewer components and modules, or further separated into additional components
and
modules.
[0158] The computer system 1002 can comprise a pose
determination module
1014 that carries out the functions, methods, acts, and/or processes described
herein. The
module 1014 is executed on the computer system 1002 by a central processing
unit 1006
discussed further below.
-44-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
[0159] In general the word "module," as used herein, refers
to logic embodied in
hardware or firmware or to a collection of software instructions, having entry
and exit points.
Modules are written in a program language, such as JAVA, C, C++, and/or the
like. Software
modules can be compiled or linked into an executable program, installed in a
dynamic link
library, or can be written in an interpreted language such as BASIC, PERL,
LAU, PHP, or
Python and/or any such languages. Software modules can be called from other
modules or
from themselves, and/or can be invoked in response to detected events or
interruptions.
Modules implemented in hardware include connected logic units such as gates
and flip-flops,
and/or can include programmable units, such as programmable gate arrays and/or
processors.
[0160] Generally, the modules described herein refer to
logical modules that can
be combined with other modules or divided into sub-modules despite their
physical
organization or storage. The modules are executed by one or more computing
systems and
can be stored on or within any suitable computer readable medium, or
implemented in-whole
or in-part within special designed hardware or firmware. Not all calculations,
analysis, and/or
optimization require the use of computer systems, though any of the above-
described
methods, calculations, processes, or analyses can be facilitated through the
use of computers.
Further, in some embodiments, process blocks described herein can be altered,
rearranged,
combined, and/or omitted.
[0161] The computer system 1002 includes one or more
processing units (CPU)
1006, which can comprise a microprocessor. The computer system 1002 further
includes a
physical memory 1010, such as random access memory (RAM) for temporary storage
of
information, a read only memory (ROM) for permanent storage of information,
and a mass
storage device 1004, such as a backing store, hard drive, rotating magnetic
disks, solid state
disks (SSD), flash memory, phase-change memory (PCM), 3D XPoint memory,
diskette, or
optical media storage device. Alternatively, the mass storage device can be
implemented in
an array of servers. Typically, the components of the computer system 1002 are
connected to
the computer using a standards-based bus system. The bus system can be
implemented using
various protocols, such as Peripheral Component Interconnect (PCI), Micro
Channel, SCSI,
Industrial Standard Architecture (ISA) and Extended ISA (EISA) architectures.
[0162] The computer system 1002 includes one or more
input/output (I/0)
devices and interfaces 1012, such as a keyboard, mouse, touch pad, and
printer. The I/0
-45-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
devices and interfaces 1012 can include one or more display devices, such as a
monitor,
which allows the visual presentation of data to a user. More particularly, a
display device
provides for the presentation of GUIs as application software data, and multi-
media
presentations, for example. The I/O devices and interfaces 1012 can also
provide a
communications interface to various external devices. The computer system 1002
can
comprise one or more multi-media devices 1008, such as speakers, video cards,
graphics
accelerators, and microphones, for example.
Computing System Device / Operating System
[0163] The computer system 1002 can run on a variety of
computing devices,
such as a server, a Windows server, a Structure Query Language server, a Unix
Server, a
personal computer, a laptop computer, and so forth. In other embodiments, the
computer
system 1002 can run on a cluster computer system, a mainframe computer system
and/or
other computing system suitable for controlling and/or communicating with
large databases,
performing high volume transaction processing, and generating reports from
large databases.
The computing system 1002 is generally controlled and coordinated by an
operating system
software, such as z/OS, Windows, Linux, UNIX, BSD, PHP, SunOS, Solaris, MacOS,
ICloud services or other compatible operating systems, including proprietary
operating
systems. Operating systems control and schedule computer processes for
execution, perform
memory management, provide file system, networking, and I/0 services, and
provide a user
interface, such as a graphical user interface (GUI), among other things.
Network
[0164] The computer system 1002 illustrated in FIG. 10 is
coupled to a network
1018, such as a LAN, WAN, or the Internet via a communication link 1016
(wired, wireless,
or a combination thereof). Network 1018 communicates with various computing
devices
and/or other electronic devices. Network 1018 is communicating with one or
more
computing systems 1020 and one or more data sources 1022. The pose
determination module
1014 can access or can be accessed by computing systems 1020 and/or data
sources 1022
through a web-enabled user access point. Connections can be a direct physical
connection, a
virtual connection, and other connection type. The web-enabled user access
point can
comprise a browser module that uses text, graphics, audio, video, and other
media to present
data and to allow interaction with data via the network 1018.
-46-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
[0165] The output module can be implemented as a
combination of an all-points
addressable display such as a cathode ray tube (CRT), a liquid crystal display
(LCD), a
plasma display, or other types and/or combinations of displays. The output
module can be
implemented to communicate with input devices 1012 and they also include
software with
the appropriate interfaces which allow a user to access data through the use
of stylized screen
elements, such as menus, windows, dialogue boxes, tool bars, and controls (for
example,
radio buttons, check boxes, sliding scales, and so forth). Furthermore, the
output module can
communicate with a set of input and output devices to receive signals from the
user.
Other Systems
[0166] The computing system 1002 can include one or more
internal and/or
external data sources (for example, data sources 1022). In some embodiments,
one or more
of the data repositories and the data sources described above can be
implemented using a
relational database, such as DB2, Sybase, Oracle, CodeBase, and Microsoft SQL
Server as
well as other types of databases such as a flat-file database, an entity
relationship database,
and object-oriented database, and/or a record-based database.
[0167] The computer system 1002 can also access one or more
databases 1022.
The databases 1022 can be stored in a database or data repository. The
computer system 1002
can access the one or more databases 1022 through a network 1018 or can
directly access the
database or data repository through I/0 devices and interfaces 1012. The data
repository
storing the one or more databases 1022 can reside within the computer system
1002.
URLs and Cookies
[0168] In some embodiments, one or more features of the
systems, methods, and
devices described herein can utilize a URL and/or cookies, for example for
storing and/or
transmitting data or user information. A Uniform Resource Locator (URL) can
include a web
address and/or a reference to a web resource that is stored on a database
and/or a server. The
URL can specify the location of the resource on a computer and/or a computer
network. The
URL can include a mechanism to retrieve the network resource. The source of
the network
resource can receive a URL, identify the location of the web resource, and
transmit the web
resource back to the requestor. A URL can be converted to an IP address, and a
Doman
Name System (DNS) can look up the URL and its corresponding IP address. URLs
can be
references to web pages, file transfers, emails, database accesses, and other
applications. The
-47-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
URLs can include a sequence of characters that identify a path, domain name, a
file
extension, a host name, a query, a fragment, scheme, a protocol identifier, a
port number, a
username, a password, a flag, an object, a resource name and/or the like. The
systems
disclosed herein can generate, receive, transmit, apply, parse, serialize,
render, and/or
perform an action on a URL.
[0169] A cookie, also referred to as an HTTP cookie, a web
cookie, an internet
cookie, and a browser cookie, can include data sent from a website and/or
stored on a user's
computer. This data can be stored by a user's web browser while the user is
browsing. The
cookies can include useful information for websites to remember prior browsing
information,
such as a shopping cart on an online store, clicking of buttons, login
information, and/or
records of web pages or network resources visited in the past. Cookies can
also include
information that the user enters, such as names, addresses, passwords, credit
card
information, etc. Cookies can also perform computer functions. For example,
authentication
cookies can be used by applications (for example, a web browser) to identify
whether the
user is already logged in (for example, to a web site). The cookie data can be
encrypted to
provide security for the consumer. Tracking cookies can be used to compile
historical
browsing histories of individuals. Systems disclosed herein can generate and
use cookies to
access data of an individual. Systems can also generate and use JSON web
tokens to store
authenticity information, HTTP authentication as authentication protocols, IP
addresses to
track session or identity information, URLs, and the like.
Embodiments
[0170] It will now be evident to those skilled in the art
that there has been
described herein methods, systems, and devices for improved routing of
catheters and other
devices to targeted anatomical locations using robotically controlled
assemblies. Although
the inventions hereof have been described by way of several embodiments, it
will be evident
that other adaptations and modifications can be employed without departing
from the spirit
and scope thereof. The terms and expressions employed herein have been used as
terms of
description and not of limitation; and thus, there is no intent of excluding
equivalents, but on
the contrary, it is intended to cover any and all equivalents that may be
employed without
departing from the spirit and scope of the inventions.
-48-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
[0171] While the disclosure has been described with
reference to certain
embodiments, it will be understood that various changes may be made, and
equivalents may
be substituted for elements thereof without departing from the scope of the
disclosure. In
addition, many modifications will be appreciated to adapt a particular
instrument, situation,
or material to the teachings of the disclosure without departing from the
essential scope
thereof. Therefore, it is intended that the disclosure is not limited to the
particular
embodiment disclosed as the best mode contemplated for carrying out this
disclosure, but
that the disclosure will include all embodiments falling within the scope of
the appended
claims.
[0172] Although several embodiments and examples are
disclosed herein, the
present application extends beyond the specifically disclosed embodiments to
other
alternative embodiments and/or uses of the inventions and modifications and
equivalents
thereof. It is also contemplated that various combinations or subcombinations
of the specific
features and aspects of the embodiments may be made and still fall within the
scope of the
inventions. Accordingly, it should be understood that various features and
aspects of the
disclosed embodiments can be combined with or substituted for one another in
order to form
varying modes of the disclosed inventions. Thus, it is intended that the scope
of the present
inventions herein disclosed should not be limited by the particular disclosed
embodiments
described above but should be determined only by a fair reading of the claims
that follow.
[0173] While the embodiments disclosed herein are
susceptible to various
modifications, and alternative forms, specific examples thereof have been
shown in the
drawings and are herein described in detail. It should be understood, however,
that the
inventions are not to be limited to the particular forms or methods disclosed,
but, to the
contrary, the inventions are to cover all modifications, equivalents, and
alternatives falling
within the spirit and scope of the various embodiments described and the
appended claims.
Any methods disclosed herein need not be performed in the order recited. The
methods
disclosed herein include certain actions taken by a practitioner; however,
they can also
include any third-party instruction of those actions, either expressly or by
implication. For
example, actions such as "advancing a catheter or microcatheter" or "advancing
one portion
of the device (e.g., linearly) relative to another portion of the device to
rotate the distal end of
the device" include instructing advancing a catheter" or "instructing
advancing one portion of
-49-
CA 03239010 2024- 5- 23

WO 2023/096679
PCT/US2022/040118
the device," respectively. The ranges disclosed herein also encompass any and
all overlap,
sub-ranges, and combinations thereof. Language such as "up to," "at least,"
"greater than,"
"less than," "between," and the like includes the number recited. Numbers
preceded by a
term such as -about" or -approximately" include the recited numbers. For
example, -about
mm" includes "10 mm." Terms or phrases preceded by a term such as
"substantially"
include the recited term or phrase. For example, "substantially parallel"
includes "parallel."
-50-
CA 03239010 2024- 5- 23

Dessin représentatif
Une figure unique qui représente un dessin illustrant l'invention.
États administratifs

2024-08-01 : Dans le cadre de la transition vers les Brevets de nouvelle génération (BNG), la base de données sur les brevets canadiens (BDBC) contient désormais un Historique d'événement plus détaillé, qui reproduit le Journal des événements de notre nouvelle solution interne.

Veuillez noter que les événements débutant par « Inactive : » se réfèrent à des événements qui ne sont plus utilisés dans notre nouvelle solution interne.

Pour une meilleure compréhension de l'état de la demande ou brevet qui figure sur cette page, la rubrique Mise en garde , et les descriptions de Brevet , Historique d'événement , Taxes périodiques et Historique des paiements devraient être consultées.

Historique d'événement

Description Date
Inactive : Page couverture publiée 2024-05-29
Exigences applicables à la revendication de priorité - jugée conforme 2024-05-28
Lettre envoyée 2024-05-28
Exigences quant à la conformité - jugées remplies 2024-05-28
Exigences applicables à la revendication de priorité - jugée conforme 2024-05-23
Lettre envoyée 2024-05-23
Demande de priorité reçue 2024-05-23
Inactive : CIB attribuée 2024-05-23
Inactive : CIB attribuée 2024-05-23
Inactive : CIB en 1re position 2024-05-23
Demande reçue - PCT 2024-05-23
Exigences pour l'entrée dans la phase nationale - jugée conforme 2024-05-23
Demande de priorité reçue 2024-05-23
Demande publiée (accessible au public) 2023-06-01

Historique d'abandonnement

Il n'y a pas d'historique d'abandonnement

Historique des taxes

Type de taxes Anniversaire Échéance Date payée
Taxe nationale de base - générale 2024-05-23
Enregistrement d'un document 2024-05-23
Titulaires au dossier

Les titulaires actuels et antérieures au dossier sont affichés en ordre alphabétique.

Titulaires actuels au dossier
REMEDY ROBOTICS, INC.
Titulaires antérieures au dossier
DAVID JAMES BELL
GREGORY KAHN
JAKE ANTHONY SGANGA
Les propriétaires antérieurs qui ne figurent pas dans la liste des « Propriétaires au dossier » apparaîtront dans d'autres documents au dossier.
Documents

Pour visionner les fichiers sélectionnés, entrer le code reCAPTCHA :



Pour visualiser une image, cliquer sur un lien dans la colonne description du document (Temporairement non-disponible). Pour télécharger l'image (les images), cliquer l'une ou plusieurs cases à cocher dans la première colonne et ensuite cliquer sur le bouton "Télécharger sélection en format PDF (archive Zip)" ou le bouton "Télécharger sélection (en un fichier PDF fusionné)".

Liste des documents de brevet publiés et non publiés sur la BDBC .

Si vous avez des difficultés à accéder au contenu, veuillez communiquer avec le Centre de services à la clientèle au 1-866-997-1936, ou envoyer un courriel au Centre de service à la clientèle de l'OPIC.


Description du
Document 
Date
(yyyy-mm-dd) 
Nombre de pages   Taille de l'image (Ko) 
Dessins 2024-05-28 30 3 748
Description 2024-05-28 50 2 666
Revendications 2024-05-28 5 191
Abrégé 2024-05-28 1 18
Description 2024-05-22 50 2 666
Revendications 2024-05-22 5 191
Dessins 2024-05-22 30 3 748
Abrégé 2024-05-22 1 18
Dessin représentatif 2024-05-28 1 83
Page couverture 2024-05-28 1 119
Demande d'entrée en phase nationale 2024-05-22 2 73
Divers correspondance 2024-05-22 2 46
Déclaration de droits 2024-05-22 1 30
Cession 2024-05-22 5 171
Demande de priorité - PCT 2024-05-22 68 5 641
Traité de coopération en matière de brevets (PCT) 2024-05-22 1 35
Traité de coopération en matière de brevets (PCT) 2024-05-22 1 35
Demande de priorité - PCT 2024-05-22 87 6 098
Traité de coopération en matière de brevets (PCT) 2024-05-22 1 35
Traité de coopération en matière de brevets (PCT) 2024-05-22 2 155
Traité de coopération en matière de brevets (PCT) 2024-05-22 1 36
Rapport de recherche internationale 2024-05-22 3 87
Traité de coopération en matière de brevets (PCT) 2024-05-22 1 64
Traité de coopération en matière de brevets (PCT) 2024-05-22 1 37
Traité de coopération en matière de brevets (PCT) 2024-05-22 1 37
Traité de coopération en matière de brevets (PCT) 2024-05-22 1 38
Traité de coopération en matière de brevets (PCT) 2024-05-22 1 37
Demande d'entrée en phase nationale 2024-05-22 10 243
Courtoisie - Lettre confirmant l'entrée en phase nationale en vertu du PCT 2024-05-22 2 49
Courtoisie - Certificat d'enregistrement (document(s) connexe(s)) 2024-05-27 1 382