Language selection

Search

Patent 3158929 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent Application: (11) CA 3158929
(54) English Title: ROBOT TRACKING METHOD, DEVICE, EQUIPMENT, AND COMPUTER-READABLE STORAGE MEDIUM
(54) French Title: PROCEDE, DISPOSITIF ET EQUIPEMENT DE SUIVI DE ROBOT ET SUPPORT DE STOCKAGE LISIBLE PAR ORDINATEUR
Status: Examination
Bibliographic Data
(51) International Patent Classification (IPC):
  • G01C 21/20 (2006.01)
(72) Inventors :
  • ZHENG, XINJIANG (China)
  • LI, MINGHAO (China)
  • FAN, GUOXU (China)
  • ZHAO, JINGQUAN (China)
(73) Owners :
  • 10353744 CANADA LTD.
(71) Applicants :
  • 10353744 CANADA LTD. (Canada)
(74) Agent: JAMES W. HINTONHINTON, JAMES W.
(74) Associate agent:
(45) Issued:
(86) PCT Filing Date: 2020-07-30
(87) Open to Public Inspection: 2021-05-06
Examination requested: 2022-09-16
Availability of licence: N/A
Dedicated to the Public: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): Yes
(86) PCT Filing Number: PCT/CN2020/105997
(87) International Publication Number: WO 2021082571
(85) National Entry: 2022-04-25

(30) Application Priority Data:
Application No. Country/Territory Date
201911048673.3 (China) 2019-10-29

Abstracts

English Abstract

A robot tracking method, device and equipment and a computer readable storage medium, belonging to the field of intelligent robot control. The method comprises: at each moment of tracking, obtaining observation data obtained by observing a robot by at least two ultrasonic arrays (101); and estimating the motion state of the robot of each moment by using a preset augmented IMM-EKF algorithm, specifically including: respectively obtaining, by means of a number m of augmented EKF filters matched with m motion models corresponding to m motion states at k moment, the corresponding state estimation of the robot under each of the motion models at the k moment to obtain m state estimations, and performing weighting calculation on the m state estimations to obtain a state estimation result of the robot at the k moment, wherein each moment is represented by k moment and k and m are integers greater than zero (102). The intelligent robot can be stably and effectively tracked when the motion state of the robot is unknown and changeable, the phenomenon of mistracking or tracking loss is reduced, and the method is suitable for application scenarios with multiple ultrasonic arrays.


French Abstract

L'invention concerne un procédé, un dispositif et un équipement de suivi de robot, ainsi qu'un support de stockage lisible par ordinateur, se rapportant au domaine de la commande intelligente de robot. Le procédé consiste : à chaque instant de suivi, à obtenir des données d'observation obtenues par l'observation d'un robot par au moins deux réseaux ultrasonores (101) ; et à estimer l'état de déplacement du robot de chaque instant à l'aide d'un algorithme IMM-EKF augmenté prédéfini, consistant plus particulièrement : à obtenir respectivement, au moyen d'un nombre m de filtres EKF augmentés mis en correspondance avec m modèles de déplacement correspondant à m états de déplacement à un instant k, l'estimation d'état correspondante du robot sous chacun des modèles de déplacement à l'instant k afin d'obtenir m estimations d'état, et à effectuer un calcul de pondération sur les m estimations d'état afin d'obtenir un résultat d'estimation d'état du robot à l'instant k, chaque instant étant représenté par un instant k, et k et m étant des entiers supérieurs à zéro (102). Le robot intelligent peut être suivi de manière stable et efficace lorsque l'état de déplacement du robot est inconnu et modifiable, le phénomène d'un mauvais suivi ou d'une perte de suivi est réduit, et le procédé est approprié pour des scénarios d'application à l'aide de multiples réseaux ultrasonores.

Claims

Note: Claims are shown in the official language in which they were submitted.


CLAIMS
What is claimed is:
1. A robot tracking method, characterized in that the method comprises:
obtaining observation data of at least two ultrasonic arrays on a robot at
each timing of tracking;
and
employing a preset dimension-extended IMM-EKF algorithm to estimate a motion
state of the
robot at each timing, obtaining state estimation to which the robot
corresponds under each motion
model at timing k through m number of dimension-extended EKF filters that
match m number
of motion models corresponding to m number of motion states at timing k,
obtaining m number
of states, and performing weighted calculation on the m number of states to
obtain a state
estimation result of the robot at timing k, wherein each timing is expressed
by timing k, and k, m
are each an integer greater than 0.
2. The method according to Claim 1, characterized in that the step of
obtaining observation
data of at least two ultrasonic arrays on a robot at each timing of tracking
includes:
obtaining observation data <IMG> of at least two ultrasonic arrays on the
robot at timing
k, wherein k, n are each an integer greater than 0, and <IMG> are all vectors
of robot
angles and distance data measured by the at least two ultrasonic arrays.
3. The method according to Claim 2, characterized in that the step, employing
a preset
dimension-extended IMM-EKF algorithm to estimate a motion state of the robot
at each timing,
obtaining state estimation to which the robot corresponds under each motion
model at timing k
through m number of dimension-extended EKF filters that match m number of
motion models
corresponding to m number of motion states at timing k, obtaining m number of
states, and
performing weighted calculation on the m number of states to obtain a state
estimation result of
the robot at timing k, includes:
a robot tracking system creating step: creating the robot tracking system that
includes a motion
26

equation and an observation equation of the robot, expressed as:
the motion equation: <IMG>
the observation equation: <IMG>
cii = P(Mk = Mi IMk-1 = Mi);
where i, j=1, 2... m represents the number of models, n=1,2 ........... n
represents the number of
ultrasonic arrays, m, n are each an integer greater than or equal to 1, k E N
represents timing,
Cii represents a probability of a target transferring from model i at timing k-
1 to model j at timing
k, PI, represents an ith model state transfer matrix at timing k, 44
represents a target state
under an ith motion model at timing k, AZ represents an observation matrix of
an nth array at
timing k, zk' represents target state observation received by the nth array at
timing k, wk
represents process noise of model i, viki represents observation noise of the
nth array, the two
noises are both supposed to be white Gaussian noise with a mean value of zero
and a covariance
of QJ and RIki respectively;
a model input interacting step: letting Xki _llk_l be state estimation of
dimension-extended EKF
filter i at timing k-1, Pki_llk_l be corresponding covariance matrix
estimation, and 1.4_1 be a
probability of model i at timing k-1, after interactive calculation, input
calculation formulae of
dimension-extended EKF filter j at timing k being as follows:
<IMG>
where <IMG>
<IMG>
a sub-model filtering step: calculating and obtaining corresponding inputs
<IMG> at
various dimension-extended EKF filters, and employing the obtained
measurement <IMG> to
perform corresponding state estimation updates under various
models;
a model probability updating step: calculating model probabilities on various
models i=1, 2, ...m,
the calculation formula being as follows:
27

<IMG>
where <IMG>
an estimation fusion outputting step: calculating state estimation and
covariance matrix
estimation of the target at the current timing according to update
probabilities, state estimations
and covariance matrix estimations of the various models, the calculation
formulae being as
follows:
<IMG>
where xi, lk represents target state estimation at timing k, and Pklk
represents target state
covariance matrix estimation at timing k.
4. The method according to Claim 3, characterized in that the sub-model
filtering step includes:
a state predicting sub-step: with respect to the various models i=1, 2... m,
calculating
corresponding prediction states and prediction covariance matrixes
respectively, the calculation
formulae being as follows:
<IMG>
a data fusing sub-step: employing a dimension-extended algorithm to perform
data fusion,
formulae of various corresponding variables being as follows:
<IMG>
corresponding to the models i=1, 2... m, calculation formulae of their
respective measurement
prediction residuals and measurement covariances being as follows:
<IMG>
28

a likelihood function corresponding to model i being simultaneously
calculated, the likelihood
function being as follows in the supposition that a condition of Gaussian
distribution is abided
by:
<IMG>
a filter updating sub-step: corresponding to the models i=1, 2... m,
respectively calculating their
respective filter gains, state estimation updates, and error covariance
matrixes, the calculation
formulae being as follows:
<IMG>
5. A robot tracking device, characterized in that the device comprises:
a data obtaining module, for obtaining observation data of at least two
ultrasonic arrays on a
robot at each timing of tracking; and
a calculating module, for employing a preset dimension-extended IMM-EKF
algorithm to
estimate a motion state of the robot at each timing, obtaining state
estimation to which the robot
corresponds under each motion model at timing k through m number of dimension-
extended EKF
filters that match m number of motion models corresponding to m number of
motion states at
timing k, obtaining m number of states, and performing weighted calculation on
the m number
of states to obtain a state estimation result at timing k, wherein each timing
is expressed by timing
k, and k, m are each an integer greater than O.
6. The device according to Claim 5, characterized in that the data
obtaining module is employed
for:
obtaining observation data 4, 4 ....................................... ziki
of at least two ultrasonic arrays on the robot at timing
k, wherein k, n are each an integer greater than 0, and 4, 4 .......... zn are
all vectors of robot
k
angles and distance data measured by the at least two ultrasonic arrays.
29

7. The device according to Claim 6, characterized in that the calculating
module includes a
robot tracking system creating module for:
creating the robot tracking system that includes a motion equation and an
observation equation
of the robot, expressed as:
the motion equation: <IMG>
the observation equation:
<IMG>
where i, j=1, 2... m represents the number of models, n=1,2 ........... n
represents the number of
ultrasonic arrays, m, n are each an integer greater than or equal to 1, k E N
represents timing,
Cii represents a probability of a target transferring from model i at timing k-
1 to model j at timing
k, F represents an ith model state transfer matrix at timing k, xLxii,
represents a target state
under an ith motion model at timing k, 41 represents an observation matrix of
an nth array at
timing k, zk' represents target state observation received by the nth array at
timing k, wk
represents process noise of model i, viki represents observation noise of the
nth array, the two
noises are both supposed to be white Gaussian noise with a mean value of zero
and a covariance
of Qk and f 41 respectively;
a model input interacting module, for letting xki _11k_1 be state estimation
of dimension-
extended EKF filter i at timing k-1, Pk_11k_1 be corresponding covariance
matrix estimation,
and ptik_1 be a probability of model i at timing k-1, after interactive
calculation, input calculation
formulae of dimension-extended EKF filter j at timing k being as follows:
<IMG>
<IMG>
where
a sub-model filtering module, for calculating and obtaining corresponding
inputs xic i llk-1,
<IMG> at
various dimension-extended EKF filters, and employing the obtained
3 0

measurement 4,, 4 .. 41 to perform corresponding state estimation updates
under various
models;
a model probability updating module: for calculating model probabilities on
various models i=1,
2, ...m, the calculation formula being as follows:
<IMG>
where <IMG> and
an estimation fusion outputting module, for calculating state estimation and
covariance matrix
estimation of the target at the current timing according to update
probabilities, state estimations
and covariance matrix estimations of the various models, the calculation
formulae being as
follows:
<IMG>
where xi, lk represents target state estimation at timing k, and Pk lk
represents target state
covariance matrix estimation at timing k.
8. The device according to Claim 7, characterized in that the sub-model
filtering module
includes:
a state predicting sub-module for, with respect to the various models i=1,
2... m, calculating
corresponding prediction states and prediction covariance matrixes
respectively, the calculation
formulae being as follows:
<IMG>
a data fusing sub-module, for employing a dimension-extended algorithm to
perform data fusion,
formulae of various corresponding variables being as follows:
<IMG>
3 1

corresponding to the models i=1, 2... m, calculation formulae of their
respective measurement
prediction residuals and measurement covariances being as follows:
<IMG>
a likelihood function corresponding to model i being simultaneously
calculated, the likelihood
function being as follows in the supposition that a condition of Gaussian
distribution is abided
by:
<IMG>
a filter updating sub-module, for, corresponding to the models i=1, 2... m,
respectively
calculating their respective filter gains, state estimation updates, and error
covariance matrixes,
the calculation formulae being as follows:
<IMG>
9. A robot tracking equipment, characterized in comprising:
a processor; and
a memory, for storing an executable instruction of the processor; wherein
the processor is configured to execute steps of the robot tracking method
according to anyone of
Claims 1 to 4 via the executable instruction.
10. A computer-readable storage medium, characterized in that the computer-
readable storage
medium stores a computer program, the steps of the robot tracking method
according to anyone
of Claims 1 to 4 are realized when the computer program is executed by a
processor.
32

Description

Note: Descriptions are shown in the official language in which they were submitted.


CA 03158929 2022-04-25
ROBOT TRACKING METHOD, DEVICE, EQUIPMENT, AND COMPUTER-
READABLE STORAGE MEDIUM
BACKGROUND OF THE INVENTION
Technical Field
[0001] The present invention relates to the field of intelligent robots
manipulation and control,
and more particularly to a robot tracking method, and corresponding device,
equipment,
and computer-readable storage medium.
Description of Related Art
[0002] At present, intelligent robots have been applied in such various fields
as ocean exploration,
security and medical care, and bring about great convenience to the
development of
science and everyday life of people, it is therefore necessary to track robots
in real time.
However, when an intelligent robot operates underwater or indoors, it is
impossible to
employ satellite positioning. The method of visual navigation is advantageous
in
obtaining complete information and surveying wide ranges, so it plays an
important role
in robot navigation, but disadvantages thereof rest in longer time in
processing visual
images and inferior real timeliness. Accordingly, scholars in the field have
conducted
researches on positioning moving robots based on the radio frequency
identification
(RFID) technology, as should be pointed out, however, the precision of such
wireless
positioning technology is by the order of meter, and cannot meet the
requirement of high-
precision navigation and positioning of indoor robots to use the intelligent
robots to
transmit ultrasonic waves. By receiving sound signals emitted by a robot
through multiple
ultrasonic arrays, acquiring the observation location of the robot after
processing, and
finally obtaining location estimation of the robot through filtration by a
tracking
algorithm, such a method achieves higher tracking precision while satisfies
the demand
1
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
on real timeliness.
[0003] Common tracking algorithms mainly include extended Kalman filter,
unscented Kalman
filter, and particle filter etc., and these algorithms exhibit excellent
tracking effects in the
case the target motion model is known and the motion state remains essentially
unchanged. However, in the actual target tracking process, the motion model is
usually
unknown, the motion state of the robot is also frequently subjected to
changes, and the
tracking effects of the aforementioned algorithms will be reduced, or even
dissipated. In
comparison with a single ultrasonic reception array, the multi-arrays tracking
system can
obtain more motion state information of the target, and tracking precision is
enhanced by
the use of a corresponding fusion algorithm.
SUMMARY OF THE INVENTION
[0004] In order to solve problems pending in the state of the art, embodiments
of the present
invention provide a robot tracking method and a robot tracking device, whereby
tracking
precision in tracking a robot indoors is enhanced, the tracking error is
small, and the
computational amount is relatively low, so as to realize stable and effective
tracking of
an intelligent robot also when its state is unknown and variable, and to
reduce the
phenomenon of erroneous tracking or failed tracking. The technical solutions
are as
follows.
[0005] According to one aspect, there is provided a robot tracking method that
comprises:
[0006] obtaining observation data of at least two ultrasonic arrays on a robot
at each timing of
tracking; and
[0007] employing a preset dimension-extended IMM-EKF algorithm to estimate a
motion state
of the robot at each timing, obtaining state estimation to which the robot
corresponds
under each motion model at timing k through m number of dimension-extended EKF
filters that match m number of motion models corresponding to m number of
motion
2
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
states at timing k, obtaining m number of states, and performing weighted
calculation on
the m number of states to obtain a state estimation result of the robot at
timing k, wherein
each timing is expressed by timing k, and k, m are each an integer greater
than 0.
[0008] Further, the step of obtaining observation data of at least two
ultrasonic arrays on a robot
at each timing of tracking includes:
[0009] obtaining observation data 4, z,2, .................................
zici of at least two ultrasonic arrays on the robot
at timing k, wherein k, n are each an integer greater than 0, and 4, Z12,
Zk are all
vectors of robot angles and distance data measured by the at least two
ultrasonic arrays.
[0010] Further, the step, employing a preset dimension-extended IMM-EKF
algorithm to
estimate a motion state of the robot at each timing, obtaining state
estimation to which
the robot corresponds under each motion model at timing k through m number of
dimension-extended EKE filters that match m number of motion models
corresponding
to m number of motion states at timing k, obtaining m number of states, and
performing
weighted calculation on the m number of states to obtain a state estimation
result of the
robot at timing k, includes:
[0011] a robot tracking system creating step: creating the robot tracking
system that includes a
motion equation and an observation equation of the robot, expressed as:
[0012] the motion equation: 4+1 = Fx + wk;
[0013] the observation equation: 41 = Nk'xk + vikl;
[0014] C11 = P(Mk = iMk-1 = Mi);
[0015] where i, j=1, 2... m represents the number of models, n=1,2 ........ n
represents the number
of ultrasonic arrays, m, n are each an integer greater than or equal to 1, k E
N represents
timing, C11 represents a probability of a target transferring from model i at
timing k-1 to
model j at timing k, F represents an ith model state transfer matrix at timing
k, 4,4
represents a target state under an ith motion model at timing k, 41 represents
an
observation matrix of an nth array at timing k, zk' represents target state
observation
received by the nth array at timing k, represents process noise of model
i,
3
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
represents observation noise of the nth array, the two noises are both
supposed to be white
Gaussian noise with a mean value of zero and a covariance of 0, and Rikl
respectively;
[0016] a model input interacting step: letting xki be
state estimation of dimension-
extended ETU filter i at timing k-1, Pki_1ik_1 be corresponding covariance
matrix
estimation, and /4_1 be a probability of model i at timing k-1, after
interactive
calculation, input calculation formulae of dimension-extended ETU filter j at
timing k
being as follows:
of
,,, .
[0017] xk_i1-1 = En=1 Xki
[0018] of _L r,i of ir,i of õili
Pk-tik-1 = E'in=1tPki-1lk-1 ' L'Ic-111c-1 Xk-111c-1-1LA'k-lik-1 Xk-111c-
1-1
[0019] where kik _ = Cij4-1c1 = Elir C
c;
[0020] a sub-model filtering step: calculating and obtaining corresponding
inputs xk i
1ik_1(i = 1,2... m) at various dimension-extended ETU filters, and employing
the
obtained measurement zilc, Z .............................................
zto perform corresponding state estimation updates
under various models;
[0021] a model probability updating step: calculating model probabilities on
various models i=1,
2, ...m, the calculation formula being as follows:
[0022] 4 = Aikci;
[0023] where ci = _ C111, c = Aik Ci;
[0024] an estimation fusion outputting step: calculating state estimation and
covariance matrix
estimation of the target at the current timing according to update
probabilities, state
estimations and covariance matrix estimations of the various models, the
calculation
formulae being as follows:
i.
xkik = rin=1 xki ik
[0025] 13k1k = rin=itPkilk kkik ¨ Xklk] [Xkik ¨ Xki ik]T),Uik;
[0026] where xkik represents target state estimation at timing k, and Pk'',
represents target
4
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
state covariance matrix estimation at timing k.
[0027] Further, the sub-model filtering step includes:
[0028] a state predicting sub-step: with respect to the various models i=1,
2... m, calculating
corresponding prediction states and prediction covariance matrixes
respectively, the
calculation formulae being as follows:
[0029] xki =lik-1;
[0030] 13ki1k-1 = +
[0031] a data fusing sub-step: employing a dimension-extended algorithm to
perform data fusion,
formulae of various corresponding variables being as follows:
[0032] 4 = [(4)T, (z,2c)T1T;
[0033] A k = KADT , (A2k)T1T ;
[0034] R k = diag[Rilc,Ri2c];
[0035] corresponding to the models i=1, 2... m, calculation formulae of their
respective
measurement prediction residuals and measurement covariances being as follows:
[0036] 14 = 4 ¨ Akxki Ik_,;
[0037] Sic = Ak_Pkilk_i(Ak)T + R k;
[0038] a likelihood function corresponding to model i being simultaneously
calculated, the
likelihood function being as follows in the supposition that a condition of
Gaussian
distribution is abided by:
[0039] i1 = __ 1 __ exp[ ¨ -1 (v)T (S)-1-14];
õ1127Tsk 2
[0040] a filter updating sub-step: corresponding to the models i=1, 2... m,
respectively
calculating their respective filter gains, state estimation updates, and error
covariance
matrixes, the calculation formulae being as follows:
[0041] Kiic = Pkilk_i(Ak)T(Sk)l;
[0042] xi
kik = xki pc-1 + Klcvic;
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
[0043] Pkiik = Pi Ki A PI
k k kik-1-
[0044] According to another aspect, there is provided a robot tracking device
that comprises:
[0045] a data obtaining module, for obtaining observation data of at least two
ultrasonic arrays
on a robot at each timing of tracking; and
[0046] a calculating module, for employing a preset dimension-extended IMM-EKF
algorithm
to estimate a motion state of the robot at each timing, obtaining state
estimation to which
the robot corresponds under each motion model at timing k through m number of
dimension-extended EKE filters that match m number of motion models
corresponding
to m number of motion states at timing k, obtaining m number of states, and
performing
weighted calculation on them number of states to obtain a state estimation
result at timing
k, wherein each timing is expressed by timing k, and k, m are each an integer
greater than
0.
[0047] Further, the data obtaining module is employed for:
[0048] obtaining observation data 4, 4 ....................................
zici of at least two ultrasonic arrays on the robot
at timing k, wherein k, n are each an integer greater than 0, and 4, 4 ....
zici are all
vectors of robot angles and distance data measured by the at least two
ultrasonic arrays.
[0049] Further, the calculating module includes a robot tracking system
creating module for:
[0050] creating the robot tracking system that includes a motion equation and
an observation
equation of the robot, expressed as:
[0051] the motion equation: 4+1 = Fx + wk;
[0052] the observation equation: 41 = Nk'xk + vikl;
[0053] C11 = P(Mk = Mi IMk_l = Mi);
[0054] where i, j=1, 2... m represents the number of models, n=1,2 ........ n
represents the number
of ultrasonic arrays, m, n are each an integer greater than or equal to 1, k E
N represents
timing, Cii represents a probability of a target transferring from model i at
timing k-1 to
6
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
model j at timing k, F represents an ith model state transfer matrix at timing
k, xx
represents a target state under an till motion model at timing k, 41
represents an
observation matrix of an nth array at timing k, 4,1 represents target state
observation
received by the nth array at timing k,
represents process noise of model i, v17,'
represents observation noise of the nth array, the two noises are both
supposed to be white
Gaussian noise with a mean value of zero and a covariance of (211, and Rikl
respectively;
[0055] a model input interacting module, for letting xki be
state estimation of dimension-
extended ETU filter i at timing k-1, Pk_1ik_1 be corresponding covariance
matrix
estimation, and ;4_1 be a probability of model i at timing k-1, after
interactive
calculation, input calculation formulae of dimension-extended ETU filter j at
timing k
being as follows:
[0056] .x.k i
-11k-1 = rin-1 Xki ¨111c-1 ttik[]t;
[0057] = E'in=1tPki-1ik-1 [xki -11k-1 ¨ -11k-1 ¨
[0058] where =CUkl c1 = rir i C 14_1;
c
[0059] a sub-model filtering module, for calculating and obtaining
corresponding inputs
,oi Doi
k-lik-i(i = 1,2... m) at various dimension-extended ETU filters, and
employing the obtained measurement 4,, zi2, ..............................
zici to perform corresponding state
estimation updates under various models;
[0060] a model probability updating module: for calculating model
probabilities on various
models i=1, 2, ...m, the calculation formula being as follows:
[0061] 4 =
[0062] where ci = _ C111, c = ci; and
[0063] an estimation fusion outputting module, for calculating state
estimation and covariance
matrix estimation of the target at the current timing according to update
probabilities,
state estimations and covariance matrix estimations of the various models, the
calculation
formulae being as follows:
7
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
[0064] xkik = rin=lxki ik
[0065] P kik = rin=r(Plcik + [xkik ¨ xiicik][xklic ¨ xklic]} itik;
[0066] where xkik represents target state estimation at timing k, and Pkik
represents target
state covariance matrix estimation at timing k.
[0067] Further, the sub-model filtering module includes:
[0068] a state predicting sub-module for, with respect to the various models
i=1, 2... m,
calculating corresponding prediction states and prediction covariance matrixes
respectively, the calculation formulae being as follows:
[0069] xk = iik-t;
[0070] Pki k -1 =113 (pk-1)i V' _L nt
k1 -11k-tv.
[0071] a data fusing sub-module, for employing a dimension-extended algorithm
to perform data
fusion, formulae of various corresponding variables being as follows:
[0072] Zk = [(ZI)T ,(Z)T1T ;
[0073] Ak = [(Alk)T , (A2k)T1T ;
[0074] Rk = diag[Rk,Ri2c];
[0075] corresponding to the models i=1, 2... m, calculation formulae of their
respective
measurement prediction residuals and measurement covariances being as follows:
[0076] vk = 4 -Akxklik_1;
[0077] Sk = Ak_Pki ik_i(Ak)T + Rk;
[0078] a likelihood function corresponding to model i being simultaneously
calculated, the
likelihood function being as follows in the supposition that a condition of
Gaussian
distribution is abided by:
[0079] 4 = 1 exp[ -1 (v)T (S)' viic]; and
,1127r41 2
[0080] a filter updating sub-module, for, corresponding to the models i=1,
2... m, respectively
calculating their respective filter gains, state estimation updates, and error
covariance
8
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
matrixes, the calculation formulae being as follows:
[0081] K = Pkilk_i(Ak)T(Sk)_l;
[0082] xki ik = xi
kik-i "kvic,
[0083] Pkiik = Pi Ki A PI
k k kik-1-
[0084] According to still another aspect, there is provided a robot tracking
equipment that
comprises:
[0085] a processor; and
[0086] a memory, for storing an executable instruction of the processor;
wherein
[0087] the processor is configured to execute via the executable instruction
steps of the robot
tracking method according to any of the aforementioned solutions.
[0088] According to yet another aspect, there is provided a computer-readable
storage medium,
and the computer-readable storage medium stores a computer program, the steps
of the
robot tracking method according to any of the aforementioned solutions are
realized when
the computer program is executed by a processor.
[0089] The technical solutions provided by the embodiments of the present
invention bring about
the following advantageous effects.
[0090] 1. By arranging multiple ultrasonic arrays, observation data is
obtained at each timing of
tracking the robot, each step of the iterative process is performed with
measurement and
dimension extension through a preset dimension-extended IMM-EKF algorithm on
the
basis of the IMM-EKF algorithm, more target motion state information is
obtained, and
such a solution is applicable to multiple ultrasonic arrays.
[0091] 2. The primary observation data is fully utilized, the fusion effect is
optimized, tracking
precision in tracking a robot indoors is enhanced, the tracking error is
small, and the
9
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
computational amount is relatively low, so as to realize stable and effective
tracking of
an intelligent robot also when its state is unknown and variable, and to
reduce the
phenomenon of erroneous tracking or failed tracking.
[0092] 3. The dimension-extended IMM-EKF algorithm is employed to track the
robot, whereby
it is made possible to effectively weaken the influence of reverberation and
noise to the
tracking precision, the tracking error is rendered apparently lower than that
of the
traditional IMM-EKF algorithm, and excellent robustness also exhibits in the
tracking
scenario where observation data is missing.
BRIEF DESCRIPTION OF THE DRAWINGS
[0093] To more clearly describe the technical solutions in the embodiments of
the present
invention, drawings required to illustrate the embodiments are briefly
introduced below.
Apparently, the drawings introduced below are merely directed to some
embodiments of
the present invention, while persons ordinarily skilled in the art may further
acquire other
drawings on the basis of these drawings without spending creative effort in
the process.
[0094] Fig. 1 is a flowchart illustrating the robot tracking method provided
by an embodiment
of the present invention;
[0095] Fig. 2 is a flowchart illustrating sub-steps of step 102 in Fig. 1;
[0096] Fig. 3 is a flowchart illustrating sub-steps of step 1023 in Fig. 2;
[0097] Fig. 4 is a flow block diagram illustrating the robot tracking method
provided by an
embodiment of the present invention;
[0098] Fig. 5 is a view schematically illustrating the process of calculating
state calculation result
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
in the robot tracking method provided by an embodiment of the present
invention;
[0099] Fig. 6 is a view schematically illustrating the structure of the robot
tracking device
provided by an embodiment of the present invention;
[0100] Fig. 7 is a view schematically illustrating the formation of the robot
tracking equipment
provided by an embodiment of the present invention;
[0101] Fig. 8 is a view illustrating the effect comparison of tracking tracks
between the robot
tracking solution provided by the embodiments of the present invention and a
currently
available solution; and
[0102] Fig. 9 is a view illustrating the effect comparison of tracking errors
between the robot
tracking solution provided by the embodiments of the present invention and a
currently
available solution.
DETAILED DESCRIPTION OF THE INVENTION
[0103] To make more lucid and clear the objectives, technical solutions and
advantages of the
present invention, the technical solutions in the embodiments of the present
invention will
be clearly and comprehensively described below with reference to the
accompanying
drawings in the embodiments of the present invention. Apparently, the
embodiments as
described are merely partial, rather than the entire, embodiments of the
present invention.
Any other embodiments makeable by persons ordinarily skilled in the art on the
basis of
the embodiments in the present invention without creative effort shall all
fall within the
protection scope of the present invention. The wordings of "plural", "a
plurality of' and
"multiple" as used in the description of the present invention mean "two or
more", unless
definitely and specifically defined otherwise.
11
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
[0104] In the robot tracking method, and corresponding device, equipment and
computer-
readable storage medium provided by the embodiments of the present invention,
by
arranging multiple ultrasonic arrays, observation data is obtained at each
timing of
tracking the robot, each step of the iterative process is performed with
measurement and
dimension extension through a preset dimension-extended IMM-EKF algorithm on
the
basis of the IMM-EKF algorithm, more target motion state information is
obtained, such
a solution is applicable to multiple ultrasonic arrays, the primary
observation data is fully
utilized, the fusion effect is optimized, tracking precision in tracking a
robot indoors is
enhanced, the tracking error is small, and the computational amount is
relatively low, so
as to realize stable and effective tracking of an intelligent robot also when
its state is
unknown and variable, and to reduce the phenomenon of erroneous tracking or
failed
tracking. Therefore, the robot tracking method is applicable to the
application field of
intelligent robots manipulation and control, and particularly applicable to
the application
scenario with multiple ultrasonic arrays.
[0105] The robot tracking method, and corresponding device, equipment and
computer-readable
storage medium provided by the embodiments of the present invention are
described in
greater detail below in conjunction with specific embodiments and accompanying
drawings.
[0106] Fig. 1 is a flowchart illustrating the robot tracking method provided
by an embodiment
of the present invention. Fig. 2 is a flowchart illustrating sub-steps of step
102 in Fig. 1.
Fig. 3 is a flowchart illustrating sub-steps of step 1023 in Fig. 2.
[0107] As shown in Fig. 1, the robot tracking method provided by an embodiment
of the present
invention comprises the following steps.
[0108] 101 - obtaining observation data of at least two ultrasonic arrays on a
robot at each timing
of tracking.
12
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
[0109] The step of obtaining observation data of at least two ultrasonic
arrays on a robot at each
timing of tracking includes:
[0110] obtaining observation data 4, z,2, .................................
zici of at least two ultrasonic arrays on the robot
at timing k, wherein k, n are each an integer greater than 0, and 4, z ...
zk' are all
vectors of measured robot angles and distance data.
[0111] As is noticeable, the process of step 101 can further be realized via
other modes besides
the mode specified in the above step, and the specific modes are not
restricted in the
embodiments of the present invention.
[0112] 102 - employing a preset dimension-extended IMM-EKF algorithm to
estimate a motion
state of the robot at each timing, obtaining state estimation to which the
robot corresponds
under each motion model at timing k through m number of dimension-extended EKF
filters that match m number of motion models corresponding to m number of
motion
states at timing k, obtaining m number of states, and performing weighted
calculation on
the m number of states to obtain a state estimation result of the robot at
timing k, wherein
each timing is expressed by timing k, and k, m are each an integer greater
than 0.
[0113] As shown in Fig. 2, step 102 further includes the following sub-steps:
[0114] 1021 ¨ a stochastic hybrid system calculating step: a robot tracking
system creating step:
creating the robot tracking system that includes a motion equation and an
observation
equation of the robot, expressed as:
[0115] the motion equation: 4+1 = Fx + wk;
[0116] the observation equation: 41 = AIkixk + vikl;
[0117] C11 = P(Mk = Mi I Mk_ = Mi);
[0118] where i, j=1, 2... m represents the number of models, n=1,2 ........ n
represents the number
of ultrasonic arrays, m, n are each an integer greater than or equal to 1, k E
N represents
timing, C11 represents a probability of a target transferring from model i at
timing k-1 to
13
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
model j at timing k, F represents an ith model state transfer matrix at timing
k, xx
represents a target state under an till motion model at timing k, Alici
represents an
observation matrix of an nth array at timing k, 4,1 represents target state
observation
received by the nth array at timing k, wk represents process noise of model i,
represents observation noise of the nth array, the two noises are both
supposed to be white
Gaussian noise with a mean value of zero and a covariance of (211, and Rikl
respectively;
[0119] 1022 - a model input interacting step: letting xki be
state estimation of dimension-
extended ETU filter i at timing k-1, Pk_1ik_1 be corresponding covariance
matrix
estimation, and /4_1 be a probability of model i at timing k-1, after
interactive
calculation, input calculation formulae of dimension-extended ETU filter j at
timing k
being as follows:
[0120 xk_i,,
] of .
1-1 = rn=lxki
[0121] of r,i of i of
Pk-tik-1 = E'in=itPk _L i-tik-1 Xk-111c-1ir,
-1LA'k-lik-1 Xk-111c-11 1-k-1
[0122] where =CUklc1 =
ci
[0123] 1023 - a sub-model filtering step: calculating and obtaining
corresponding inputs
x0i
k-lik-1, k-lik-1
P i (i
= 1,2... m) at various dimension-extended ETU filters, and
employing the obtained measurement 4,, zi2, .............................. 41
to perform corresponding state
estimation updates under various models;
[0124] 1024 - a model probability updating step: calculating model
probabilities on various
models i=1, 2, ...m, the calculation formula being as follows:
[0125] 4 =
[0126] where ci = Er_l Cii c = Aik Ci; and
[0127] 1025 - an estimation fusion outputting step: calculating state
estimation and covariance
matrix estimation of the target at the current timing according to update
probabilities,
state estimations and estimation covariance matrix estimations of the various
models, the
calculation formulae being as follows:
14
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
[0128] xkik = ik 14;
[0129] Pk'', = rin,r(Plcik + [xkik ¨ xiicik][xkik ¨ xkik]} itik;
[0130] where xkik represents target state estimation at timing k, and Pk'',
represents target
state covariance matrix estimation at timing k.
[0131] As shown in Fig. 3, the aforementioned sub-model filtering step further
includes the
following sub-steps:
[0132] 1023a - a state predicting sub-step: with respect to the various models
i=1, 2... m,
calculating corresponding prediction states and prediction covariance matrixes
respectively, the calculation formulae being as follows:
[0133] xki ik_i = iik-t;
[0134] Pki lk-1 = N-1Pki -11k-A-1)T QL-1;
[0135] 1023b - a data fusing sub-step: employing a dimension-extended
algorithm to perform
data fusion, formulae of various corresponding variables being as follows:
[0136] 4 = [(4)T,(z,2c)T1T;
[0137] A k = MDT , (A2k)T1T ;
[0138] Rk = diag[N,Q;
[0139] corresponding to the models i=1, 2... m, calculation formulae of their
respective
measurement prediction residuals and measurement covariances being as follows:
[0140] vk = 4 ¨ Akxki ik_,;
[0141] Sk = AkPkilk-i(Ak)T + Rk;
[0142] a likelihood function corresponding to model i being simultaneously
calculated, the
likelihood function being as follows in the supposition that a condition of
Gaussian
distribution is abided by:
[0143] 4 ¨ 1 exp[¨ -1 (vOT (S)-1-14]; and
\1127T41 2
[0144] 1023c - a filter updating sub-step: corresponding to the models i=1,
2... m, respectively
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
calculating their respective filter gains, state estimation updates, and error
covariance
matrixes, the calculation formulae being as follows:
[0145] K = Pkilk_1(Ak)T(Sk)_l;
[0146] xki ik ¨ xi _L
¨ kik-1
[0147] Pildk = ¨
[0148] Fig. 4 is a flow block diagram illustrating the robot tracking method
provided by an
embodiment of the present invention, Fig. 5 is a view schematically
illustrating the
process of calculating state calculation result in the robot tracking method
provided by an
embodiment of the present invention, and the two Figures together demonstrate
a mode
of execution in which two ultrasonic arrays are selected and used.
[0149] As is noticeable, the process of step 102 can further be realized via
other modes besides
the mode specified in the above step, and the specific modes are not
restricted in the
embodiments of the present invention.
[0150] Fig. 6 is a view schematically illustrating the structure of the robot
tracking device
provided by an embodiment of the present invention, as shown in Fig. 6, the
robot
tracking device provided by an embodiment of the present invention comprises a
data
obtaining module 1 and a calculating module 2.
[0151] The data obtaining module 1 is employed for obtaining observation data
of at least two
ultrasonic arrays on a robot at each timing of tracking. Specifically, the
data obtaining
module 1 is employed for: obtaining observation data 4, 4 .................
ziki of at least two
ultrasonic arrays on the robot at timing k, wherein k, n are each an integer
greater than 0,
and 4, 4 ..................................................................
ziki are all vectors of robot angles and distance data measured by the at
least two ultrasonic arrays.
16
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
[0152] The calculating module 2 is employed for: employing a preset dimension-
extended IMM-
EKF algorithm to estimate a motion state of the robot at each timing,
obtaining state
estimation to which the robot corresponds under each motion model at timing k
through
m number of dimension-extended EKF filters that match m number of motion
models
corresponding to m number of motion states at timing k, obtaining m number of
states,
and performing weighted calculation on the m number of states to obtain a
state
estimation result at timing k, wherein each timing is expressed by timing k,
and k, m are
each an integer greater than 0.
[0153] Specifically, the calculating module 2 includes a robot tracking system
creating module
21, a model input interacting module 22, a sub-model filtering module 23, a
model
probability updating module 24 and an estimation fusion outputting module 25.
[0154] The robot tracking system creating module 21 is employed for:
[0155] creating the robot tracking system that includes a motion equation and
an observation
equation of the robot, expressed as:
[0156] the motion equation: 4+1 = Fic4 + wk;
[0157] the observation equation: 41 = A7k'xk + vikl;
[0158] C11 = P(Mk = = Mi);
[0159] where i, j=1, 2... m represents the number of models, n=1,2 ........ n
represents the number
of ultrasonic arrays, m, n are each an integer greater than or equal to 1, k E
N represents
timing, C11 represents a probability of a target transferring from model i at
timing k-1 to
model j at timing k, F represents an ith model state transfer matrix at timing
k, 44
represents a target state under an ith motion model at timing k, AZ represents
an
observation matrix of an nth array at timing k, zki represents target state
observation
received by the nth array at timing k,
represents process noise of model i, 19/7,'
represents observation noise of the nth array, the two noises are both
supposed to be white
Gaussian noise with a mean value of zero and a covariance of 0, and RIki
respectively.
17
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
[0160] The model input interacting module 22 is employed for:
[0161] letting xki be state estimation of dimension-extended ETU filter i
at timing k-1,
Pi k-111c-1 be corresponding covariance matrix estimation, and 1.4_1 be a
probability of
model i at timing k-1, after interactive calculation, input calculation
formulae of
dimension-extended ETU filter j at timing k being as follows:
[0162] 4i_ilk 1 = xki
[0163] of õ
Pk-1jk-1 = rin=1tPki-111c-1 [Xki ¨1Ik-1 Xkl3L11k-11[Xki ¨1Ik-
1 Xkl3L1ik-1iTil¨ki ¨1
[0164] where =CJklc1 = riT i
c;
[0165] The sub-model filtering module 23 is employed for:
[0166] calculating and obtaining corresponding inputs xk _i
p,(cLi('= 1,2... m) at
various dimension-extended ETU filters, and employing the obtained measurement
2
Zk1, Zk ......... 41 to perform corresponding state estimation updates under
various models.
[0167] The model probability updating module 24 is employed for: calculating
model
probabilities on various models i=1, 2, ...m, the calculation formula being as
follows:
[0168] 4 =
[0169] where ci = Er_lCii ilk] c = Ci.
[0170] The estimation fusion outputting module 25 is employed for:
[0171] calculating state estimation and covariance matrix estimation of the
target at the current
timing according to update probabilities, state estimations and covariance
matrix
estimations of the various models, the calculation formulae being as follows:
[0172] xkik = rin=i xki ik /Ilk;
[01731 P kik = E'in=ltPki lk kiclic Xki lici[Xklk Xki 11(19
[ilk;
18
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
[0174] where xkik represents target state estimation at timing k, and Pk'',
represents target
state covariance matrix estimation at timing k.
[0175] Further, the sub-model filtering module 23 includes a state predicting
sub-module 231, a
data fusing sub-module 232 and a filter updating sub-module 233.
[0176] The state predicting sub-module 231 is employed for:
[0177] with respect to the various models i=1, 2... m, calculating
corresponding prediction states
and prediction covariance matrixes respectively, the calculation formulae
being as
follows:
[Ol 78] xki ik_l = iik_i;
[0179] ( )7' _L
Pki pc-1= Filc-tPki -11k-tv0 -1-
[0180] The data fusing sub-module 232 is employed for:
[0181] employing a dimension-extended algorithm to perform data fusion,
formulae of various
corresponding variables being as follows:
[0182] Z k =
[0183] A k = [OW (Ak)T17
[0184] R k = diag[Rk,R];
[0185] corresponding to the models i=1, 2... m, calculation formulae of their
respective
measurement prediction residuals and measurement covariances being as follows:
[0186] vk = 4 -Akxklik_1;
[0187] Sk = Ak_Pki pc_i(Ak)T + R k;
[0188] a likelihood function corresponding to model i being simultaneously
calculated, the
likelihood function being as follows in the supposition that a condition of
Gaussian
distribution is abided by:
19
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
[0189] Aik ¨ __ 1 __ exp[¨ -1 (v)T (Sk)_lvild.
õ1127T41 2
[0190] The filter updating sub-module 233 is employed for:
[0191] corresponding to the models i=1, 2... m, respectively calculating their
respective filter
gains, state estimation updates, and error covariance matrixes, the
calculation formulae
being as follows:
[0192] K = Pki ik_i(Ak)T(Sk)_l;
[0193] xki ik = xki pc-1 + Kilcvic;
[0194] Pkiik ¨ Pi Ki A Pi
¨ k k kik-1-
[0195] Fig. 7 is a view schematically illustrating the formation of the robot
tracking equipment
provided by an embodiment of the present invention, as shown in Fig. 7, the
robot
tracking equipment provided by an embodiment of the present invention
comprises a
processor 3 and a memory 4, the memory 4 is for storing an executable
instruction of the
processor 3, and the processor 3 is configured to execute via the executable
instruction
steps of the robot tracking method according to any of the aforementioned
solutions.
[0196] An embodiment of the present invention further provides a computer-
readable storage
medium, and the computer-readable storage medium stores a computer program,
the steps
of the robot tracking method according to any of the aforementioned solutions
are realized
when the computer program is executed by a processor.
[0197] As should be noted, when the robot tracking device provided by the
aforementioned
embodiment triggers a robot tracking business, the division into the
aforementioned
various functional modules is merely by way of example, while it is possible,
in actual
application, to base on requirements to assign the functions to different
functional
modules for completion, that is to say, to divide the internal structure of
the device into
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
different functional modules to complete the entire or partial functions
described above.
In addition, the robot tracking device, robot tracking equipment and computer-
readable
storage medium that trigger the robot tracking business provided by the
aforementioned
embodiments pertain to the same conception as the robot tracking method that
triggers
the robot tracking business as provided by the method embodiment ¨ see the
corresponding method embodiment for their specific realization processes,
while no
repetition will be made in this context.
[0198] All of the aforementioned optional technical solutions can be randomly
combined to form
the optional embodiments of the present invention, while no redundancy is made
in this
context on a one-by-one basis.
[0199] To enunciate the advantages of the robot tracking solution provided by
the embodiments
of the present invention in tracking automated equipment indoors, measurement
data of
the robot is processed through the robot tracking method, the IMM-EKF method
and the
weighted IMM-EKF provided by the embodiments of the present invention, to
realize
state estimation of the robot, and the results are as shown in Fig. 8.
[0200] Fig. 8 is a view illustrating the effect comparison of tracking tracks
between the robot
tracking solution provided by the embodiments of the present invention and a
currently
available solution. Fig. 9 is a view illustrating the effect comparison of
tracking errors
between the robot tracking solution provided by the embodiments of the present
invention
and a currently available solution.
[0201] As shown in Fig. 9, to further depict the performances of the different
methods, tracking
errors of the above estimation results are calculated to evaluate the
performances. The
error formula of state estimation at timing tk is follows:
[0202] RMSE = xk)2 + (9k ¨ Yk)2
[0203] where (ik, jik) represents position coordinates obtained by the target
state estimation at
21
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
timing tk, and (xk,yk) represents the actual position of the target at timing
tk.
[0204] The following Table 1 shows target average tracking errors of three
methods, as shown
below:
[0205]
Tracking Dimension- Weighted IMM- I]\41M-EKF
Algorithms extended IMM- EKF Algorithm Algorithm
EKF Algorithm
Tracking Errors 0.15 0.22 3.9
(m)
[0206] Table 1
[0207] As can be known, the tracking precision of the robot tracking method
provided by the
embodiments of the present invention is apparently better than that of the IMM-
EKF
algorithm, moreover, in comparison with the weighted IMM-EKF algorithm, the
tracking
error is reduced by approximately 50%.
[0208] In summary, in comparison with the prior-art technology, the robot
tracking method, and
corresponding device, equipment and computer-readable storage medium provided
by the
embodiments of the present invention achieve the following advantageous
effects.
[0209] 1. By arranging multiple ultrasonic arrays, observation data is
obtained at each timing of
tracking the robot, each step of the iterative process is performed with
measurement and
dimension extension through a preset dimension-extended IMM-EKF algorithm on
the
basis of the IMM-EKF algorithm, more target motion state information is
obtained, and
such a solution is applicable to multiple ultrasonic arrays.
[0210] 2. The primary observation data is fully utilized, the fusion effect is
optimized, tracking
precision in tracking a robot indoors is enhanced, the tracking error is
small, and the
22
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
computational amount is relatively low, so as to realize stable and effective
tracking of
an intelligent robot also when its state is unknown and variable, and to
reduce the
phenomenon of erroneous tracking or failed tracking.
[0211] 3. The dimension-extended IMM-EKF algorithm is employed to track the
intelligent
robot, whereby it is made possible to effectively weaken the influence of
reverberation
and noise to the tracking precision, the tracking error is rendered apparently
lower than
that of the traditional IMM-EKF algorithm, and excellent robustness also
exhibits in the
tracking scenario where observation data is missing.
[0212] As understandable by persons ordinarily skilled in the art, realization
of the entire or
partial steps of the aforementioned embodiments can be completed by hardware,
or by a
program instructing relevant hardware, the program can be stored in a computer-
readable
storage medium, and the storage medium can be a read-only memory, a magnetic
disk, or
an optical disk, etc.
[0213] The embodiments of the present application are described with reference
to flowcharts
and/or block diagrams of the method, device (system), and computer program
product
embodied in the embodiments of the present application. As should be
understood, each
flow and/or block in the flowcharts and/or block diagrams, and any combination
of flow
and/or block in the flowcharts and/or block diagrams can be realized by
computer
program instructions. These computer program instructions can be supplied to a
general
computer, a dedicated computer, an embedded processor or a processor of any
other
programmable data processing device to form a machine, so that the
instructions executed
by the computer or the processor of any other programmable data processing
device
generate a device for realizing the functions designated in one or more
flow(s) of the
flowcharts and/or one or more block(s) of the block diagrams.
[0214] These computer program instructions can also be stored in a computer-
readable memory
23
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
enabling a computer or any other programmable data processing device to
operate by a
specific mode, so that the instructions stored in the computer-readable memory
generate
a product containing instructing means, and this instructing means realizes
the functions
designated in one or more flow(s) of the flowcharts and/or one or more
block(s) of the
block diagrams.
[0215] These computer program instructions can also be loaded onto a computer
or any other
programmable data processing device, so as to execute a series of operations
and steps
on the computer or the any other programmable device to generate computer-
realized
processing, so that the instructions executed on the computer or the any other
programmable device provide steps for realizing the functions designated in
one or more
flow(s) of the flowcharts and/or one or more block(s) of the block diagrams.
[0216] Although preferred embodiments in the embodiments of the present
application have
been described so far, it is still possible for persons skilled in the art to
make additional
modifications and amendments to these embodiments upon learning of the basic
inventive
conception. Accordingly, the attached Claims are meant to cover the preferred
embodiments and all modifications and amendments that fall within the scope of
the
embodiments of the present application.
[0217] Apparently, persons skilled in the art can make various amendments and
modifications to
the present invention without departing from the spirit and scope of the
present invention.
Thus, if such amendments and modifications to the present invention fall
within the
Claims of the present invention and equivalent technology, the present
invention is also
meant to cover these amendments and modifications.
[0218] What is described above is merely directed to preferred embodiments of
the present
invention, and they are not meant to restrict the present invention. Any
amendment,
equivalent replacement and improvement makeable within the spirit and scope of
the
24
Date Recue/Date Received 2022-04-25

CA 03158929 2022-04-25
present invention shall all be covered within the protection scope of the
present invention.
Date Recue/Date Received 2022-04-25

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

2024-08-01:As part of the Next Generation Patents (NGP) transition, the Canadian Patents Database (CPD) now contains a more detailed Event History, which replicates the Event Log of our new back-office solution.

Please note that "Inactive:" events refers to events no longer in use in our new back-office solution.

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Event History , Maintenance Fee  and Payment History  should be consulted.

Event History

Description Date
Amendment Received - Voluntary Amendment 2024-04-11
Amendment Received - Response to Examiner's Requisition 2024-04-11
Inactive: IPC expired 2024-01-01
Examiner's Report 2023-12-12
Inactive: Report - No QC 2023-12-11
Letter Sent 2023-02-07
Inactive: Correspondence - Prosecution 2022-12-23
Request for Examination Received 2022-09-16
All Requirements for Examination Determined Compliant 2022-09-16
Request for Examination Requirements Determined Compliant 2022-09-16
Letter sent 2022-06-03
Priority Claim Requirements Determined Compliant 2022-05-19
Request for Priority Received 2022-05-19
Inactive: IPC assigned 2022-05-19
Inactive: IPC assigned 2022-05-19
Inactive: First IPC assigned 2022-05-19
Application Received - PCT 2022-05-19
National Entry Requirements Determined Compliant 2022-04-25
Application Published (Open to Public Inspection) 2021-05-06

Abandonment History

There is no abandonment history.

Maintenance Fee

The last payment was received on 2023-12-15

Note : If the full payment has not been received on or before the date indicated, a further fee may be required which may be one of the following

  • the reinstatement fee;
  • the late payment fee; or
  • additional fee to reverse deemed expiry.

Please refer to the CIPO Patent Fees web page to see all current fee amounts.

Fee History

Fee Type Anniversary Year Due Date Paid Date
MF (application, 2nd anniv.) - standard 02 2022-08-02 2022-04-25
Basic national fee - standard 2022-04-25 2022-04-25
Request for examination - standard 2024-07-30 2022-09-16
MF (application, 3rd anniv.) - standard 03 2023-07-31 2023-06-15
MF (application, 4th anniv.) - standard 04 2024-07-30 2023-12-15
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
10353744 CANADA LTD.
Past Owners on Record
GUOXU FAN
JINGQUAN ZHAO
MINGHAO LI
XINJIANG ZHENG
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Claims 2024-04-15 29 1,336
Abstract 2022-04-25 1 22
Claims 2022-04-25 7 265
Description 2022-04-25 25 985
Drawings 2022-04-25 5 249
Cover Page 2022-08-26 1 75
Representative drawing 2022-08-26 1 40
Amendment / response to report 2024-04-11 67 2,235
Courtesy - Letter Acknowledging PCT National Phase Entry 2022-06-03 1 591
Courtesy - Acknowledgement of Request for Examination 2023-02-07 1 423
Examiner requisition 2023-12-12 4 189
National entry request 2022-04-25 12 1,113
Amendment - Abstract 2022-04-25 2 107
International search report 2022-04-25 6 180
Request for examination 2022-09-16 9 320