Language selection

Search

Patent 3126592 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent Application: (11) CA 3126592
(54) English Title: METHODE ET SYSTEME DE PROFILOMETRIE PAR ILLUMINATION HAUTE VITESSE A BANDE LIMITEE AVEC DEUX OBJECTIFS
(54) French Title: METHOD AND SYSTEM FOR HIGH-SPEED DUAL-VIEW BAND-LIMITED ILLUMINATION PROFILOMETRY
Status: Examination Requested
Bibliographic Data
(51) International Patent Classification (IPC):
  • G01B 11/245 (2006.01)
  • G01B 11/25 (2006.01)
(72) Inventors :
  • LIANG, JINYANG (Canada)
  • JIANG, CHENG (Canada)
  • KILCULLEN, PATRICK (Canada)
(73) Owners :
  • INSTITUT DE LA RECHERCHE SCIENTIFIQUE (Canada)
(71) Applicants :
  • INSTITUT DE LA RECHERCHE SCIENTIFIQUE (Canada)
(74) Agent: LAVERY, DE BILLY, LLP
(74) Associate agent:
(45) Issued:
(22) Filed Date: 2021-08-02
(41) Open to Public Inspection: 2022-02-03
Examination requested: 2023-12-13
Availability of licence: N/A
(25) Language of filing: French

Patent Cooperation Treaty (PCT): No

(30) Application Priority Data:
Application No. Country/Territory Date
63/060,630 United States of America 2020-08-03

Abstracts

French Abstract


A system and a method for 3D imaging of an object, the method comprising
projecting sinusoidal fringe patterns
onto the object using a projecting unit and capturing fringe patterns deformed
by the object, alternatively by at
least a first camera and a second camera, and recovering a 3D image of the
object pixel by pixel from mutually
incomplete images provided by the first camera and the second camera, by
locating a point in images of the
second camera that matches a selected pixel of the first camera; determining
estimated 3D coordinates and
wrapped phase based on calibration of the cameras, determining an horizontal
coordinate on the plane of a
projector of the projecting unit based on calibration of the projector, and
using a wrapped phase value to recover
a 3D point of 3D coordinates (x, y, z).

Claims

Note: Claims are shown in the official language in which they were submitted.


19
Claims
1. A system for 3D imaging of an object, the system comprising:
a projection unit; said projection unit comprising a light source and a
projector; and
at least two cameras, said cameras being positioned on a same side of said
projector;
wherein said projection unit projects sinusoidal fringe patterns onto the
object and the cameras
alternatively capture, point by point; fringe patterns deformed by the object,
depth information being encoded into
the phase of the deformed fringe patterns, and the object being recovered by
phase demodulation and
reconstruction.
2. The system of claim 1, wherein said light source is a high-coherent
light source of a
power of at least 50 mW.
3. The system of any one of claims 1 and 2, wherein said cameras have an
imaging speed
of at least 2k frames/second, and image resolution at least 1000x800 pixels.
4. The system of any one of claims 1 to 3, wherein said projection unit
comprises a spatial
light modulator, and said spatial light modulator has a refreshing rate of at
least 4 kHz and on board memory of at
least 1 MB.
5. The system of any one of claims 1 to 4, wherein said projection unit
comprises a spatial
light modulator, and said spatial light modulator is one of: digital
micromirror device, a liquid crystal display and a
binary fringe mask with a motorized translation stage.
6. The system of any one of claims 1 to 5, wherein said projection unit
comprises a pattern
conversion unit and said pattern conversion unit comprises a 4f imaging system
and a low-pass filter.
7. The system of any one of claims 1 to 5, wherein said projection unit
comprises a pattern
conversion unit and said pattern conversion unit comprises a 4f imaging system
and a low-pass filter, a ratio
between focal length of a first lens and a focal length of a second lens of
the 4f imaging system being selected in
a range between 0.75 and 1.5, and a diameter of the low pass filter is
selected in a range between 150 lam and
300 m.
Date Reçue/Date Received 2021-08-02

20
8. The system of any one of claims 1 to 5, wherein said projection unit
comprises a pattern
conversion unit and said pattern conversion unit comprises a 4f imaging system
and a low-pass filter, said low
pass filter being one of: a pinhole and a slit.
9. The system of any one of claims 1 to 5, wherein said projection unit
comprises a pattern
conversion unit and said pattern conversion unit comprises a 4f imaging system
and a low-pass filter, said low
Af1
pass filter being one a pinhole, said pinhole having a minimal diameter
determined by: D = ¨, where p is a
Pf
period of the fringes and f1 is a focal length of a first lens of the 4f
imaging system.
10. The system of any one of claims 1 to 9, wherein said projector is
selected with a focal
length in a range between 18 and 55 mm, a F number in a range between 3.5 and
5.6, and a magnification ratio
in a range between 5 and 10 times.
11. A method for 3D imaging of an object, comprising:
projecting sinusoidal fringe patterns onto the object using a projecting unit
and capturing fringe
patterns deformed by the object, alternatively by at least a first camera and
a second camera, and
recovering a 3D image of the object pixel by pixel from mutually incomplete
images provided by
the first camera and the second camera, by locating a point in images of the
second camera that matches a
selected pixel of the first camera; determining estimated 3D coordinates and
wrapped phase based on calibration
of the cameras, determining an horizontal coordinate on the plane of a
projector of the projecting unit based on
calibration of the projector, and using a wrapped phase value to recover a 3D
point of 3D coordinates (x, y, z).
12. The method of claim 11, comprising calibrating the first and second
cameras and the
projecting optics, and recovering 3D information using a coordinate-based
method, wherein, to the point on the
object with the 3D coordinates (x, y, z) correspond two independent
coordinates, (u, v) for the cameras and
(u", v") for the projecting optics.
13. The method of claim 11, comprising calibrating the first and second
cameras and the
projecting optics, using images captured by the first camera to provide camera
coordinates of a point on the object,
and determining an epipolar line on the second camera, recovering an
horizontal coordinate in images captured
by the second camera along the epipolar line; by substituting the projector in
place of the second camera, using
intensity values of the camera coordinates of the point of the second camera
across a sequence of images to
Date Reçue/Date Received 2021-08-02

21
recover information about a coordinate of the projector; and extracting 3D
information of the object pixel by pixel
based on interlaced image acquisition by incorporating the camera coordinates
(u,v) of the point on the 3D object
and its corresponding projector coordinates.
14. The method of claim 11, comprising:
obtaining a threshold intensity from a selected background region in images
captured by the
first camera, using the threshold intensity to eliminate pixels in the images
captured by the first camera to obtain
a binary quality map of the first camera, and selecting a pixel of an
estimated corresponding pixel in images
captured by the second camera within the quality map of the first camera;
the selected pixel of the first camera determining an epipolar line containing
a matching point
within the images of the second camera, extracting candidates for the matching
point in the second camera mages
that satisfy an intensity matching condition.
15. The method of claim 11, comprising:
obtaining a threshold intensity from a selected background region in images
captured by the
first camera, using the threshold intensity to eliminate pixels in the images
captured by the first camera to obtain
a binary quality map of the first camera, and selecting a pixel in images of
the second camera within the quality
map of the first camera as a candidate;
the selected pixel of the first camera determining an epipolar line containing
the matching point
within the images of the second camera, extracting candidates for the matching
point in the second camera images
that satisfy an intensity matching condition;
determining penalty scores for each obtained candidate; and
determining final 3D coordinates, by unwrapping the phase of the candidates
and obtaining a
coordinate on the plane of the projector.
16. The method of claim 11, comprising:
obtaining a threshold intensity from a selected background region in images
captured by the
first camera, using the threshold intensity to eliminate pixels in the images
captured by the first camera to obtain
a binary quality map of the first camera, and selecting a pixel in images of
the second camera within the quality
map of the first camera as a candidate;
the selected pixel of the first camera determining an epipolar line containing
the matching point
within the images of the second camera, extracting candidates for the matching
point in the second camera images
that satisfy an intensity matching condition and quality map constraint,
transformation constraint and phase sign
Date Reçue/Date Received 2021-08-02

22
constraint; the quality map constraint requiring that the candidates for the
matching point in the second camera
images fall within the quality map of the second camera; the transformation
constraint requiring that the candidates
occur within a segment of the epipolar line; and the phase sign constraint
requiring that the selected point of the
first camera and the candidates have a same sign of respective wrapped phases;
determining penalty scores for each obtained candidate; and
determining final 3D coordinates, by unwrapping the phase of the candidates
and obtaining a
coordinate on the plane of the projector.
17. The method of claim 11, comprising positioning the cameras on a same
side of the
projector.
18. The method of claim 11, wherein the projection unit comprises a light
source, a spatial
light modulator, a pattern conversion unit, the spatial light modulator
generating sinusoidal fringes from beams
from the light source using pre-defined binary patterns, and the pattern
conversion unit converting the binary
patterns to grayscale fringes; the cameras capturing alternatively deformed
structured images; depth information
being encoded into the phase of the deformed images, and the object being
recovered by phase demodulation
and reconstruction.
19. The method of claim 11, wherein the projection unit comprises a light
source, a spatial
light modulator and a pattern conversion unit, the spatial light modulator
generating sinusoidal fringes from beams
from the light source using pre-defined binary patterns, and the pattern
conversion unit converting the binary
patterns to grayscale fringes; the cameras capturing alternatively deformed
structured images; depth information
being encoded into the phase of the deformed images, and the object being
recovered by phase demodulation
and reconstruction, the light source being a high-coherent light source of a
power of at least 50 mW, the cameras
having an imaging speed of at least 2k frames/second and image resolution at
least 1000x800 pixels, the spatial
light modulator having a refreshing rate of at least 4 kHz and on board memory
of at least 1 Mb, and the projector
being selected with a focal length in a range between 18 and 55 mm, a F number
in a range between 3.5 and 5.6,
and a magnification ratio in a range between 5 and 10 times.
20. The method of claim 11, wherein the projection unit comprises a light
source, a spatial
light modulator and a pattern conversion unit, the spatial light modulator
generating sinusoidal fringes from beams
from the light source using pre-defined binary patterns, and the pattern
conversion unit converting the binary
patterns to grayscale fringes; the cameras capturing alternatively deformed
structured images; depth information
Date Reçue/Date Received 2021-08-02

23
being encoded into the phase of the deformed images, and the object being
recovered by phase demodulation
and reconstruction, the light modulator being one of: a digital micromirror
device, a liquid crystal display and a
binary fringe mask with a motorized translation stage, and the pattern
conversion unit comprising a 4f imaging
system and a low-pass filter, with a ratio between a focal length of a first
lens and a focal length of a second lens
of the 4f imaging system comprised in a range between 0.75 and 1.5, and a
diameter of the low pass filter being
in a range between 150 lam and 300 m.
Date Reçue/Date Received 2021-08-02

Description

Note: Descriptions are shown in the official language in which they were submitted.


I
TITLE OF THE INVENTION
Method and system for high-speed dual-view band-limited illumination
profilometry
FIELD OF THE INVENTION
[0001] The present invention relates to three-dimensional imaging. More
specifically, it is concerned with a
system and a method for high-speed dual-view band-limited illumination
profilometry.
BACKGROUND OF THE INVENTION
[0002] Three-dimensional (3D) surface imaging has been e)densively applied
in a number of fields in
industry, entertainment, and biomedicine. Among developed methods, structured-
light profilometry has gained
increasing popularity in measuring dynamic 3D objects due to high measurement
accuracy and high imaging
speeds. Phase-shifting fringe projection profilometry (PSFPP) for instance
uses a set of sinusoidal fringe patterns
as the basis for coordinate encoding, and, in contrast to other methods such
as binary pattern projection for
example, the pixel-level information carried by the phase of the fringe
patterns is insensitive to variations in
reflectivity across the surface of an the object's surface, which results in
high accuracy in 3D measurements. The
sinusoidal fringes are typically generated using digital micromirror devices
(DMDs). Each micromirror on the digital
micromirror devices can be independently tilted to either +12 or ¨12 from
the normal to its surface to generate
binary patterns at up to tens of kilohertz. Although they are binary amplitude
spatial light modulators, it was shown
that digital micromirror devices can be used to generate grayscale fringe
patterns at high speeds. The average
reflectance rate of each micromirror can be controlled by conventional
dithering method to form a grayscale image.
However, the projection rate of fringe patterns is limited at hundreds of
hertz. To improve the projection speed,
binary defocusing methods have been developed to produce a quasi-sinusoidal
pattern by slightly defocusing a
single binary digital micromirror device pattern. Nonetheless, the image is
generated at a plane unconjugated to
the digital micromirror device, which compromises the depth-sensing range and
is less convenient to operate with
fringe patterns of different frequencies. Recently, band-limited illumination
was developed to control the system
bandwidth by placing a pinhole low-pass filter at the Fourier plane of a 4f
imaging system. Both the binary
defocusing method and the band-limited illumination scheme allow generating
one grayscale sinusoidal fringe
pattern from a single binary digital micromirror device pattern. Thus, the
fringe projection speed matches the
refreshing rate of the digital micromirror device.
[0003] High-speed image acquisition is indispensable to digital micromirror
device-based phase-shifting
Date Reçue/Date Received 2021-08-02

2
fringe projection profilometry. In standard phase-shifting fringe projection
profilometry methods, extra calibration
patterns must be used to avoid phase ambiguity, which reduces the overall 3D
imaging speed. A solution to this
problem is to use multiple cameras to simultaneously capture the full sequence
of fringe patterns. The enriched
observation of the 3D object eliminates the necessity of calibration patterns
in data acquisition and phase
unwrapping. This advancement, along with the incessantly increasing imaging
speeds of cameras, has endowed
multi-view phase-shifting fringe projection profilometry systems with image
acquisition rates that keep up with the
refreshing rates of digital micromirror devices.
[0004]
Current multi-view phase-shifting fringe projection profilometry systems are
still limited, mainly in two
aspects. First, each camera must capture the full sequence of fringe patterns.
This requirement imposes
redundancy in data acquisition, which ultimately clamps the imaging speeds of
systems. Given the finite readout
rates of camera sensors, a sacrifice of the field of view (FOV) is inevitable
for higher imaging speeds. Advanced
signal processing approaches, such as image interpolation and compressed
sensing, applied to mitigate this trade-
off typically involve high computational complexity and reduced image quality.
Second, generally the cameras are
placed on different sides of the projector, and this arrangement may induce a
large intensity difference from the
directional scattering light and the shadow effect from the occlusion by local
surface features, both of which reduce
the reconstruction accuracy and exclude the application from non-Lambertian
surfaces.
[0005]
There is a need in the art for a method and a system for high-speed dual-view
band-limited
illumination profilometry.
SUMMARY OF THE INVENTION
[0006]
More specifically, in accordance with the present invention, there is provided
a system for 3D
imaging of an object, the system comprising a projection unit; and at least
two cameras; the projection unit
comprising a light source and a projector; the cameras being positioned on a
same side of the projector; wherein
the projection unit projects sinusoidal fringe patterns onto the object and
the cameras alternatively capture, point
by point; fringe patterns deformed by the object, depth information being
encoded into the phase of the deformed
fringe patterns, and the object being recovered by phase demodulation and
reconstruction.
[0007] There is further provided a method for 3D imaging of an object,
comprising projecting sinusoidal fringe
patterns onto the object using a projecting unit and capturing fringe patterns
deformed by the object, alternatively
by at least a first camera and a second camera, and recovering a 3D image of
the object pixel by pixel from mutually
Date Reçue/Date Received 2021-08-02

3
incomplete images provided by the first camera and the second camera, by
locating a point in images of the second
camera that matches a selected pixel of the first camera; determining
estimated 3D coordinates and wrapped phase
based on calibration of the cameras, determining an horizontal coordinate on
the plane of a projector of the
projecting unit based on calibration of the projector, and using a wrapped
phase value to recover a 3D point of 3D
coordinates (x, y, z).
[0008] Other objects, advantages and features of the present invention will
become more apparent upon reading
of the following non-restrictive description of specific embodiments thereof,
given by way of example only with
reference to the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0009] In the appended drawings:
[0010] FIG. 1A is a schematic view of a system according to an embodiment
of an aspect of the present
invention;
[0011] FIG. 1B is a timing diagram and acquisition sequence with exposure
time te of the camera of the
system of FIG. 1A;
[0012] FIG. 2 is a flow chart of a coordinate-based 3D point determination
method with illustrative data
according to an embodiment of an aspect of the present invention;
[0013] FIG. 3A shows 3D images of the planar surfaces (top images) and
measured depth difference (bottom
images) at different exposure times, the boxes representing the selected
regions for analysis;
[0014] FIG. 3B shows depth resolution versus exposure time;
[0015] FIG. 4A shows reconstructed results of letter toys, with two
perspective views (top row); selected
depth profiles marked by dashed lines (bottom row) and close-up views;
[0016] FIG. 4B shows two perspective views of the reconstruction results of
three toy cubes;
[0017] FIG. 5A shows reconstructed 3D images of a moving hand at five time
points;
[0018] FIG. 5B shows movement traces of four fingertips, marked in the
first panel in FIG. 5A;
Date Reçue/Date Received 2021-08-02

4
[0019] FIG. 5C shows a front view of the reconstructed 3D image of a
bouncing balls at different time points;
[0020] FIG. 5D shows the evolution of 3D positions of the bouncing balls,
marked in the third panel in FIG.
5C;
[0021] FIG. 6A is a schematic view of an experimental setup; the field of
view being marked by the dashed
box;
[0022] FIG. 6B shows four reconstructed 3D images of the cup of FIG. 6A
driven by a 500-Hz sound signal;
[0023] FIG. 6C shows evolution of the depth change of five points marked in
of FIG. 6B with the fitted result;
[0024] FIG. 6D shows evolution of the averaged depth change with the fitted
results under driving
frequencies of 490 Hz, 500 Hz, and 510Hz; error bar: standard deviation of Az
calculated from the five selected
pixels;
[0025] FIG. 6E shows the response of the depth displacements to sound
frequencies, the curve being a
fitted result of a Lorentz function;
[0026] FIG. 7A shows six reconstructed 3D images showing a glass cup broken
by a hammer;
[0027] FIG. 7B shows the evolution of 3D velocities of four selected
fragments of the broken cup of FIG. 7A
marked in the fourth and fifth panels in FIG. 7A; and
[0028] FIG. 7C shows the evolution of the corresponding 3D accelerations of
the four selected fragments.
DESCRIPTION OF ILLUSTRATIVE EMBODIMENTS
[0029] The present invention is illustrated in further details by the
following non-limiting examples.
[0030] A system for band-limited illumination profilometry (BLIP) with
temporally interlaced acquisition (TIA)
Date Reçue/Date Received 2021-08-02

5
according to an embodiment of an aspect of the present invention generally
comprises a projection unit to project
pre-defined fringe patterns onto the surface of the measured object, the
fringe patterns being distorted and reflected
by the object surface, point by point, and cameras capturing the distorted
fringes images, point by point.
[0031] In FIG. 1, the projecting unit comprises a high-coherent light
source 10, a spatial light modulator 18,
a pattern conversion unit 21, a projecting optics 30.
[0032] After expansion and collimation by a beam expander 12, the laser
beam from a 10 Ins a 200-mW
continuous-wave laser source 10, of wavelength A=671 nm (MRL-III-671, CNI
Lasers), is directed by mirrors 14 and
16 to a 0.45" digital micromirror device 18 (AJD-4500, Ajile Light Industries)
at an incident angle of about 24 to the
normal of the surface of the digital micromirror device 18, for sinusoidal
fringes generation; using four phase-shifting
binary patterns, generated by an error diffusion algorithm from their
corresponding grayscale sinusoidal patterns,
loaded onto the digital micromirror device 18. The pattern conversion unit 21,
comprising a 4f imaging system 25
with a low pass filter 24 such as a pinhole, converts the binary patterns to
grayscale fringes at the intermediate
image plane 28.
[0033] The minimal pinhole diameter D for ail spatial frequency content of
the sinusoidal fringe pattern to
pass through the system is determined by the system bandwidth as follows:
n 4.1
LJ = ¨ (1)
Pf
[0034] where pf=324 pm is the period of the fringes composed by the digital
micromirror device pixels, f1
being the focal length of lens 22. With lenses 22 and 26 of the 4f imaging
system 25 having focal lengths f1=120
mm and f2=175 mm respectively, the minimal pinhole diameter is D=248.52 pm. In
an experiment, a 300 pm-
diameter pinhole was selected.
[0035] A projector lens 30 (AF-P DX NIKKOR, Nikon) projects the output
fringe patterns on a 3D object 32.
[0036] Deformed structured images are captured alternately by two high-
speed CMOS cameras 34, 36
(CP70-1HS-M-1900, Optronis) placed side by side, i. e on a same side of the
projector. Depending on their roles in
image reconstruction, the cameras are referred to as the main camera 34 and
the auxiliary camera 36 respectively,
as will be described hereinbelow. Synchronized by the trigger signal of the
digital micromirror device, each camera
captures half of the sequence (FIG. 1B). The acquired images from each camera
are transferred to a computer 38
Date Reçue/Date Received 2021-08-02

6
via a CoaXPress cable connected to a frame grabber (Cyton-CXP, Bitflow).
[0037] The light source 10 is a high coherent light source of power at
least 50 mW, selected depending on
the sensitivity of cameras 34 and 36. with a laser wavelength is comprised in
the range between about 380 and
about 750 nm in case of visible light cameras, and in the range between about
800 and about 1100 nm in case of
near infrared (NIR) cameras.
[0038] The high-speed cameras 34, 36 may be cameras with global shutter, of
imaging speed of at least
about 2k frames/second, with image resolution at least about 1000x800 pixels.
[0039] The spatial light modulator 18 has a refreshing rate of at least
about 4 kHz, on board memory of at
least about 1 Mb, and is selected to work at the corresponding wavelength of
the light source. It may be a liquid
crystal display or a binary fringe mask with a motorized translation stage for
example.
[0040] The pattern conversion unit 21 may comprise a 4f imaging system 25
with lenses of different focal
lengths and the low-pass filter 24 may be a slit. The focal lengths of the two
lenses are selected with a ratio (focal
length of the first lens/ focal length of the second length) comprised in the
range between about 0.75 and about 1.5.
The diameter of the low pass filter is selected in the range between about 150
lm and about 300 m.
[0041] The projecting optics 30 is selected with a focal length in the
range between about 18 and about 55
mm, a F number in the range between about 3.5 and about 5.6, and a
magnification ratio in a range between about
and about 10 times.
[0042] The imaging speed and field of view may be further improved by using
more than two cameras, in
such a way to separate the workload to an array of cameras, for example to
trace and recognize hand gesture in
3D space to provide information for human-computer interaction.
[0043] The system thus projects sinusoidal fringe patterns onto the object
and captures the corresponding
deformed patterns modulated by the object surfaces. The depth information is
encoded into the phase of the
distorted fringe images. For phase demodulation and reconstruction of the 3D
object, the retrieved phase
distribution corresponding to the object height is mathematically wrapped to
principle values of arctangent function
ranging between -u and u, and consequently, the phase discontinuities occur at
the limits every time when the
unknown true phase changes by 2u, which is referred fo as the phase ambiguity
problem, resulting from the
periodical nature of the sinusoidal signal. A unique pixel correspondence
between the cameras and the projector
is obtained by phase unwrapping.
Date Reçue/Date Received 2021-08-02

7
[0044] According to an aspect of the present disclosure, a method to
recover the 3D image of the object
pixel by pixel from the mutually incomplete images provided by the cameras
generally comprises locating a point
(u,' , va') in the images of the auxiliary camera 36 that matches a selected
pixel (um, vm) of the main camera 34;
determining estimated 3D coordinates and wrapped phase from knowledge of the
cameras calibration, determining
the horizontal coordinate on the plane of the projector from knowledge of the
projector calibration, and using the
wrapped phase value to recover the 3D point of 3D coordinates (x,y,z) with the
coordinate-based method.
[0045] System calibration
[0046] To recover the object's 3D information, the method relies on a
coordinate-based understanding of
the spatial relationship between the projector 30 and the cameras 34, 36 in
image formation. The projection of the
3D coordinates (x,y,z) of the 3D point onto the camera coordinates (u, y) is
described in a pinhole model using
extrinsic parameters R and T describing the rotation and translation of
coordinates, respectively, and intrinsic
parameters characterizing the properties of the cameras in image formation,
with fu and fõ the effective focal
lengths along each axis of the sensor of the cameras; upp and vpp the
coordinates of the principal point of the
cameras; and a accounting for pixel skewness, as follows:
U
S12i =
1
x (2)
[fou a uppl _
[f i , vpp [ R T]4
0 0 1 1
[0047] Column vectors [u, y, 1]T and [x,y,z,1]T represent the camera
coordinates (u, v) and the 3D
coordinates (x, y, z) in homogeneous coordinates, which allow for the
numerical extraction of the camera
coordinates (u, y) from Relation (2) through a scalar factor s.
[0048] The cameras and the projector are calibrated to determine the values of
the extrinsic and intrinsic
parameters using a checkerboard. Since the direct image acquisition is not
possible for a projector, projector-
centered images of the calibration object obtained by the phase-based mapping
method are sent to a toolbox with
calibration in the same manner as for the cameras.
[0049] Coordinate-based 3D point determination
Date Reçue/Date Received 2021-08-02

8
[0050] 3D information is recovered from the calibrated imaging system using
a coordinate-based method.
To a point on the 3D object with the 3D coordinates (x, y, z) correspond two
independent coordinates, (u, y) for
the cameras and (u", y") for the projector.
[0051] In a calibrated phase-shifting fringe projection profilometry
system, any three of these coordinates
[IL, y, u", v"} can be determined and a linear system of the form E = M [x, y,
z]T is derived. The elements of
E and M are obtained by using the calibration parameters of each device, the
scalar factors and the three
determined coordinates among u, y, u" and y". Thus, 3D information of an
object point can be extracted via
matrix inversion.
[0052] Returning to the system discussed in relation to FIG. 1A, first,
images from the calibrated main
camera 34 are used to provide the camera coordinates (u, y) of a point on the
3D object. Along with the calibration
parameters of the system, an epipolar line is determined on the calibrated
auxiliary camera 36. The horizontal
coordinate in the images of the auxiliary camera 36 is recovered using search-
based algorithms along the epipolar
line, in stereo vision. Second, by substituting the calibrated projector in
place of the auxiliary camera 36, the intensity
values of the pixel (u, y) of the auxiliary camera 36 across a sequence of
images is used by structured light
methods to recover information about a coordinate of the calibrated projector.
The object's 3D information is
extracted pixel by pixel based on interlaced image acquisition by
incorporating the camera coordinates (u,v) of a
point on the 3D object and its corresponding projector coordinates., using a
triangulation method to solve the point
in 3D space.
[0053] Data acquisition
[0054] For data acquisition, four fringe patterns with phases equally
shifted by n-/2 illuminate the 3D object.
The intensity value 'k (11., y) for the pixel (u, y) in the kth image acquired
by the calibrated main camera 34 is
obtained as follows:
I k(li, V) = Ib(li, V) + (3)
/õ(u, y) cos [cp(u, v) ¨ n; 1,
[0055] where k E[0,3]. Ib(u, p) is the background intensity, /õ(u, y) is
the variation of intensity and
cp(u, y) is the depth-dependent phase.
Date Reçue/Date Received 2021-08-02

9
[0056] Relation (3) allows analyzing two types of intensity matching
conditions for the order of pattern
projection shown in FIG. 1B. Fora pixel (u,' , va') in the images of the
auxiliary camera 36 that perfectly corresponds
with the coordinates of a selected pixel (um, vm) in the images of the main
camera 34, Relation (3) yields:
/0 (um, vm) + 12(u"V) = (2m, Vrn) (4)
13 (liaf 17).
[0057] Rearrangement of Relation (4) leads to the equivalent relation,
selected as the intensity matching
condition:
/0(um, vm) ¨ /i(um, vm) = 13 (lifa, "V) ¨ (5)
12 (liaf 17).
[0058] Each side of Relation (5) contains images captured by the same
camera and represent a residual
fringe component of sinusoidal characteristics, which allows to increase the
efficiency of line-constrained searches
by regularizing local maxima and minima in the patterns and by including
additional phase information. Moreover,
by considering the right-hand side as a continuously varying function along
the epipolar line determined on the
calibrated auxiliary camera 36, Relation (5) and bi-linear interpolation
allows for the selection of discrete candidates
with sub-pixel accuracy.
[0059] FIG. 2 is a flowchart of a method for coordinate-based 3D point
determination according to an aspect
of the present disclosure, with coordinates of the point to be matched for the
main camera (um, vm); coordinates
of the estimated corresponding point for the auxiliary camera (ue', v);
recovered 3D coordinates (x, y, z);
horizontal distance between the candidates and the estimated corresponding
point ri; phase value of the selected
point in the main camera obtained by the Fourier transform profilometry method
corn; phase value of the candidate
points in the auxiliary camera obtained by the Fourier transform profilometry
method; phase value obtained by the
phase-shifting method cp; phase value determined on the projector's plane
cppui ; 3D points determined by
candidates Pi; principal point of the main camera Pm ; principal point of the
auxiliary camera Pa; ,A/m =
/0 (um, vm) ¨ /i(um, vm); and intensity profile of /3 ¨ /2 along the epipolar
line Aiep.
[0060] In a quality map determination step (see Step lin FIG. 2), (10 +
4)/2 and (12 + /3)/2 are obtained
from images by the main camera. Then, a threshold intensity, obtained from a
selected background reg ion, is used
to eliminate pixels with low intensities and obtain a binary quality map.
Subsequently, after such thresholding of the
Date Reçue/Date Received 2021-08-02

10
intensity map, only pixels (um, vm) that fall within the quality map of the
main camera are considered for 3D
information recovery.
[0061]
In a candidate discovery step (see Step II in FIG. 2), the selected pixel (um,
vm) of the main camera
determines an epipolar line containing the matching point within the images of
the auxiliary camera. Then, the
candidates (u'ai,
for the matching point in the auxiliary camera images are extracted, the
subscript "i" denoting
the ith candidate, that satisfy the intensity matching condition determined by
Relation (5) above in addition to a
quality map constraint, a transformation constraint and a phase sign
constraint.
[0062]
The quality map constraint requires that the candidates (u'ai, v) for the
matching point in the
auxiliary images fall within the quality map of the auxiliary camera.
[0063]
The transformation constraint requires that candidates occur within a segment
of the epipolar line
determined by a fixed two-dimensional projective transformation or homography
that approximates the location of
the matching point (ue' , ve') within the images of the auxiliary camera as
follows:
s'[ue', ve', 11T = H[urn, 11T, (6)
[0064]
where s' is a scalar factor representing extraction of the pair of coordinates
of the estimated
corresponding point (u'e, ve') from its homogeneous coordinates [x, y, z, 11T.
H is obtained by applying Relation
(6) to four points chosen as the corners of a flat rectangular plane when
imaged by both cameras at the approxi mate
center of the measurement volume. [um, v., 11' are the homogeneous coordinates
of the selected pixel
(um, vm) of the main camera. Once the coordinates of the estimated
corresponding point (ue' , ve') are determined,
the search along the epipolar line is confined to the segment occurring over
the horizontal interval [u 'e ¨ ro, ue +
ro], where ro is an experiment-dependent constant. In general, ro is selected
as small as possible while still
covering the targeted depth range. For the presently described experiments,
the value of ro was set to 40 pixels.
[0065]
The phase sign constraint requires that the selected point (um, vm) of the
main camera and
candidates (urai,
have the same sign of their wrapped phases com, and cotai respectively.
Estimates of the
wrapped phases are obtained using Fourier transform profilometry. In
particular, the intensity if(um, vm) of the
selected pixel (um, vm) of the main camera pixel in the filtered image is
obtained by band-pass filtering the left-
Date Reçue/Date Received 2021-08-02

11
hand side of Relation (5) /0 ¨ /1, as follows:
/f(um, vm) (7)
= ¨2 /õ (um, vro) exp (cp(um, vm) +
4
[0066] The wrapped phase estimation corn of the selected point (um, vm) is
obtained as follows:
F-5[4(um, vm)] (8)
= tan-1 ____________________
9't [if (um, %AS '
[0067] where Z-5[.] and 9î Hdenote the imaginary and real part of a complex
variable respectively. The same
band-pass filtering applied to the right-hand side of Relation (5) /3 ¨ /2
yields the estimate of its wrapped phase
co'ai of the candidate (u'ai, v 'ai), as follows:
= tan-1 (9)
a/ f ai, v'ai)11 [i; (1af Va t)]
[0068] The phase sign constraint requires that the wrapped phase estimation
corn of the selected point
(um, vm) and the wrapped phase estimation wa of the candidate (u'ai, va' i)
have the same sign in the interval
(¨n-,
[0069] Other Fourier transform profilometry methods for wrapped phase value
extraction.
[0070] The output of the candidate discovery step is a pool of candidates
for further evaluation and the
method proceeds to matching point selection. If no candidate is found, the
candidate discovery step is re-initiated
for the next pixel in the main camera, until a candidate is obtained, and the
method proceeds to the matching point
selection.
[0071] In the matching point selection step (see Step III in FIG. 2),
penalty scores for each candidate
obtained from the candidate discovery step are determined. A first and primary
criterion compares the phase values
of the candidates using two methods. First, the phase of the candidate is
obtained from the intensities of the
candidate (u'ai, va' i) and of the pixel (um, vm) of the selected point as
follows:
Date Reçue/Date Received 2021-08-02

12
[4(um, vm) ¨ (Ufai, V fai)1
cp'ai = tan-1 . (10)
(um, 12m) - (u,
[0072] Meanwhile, for each candidate (u'ai,
the coordinate triple (um, m, u) and knowledge of
camera calibration allows determining an estimated 3D point Pi by using the
stereo vision method. In addition, with
the knowledge of the projector calibration, a point with coordinates (up"i,
vp"i) on the plane of the projector is
determined for each candidate. Then, an unwrapped phase value cpput is
obtained by:
2 n-
(pp"i = ¨ (ut ¨ u),
(11)
[0073] where
is a horizontal datum coordinate on the plane of the projector associated with
the zero
phase, and p is the fringe period in units of projector pixels. Since these
independently obtained phase values must
agree if the candidate correctly matches (um, vm), a penalty score Ai, as a
normalized difference of these two
phase values, is obtained as follows:
IR( (flat ¨ ePpui)1
=
(12)
n-
[0074] where the rewrapping function RO computes the subtracted difference
between wrapped and
unwrapped phase values.
[0075] To improve the robustness of the method, two additional criteria are
implemented using data available
from the candidate discovery step. Bi is a normalized distance score favoring
candidates located doser to the
estimated matching point (//,' , ve'), which is obtained by:
lu' ¨ ua' I
Bi ¨ (13)
ro
[0076] Moreover, C is a normalized difference of wrapped phase values
obtained by using the wrapped
phases com and cotai, as follows:
IR( com
=
(14)
=
n-
[0077] A total penalty score Sifor each candidate is then determined as a
weighted linear combination of
Date Reçue/Date Received 2021-08-02

13
three individual scores as follows:
Si = ij1A + n2B, + n3c,,
(15)
[0078] where the normalized weights [Th, Th, Th] = [0.73, (109, 0.18] are
empirically selected to lead to the
results that are most consistent with physical reality. Finally, the candidate
with the minimum total penalty score Si
is selected as the matching point (u'a, va'), and its phase values are
obtained by using relations. (10) and (11) are
denoted as cf) and ce, respectively.
[0079] In a final step of 3D point recovery (see Step IV in FIG. 2), the
method determines the final 3D
coordinates. First, the phase of the candidate cf) is unwrapped as cpa + 2n-q,
where q is an integer such that
cpp" ¨ (4 + 2n-q) E (¨n-, Then, the
coordinate on the plane of the projector is obtained with sub-pixel
resolution as follows:
u; = + P(cp12n- + g),
(16)
from which the final 3D coordinates (x,y,z) are obtained using calibration
information associated with the
coordinate triple (urp, prp, up").
[0080] Results
[0081] FIGs. 3 show quantification of the depth resolution of the method.
To quantify the depth resolution
with different exposure times, two stacked planar surfaces offset by about 9
were imaged. Reconstructed results
at four representative exposure times te are shown in FIG. 3A. One area on
each surface, marked as white boxes
in full lines in FIG. 3A, was selected in the reconstructed image. The depth
information on the x axis was obtained
by averaging the depth values along the y axis. The difference in depths
between these two surfaces is denoted
by zd. In addition, the noise is defined as the averaged values of the
standard deviation in depth from both surfaces.
The depth resolution is defined as when zd equals to two times the noise level
of the system. As shown in the four
plots of FIG. 3A, the reconstruction results deteriorate with shorter exposure
times, manifested by increased noise
levels and more points incapable of retrieving 3D information. As a result,
the depth resolution degrades from 0.06
Date Reçue/Date Received 2021-08-02

14
mm at 4=950 ps to 0.45 mm at te=150 ps (FIG. 3B). At exposure time te=100 ps,
the method fails in 3D
measurements. The region of unsuccessful reconstruction prevails across most
of the planar surfaces. The noise
dominates the obtained depth difference, which is attributed to the low signal-
to-noise ratio in the captured images.
[0082] To examine the feasibility of the method, various static 3D objects
were imaged. First, two sets of 3D
distributed letter toys that composed the words of "LACI" and "INRS" were
imaged. FIGs. 4 show static 3D objects.
Shown in FIG. 4A, the two perspective views of the reconstructed results
reveal the 3D position of each letter toy.
The detailed surface structures are illustrated by the selected depth profiles
(white dashed lines in FIG. 4A). A
proof-of-concept experiment was also conducted on three cube toys with fine
structures, with a depth of about 4
mm, on the surfaces. As can be seen in FIG. 4B, the detailed structural
information of these cube toys is recovered.
[0083] Imaging of dynamic 3D objects
[0084] To verify high-speed 3D surface profilometry, the method was used to
image two dynamic scenes: a
moving hand and three bouncing balls. The fringe patterns were projected at 4
kHz. The exposure times of both
cameras were 4=250 ps. Under these experimental conditions, a 3D imaging speed
of 1 thousand frames per
second (kips), a field of view (FOV) of 150 mm x 130 mm, corresponding to 1180
x 860 pixels in captured images,
and a depth resolution of 0.24 mm were achieved.
[0085] FIG. 5A shows the reconstructed 3D images of the moving hand at five
time points from 0 ms to 60
ms with a time interval of 15 ms. The high-speed 3D imaging allowed tracking
the movements of four fingertips. As
shown in FIG. 5B, all the four fingers have apparent movement in both the x
axis and the z axis but stay relatively
stationary in the y axis, which agrees with the experimental condition.
[0086] In the second experiment, three white balls, each of which was
marked by a different letter on its
surface, bounced in an inclined transparent container. FIG. 5C shows five
representative reconstructed images
from 8 ms fo 28 ms with a time interval of 5 ms. The changes of the letter "C"
on B1 and the letter "L" on B2, marked
in the third panel of FIG. 5C, clearly show the rotation of the two balls. The
method enabled tracking the 3D centroids
of each ball over time. As shown in FIG. 5D, B1 collides with B2 at 16 ms,
resulting in a sudden change in the
moving directions. This collision temporarily interrupted the free fall of B1,
represented by the two turning points in
the curve of evolution along the y-axis (second panel of FIG. 5D). The
collision also changed the moving direction
Date Reçue/Date Received 2021-08-02

15
of B2, making it touch the base at 27 ms and then bounce up. In this scene, B3
maintained its movement in a single
direction in both the x axis and the z axis. It fell onto the based and
bounced back at 16 ms, resulting in a turning
point in its y-t curve. Because of the inclined bottom plane, the y-value of
B3 at 16 ms was smaller than that of B2
at 27 ms.
[0087] Application to the study of sound-induced vibration on glass
[0088] To highlight the broad utility of the method, sound-induced
vibration on glass was imaged. In an
experiment (FIG. 6A), a glass cup was fixed on a table. A function generator
drove a speaker to produce single-
frequency sound signals, from 450 Hz to 550 Hz with a step of 10 Hz through a
sound channel placed close to the
wall of the cup. To image the vibration dynamics, fringe patterns were
projected at 4.8 kHz. The cameras had an
exposure time of 4=205 ps. This configuration enabled a 3D imaging speed of
1.2 kfps, a field of view (FOV) of
120 mm x 110 mm, corresponding to 960 x 800 pixels in captured images, and a
depth resolution of 0.31 mm. FIG.
6B shows four representative 3D images of the instantaneous shapes of the
glass cup driven by the 500-Hz sound
signal, showing the dynamic of structural deformation of the glass cup. The
evolution of depth changes was
analyzed using five selected points, marked by PA to PE in the first panel of
FIG. 6B. A shown in FIG. 6C, the depth
changes of the five points are in accordance, which is attributed to the
rigidness of the glass.
[0089] Time histories of averaged depth displacements under different sound
frequencies were further
analyzed. FIG. 6D shows the results at the driving frequencies of 490 Hz, 500
Hz, and 510 Hz. Each result was
fitted by a sinusoidal function with a frequency of 490.0 Hz, 499.4 Hz, and
508.6 Hz, respectively. These results
show that the rigid glass cup vibrated in compliance with the driving
frequency. Moreover, the amplitudes of fitted
results, Azfit, were used to determine the relationship between the depth
displacement and the sound frequency
(FIG. 6E). This result was fitted by the Lorentz function, which determined
the resonant frequency of this glass cup
to 499.0 Hz.
[0090] Application to the study of glass breakage
[0091] To further apply the method to recording non-repeatable 3D dynamics,
the process of glass breaking
by a hammer was imaged. As displayed in FIG. 7A, the growth of cracks and the
burst of fragments with different
shapes and sizes were clearly shown in the reconstructed 3D images. The time
courses of velocities of four
Date Reçue/Date Received 2021-08-02

16
fragments, marked by FA to Fg in FIG. 7A, are plotted in FIG. 7B. The
velocities in the y axis are considerably
small compared to the other two directions, which indicates the impact of the
hammer force was exerted on the x-
z plane. vy of fragments FA and Fc shows that they moved upward until 15 ms
and fell afterward. vy of fragments
Fg and Fg reveals that they fell onto the remaining base of the cup at 15 ms
and kept sliding down on the surface.
The data of vz illustrates that FA and Fc moved closer to the cameras, which
were directly driven by the hammer's
force. However, Fg and Fg, which collided with other pieces, maintaining their
positive directions in vz to move
away from the cameras. The corresponding accelerations are displayed as in
FIG. 7C, which indicates the influence
of both the main strike and the ensuing collision among different fragments.
At 14 ms, the collision with other
fragments, which applied an impact along the +x direction, dominated the
acceleration direction for ail four tracked
fragments. In contrast, at 15 ms, another collision produced an impact in the
¨x direction, causing a sharp
decrease in the acceleration for FA and Fc. In addition, the direction of
acceleration for Fg along the y-axis changed
several times, which is attributed in several collisions of Fg with the base
of the glass cup while sliding down.
[0092] There is thus presented a method with a kfps-level 3D imaging speed
over a field of view of up to
150 mm x 130 mm. The method implements temporally interlaced acquisition in
multi-view 3D phase-shifting fringe
projection profilometry systems, which allows each camera capturing half of
the sequence of phase-shifting fringes.
Leveraging the characteristics indicated in the intensity matching condition
[Relation (5)], the method applies
constraints in geometry and phase to find the matching pair of points in the
main and auxiliary cameras and guides
phase unwrapping to extract the depth information. The method was shown to
allow the 3D visualization of glass
vibration induced by sound and the glass breakage by a hammer.
[0093] There is thus presented a system and a method for high-speed dual-
view band-limited illumination
profilometry using temporally interlaced acquisition As people in the art will
now be in a position to appreciate,
temporally interlaced acquisition eliminates the redundant capture of fringe
patterns in data acquisition. The roles
of the main camera and the auxiliary camera are interchangeable and the
present method may be adapted to a
range of multi-view phase-shifting fringe projection profilometry systems.
Moreover, temporally interlaced
acquisition reduces the workload for both cameras by half. For the given
bandwidth of the camera's interface, this
more efficient use of cameras can either increase the 3D imaging speed for a
fixed field of view or enlarge the field
of view with a maintained 3D imaging speed. Both advantages shed light on
implementing the present method with
an array of cameras to simultaneously accomplishing high accuracy and high
speed 3D imaging over a larger files
of view. Also, the two cameras deployed in the present method are placed on a
same side relative to the projector,
which circumvents the intensity difference induced by the directional
scattering light from the 3D object and reduces
Date Reçue/Date Received 2021-08-02

17
shadow effect by occlusion occurring when placing the cameras on different
sides of the projector. As a result,
robust pixel matching in the image reconstruction algorithm allows to recover
3D information on non-Lambertian
surfaces.
[0094] The imaging speed and field of view may be optimized by separating
the workload to four cameras,
by using a faster digital micromirror device, and by using a more powerful
laser. The image reconstruction toward
real-time operation may be increased by further adapting the 3D point recovery
method to four cameras and by
using parallel computing to accelerate the calculation.
[0095] The present method may be integrated in structured illumination
microscopy and frequency-resolved
multi-dimensional imaging. The present method may also be implemented in the
study of the dynamic
characterization of glass in its interaction with the e)dernal forces in non-
repeatable safety test analysis. As another
example, the present method may be used to trace and recognize the hand
gesture in 3D space to provide
information for human-computer interaction. Furthermore, in robotics, the
present method may provide a dual-view
3D vision for object tracking and reaction guidance. Finally, the present
method can be used as an imaging
accelerometer for vibration monitoring in rotating machinery and for behavior
quantification in biological science.
[0096] Temporally interlaced acquisition thus integrated in a dual-view
phase-shifting fringe projection
profilometry system allows each camera capturing half of the sequence of phase-
shifting fringes. Leveraging the
characteristics indicated in the intensity matching condition, the method
applies constraints in geometry and phase
to find the matching pair of points in the main and auxiliary cameras and
guides phase unwrapping to e)dract the
depth information.
[0097] The present method and system eliminate the redundant capture of
fringe patterns in data acquisition,
which lifts the long-standing limitation in imaging speed for multi-view phase-
shifting fringe projection profilometry,
and allows reducing the workload of cameras, which enables the enhancement of
either the 3D imaging speed or
the imaging field of view. Dynamic 3D imaging of over 1 thousand frames per
second on a field of view of up to
150x130 mm2, corresponding to 1180 x 860 pixels in captured images, was
demonstrated. Moreover, by putting
the two cameras side by side on a same did of the projector, the present
method and system circumvent the
influence of directional scattering light and occlusion effect for more robust
reconstruction, thereby expanding the
application range of multi-view phase-shifting fringe projection profilometry
to non-Lambertian surfaces.
Date Reçue/Date Received 2021-08-02

18
[0098] The present method and system may be adapted into other multi-view
3D profilometers, thus opening
new opportunities to blur-free 3D optical inspection and characterization with
high speeds, large fields of view, and
high accuracy. The present method and system provide a versatile tool for
dynamic 3D metrology with potential
applications in advanced manufacturing, such as characterization of glass in
non -repeatable safety test and high-
speed vibration monitoring in rotating machinery. The present compact and
symmetric system may be embedded
in the vision system of robots to track objects, to recognize the gesture for
human-computer interaction, and to
guide reactions.
[0099] The scope of the daims should not be limited by the embodiments set
forth in the examples, but
should be given the broadest interpretation consistent with the description as
a whole.
Date Reçue/Date Received 2021-08-02

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date Unavailable
(22) Filed 2021-08-02
(41) Open to Public Inspection 2022-02-03
Examination Requested 2023-12-13

Abandonment History

There is no abandonment history.

Maintenance Fee

Last Payment of $100.00 was received on 2023-12-11


 Upcoming maintenance fee amounts

Description Date Amount
Next Payment if small entity fee 2025-08-05 $50.00
Next Payment if standard fee 2025-08-05 $125.00

Note : If the full payment has not been received on or before the date indicated, a further fee may be required which may be one of the following

  • the reinstatement fee;
  • the late payment fee; or
  • additional fee to reverse deemed expiry.

Patent fees are adjusted on the 1st of January every year. The amounts above are the current amounts if received by December 31 of the current year.
Please refer to the CIPO Patent Fees web page to see all current fee amounts.

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Application Fee 2021-08-03 $408.00 2021-08-02
Maintenance Fee - Application - New Act 2 2023-08-02 $100.00 2023-07-05
Maintenance Fee - Application - New Act 3 2024-08-02 $100.00 2023-12-11
Request for Examination 2025-08-05 $816.00 2023-12-13
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
INSTITUT DE LA RECHERCHE SCIENTIFIQUE
Past Owners on Record
None
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
New Application 2021-08-02 8 248
Abstract 2021-08-02 1 19
Description 2021-08-02 18 897
Claims 2021-08-02 5 232
Drawings 2021-08-02 8 802
Representative Drawing 2021-12-29 1 12
Cover Page 2021-12-29 1 46
Request for Examination 2023-12-13 4 95