Language selection

Search

Patent 2686904 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent: (11) CA 2686904
(54) English Title: HAND-HELD SELF-REFERENCED APPARATUS FOR THREE-DIMENSIONAL SCANNING
(54) French Title: APPAREIL A MAIN AUTOREFERENCE POUR EFFECTUER DES BALAYAGES TRIDIMENSIONNELS
Status: Granted and Issued
Bibliographic Data
(51) International Patent Classification (IPC):
  • G01B 11/25 (2006.01)
  • G01B 11/245 (2006.01)
  • G01C 11/02 (2006.01)
(72) Inventors :
  • TUBIC, DRAGAN (Canada)
  • HEBERT, PATRICK (Canada)
  • ST-PIERRE, ERIC (Canada)
  • GAGNE, PIERRE-LUC (Canada)
  • CARON, ANTOINE THOMAS (Canada)
  • BEAUPRE, NICOLAS (Canada)
(73) Owners :
  • CREAFORM INC.
(71) Applicants :
  • CREAFORM INC. (Canada)
(74) Agent: FASKEN MARTINEAU DUMOULIN LLP
(74) Associate agent:
(45) Issued: 2012-04-24
(22) Filed Date: 2009-12-02
(41) Open to Public Inspection: 2011-06-02
Examination requested: 2011-11-03
Availability of licence: N/A
Dedicated to the Public: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): No

(30) Application Priority Data: None

Abstracts

English Abstract

A method and hand-held scanning apparatus for three-dimensional scanning of an object is described. The hand-held self-referenced scanning apparatus has a light source for illuminating retro-reflective markers, the retro-reflective markers being provided at fixed positions on or around the object, a photogrammetric high- resolution camera, a pattern projector for providing a projected pattern on a surface of the object; at least a pair of basic cameras, the basic camera cooperating with light sources, the projected pattern and at least a portion of the retro-reflective markers being apparent on the 2D images, a frame for holding all components in position within the hand- held apparatus, the frame having a handle, the frame allowing support and free movement of the scanning apparatus by a user.


French Abstract

La présente divulgation porte sur la description d'une méthode et d'un appareil portatif de balayage pour le balayage tridimensionnel d'un objet. L'appareil portatif à auto-référence comprend les éléments qui suivent. Une source lumineuse pour éclairer des marqueurs rétroréfléchissants, ces marqueurs rétroréfléchissants occupant des positions fixes sur l'objet ou autour de celui-ci; une caméra photogrammétrique à haute résolution, un projecteur de trace pour fournir une trace projetée sur la surface de l'objet; au moins une paire de caméras de base coopérant avec les sources lumineuses, la trace projetée et au moins une partie des marqueurs rétroréfléchissants étant apparents sur les images bidimensionnelles; une carcasse pour maintenir en place tous les éléments dans l'appareil portatif, la carcasse étant pourvue d'une poignée, et cette carcasse assurant le support le mouvement libre de l'appareil de balayage par un utilisateur.

Claims

Note: Claims are shown in the official language in which they were submitted.


-14-
CLAIMS
The embodiments of the invention for which protection is sought are as
follows:
1. A hand-held self-referenced scanning apparatus for three-dimensional
scanning of an object,
said hand-held scanning apparatus comprising:
a first light source for illuminating at least one marker of a set of retro-
reflective markers,
wherein each marker of said set of retro-reflective markers is provided at a
fixed position one of
on and near said object, said first light source being adapted to illuminate
said marker at a
photogrammetric modeling distance of more than 75 cm to said object;
a photogrammetric high-resolution camera for acquiring and storing at least
one
photogrammetric image of said object, said photogrammetric high-resolution
camera cooperating
with said first light source, wherein at least a portion of said set of retro-
reflective markers is
apparent on said photogrammetric image, said photogrammetric high-resolution
camera being
adapted to acquire said photogrammetric image at said photogrammetric modeling
distance;
a pattern projector for providing a projected pattern on a surface of said
object;
at least a pair of second light sources, each said second light sources for
illuminating at least
one marker of said set of retro-reflective markers, said second light source
being adapted to
illuminate said marker at a surface scanning distance of less than 60 cm from
said object;
at least a pair of cameras for acquiring and storing 2D images of said object,
one 2D image
from each camera, each said camera cooperating with one of said second light
sources, wherein
said projected pattern and at least a portion of said set of retro-reflective
markers is apparent on
said 2D images, a spatial relationship between said pair of cameras being
known, said cameras
being adapted to acquire said 2D images at said surface scanning distance;
a frame for holding in position said first light source, said photogrammetric
high-resolution
camera, said pattern projector, said cameras and said at least a pair of
second light sources within

-15-
said hand-held apparatus, said frame having a handle, said frame allowing
support and free
movement of said scanning apparatus by a user;
whereby said user may freely handle said hand-held self-referencing scanning
apparatus to
build a 3D model of said markers and then obtain dense sets of 3D surface
points from said 2D
images.
2. The hand-held self-referenced scanning apparatus as claimed in claim 1,
further comprising a
control circuit for controlling operation of said first light source, said
photogrammetric high-
resolution camera, said pattern projector, said at least said pair of second
light sources and said at
least said pair of cameras, said control circuit being mounted to said frame.
3. The hand-held self-referenced scanning apparatus as claimed in claim 1,
wherein said frame is
adapted to allow single-handed handling of said scanning apparatus.
4. The hand-held self-referenced scanning apparatus as claimed in claim 1,
further comprising a
casing for said scanning apparatus, said casing protecting said frame, said
photogrammetric high-
resolution camera, said pattern projector, said cameras and said at least a
pair of second light
sources and providing said handle.
5. The hand-held self-referenced scanning apparatus as claimed in claim 1,
wherein said first
light source is coaxial with said photogrammetric camera.
6. The hand-held self-referenced scanning apparatus as claimed in claim 1,
wherein said second
light source is coaxial with said camera.
7. The hand-held self-referenced scanning apparatus as claimed in claim 1,
wherein said first
light source is a ring light source surrounding said photogrammetric high-
resolution camera.
8. The hand-held self-referenced scanning apparatus as claimed in claim 1,
wherein said at least
two second light sources are ring light sources each surrounding one of said
at least a pair of
cameras.

-16-
9. The hand-held self-referenced scanning apparatus as claimed in claim 2,
further comprising
switches for actuating said controller.
10. The hand-held self-referenced scanning apparatus as claimed in claim 9,
wherein said
switches are provided at separate locations on said frame.
11. The hand-held self-referenced scanning apparatus as claimed in claim 7,
wherein said first
light source is a ring having at least two Light Emitting Diodes (LED),
12. The hand-held self-referenced scanning apparatus as claimed in claim 8,
wherein said second
light sources are rings each having at least two Light Emitting Diodes (LED).
13. The hand-held self-referenced scanning apparatus as claimed in claim 1,
wherein said pattern
projector is a laser pattern projector and wherein said projected pattern is a
laser pattern.
14. The hand-held self-referenced scanning apparatus as claimed in claim 13,
wherein said laser
pattern projector projects a laser crosshair pattern on said object.
15. The hand-held self-referenced scanning apparatus as claimed in claim 2,
further comprising
at least one status indicator activated by said controller for indicating a
status of said scanning
apparatus.
16. The hand-held self-referenced scanning apparatus as claimed in claim 1,
further comprising a
hub for handling transmission of said 2D images and said photogrammetric
image.
17. The hand-held self-referenced scanning apparatus as claimed in claim 1,
wherein said
photogrammetric modeling distance is 1.75 m.
18. The hand-held self-referenced scanning apparatus as claimed in claim 1,
wlierein said 3D
scanning distance is 30 cm.
19. The hand-held self-referenced scanning apparatus as claimed in claim 1,
wherein said frame
has two perpendicular axes, said cameras being provided at opposite ends of
one of said two
perpendicular axes, said pattern projector and said photogrammetric camera
being provided at

-17-
opposite ends of another one of said two perpendicular axes, said cameras,
said pattern projector,
said photogrammetric camera all facing a same direction.
20. A method for three-dimensional scanning of an object, comprising:
obtaining at least one photogrammetric image of said object from a hand-held
self-referenced
scanning apparatus having :
wherein each marker of said set of retro-reflective markers is provided at a
fixed
position one of on and near said object, said first light source being adapted
to
illuminate said marker at a photogrammetric modeling distance of more than 75
cm to
said object;
a photogrammetric high-resolution camera for acquiring and storing at least
one
photogrammetric image of said object, said photogrammetric high-resolution
camera
cooperating with said first light source, wherein at least a portion of said
set of retro-
reflective markers is apparent on said photogrammetric image, said
photogrammetric
high-resolution camera being adapted to acquire said photogrammetric image at
said
photogrammetric modeling distance;
a pattern projector for providing a projected pattern on a surface of said
object;
at least a pair of second light sources, each said second light sources for
illuminating at least one marker of said set of retro-reflective markers, said
second light
source being adapted to illuminate said marker at a surface scanning distance
of less
than 45 cm from said object;
at least a pair of cameras for acquiring and storing 2D images of said object,
one
2D image from each camera, each said camera cooperating with one of said
second light
sources, wherein said projected pattern and at least a portion of said set of
retro-
reflective markers is apparent on said 2D images, a spatial relationship
between said
pair of cameras being known, said cameras being adapted to acquire said 2D
images at
said surface scanning distance;

-18-
a frame for holding in position said first light source, said photogrammetric
high-
resolution camera, said pattern projector, said cameras and said at least a
pair of second
light sources within said hand-held apparatus, said frame having a handle,
said frame
allowing support and free movement of said scanning apparatus by a user;
extracting markers from said at least one photogrammetric images;
building a 3D model of said markers;
obtaining at least two 2D images of said object using said scanning apparatus;
positioning said at least two 2D images of said object in said 3D model of
said markers;
extracting dense sets of 3D surface points from said at least two self-
referenced 2D images;
whereby said user may freely handle said hand-held self-referencing scanning
apparatus to build
a 3D model of said markers and then obtain dense sets of 3D surface points
from said 2D
images.
21. The method as claimed in claim 20, further comprising detecting addition
of additional retro-
reflective markers on said object in said 2D images and updating said 3D model
of markers to
add said additional markers.
22. The method as claimed in claim 20, wherein said obtaining at least two 2D
images is
continuously repeated at predetermined rate once said obtaining is activated,
while said scanning
apparatus is in operation, wherein said scanning apparatus is displaced by
said user while said
scanning apparatus is in operation.
23. The method as claimed in claim 20, wherein said obtaining 2D images is
performed on non-
overlapping sections of said object.

Description

Note: Descriptions are shown in the official language in which they were submitted.


CA 02686904 2009-12-02
05201050-1 OCA
-1-
HAND-HELD SELF-REFERENCED APPARATUS FOR
THREE-DIMENSIONAL SCANNING
TECHNICAL FIELD
The present invention generally relates to the field of three-dimensional
scanning of an
object's surface geometry and, more particularly, to a portable three-
dimensional
scanning apparatus for hand-held operations.
BACKGROUND OF THE ART
Three-dimensional scanning and digitization of the surface geometry of objects
is
commonly used in many industries and services and their applications are
numerous. A
few examples of such applications are: 3D inspection and measurement of shape
conformity in industrial production systems, digitization of clay models for
industrial
design and styling applications, reverse engineering of existing parts with
complex
geometry, interactive visualization of objects in multimedia applications,
three-
dimensional documentation of artwork and artefacts, human body scanning for
biometry
or better adaptation of orthoses.
The shape of an object is scanned and digitized using a ranging sensor that
measures
the distance between the sensor and a set of points on the surface. Different
principles
have been developed for optical range sensors (see F. Blais, "A Review of 20
Years of
Range Sensor Development", in proceedings of SPIE-IS&T Electronic Imaging,
SPIE
Vol. 5013, 2003, pp. 62-76) that make it possible to capture a dense set of
measurements on the object surface. From these measurements, three dimensional
coordinates of points on the target surface are obtained in the sensor
reference frame.
From a given viewpoint, the ranging sensor can only acquire distance
measurements on
the visible portion of the surface. To digitize the whole object, the sensor
is therefore
moved to a plurality of viewpoints in order to acquire sets of range
measurements that

CA 02686904 2009-12-02
05201050-10CA
-2-
cover the entire surface. A model of the object's surface geometry can be
built from the
whole set of range measurements provided in the same global coordinate system.
While acquiring the measurements, the sensor can be moved around the object
using a
mechanical system or can be hand-held for more versatility. Portable hand-held
systems are especially useful for rapid scanning and for objects that are
scanned on
site. Using a hand-held system, the main challenge is to continuously estimate
the
position and orientation (6 degrees of freedom) of the apparatus in a global
coordinate
system fixed relative to the object. This can be accomplished using a
positioning device
coupled to the range scanner. The positioning device can be electromagnetic
(see for
example products by Polhemus), mechanical (see for example products by Faro),
optical (see for example products by Steinbichler) or ultrasonic (see
Arsenault et al.,
"Portable apparatus for 3-dimensional scanning", US Patent No. US 6,508403 B2,
Jan.
21, 2003). Using a positioning device significantly increases the complexity
and cost of
the apparatus. It is also cumbersome or, in some cases, noisy enough to limit
the
quality of the integrated data.
To avoid the usage of an external positioning device, an alternative consists
of using the
3D measurements collected on a rigid object in order to compute the relative
position
and orientation between the apparatus and the object. It is even possible to
hold and
displace the object in hand while scanning (see S. Rusinkiewicz, O. Hall-Holt
and M.
Levoy, "Real-Time 3D Model Acquisition", in ACM Transactions on Graphics, vol.
21,
no.3, July 2002, pp. 438-446, F. Blais, M. and G. Godin, "Accurate 3D
Acquisition of
Freely Moving Objects," in proc. of the Second International Symposium on 3D
Data
Processing, Visualization and Transmission. Thessaloniki, Greece. September 6-
9,
2004. NRC 47141). This idea of integrating the computation of the position
directly into
the system while exploiting measurement is interesting but these systems
depend
completely on the geometry of the object and it is not possible to ensure that
an
accurate estimate of the pose be maintained. For example, objects whose
geometry
varies smoothly or objects with local symmetries including spherical,
cylindrical or nearly
planar shapes, lead to non constant quality in positioning.

CA 02686904 2009-12-02
05201050-10CA
-3-
To circumvent this limitation, one can exploit principles of photogrammetry by
using
fixed points or features that can be re-observed from various viewpoints in
the scene.
These positioning features can be natural points in the scene but in many
cases their
density or quality is not sufficient and positioning markers are set in the
scene. One may
thus collect a set of images and model the 3D set of positioning features in a
same
global coordinate system. One can further combine this principle using a
camera with a
3D surface scanner. The complementarity of photogrammetry and range sensing
has
been exploited (see products by GOM mbH, CogniTens Ltd. and Steinbichler
Optotechnik GmbH) where a white light projector is used with cameras that
observe the
illuminated scene including positioning features (markers). Using this type of
system, a
photogrammetric model of the set of markers is measured and built beforehand,
using a
digital camera. Then, the 3D sensor apparatus is displaced at a set of fixed
positions to
measure the surface geometry. The range images can be registered to the
formerly
constructed model of positioning features since the 3D sensor apparatus can
detect the
positioning markers.
An interesting idea is to integrate within the same system, a hand-held
scanner
projecting a laser light pattern along with the capability of self-referencing
while
simultaneously observing positioning features. Hebert (see P. Hebert, "A Self-
Referenced Hand-Held Range Sensor". in proc. of the 3rd International
Conference on
3D Digital Imaging and Modeling (3DIM 2001), 28 May - 1 June 2001, Quebec
City,
Canada, pp. 5-12) proposed to project laser points on the object to be scanned
with an
external fixed projector to help position the hand-held sensor. This type of
system can
be improved by making it capable of building a model of the positioning
feature points
dynamically. Moreover, the 3D range scanner device was improved to
simultaneously
capture 3D surface points along with positioning features obtained from retro-
reflective
markers (see US Patent Publication No. 2008/0201101).
While the sensor projects a laser pattern to recover dense 3D surface points,
it also
projects light from LED in order to recover a signal on the light detectors,
arising from
the reflection of light on the retro-reflective markers that are fixed in the
observed scene.

CA 02686904 2009-12-02
05201050-10CA
-4-
The system can then simultaneously build a 3D model of these reference markers
for
positioning while acquiring a dense set of 3D surface measurements.
Nevertheless, there are constraints related to the usage of such a hand-held
system. In
order to obtain high accuracy for 3D surface measurements, the sensing device
must
acquire data while being as close as possible to the surface to be scanned.
This
imposes a reduced field of view on the object and consequently, the distance
between
retro-reflective markers must be reduced. For larger objects such as vehicles
or
architectural structures that exceed the working volume size of one cubed
meter, this
becomes not optimal when it is necessary to scan the whole object surface or
when it is
necessary to scan sections of these objects but in a common global coordinate
system.
Actually, positioning errors accumulate and affect the accuracy of the
recovered 3D
surface model.
SUMMARY
For large objects, it is a great advantage to capture large sets of
positioning markers in
one or a few snapshots. Using a high-resolution photogrammetric camera, one
can
actually obtain higher accuracy on the integrated sets of markers into a
single model,
after bundle adjustment. These markers are typically stuck on the surface of
the object
at a distance of 10 to 15 cm between each other. The photogrammetric camera
makes
it possible to capture an accurate model when viewed at a farther distance
such as for
example, more than 1,5 m. Using this 3D model of the positioning markers, one
can
then use a self-referencing laser scanner to gather sets of high density 3D
surface
points on the surface of an object. The density of these 3D surface points may
easily
exceed several points per square millimeter. For that purpose, the laser
scanner
calculates its position and orientation from the observed positioning markers
and
matches them with the 3D model of these markers. The laser scanner then
benefits
from an accurate model of the markers while measuring dense sets of points at
short
distance, typically from 250 to 400 millimeters to the object's surface.
Actually, when
using triangulation-based 3D sensors, the accuracy of these dense sets of
points is

CA 02686904 2009-12-02
05201050-10CA
-5-
inversely proportional to the square of the distance between the sensor and
the object's
surface.
The present invention thus makes it possible to improve the accuracy of the
hand-held
self-referencing device described in US Patent Publication No. 2008/0201101
for large
objects by eliminating these two conflicting objectives. The 3D model of the
positioning
markers can be captured at a farther distance from the object using the
observations
from the photogrammetric camera before exploiting this recovered information
to help
position the self-referenced 3D scanner at short distance from the surface.
Consequently, the accuracy of the final high-density surface model can be
increased.
Additionally, the present invention can reduce acquisition time when only
separated
surface sections of a large object are scanned within the same global
coordinate
system.
By combining a photogrammetric camera and two additional cameras coupled with
a
pattern projector in a same scanning apparatus, it is possible to easily and
rapidly
capture large sets of reference markers from wide snapshots of the object and
then
capture accurate and dense sets of 3D surface points at closer range while
always
exploiting the reference marker model. The capture of large sets of reference
markers
using the photogrammetric camera contributes to the improved accuracy of the
final
surface model.
One additional feature arising from the integrated hand-held device is the
possibility to
capture the reference marker model in the first stage and then selectively
acquire dense
surface points on some local surface sections of the object. These local
surface
sections are automatically calculated within the same global coordinate system
and thus
make it possible to extract additional surface markers on the object. This
added
capability can be useful, for example, in the field of 3D inspection in the
automotive
industry where large objects such as cars or trucks are assembled.

CA 02686904 2009-12-02
05201050-10CA
-6-
A method and hand-held scanning apparatus for three-dimensional scanning of an
object is described. The hand-held self-referenced scanning apparatus has a
light
source for illuminating retro-reflective markers, the retro-reflective markers
being
provided at fixed positions on or around the object, a photogrammetric high-
resolution
camera, a pattern projector for providing a projected pattern on a surface of
the object;
at least a pair of basic cameras, the basic camera cooperating with light
sources, the
projected pattern and at least a portion of the retro-reflective markers being
apparent on
the images, a frame for holding all components in position within the hand-
held
apparatus, the frame having a handle, the frame allowing support and free
movement of
the scanning apparatus by a user.
One aspect of the invention provides a hand-held self-referenced scanning
apparatus.
The hand-held scanning apparatus comprises a first light source for
illuminating at least
one marker of a set of retro-reflective markers, wherein each marker of the
set of retro-
reflective markers is provided at a fixed position one of on and near the
object, the first
light source being adapted to illuminate the marker at a photogrammetric
modeling
distance of more than 75 cm to the object; a photogrammetric high-resolution
camera
for acquiring and storing at least one photogrammetric image of the object,
the
photogrammetric high-resolution camera cooperating with the first light
source, wherein
at least a portion of the set of retro-reflective markers is apparent on the
photogrammetric image, the photogrammetric high-resolution camera being
adapted to
acquire the photogrammetric image at the photogrammetric modeling distance; a
pattern projector for providing a projected pattern on a surface of the
object; at least a
pair of second light sources, each the second light sources for illuminating
at least one
marker of the set of retro-reflective markers, the second light source being
adapted to
illuminate the marker at a surface scanning distance of less than 45 cm from
the object;
at least a pair of basic cameras for acquiring and storing 2D images of the
object, one
2D image from each basic camera, each the basic camera cooperating with one of
the
second light sources, wherein the projected pattern and at least a portion of
the set of
retro-reflective markers is apparent on the 2D images, a spatial relationship
between
the pair of basic cameras being known, the basic cameras being adapted to
acquire the

CA 02686904 2009-12-02
05201050-10CA
-7-
2D images at the surface scanning distance; a frame for holding in position
the first light
source, the photogrammetric high-resolution camera, the pattern projector, the
basic
cameras and the at least a pair of second light sources within the hand-held
apparatus,
the frame having a handle, the frame allowing support and free movement of the
scanning apparatus by a user; whereby the user may freely handle the hand-held
self-
referencing scanning apparatus to build a 3D model of the markers and then
obtain
dense sets of 3D surface points from the 2D images.
Another aspect of the invention provides a method for three-dimensional
scanning of an
object. The method comprises obtaining at least one photogrammetric image of
the
object from a hand-held self-referenced scanning apparatus, extracting markers
from
the at least one photogrammetric images; building a 3D model of the markers;
obtaining
at least two 2D images of the object using the scanning apparatus; positioning
the at
least two 2D images of the object in the 3D model of the markers; extracting
dense sets
of 3D surface points from the at least two self-referenced 2D images; whereby
the user
may freely handle the hand-held self-referencing scanning apparatus to build a
3D
model of the markers and then obtain dense sets of 3D surface points from the
2D
images.
BRIEF DESCRIPTION OF THE DRAWINGS
Having thus generally described the nature of the invention, reference will
now be made
to the accompanying drawings, in which:
FIG. 1 is a front perspective view of an example embodiment;
FIG. 2 is a rear top view of the example embodiment of FIG. 1;
FIG. 3 comprises FIGs. 3A, 3B, and 3C which are a rear perspective view, a
front view
and a front perspective view of an example of the internal structure of the
example
embodiment of FIG. 1;
FIG. 4 is a flow chart of main steps of an embodiment;

CA 02686904 2009-12-02
05201050-10CA
-8-
FIG. 5 comprises FIGs. 5A and 5B, in which FIG 5A is a schematic
representation of
data obtained after the capture of a photogrammetric model of FIG. 4 and FIG.
5B is a
schematic representation of data obtained after the capture of dense sets of
3D surface
points of FIG. 4; and
FIG. 6 comprises FIGs. 6A and 6B which illustrate an example embodiment in use
with
an object on which are affixed sets of positioning markers, non-coded markers
being
used in FIG. 6A and a combination of coded and non-coded markers being used in
FIG.
6B.
It will be noted that throughout the appended drawings, like features are
identified by
like reference numerals.
DETAILED DESCRIPTION
Referring now to FIG. 1, a 3-D scanning apparatus is generally shown at 10.
The 3-D
scanning apparatus 10 comprises a set of photogrammetric high-resolution
camera with
optics and filter 12, hereinafter referred to as photogrammetric camera 12.
Different
manufacturers of cameras provide cameras with adequate performances (Sony for
example). Although there is no restriction on the position of the high-
resolution camera,
it can, for example, be positioned in the center of the scanning apparatus
while aiming
forward. The resolution of the high-resolution camera can, for example, exceed
two
megapixels and the focal length of the optics should be small enough for
providing a
wide field of view, typically higher than 50 degrees. The optics can be
manufactured by
Pentax for example. This provides a 1,4 m horizontal field of view at a
distance of 1,5 m.
The filter is adapted to the light emitting diodes (LEDs) shown at 14.
Typically the
wavelength is set in the range of red visible spectrum but this is not a
restriction. These
LEDs project light towards the scene and then the light is reflected on the
retro-
reflective markers before coming back towards the LEDs. The reflected light is
captured
by the photogrammetric camera in order to produce a signal from the retro-
reflective
markers. In FIG. 1, eight LEDs are drawn at 14 but this number can be
different as long
as the recovered signal is strong enough to be extracted from the image
background. In

CA 02686904 2009-12-02
05201050-10CA
-9-
Fig. 1, the LEDs 14 are shown as being provided on a ring light which surround
the
photogrammetric camera 12. As will be understood, the LEDs 14 should be close
to the
photogrammetric camera 12 to ensure that the reflected light is captured by
the
photogrammetric camera 12 but they need not be provided on a ring surrounding
the
photogrammetric camera 12. LEDs 14 may be nearly coaxial with the
photogrammetric
camera 12. As will also be understood, the LEDs 14 could be replaced by
another light
source which would illuminate the retro-reflective markers on the object.
Still referring to FIG. 1, the 3-D scanning apparatus is complemented with two
sets of
basic cameras, optics, and filters shown at 16, hereinafter referred to as
basic cameras
16. To each basic camera 16, LEDs are added for the same purpose of recovering
signal from the retro-reflective markers. The number of LEDs, shown at 18, is
smaller in
this case since these basic cameras 16 operate at closer range from the
object. The
LEDs at 14 and 18 can be of the same type; for example, all emitting red
light. As will
also be understood, the LEDs 18 could be replaced by another light source
which would
illuminate the retro-reflective markers on the object.
Typically, the standoff distance when operating with these sets of cameras, is
30
centimeters. These basic cameras are monochromic and their field of view is
set
typically to 60 degrees. Their resolution is also typically set to at least
0,3 megapixels.
While used for positioning, these basic cameras 16 also capture the laser
pattern that is
projected by the laser pattern projector shown at 20. The laser pattern
projector 20 can
be a class II laser which is eye-safe. Coherent inc. is an example of a
manufacturer of
these laser projectors. It can project a red crosshair pattern. The fan angle
of the laser
pattern projector 20 can be 45 degrees. The combination of these basic cameras
along
with the laser pattern projector follows the description set in US Patent
Publication No.
2008/0201101.
FIG. 2 depicts a rear view of the 3-D scanning apparatus shown at 10. In this
example
embodiment, an IEEE 1394a cable provides a power and data link between the
scanning apparatus and a computing device. The connector is shown at 30. The

CA 02686904 2009-12-02
05201050-1 OCA
-10-
switches are shown at 22. Moreover, five status LEDs are mounted on the
scanning
apparatus. The three LEDs at 36 are activated to display whether the sensor is
too
close, too far or at adequate range from the object. While in the first two
cases a red
LED is activated, a green led located in the middle of the three-LED
arrangement, is
activated. The LED at 32 indicates whether the power is on or off. Finally,
the LED at 34
is activated when the device is recording 3-D measurements using the laser
pattern.
The dimensions of the scanning apparatus are 172 x 260 x 216 millimeters.
Besides the
basic cameras 16, the associated LEDs 18, the photogrammetric camera 12 and
the
associated LEDs shown at 14, the laser pattern projector 20, and the switches
shown at
22 and 24, the scanning apparatus integrates a control circuit for activating
the three
cameras, their associated LEDs, the laser projector and the status LEDs. It
also
recovers input signal from the switches. Finally, the scanning apparatus also
integrates
a hub for pipelining the images recovered from the cameras into the IEEE 1394a
cable.
In FIGs. 3A, 3B, and 3C, the internal structure of the scanning apparatus
shows the
components that are mounted on a t-shaped frame shown at 112. The two basic
cameras 104 are separated by a baseline 116 typically set to 190 millimeters.
The
distance 118 between this baseline axis and the center of the photogrammetric
camera
may be arbitrary; a typical value of 60 millimeters is shown. Finally, the
distance 114
between the same axis and the laser projector 106 is, for example, set such
that the
triangle composed of the two basic cameras and the laser projector is
isosceles. The
orientation of the basic cameras and the laser projector can also be adjusted
in such a
way that their optical axes converge at a single point on the surface of the
object. In this
example embodiment, this distance is set to 300 millimeters. These figures
also show
the two circuit boards mounted within the casing: the camera hub shown at 110
and the
control circuit shown at 108.
In a typical setup, some coded retro-reflective markers are stuck on the
object but they
need not to be coded. These coded markers are standard and well-known for
anyone
skilled in the art of photogrammetry. The coded retro-reflective markers are
provided at

CA 02686904 2009-12-02
05201050-10CA
-11-
a typical distance of 500 mm to 600 mm between each other and they are
distributed
uniformly on the object to capture. They facilitate matching between images at
the step
of producing a 3D model of the positioning markers. When the object stays
fixed in the
scene, markers need not to be stuck only on the object to be scanned. They can
also be
provided in the surrounding environment of the object to capture. Besides the
coded
retro-reflective markers, non-coded retro-reflective markers are also stuck on
or around
the object at closer distance. These markers make it possible for the laser
scanner to
self-reference. For the non-coded markers, the average distance between each
other is
to 15 centimeters. Typically, the user sticks the retro-reflective markers on
the object
10 before collecting a set of images using the scanning apparatus.
The 3-D scanning apparatus operates in two different modes.
In the first mode, only the photogrammetric camera is activated to capture
images all
around the object or for a specific section of the object. In FIG. 1, the
photogrammetric
camera 12 is activated simultaneously with the LEDs 14 by pressing a switch
button 24.
The control circuit board receives the signal from the switch button and
activates both
the photogrammetric camera and the LEDs. On the scanning apparatus, additional
switch buttons 22 are mounted on the device in order to facilitate hand-held
manipulation. They are connected to the circuit control board and their role
is the same
as the switch button 24. This multi-switch system significantly improves
ergonomy for
the hand-held device. Images can be captured at middle range from the object,
that is
approximately at a distance of 1,75 m. This part of the process, namely the 3D
positioning markers modeling, aims at obtaining a sparse 3D model of the
object - or
the scene - based on photogrammetric techniques such as bundle adjustment,
which is
well known to anyone skilled in the art. The model consists of a set of the 3D
positioning
markers distributed all around the object. These markers can be recognized
individually
when coded markers are used. Otherwise, they are identified based on their
geometric
distribution in space. Several algorithms have been developed for this later
approach
(see US Patent Publication No. 2008/0201101).

CA 02686904 2009-12-02
05201050-10CA
-12-
The captured images pass through a hub that feeds them into the IEEE 1394a
cable for
wired transmission but could be transmitted wirelessly by a transmitter to a
computing
device that is typically outside the acquisition sensor. The images are
processed to
extract the signal produced by the reflected light on the reference markers
and finally, a
bundle adjustment algorithm is applied to provide the 3D model of
photogrammetric
reference markers. Once this 3D model is calculated, the user prefers to
visualize it on
a screen display. Fig. 5A illustrates such a model. Then, he activates the
second mode
for the scanning apparatus. In this mode, the basic cameras 16, the LEDs 18
and the
laser pattern projector 20 are activated for acquiring a dense set of points
on the
object's surface.
In this second mode, the scanning apparatus 10 acquires two images at a time
to
simultaneously capture the signal from the reflected light emitted by the
laser pattern
projector 20 and the light emitted by the LEDs shown at 18 and reflected on
the retro-
reflective markers. For this later purpose, the principle is the same as in
the first stage
with the photogrammetric camera. LEDs 18 are thus provided around or nearly
coaxial
with the two basic cameras 16. Images are sent to a calculating device through
the
same hub and cable. They can also be transmitted wirelessly. Image processing
and
optionally, 3-D modeling of the reference markers, are performed before
calculating the
6 degree of freedom (3 translations and 3 angular parameters) pose of the
sensor and
calculating the dense set of 3D surface points. For a given image frame, the
dense set
of 3D surface points is obtained from the reflected pattern on the object. As
opposed to
3-D points located at the reference marker positions, the signal recovered
from the
reflected light originating from the laser pattern projector makes it possible
to recover
points at much higher density on the surface of the object, typically several
points per
square millimeter on the object's surface. Fig. 5B illustrates the results
while this
acquisition process is ongoing. For reaching higher accuracy, the hand-held
sensor can
be positioned at close range from the object. This is typically between 250
and 400 mm.
For the purpose of this second mode, a method is described in US Patent
Publication
No. 2008/0201101.

CA 02686904 2009-12-02
05201050-10CA
-13-
It is worth noting that using this later referenced method, additional
positioning markers
can be added to the scene while operating in this second mode. In this case,
the system
integrates the new markers online while acquiring the dense set of surface
points. This
optional stage 84 is represented in FIG. 4 along with the capture of a
photogrammetric
model of reference markers using high-resolution snapshots 80 and the capture
of
dense sets of 3D surface points along with reference markers for self-
referencing 82.
The result after step 80 is depicted in FIG. 5A where the 3D model of the
positioning
markers is shown at 72 into a global coordinate system shown at 70. In FIG.
5B, a
dense set of points during laser scanning of the surface, is shown at 74 along
with the
reflected crosshair laser pattern shown at 76.
FIG. 6A and FIG. 6B illustrate two situations where in FIG. 6A no coded
reference
markers are used while a combination of coded 64 and non-coded 60 reference
markers are used in FIG. 6B. FIGs. 6A and 6B further depict the projected
laser pattern
58 which is a crosshair in this illustrated case. The crosshair is reflected
from the object
62. In FIGs. 6A and 6B, a representation of the scanning apparatus is seen
from the
back. The laser pattern projector 20 projects the crosshair onto the object.
This pattern
is observed by the two basic cameras 16 along with the reference markers. A
model of
the reference markers has been built formerly using the photogrammetric camera
12.
While illustrated in the block diagrams as groups of discrete components
communicating with each other via distinct data signal connections, it will be
understood
by those skilled in the art that the illustrated embodiments may be provided
by a
combination of hardware and software components, with some components being
implemented by a given function or operation of a hardware or software system,
and
many of the data paths illustrated being implemented by data communication
within a
computer application or operating system. The structure illustrated is thus
provided for
efficiency of teaching the described embodiment.
The embodiments described above are intended to be exemplary only. The scope
of
the invention is therefore intended to be limited solely by the appended
claims.

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

2024-08-01:As part of the Next Generation Patents (NGP) transition, the Canadian Patents Database (CPD) now contains a more detailed Event History, which replicates the Event Log of our new back-office solution.

Please note that "Inactive:" events refers to events no longer in use in our new back-office solution.

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Event History , Maintenance Fee  and Payment History  should be consulted.

Event History

Description Date
Change of Address or Method of Correspondence Request Received 2020-01-17
Common Representative Appointed 2019-10-30
Common Representative Appointed 2019-10-30
Change of Address or Method of Correspondence Request Received 2019-08-14
Inactive: Correspondence - Transfer 2018-01-26
Appointment of Agent Request 2017-02-28
Revocation of Agent Request 2017-02-28
Grant by Issuance 2012-04-24
Inactive: Cover page published 2012-04-23
Pre-grant 2012-02-09
Inactive: Final fee received 2012-02-09
Notice of Allowance is Issued 2011-12-07
Letter Sent 2011-12-07
Notice of Allowance is Issued 2011-12-07
Inactive: Approved for allowance (AFA) 2011-12-01
Letter Sent 2011-11-15
Letter sent 2011-11-15
Advanced Examination Determined Compliant - paragraph 84(1)(a) of the Patent Rules 2011-11-15
Inactive: Advanced examination (SO) 2011-11-03
Request for Examination Received 2011-11-03
Amendment Received - Voluntary Amendment 2011-11-03
Request for Examination Requirements Determined Compliant 2011-11-03
Inactive: Advanced examination (SO) fee processed 2011-11-03
All Requirements for Examination Determined Compliant 2011-11-03
Application Published (Open to Public Inspection) 2011-06-02
Inactive: Cover page published 2011-06-01
Revocation of Agent Request 2010-06-07
Appointment of Agent Request 2010-06-07
Inactive: Office letter 2010-06-04
Inactive: Office letter 2010-06-04
Revocation of Agent Requirements Determined Compliant 2010-06-04
Appointment of Agent Requirements Determined Compliant 2010-06-04
Appointment of Agent Request 2010-06-01
Revocation of Agent Request 2010-06-01
Revocation of Agent Requirements Determined Compliant 2010-05-11
Inactive: Office letter 2010-05-11
Inactive: Office letter 2010-05-11
Appointment of Agent Requirements Determined Compliant 2010-05-11
Revocation of Agent Request 2010-04-30
Appointment of Agent Request 2010-04-30
Inactive: IPC assigned 2010-03-30
Inactive: IPC assigned 2010-03-30
Inactive: First IPC assigned 2010-03-30
Inactive: IPC assigned 2010-03-30
Inactive: Office letter 2010-03-09
Letter Sent 2010-03-09
Inactive: Declaration of entitlement - Formalities 2010-02-10
Inactive: Single transfer 2010-02-10
Correct Inventor Requirements Determined Compliant 2010-01-11
Inactive: Filing certificate - No RFE (English) 2010-01-11
Reinstatement Requirements Deemed Compliant for All Abandonment Reasons 2010-01-05
Inactive: Filing certificate - No RFE (English) 2009-12-29
Filing Requirements Determined Compliant 2009-12-29
Application Received - Regular National 2009-12-29

Abandonment History

There is no abandonment history.

Maintenance Fee

The last payment was received on 2011-01-05

Note : If the full payment has not been received on or before the date indicated, a further fee may be required which may be one of the following

  • the reinstatement fee;
  • the late payment fee; or
  • additional fee to reverse deemed expiry.

Please refer to the CIPO Patent Fees web page to see all current fee amounts.

Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
CREAFORM INC.
Past Owners on Record
ANTOINE THOMAS CARON
DRAGAN TUBIC
ERIC ST-PIERRE
NICOLAS BEAUPRE
PATRICK HEBERT
PIERRE-LUC GAGNE
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Description 2009-12-02 13 679
Claims 2009-12-02 6 232
Abstract 2009-12-02 1 21
Drawings 2009-12-02 8 203
Representative drawing 2011-05-05 1 24
Cover Page 2011-05-11 1 57
Claims 2011-11-03 5 198
Cover Page 2012-03-29 2 63
Filing Certificate (English) 2010-01-11 1 166
Courtesy - Certificate of registration (related document(s)) 2010-03-09 1 103
Acknowledgement of Request for Examination 2011-11-15 1 176
Commissioner's Notice - Application Found Allowable 2011-12-07 1 163
Correspondence 2009-12-29 1 18
Correspondence 2010-02-10 3 100
Correspondence 2010-03-09 1 15
Correspondence 2010-04-30 1 34
Correspondence 2010-05-11 1 31
Correspondence 2010-05-11 1 20
Correspondence 2010-06-01 2 78
Correspondence 2010-06-04 1 18
Correspondence 2010-06-04 1 14
Correspondence 2010-06-07 5 173
Fees 2011-01-05 1 202
Correspondence 2012-02-09 2 56