Language selection

Search

Patent 2340501 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent Application: (11) CA 2340501
(54) English Title: SYSTEM, METHOD, AND PROGRAM PRODUCT FOR AUTHENTICATING OR IDENTIFYING A SUBJECT THROUGH A SERIES OF CONTROLLED CHANGES TO BIOMETRICS OF THE SUBJECT
(54) French Title: SYSTEME, METHODE ET PROGICIEL D'AUTHENTIFICATION OU D'IDENTIFICATION D'UN SUJET PAR L'INTERMEDIAIRE D'UNE SERIE DE CHANGEMENTS CONTROLES APPORTES A LA BIOMETRIE DU SUJET
Status: Dead
Bibliographic Data
(51) International Patent Classification (IPC):
  • A61B 5/117 (2016.01)
  • A61B 5/1171 (2016.01)
  • G06F 17/40 (2006.01)
  • G06K 9/62 (2006.01)
(72) Inventors :
  • BOLLE, RUDOLF M. (United States of America)
  • DORAI, CHITRA (United States of America)
  • RATHA, NALINI K. (United States of America)
(73) Owners :
  • INTERNATIONAL BUSINESS MACHINES CORPORATION (United States of America)
(71) Applicants :
  • INTERNATIONAL BUSINESS MACHINES CORPORATION (United States of America)
(74) Agent: NA
(74) Associate agent: NA
(45) Issued:
(22) Filed Date: 2001-03-12
(41) Open to Public Inspection: 2001-09-28
Examination requested: 2003-08-12
Availability of licence: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): No

(30) Application Priority Data:
Application No. Country/Territory Date
09/546,834 United States of America 2000-03-28

Abstracts

English Abstract





This invention defines novel biometrics, called resultant biometrics. These
resultant biometrics are
a combination of traditional biometrics and some controlled change to the
traditional biometric. That
is, they are sequences of biometric signals over a short interval of time
where the signals are
modified according to some pattern. Resultant finger- or palm prints, for
example, are consecutive
print images where the subject exerts force, torque and/or rolling (controlled
change) over an image
acquisition interval of time. The physical way the subject distorts the images
is the behavioral part
of the resultant biometrics, the finger or palm print is the physiological
part of the resultant
biometric. An undistorted print image in combination with an expression of the
distortion trajectory
which can be computed from the sequence of distorted print images, forms a
more compact
representation of the resultant fingerprint. A template representing the
resultant print biometrics is
derived from the traditional template representing the finger- or palm print
plus a template
representing the trajectory. Other traditional biometrics also lend themselves
to temporal
modification and are described in the invention.



Claims

Note: Claims are shown in the official language in which they were submitted.





The embodiments of the invention in which an exclusive property or privilege
is claimed are defined
as follows:
1. A biometrics system comprising:
an acquisition device for acquiring a set of one or more biometrics from a
subject over a time
period along with a controlled change of one or more of the respective
biometrics; and
a storage process for storing and associating the biometric and the respective
controlled
change, the combined biometric and the respective controlled change over a
time period being a
resultant biometric.
2. A system, as in claim 1, where the biometrics include any one or more of
the following: a
physiological biometric, a behavioral biometric, a fingerprint, a face, a palm
print, an iris, a retina,
a foot print, a gait, a signature, a key stroke pattern, and a voice.
3. A system, as in claim 1, where the controlled change is performed by the
subject.
4. A system, as in claim 1, where the controlled change is induced by a
mechanism external to the
subject.
5. A system, as in claim 4, where the mechanism is any one or more of the
following: a light change,
a light frequency change, and a light intensity change.
6. A system, as in claim 1, where the controlled change includes any one or
more of the following:
a distortion to the biometric, a force, a pressure, a motion, a torque, a
frequency change, a gesture,
an energy change, a loudness, an accentuation, and a pattern.
7. A system, as in claim 1, where the biometric is a fingerprint and the
controlled change is any one
or more of the following: a finger motion, a figure torque, a finger pressure,
and a finger force.
20




8. A system, as in claim 1, where the biometric is a face and the controlled
change is a face motion.
9. A system, as in claim 1, where the biometric is a face and the controlled
change is a face
distortion.
10. A system, as in claim 1, where the biometric is a palm and the controlled
change is a palm
motion.
11. A system, as in claim 1, where the biometric is a voice and the controlled
change is any one or
more of the following: a loudness, a frequency, a pattern, and an intonation.
12. A system, as in claim 1, where the biometric is a gait and the controlled
change is any one or
more of the following: a stop pattern, a speed, a sway, a course, a carriage,
a hop, a skip, and a stride.
13. A system, as in claim 1, where the biometric is a signature and the
controlled change is any one
or more of the following: a slant, a loop, a stretch, a size, and a spacing.
14. A method, performed by a biometrics system, comprising the steps of:
acquiring a set of one or more biometrics from a subject over a time period
along with a
controlled change of one or more of the respective biometrics; and
storing and associating the biometric and the respective controlled change,
the combined
biometric and the respective controlled change being a resultant biometric.
15. A computer system comprising:
means for acquiring a set of one or more biometrics from a subject over a time
period along
with a controlled change of one or more of the respective biometrics; and
means for storing and associating the biometric and the respective controlled
change, the
combined biometric and the respective controlled change being a resultant
biometric.
21


16. A computer program product that performs the steps of:
acquiring a set of one or more biometrics from a subject over a time period
along with a
controlled change of one or more of the respective biometrics; and
storing and associating the biometric and the respective controlled change,
the combined
biometric and the respective controlled change being a resultant biometric.
22

Description

Note: Descriptions are shown in the official language in which they were submitted.



CA 02340501 2001-03-12
SYSTEM, METHOD, AND PROGRAM PRODUCT FOR AUTHENTICATING OR
IDENTIFYING A SUBJECT THROUGH A SERIES OF CONTROLLED CHANGES TO
BIOMETRICS OF THE SUBJECT
FIELD OF THE INVENTION
This invention relates to the field of biometrics, i.e., physiological or
behavioral characteristics of
a subject that more or less uniquely relate to the subject's identity. More
specifically, this invention
relates to a new type of biometrics which is produced by a subject through a
series of controlled
changes to a traditional biometrics.
BACKGROUND OF THE INVENTION
Fingerprints have been used for identifying persons in a semiautomatic fashion
for at least fifty years
for law enforcement purposes and have been used for several decades in
automatic authentication
applications for access control such as building access and computer login.
Signature recognition for
~ 5 automatically authenticating a person's identity has been used at least
for fifteen years, mainly for
banking applications. In an automatic fingerprint or signature identification
system, the first stage
is the signal acquisition stage where a subject's fingerprint or signature is
acquired. There are several
techniques to acquire fingerprints including scanning an inked fingerprint and
inkless techniques
using optical, capacitative and other semiconductor- based sensing techniques.
The acquired signal
2o is processed and matched against a stored template that is a machine
representation of the fingerprint.
The image processing techniques typically locate ridges and valleys in the
fingerprint and derive
templates from the ridge and valley pattern of a fingerprint image.
Signatures, on the other hand, are typically sensed through the use of
pressure sensitive writing pads
2s or with electro-magnetic writing recording devices. More advanced systems
use special pens that
compute the pen's velocity and acceleration. The recorded signal can be simply
a list of (x, y)
coordinates, in the case of static signature recognition, or the coordinates
can be a function of time
(x(t), y(t)) for dynamic signature recognition. The template representing a
signature is more directly
related to the acquired signal than a fingerprint template is. An example is a
representation of a
YOR9-2000-O 158


CA 02340501 2001-03-12
signature in terms of a set of strokes between extremes, where for each stroke
the acceleration is
encoded. For examples of signature authentication see
V. S. Nalwa, "Automatic on-line signature verification," Proceedings of IEEE,
pp. 215-239, Feb. 1997.
Recently, biometrics, such as fingerprints, signature, face, and voice are
being used increasingly for
authenticating a user's identity, for example, for access to medical dossiers,
ATM access, access to
Internet services and other such applications.
to
With the rapid growth of the Internet, many new e-commerce and e-business
applications are being
developed and deployed. For example, retail purchasing and travel reservations
over the Internet
using a credit card are very common commercial applications. Today, users are
recognized with a
userID and password, for identification and authentication, respectively. Very
soon, more secure
methods for authentication and possibly identification involving biometrics,
such as fingerprints,
signatures, voice prints, iris images and face images, will be replacing these
simple methods of
identification. An automated biometrics system involves acquisition of a
signal from the user that
more or less uniquely identifies the user. For example, for fingerprint- based
authentication a user's
fingerprint needs to be scanned and some representation needs to be computed
and stored.
Authentication is then achieved by comparing the representation extracted from
the user's newly
acquired fingerprint image with a stored representation extracted from an
image acquired at the time
of enrollment. In a speaker verification system a user's speech signal is
recorded and some
representations computed and stored. Authentication is then achieved by
comparing the
representation extracted from a speech signal recorded at access or logon time
with the stored
2s representation. Similarly, for signature verification, a template is
extracted from the digitized
signature and compared to previously computed templates.
Biometrics are distinguished into two broad groups: behavioral and
physiological biometrics.
Physiological biometrics, are the ones that are relatively constant over time,
such as, fingerprint and
YOR9-2000-O 158


CA 02340501 2001-03-12
iris. Behavioral biometrics, on the other hand, are subject to possibly
gradual change over time
and/or more abrupt changes in short periods of time. Examples of these
biometrics are signature,
voice and face. (Face is often regarded to be a physiological biometrics since
the basic features
cannot be changed that easily; however, aging, haircuts, beard growth and
facial expressions do
change the global appearance of a face.) The field of the present invention
relates to physiological
and behavioral biometrics, and more particularly, the invention relates
behavioral changes, that is,
a series of user-controlled changes, to physiological or behavioral biometrics
that can be used as a
new type of biometrics.
1 o One of the main advantages of Internet-based commerce/business solutions
is that they are accessible
from remote, unattended locations including users' homes. Hence, the
biometrics signal has to be
acquired from a remote user in an unsupervised manner. So, a fingerprint or a
palm-print reader, a
signature digitizer or a camera for acquiring face or iris images is attached
to the user's home
computer. This, of course, opens up the possibility of fraudulent unauthorized
system access
attempts. Maliciously intended individuals or organizations may obtain
biometrics signals from
genuine users by intercepting them from the network or obtaining the signals
from other applications
where the user uses her/his biometrics. The recorded signals can then be
reused for unknown,
fraudulent purposes such as to impersonate a genuine, registered user of an
Internet service. The
simplest method is that a signal is acquired once and reused several times.
Perturbations can be
2o added to this previously acquired signal to generate a biometrics signal
that looks "fresh." If the
complete fingerprint or palm print is known to the perpetrator, a more
sophisticated method would
be to fabricate from, for example, materials like silicone or latex, an
artificial ("spoof ')
three-dimensional copy of the forger or palm. Finger- and palm print images of
genuine users can
then be produced by impostors without much effort. A transaction server, an
authentication server
or some other computing device then has the burden of ensuring that the
biometrics signal
transmitted from a client is a current and live signal, and not a previously
acquired or otherwise
constructed or obtained signal. Using artificial body parts, many fingerprint
and palm-print readers
produce images that look very authentic to a lay person when the right
material is used to fabricate
these body parts. The images will, in many cases, also appear real to the
component image
YOR9-2000-O 158 3


CA 02340501 2001-03-12
processing parts of the authentication systems. Hence, it is very difficult to
determine whether the
static fingerprint or palm-print images are produced by a real finger or palm
or by spoof copies.
Other physiological biometrics suffer from the same limitations, the iris of a
genuine user can be
photographed and used for unauthorized access. A good iris recognition system
detects the rapid
fluctuations of the iris diameter, but even that phenomenon can be mimicked
with iris image
sequences. Similar methods, can of course be used for the face biometrics too.
PROBLEMS WITH THE PRIOR ART
Fingerprints and, to a lesser extent, palm prints are used more and more for
authenticating a user's
identity for access to medical dossiers, ATM access and other such
applications. A problem with this
prior art method of identification is that it is possible to fabricate three-
dimensional spoof
fingerprints or palm prints. Silicone, latex, urethane and other materials can
be used to fabricate these
artificial body parts and many image acquisition devices simply produce a
realistic looking
impression of the ridges on the artificial body parts which is hard to
distinguish from a real
impression. A contributing factor is that a fingerprint or palm-print
impression obtained is the static
depiction of the print at some given instant in time. The fingerprint in not a
function of time. S imilar
problems exist with biometrics like face and iris. A problem here is that
static two-dimensional or
three- dimensional spoof copies of the biometrics can be fabricated and used
to spoof biometric
security systems since these biometrics are not functions of time.
Another problem with the prior art is that only one static fingerprint or palm-
print image is collected
during acquisition of the biometrics signal. This instant image may be a
distorted depiction of the
ridges and valleys on the forger or palm because the user exerts force or
torque with the finger with
respect to the image acquisition device (fingerprint or palm-print reader). A
problem is that, without
collecting more than one image or modifying the mechanics of the sensor, it
cannot be detected
whether the image is acquired without distortion. An additional problem with
the prior art is that
there is only one choice for the image that can be used for person
identification. Of course, for
non-contact biometrics, like face and iris, distortion of the biometrics
pattern cannot be detected or
is very hard to detect.
YOR9-2000-O 158 4


CA 02340501 2001-03-12
Allen Pu and Demetri Psaltis
User identification through sequential input of fingerprints
US Patent Number 5933515, August 1999.
The method presented by Pu and Psaltis in their patent US 5933515 uses
multiple fingers in a
sequence which the user remembers and is known to the user only. If the
fingers are indexed, say,
from left to right as finger 0 through finger 9, the sequence is nothing more
than a PIN. If one would
consider the sequence plus the fingerprint images as a single biometric, the
sequence is a changeable
and non-static part of the biometric. However, it is not a series of
controlled changes to an existing
1 o biometric, the pattern of each of the fingers is not changed but rather
the pattern of the sequence can
be changed. A problem is that anyone can watch the fingerprint sequence,
probably easier than
observing PIN entry because fingerprint entry is a slower process. Moreover,
it requires storing each
of the fingerprints of the subject for comparison.
Another problem with the prior art is that in order to assure authenticity of
the biometrics signal, the
sensor (fingerprint or palm-print reader, face imaging camera) needs to have
embedded
computational resources for body part authentication and sensor
authentication. Body part
authentication is commonly achieved by pulse and body temperature measurement.
Sensor
authentication can be achieved with two-directional communication between the
sensor and the
2o authentication server in the form of a challenge and response question
session.
A potential big problem with prior art palm- and fingerprints is that if the
user somehow loses a
fingerprint or palm print impression or the template representing the print
and this ends up in the
wrong hands, the print is compromised forever since one cannot change prints.
Prints of other forgers
2s can then be used but that can only be done a few more times.
A problem with prior art systems that use static fingerprints is that there is
no additional information
associated with the fingerprint which can be used for its additional
discriminating power. That is,
individuals that have fingerprints that are close in appearance can be
confused because the
YOR9-2000-O 158 5


CA 02340501 2001-03-12
fingerprints are static and no additional information is available to
distinguish between these prints.
Traditional fingerprint databases may be searched by first filtering on
fingerprint type (loop,
whorl,...). A problem with this prior art is that there are few fingerprint
classes because the
fingerprint images are static snapshots in time and no additional information
is associated with the
fingerprints.
A final problem with any of the prior art biometrics is that they are not
backward compatible with
other biometrics. The use of, say, faces for authentication is not backward
compatible with
1 o fingerprint databases.
OBJECTS OF THE INVENTION
An object of the invention is a new type of biometrics which is produced by a
subject through a
series of controlled changes to an existing biometrics.
Another object of the invention is a biometric that is modified through a
series of user-controlled
changes, which has both a physiological and temporal characteristic.
Another object of the invention is a biometric that is modified through a
series of user-controlled
2o changes, which has both aphysiological, physical (e.g., force, torque,
linear motion, rotation), and/or
temporal characteristic.
An object of this invention is an efficient way to modify compromised
biometrics.
An object of this invention is a biometric that is modified through a series
of user-controlled changes,
a combination of a traditional biometrics with a user-selected behavioral
biometrics.
A further object of this invention is a biometric that is harder to produce
with spoof body parts.
YOR9-2000-O l 58


CA 02340501 2001-03-12
SUMMARY OF THE INVENTION
The present invention achieves these and other objectives by defining a new
class of biometrics,
called resultant biometrics, biometrics that are modified through a series of
user-controlled changes.
A biometric is a more or less unique characteristic associated with a person.
There exist
physiological and behavioral biometrics. Physiological biometrics are
characteristics that do not
change, or change very little, over time while behavioral biometrics are
characteristics which may
change over time, and may change abruptly, because they depend on a person's
mood, mental or
physical state. Examples of physiological biometrics are fingerprint, iris and
face; examples of
behavioral biometrics are voice and signature.
The biometrics introduced in this invention are a combination of physiological
or behavioral
biometrics signals produced by a subject by modifying (through a controlled
change over a time
period) the appearance of the physiological or behavioral biometrics using
behavioral biometrics
and/or physical elements. In a preferred embodiment, for resultant
fingerprints and palm prints, the
~ 5 appearance of the print is changed by changing force, torque and roll
while scanning the prints. This
results in a sequence of fingerprint or palm print images where the finger or
palm and hence the
impressions of the prints are continuously, elastically deformed according to
the force, torque and
roll, i.e., physical elements, and/or behavioral biometrics (e.g., the motion
of the signature or
gesture).
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 gives prior art examples of traditional biometrics.
Figure 2 shows a block diagram of an automated biometrics system for
authentication (Figure
2A) and a block diagram of an automated biometrics system for identification
(Figure 2B).
Figure 3 shows various possibilities for combining two biometrics at the
system level, where
Fig. 3A is combining through ANDing component, Fig. 3B is combining through
ORing component,
Fig. 3B is combining through ADDing and Fig. 3D is sequential combining.
Figure 4 is a generic block diagram conceptually showing the combining one
biometrics with
user action (another biometric) to obtain a biometric that is modified through
a series of
YOR9-2000-015 8 7


CA 02340501 2001-03-12
user-controlled changes.
Figure 5 is an example of a resultant fingerprint biometrics where the user
can rotate the
finger on the scanner according to a pattern.
Figure 6 is an example of a resultant fingerprint biometrics where the user
has four degrees
of freedom to move the finger on the scanner.
Figure 7 is an example of a resultant palm-print image sequence generation.
Figure 8 is an example of a resultant face image sequence generation.
Figure 9 shows a block diagram of the behavioral component extraction of a
resultant
biometric. As an example, the rotation extraction of resultant rotated
fingerprints of Fig. 5 is
1 o shown.
Figure 10 shows the local flow computation on a block by block basis from the
input
resultant fingerprint image sequence.
Figure 11 explains the computation of the curl or the spin of the finger as a
function of
time, which is the behavioral component of the resultant fingerprint.
IS
DETAILED DESCRIPTION OF THE INVENTION
This invention introduces a new biometric, called a resultant biometrics. A
resultant biometrics is
a sequence of consecutive physiological or behavioral biometrics signals
recorded at some sample
rate producing the first biometrics signal plus a second biometrics, the
behavioral biometrics, which
20 is the way the physiological or behavioral biometrics is transformed over
some time interval. This
transformation is the result of a series of user-controlled changes to the
first biometric.
Traditional biometrics, such as fingerprints, have been used for (automatic)
authentication and
identification purposes for several decades. Signatures have been accepted as
a legally binding proof
25 of identity and automated signature authentication/verification methods
have been available for at
least 20 years. Figure 1 gives examples of these biometrics. On the top-left,
a signature 110 is shown
and on the top-right a fingerprint impression 130 is shown. The bottom- left
shows a voice (print);
the bottom-right an iris pattern.
YOR9-2000-015 8 t3


CA 02340501 2001-03-12
Biometrics can be used for automatic authentication or identification of a
(human) subject. Typically,
the subject is enrolled by offering a sample biometric when opening, say, a
bank account or
subscribing to an Internet service. From this sample biometrics, a template is
derived that is stored
and used for matching purposes at the time the user wishes to access the
account or service. In the
present preferred embodiment, a template for a resultant biometric is a
combination of a traditional
template of the biometrics and a template describing the changing appearance
of this biometric over
time.
Resultant fingerprints and palm prints are described in further detail. A
finger- or palm print template
to is derived from a selected impression in the sequence where there is no
force, torque or rolling
exerted. The template of the trajectory is a quantitative description of this
motion trajectory over the
period of time of the resultant fingerprint. Matching of two templates, in
turn, is a combination of
traditional matching of fingerprint templates plus dynamic string matching of
the trajectories similar
to signature matching. This string matching is well known in the prior art.
Resultant fingerprints
t 5 sensed while the user only exerts torque are described in greater detail.
A biometric more or less uniquely determines a person's identity, that is,
given a biometric signal,
the signal is either associated with one unique person or narrows down
significantly the list of people
with whom this biometric is associated. Fingerprints are an excellent
biometrics, since never in
2o history two people with the same fingerprints have been found; on the other
hand, biometrics signals
such as shoe size and weight are poor biometrics signals since these signals
obviously have little
discriminatory value. Biometrics can be divided up into behavioral biometrics
and physiological
biometrics. Behavioral biometrics depend on a person's physical and mental
state and are subject to
change, possibly rapid change, over time. Behavioral biometrics include
signatures 110 and voice
2s prints 120 (see Fig. 1). Physiological biometrics, on the other hand, are
subject to much less
variability. For a fingerprint, the basic flow structure of ridges and
valleys, see the fingerprint 130
in Fig. l, is essentially unchanged over a person's life span. As an example
of another biometrics,
the circular texture of a subject's iris, 140 in Fig. l, is believed to be
even less variable over a
subject's life span. Hence, there exist behavioral biometrics, e.g., 110 and
120, which to a certain
YOR9-2000-015 8 9


CA 02340501 2001-03-12
extent are under the control of the subjects and there exist physiological
biometrics whose
appearance cannot be influenced (the iris 140) or can be influenced very
little (the fingerprint 130).
The signature and voice print on the left are behavioral biometrics; the
fingerprint and iris image on
the right are physiological biometrics.
Referring now to Fig. 2A. A typical, legacy prior-art automatic fingerprint
authentication system has
a fingerprint image (biometrics signal) as input 210 to the biometrics
matching system. This system
consists of three other stages 215, 220 and 225, comprising: signal processing
215 for feature
extraction, template extraction 220 from the features and template matching
225. Along with the
1 o biometrics signal 210, an identifier 212 of the subject is input to the
matching system. During the
template matching stage 225, the template associated with this particular
identifier is retrieved from
some database of templates 230 indexed by identities (identifiers). If there
is a Match/No Match
between the extracted template 220 and the retrieved template from database
230, a 'Yes/No' 240
answer is the output of the matching system. Matching is typically based on a
similarity measure,
if the measure is significantly large, the answer is 'Yes,' otherwise the
answer is 'No.' The following
reference describes examples of the state of the prior art:
N. K. Ratha, S. Chen and A. K. Jain,
Adaptive flow orientation based feature extraction in fingerprint images,
2o Pattern Recognition, vol. 28, no. 11, pp. 1657-1672, Nov. 1995.
Note that system 200 is not limited to fingerprint authentication, this system
architecture is valid for
any biometric. The biometric signal 210 that is input to the system can be
acquired either local to the
application on the client or remotely with the matching application running on
some server. Hence
architecture 200 applies to all biometrics and networked or non-networked
applications.
System 200 in Fig. 2A is an authentication system, system 250 in Fig 2B is an
identification system.
A typical, legacy prior-art automatic biometrics signal identification system
takes only a biometric
signal 210 as input (Fig. 2A). Again, the system consists again of three other
stages 215, 220 and
YOR9-2000-015 8 10


CA 02340501 2001-03-12
225, comprising: signal processing 215 for feature extraction, template
extraction 220 from the
features and template matching 225. However, in the case of an identification
system 250, only a
biometric signal 210 is input to the system. During the template matching
stage 225, the extracted
template is matched to all template, identifier pairs stored in database 230.
If there exists a match
between the extracted template 220 and a template associated with an identity
in database 230, this
identity is the output 255 of the identification system 250. If no match can
be found in database 230,
the output identity 255 could be set to NIL. Again, the biometric signal 210
can be acquired either
local to the application on the client or remotely with the matching
application running on some
server. Hence architecture 250 applies to networked or non-networked
applications.
Biometric signals can be combined (integrated) at the system level and at the
subject level. The latter
is the object of this invention. The former is summarized in Fig. 3 for the
purposes of comparing the
different methods and for designing decision methods for integrated subject-
level biometrics
(resultant biometrics). Four possibilities for combining (integrating) two
biometrics are shown:
~ 5 Combining through ANDing 210 (Fig. 3A), Combining through ORing 220 (Fig.
3B), Combining
through ADDing 230 (Fig. 3C) and serial or sequential combining 240 (Fig 3D).
Two biometrics BX
(250) and By (260) of a subject Zare used for authentication as shown in Fig.
3. However, more than
two biometrics of a subject can be combined in a straightforward fashion.
These biometrics can be
the same, e.g., two fingerprints, or they can be different biometrics, e.g.,
fingerprint and signature.
2o The corresponding matchers for the biometrics Bx and By, are matcher A 202
and matcher B 204 in
Fig. 3, respectively. These matchers compare the template of the input
biometrics 250 and 260 with
stored templates and either give a 'Yes/No' 214 answer as in systems 210 and
220 or score values,
S, (231) and SZ (233), as in systems 230 and 240.
25 System 210, combining through ANDing, takes the two 'Yes/No' answers of
matcher A 202 and
matcher B 204 and combines the result through the AND gate 212. Hence, only if
both matchers 202
and 204 agree, the 'Yes/No' output 216 of system 210 is 'Yes' (the biometrics
both match and
subject Z is authenticated) otherwise the output 216 is 'No' (one or both of
the biometrics do not
match and subject Z is rejected). System 220, combining through ORing, takes
the two 'Yes/No'
YOR9-2000-0158 11


CA 02340501 2001-03-12
answers of matchers A 202 and B 204 and combines the result through the OR
gate 222. Hence, if
one of the matchers' 202 and 204 'Yes/No' output 214 is 'Yes,' the 'Yes/No'
output 216 of system
220 is 'Yes' (one or both of the biometrics match and subject Z is
authenticated). Only if both
'Yes/No' outputs 214 of the matchers 202 and 204 are 'No,' the 'Yes/No' output
216 of system 220
is 'No' (both biometrics do not match and subject Z is rejected).
For system 230, combining through ADDing, matcher A 202 and matcher B 204
produce matching
scores S, (231 ) and SZ (233), respectively. Score S, expresses how similar
the template extracted from
biometrics BX (250) is to the template stored in matcher A 202, while score SZ
expresses how similar
1 o the template extracted from biometrics BY (260) is to the template stored
in matcher B 204. The
ADDer 232 gives as output the sum of the scores 231 and 233, S, + Sz. In 234,
this sum is compared
to a decision threshold T, if S, + SZ > T, 236, the output is 'Yes' and the
subject Z with biometrics
BX and By is authenticated, otherwise the output is 'No' {238) and the subject
is rejected.
t 5 System 240 in Fig. 3 combines the biometrics Bx (250) and By (260) of a
subject Z sequentially. The
first biometrics BX (250) is matched against the template stored in matcher A
(202) resulting in
matching score S, (231). The resulting matching score is compared to threshold
T, 244, and when
test 244 fails and the output is 'No' (238) the subject Z is rejected.
Otherwise biometrics BY (260)
is matched against the template stored in matcher B (204). The output score SZ
(233) of this matcher
20 is compared to threshold TZ 246. If the output is 'Yes,' i.e., SZ > TZ
(236) subject Z is authenticated.
Otherwise, when the output is 'No' 238, subject Z is rejected.
Figure 4 is a generic flow diagram for combining a biometrics with user
action, i.e., combining
biometrics at the subject level. The user action, just like the movement of a
pen to produce a
25 signature, is the second behavioral biometrics. The user 410 offers a
traditional biometric 420 for
authentication or identification purposes. Such a biometric could be a
fingerprint, iris or face.
However, rather than holding the biometric still, as in the case of
fingerprint or face, or keeping the
eyes open, as in case of iris recognition, the user performs some specific
action 430, a(t) with the
biometrics. This action is performed over time 432, from time 0 (434) to some
time T (436). Hence,
YOR9-2000-O 158 l2


CA 02340501 2001-03-12
the action a(t) is some one-dimensional function of time 430 and acts upon the
traditional biometric
420. Note that this biometric is the actual biometric of user 410 and not a
machine readable
biometrics signal (i.e., in the case of fingerprints, it is the three-
dimensional finger with the print on
it). It is specified what the constraints of the action 430 are but within
these constraints, the user 410
can define the action. (For example, constraints for putting a signature are
that the user can move the
pen over the paper in the x- and y-direction but cannot move the pen in the z-
direction.) That is, the
action 430 in some sense transforms the biometric of the user over time. It is
this transformed
biometric 450 that is input to the biometric signal recording device 460. The
output 470 of this
device is a sequence (series) of individually transformed biometrics signals
B(t) 480 from time 0
(434) to some time T (436). In the case of fingerprints, these are fingerprint
images, in the case of
face, these are face images. This output sequence 470, is the input 485 to
some extraction algorithm
490. The extraction algorithm computes from the sequence of transformed
biometrics the pair (a'(t),
B), 495, which is itself a biometric. The function a'(t) is some behavioral
way of transforming
biometric B over a time interval [0, T] and is related to the function a(t)
wich is chosen by the user
(very much like a user would select a signature). The biometrics B can be
computed from the pair
(a'(t), B), that is, where a(t) 430 is zero, where there is no action of the
user, the output 470 is
undistorted digitization of biometric 420. In general, it can be computed
where in the signal 480, the
biometrics 420 is not distorted.
2o Refer now to Fig. 5. This figure is an example of a resultant fingerprint
biometric where the user can
rotate the finger on the fingerprint reader 510 (without sliding over the
glass platen). This rotation
can be performed according to some user defined angle a as a function of time
a(t). An example of
producing a resultant fingerprint is given in Fig. 5. The user puts the forger
540 on the fingerprint
reader 510 in hand position 520. Then from time 0 (434) to time T(436), the
user rotates finger 540
over the glass platen of fingerprint reader 510 according to some angle a as a
function of time a(t).
The rotation takes place in the horizontal plane, the plane parallel to the
glass platen of the
fingerprint reader. The rotation function in this case is the behavioral part
of the resultant fingerprint
and is defined by the user. (If this portion of the resultant biometric is
compromised, the user can
redefine this behavioral part of the resultant fingerprint.) First the user
rotates by angle 550 to the
YOR9-2000-O 158 1 ~


CA 02340501 2001-03-12
left, to the hand position 525. Then the user rotates by angle 555 to the
right, resulting in final hand
position 530. During this operation over time interval [0, T], the fingerprint
reader has as output 470
a series of transformed (distorted) fingerprint images. This output 470 is a
sequence of transformed
biometrics 480 (fingerprints), as in Fig. 4, which are the input to the
extraction algorithm 490 (Fig.
4). This algorithm computes, given the output 470, the angle a as a function
of time a(t) 560 over
the time interval 0 (434) to time T (436). The resultant fingerprint in this
case is (a(t), F), with F the
undistorted fingerprint image. The undistorted fingerprint image is found at
times 434, 570 and 436
where the rotation angle a is zero. A preferred method for extracting the
rotation angles from the
distorted fingerprint images is described in Figs. 9-11.
YOR9-2000-O 158 14


CA 02340501 2001-03-12
Figure 6 is an example of a resultant fingerprint biometric where the user has
four degrees of
freedom, instead of one degree of freedom in Fig. 5, to move the finger on the
scanner. Again, as in
Fig. 5, the user has the ability to rotate the finger 540 around the z-axis
(the axis perpendicular to the
glass platen of the fingerprint reader 510). This is depicted by rotation a,
first along 550 to the left
and then along 555 to the right. This brings the hand position from 520 to 525
to the final hand
position 530. Also, at any given angle a, the user can perform a rotation /3
610 around the axis 620
of the finger. Finally, the user can exert a force f 630 parallel to the glass
platen of fingerprint reader
510. This force can be constrained to be only on the direction of the finger
632, or can be
unconstrained 634. In the former case, the degrees of freedom for moving the
finger (without sliding
t 0 over the glass platen) is three, in the latter there are four degrees of
freedom. The angles a, /~ plus
force f can be combined and referred to as motion m. During the user
operations over time interval
[0, TJ, the fingerprint reader has as output 470 a sequence of transformed
(distorted) fingerprint
images. This output 470 is a sequence of transformed biometrics 480
(fingerprints), as in Fig. 4,
which are the input to an extraction algorithm 490 (Fig. 4). This algorithm
computes, given the
output 470, the angles a and /3 as a function of time over the interval 0
(434) to time T (436). For
example, the function for the angle a, a(t) is the function 560 of Fig. 5.
Moreover, the algorithm
computes the force 630. In the case of force constrained to be along the
finger direction 632, f(t) will
be a one-dimensional function. For the case that the force may be exerted
along the glass platen of
reader 510 in any direction, f(t) will be a two-dimensional function. The
force will then have a
2o component in the x-direction and a component in the y-direction. The
resultant fingerprint for this
case is (a(t), /3(t), f(t) F) or (m (t), F) with F the undistorted fingerprint
image. The undistorted
fingerprint image is found at times 434, 570 and 436 where the reconstructed
motion m (690) is zero.
Figure 7 gives an example of the same principle as fingerprints for palm
prints. The palm print reader
710 with glass platen 720 can, for example, be mounted next to a door. Only
authorized users with
matching palm print templates will be allowed access. The user will put
his/her hand 730 on the palm
print reader platen 720. As with the resultant fingerprints of Figs. 6 and 7,
the user will not keep the
palm biometric still but rather make movements with the palm. In the case of
Fig. 7, rotation of the
palm around the axis perpendicular to the glass platen is the behavioral part
of the resultant
YOR9-2000-O 158 15


CA 02340501 2001-03-12
palm-print biometric. The user could, for instance, rotate the hand to the
right 740, followed by a
rotation of the hand to the left 744, followed by a rotation of the hand to
the right 748 again. As in
Fig. 5, during these operations over some time interval [0, T], the palm print
reader has as output a
sequence of transformed (distorted) palm print images. This output is a
sequence of transformed
biometrics 480 (palm prints), as in Fig. 4, which are the input to an
extraction algorithm 490 as in
Fig. 4. The algorithm computes, given the output of palm print reader 710, the
palm rotation angle
a as a function of time a(t) 560 over the time interval 0 (434) to time T
(436). The resultant palm
print in this case will be (a(t), P), with P the undistorted palm print image.
The undistorted palm
print image is found at times 434, 570 and 436 where the rotation angle a is
zero.
l0
Figure 8 describes a facial resultant biometric. Here subject 800 is posing in
front of a camera to be
identified or authenticated trough both recognition of the physiological face
biometrics plus an
additional behavioral component. This behavioral component is introduced by
head motion of the
subject. This motion produces a sequence of face images as a function of time,
Face-Image(t). When
the subject's face is in canonical position, the head is embedded in
coordinate system 805 with the
Y-Axis 810 along the length of the head. The X-axis 820 is parallel to the
line connecting the ears,
while the Z-Axis 830 is parallel to the perpendicular to the frontal plane of
the face. The subject now
can generate a resultant biometric, Face-Image(t), by panning the face around
the Y-Axis, resulting
in a pan 840 as a function of time, Pan(t). The subject further can tilt 850
the face by bending the
2o head in the plane 860 that is spanned by the Y-Axis 810 and the pan
direction. This tilting 850 results
in another function of time, Tilt(t). Hence, the resultant biometric in this
case is a face image at some
time, t~,, Face-Image(t~,), a frontal depiction of the face, plus the pan and
tilt, Pant) and Tilt(t),
respectively. The face images in the sequence are mathematical transformations
of the image of the
face. Other distortions of the face image through other means are envisioned
by the present
invention.
In Fig. 9, a block diagram 900 of a generic process for extracting the
behavioral component from a
resultant biometric is given. The input 910 is a sequence of biometric signals
B(t). In block 920,
subsequent biometric signals, B(t+1) and B(t), are processed through inter-
signal analysis. Block
YOR9-2000-O 158 1 ~


CA 02340501 2001-03-12
930, uses this analysis to extract the change, a(t+ 1 ) - a(t), in the
behavioral component. In turn, this
gives the output a(t) as a function of time 940, where a(t) is the behavioral
component of the resultant
biometric B(t). Added in Fig. 9 are the specific steps (inter-image flow
analysis and affine motion
parameter estimation) for estimating the forger rotation from a sequence of
distorted fingerprint
s images produced as in Fig. 5. These are further detailed in Figs. 10 and 11.
Rotation from one fingerprint image to the next can be estimated using the
steps illustrated in Fig.
10. The images, B(t) and B(t+1), 1010 and 1015, are divided up into 16 x 16
blocks 1020, 1022,
1024, ..., 1028 as given by the MPEG compression standard. Given a fingerprint
image sequence
B(t), of which two images (1010 and 1015) are shown in Fig. 10, the inter-
image flow (u, v) 1040
for each block (of size 16 x 16) 1030 present in an image is computed. This
represents the motion
that may be present in any image B(t) 1010 with respect to its immediate next
image B(t+1) 1015
in the sequence. A flow characterization [u(x,y), v(x,y)] 1050 as a function
of (x, y) 1060 and t 1070
of an image sequence is then a uniform motion representation amenable for
consistent interpretation.
~ 5 This motion representation 1050 can be computed from the raw motion
vectors encoded in the
MPEG-1 or MPEG-2 image sequences. If the input is uncompressed, the flow field
can be estimated
using motion estimation techniques known in the prior art.
The following references describe the state of the prior art in MPEG
compression, an example of
2o prior art optical flow estimation in image sequences, and an example of
prior art of directly
extracting flow from MPEG-compressed image sequences respectively:
B.G. Haskell, A. Puri and A.N. Netravali,
Digital Video: An introduction to MPEG-2,
25 Chapman and Hill, 1997.
J. Bergen, P. Anandan, K. Hanna and R. Hingorani,
Hierarchical model-based motion estimation,
Second European Conference on Computer Vision, pp. 237-252, 1992.
YOR9-2000-015 $ 17


CA 02340501 2001-03-12
Chitra Dorai and Vikrant Kobla,
Extracting Motion Annotations from MPEG-2 Compressed Video for HDTV Content
Management Applications,
IEEE International Conference on Multimedia Computing and Systems, pp.673-678,
1999.
Refer now to Fig. 11. By examining the flow [u(x,y), v(x,y)] 1 OSO in the
blocks 1020, 1022, 1024,...,
1028, a largest connected component of zero-motion blocks, pictured by pivotal
region 1110 in Fig.
11 is determined. Further analysis is performed on the flow around this
region. Using the flow
computed for each image in the given image sequence, motion parameters from
the fingerprint region
are computed by imposing an affine motion model on the image-to-image flow and
sampling the
non-zero motion blocks radially around the bounding box 1120 of region 1110.
Affme motion A
1130 can transform shape 1140 into shape 1145 in Fig 11B and quantifies
translation 1150, rotation
1152 and shear 1154 due to image flow. Six parameters, a, ... a6 are estimated
in this process, where
a, and a4 correspond to translation, a~ and a5 correspond to rotation, and a2
and ab correspond to shear.
These parameters are estimated for each sampling around bounding box 1120.
Average curl is
computed in each frame t, C(t) _ - a~+ as. The curl in each frame
quantitatively provides the extent
of rotation, or the spin of the finger skin around the pivotal region. That
is, an expression C(t) of the
behavioral component of the resultant fingerprint computed from flow vectors
[u(x,y), v(x,y)] 1050
is obtained. The magnitude of the average translation vector, T(t)= (a,, a4)
of the frame is also
computed.
For all the resultant biometrics discussed and envisioned, we have a
traditional behavioral or
physiological biometric. For representation (template) purposes and for
matching purposes of that
part of resultant biometrics, these traditional biometrics are well understood
in the prior art. (See, the
above Ratha, Chen and Jain reference for fingerprints.) For the other part of
the resultant biometrics,
the behavioral part, we are left with some one-dimensional or higher-
dimensional function a(t) of
time, a user action. Matching this part amounts to matching this function a(t)
with a stored template
YOR9-2000-015 8 18


CA 02340501 2001-03-12
a'(t). Such matching is again well-understood in the prior art and is
routinely done in the area of
signature verification. The following reference gives examples of approaches
for matching.
V. S. Nalwa, "Automatic on-line signature verification," Proceedings of IEEE,
s pp. 215-239, Feb. 1997.
Now the resultant biometric, after matching with a stored template has either
two 'Yes/No' (214 in
Fig. 3) answers or two scores S, and Sz (231 and 233 in Fig. 3). Any of the
methods for combining
the two biometrics discussed on Fig. 3 can be used to combine the traditional
and user-defined
1 o biometrics of a resultant biometric to arrive at a matching decision.
YOR9-2000-O 158 I c~

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date Unavailable
(22) Filed 2001-03-12
(41) Open to Public Inspection 2001-09-28
Examination Requested 2003-08-12
Dead Application 2006-03-13

Abandonment History

Abandonment Date Reason Reinstatement Date
2005-03-14 FAILURE TO PAY APPLICATION MAINTENANCE FEE

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Registration of a document - section 124 $100.00 2001-03-12
Application Fee $300.00 2001-03-12
Maintenance Fee - Application - New Act 2 2003-03-12 $100.00 2003-01-03
Request for Examination $400.00 2003-08-12
Maintenance Fee - Application - New Act 3 2004-03-12 $100.00 2003-12-22
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
INTERNATIONAL BUSINESS MACHINES CORPORATION
Past Owners on Record
BOLLE, RUDOLF M.
DORAI, CHITRA
RATHA, NALINI K.
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Representative Drawing 2001-09-13 1 6
Abstract 2001-03-12 1 35
Description 2001-03-12 19 991
Claims 2001-03-12 3 89
Drawings 2001-03-12 12 167
Cover Page 2001-09-20 1 47
Assignment 2001-03-12 7 299
Prosecution-Amendment 2003-08-12 1 39