Language selection

Search

Patent 2264029 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent Application: (11) CA 2264029
(54) English Title: APPARATUS FOR THE IRIS ACQUIRING IMAGES
(54) French Title: APPAREIL DE PRISE D'IMAGES D'IRIS
Status: Dead
Bibliographic Data
(51) International Patent Classification (IPC):
  • A61B 3/14 (2006.01)
  • A61B 5/1171 (2016.01)
  • G06K 9/20 (2006.01)
(72) Inventors :
  • NEGIN, MICHAEL (United States of America)
  • CHMIELEWSKI, THOMAS A., JR. (United States of America)
  • SAINSBURY, ROBIN (United States of America)
  • SALGANICOFF, MARCOS (United States of America)
  • HANNA, KEITH JAMES (United States of America)
  • MANDELBAUM, ROBERT (United States of America)
  • MISHRA, DEEPAM (United States of America)
(73) Owners :
  • SENSAR, INC. (United States of America)
  • SARNOFF CORPORATION (Not Available)
(71) Applicants :
  • SENSAR, INC. (United States of America)
  • THE SARNOFF CORPORATION (United States of America)
(74) Agent: DIMOCK STRATTON LLP
(74) Associate agent:
(45) Issued:
(86) PCT Filing Date: 1997-08-22
(87) Open to Public Inspection: 1998-03-05
Examination requested: 1999-02-24
Availability of licence: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): Yes
(86) PCT Filing Number: PCT/US1997/014873
(87) International Publication Number: WO1998/008439
(85) National Entry: 1999-02-24

(30) Application Priority Data:
Application No. Country/Territory Date
08/702,923 United States of America 1996-08-26

Abstracts

English Abstract




AT LEAST ONE WIDE FIELD OF VIEW CAMERA (3) AND WITH AN ASSOCIATED ILLUMINATOR
(6, 10) OBTAINS SUFFICIENT IMAGES OF A PERSON TO BE IDENTIFIED SO THAT X, Y, Z
COORDINATES CAN BE ESTABLISHED FOR THE EXPECTED POSITION OF THAT PERSON'S EYE.
THE COORDINATES ARE USED TO DIRECT A NARROW FIELD OF VIEW CAMERA (16) AND
ASSOCIATED ILLUMINATORS (21-23) TO TAKE AN IMAGE OF THE EYE (76, 78) THAT CAN
BE USED FOR TO IDENTIFYING THE PERSON USING IRIS VERIFICATION AND RECOGNITION
ALGORITHMS. THESE ILLUMINATORS ARE POSITIONED AND ILLUMINATED TO ELIMINATE OR
MINIMIZE SPECULARITIES AND REFLECTIONS THAT OBSCURE THE IRIS (76).


French Abstract

AU MOINS UNE CAMéRA (3) à LARGE CHAMP DE VISION ET UN ILLUMINATEUR ASSOCIé (6, 10) PERMETTENT D'OBTENIR DES IMAGES D'UNE PERSONNE D'UNE QUALITé SUFFISAMMENT BONNE POUR POUVOIR IDENTIFIER CETTE DERNIèRE. DES COORDONNéES X, Y ET Z PEUVENT ALORS êTRE éTABLIES DE MANIèRE à CORRESPONDRE à LA POSITION ATTENDUE DE L'OEIL DE LADITE PERSONNE. CES COORDONNéES SONT UTILISéES POUR DIRIGER UNE CAMéRA (16) à CHAMP DE VISION éTROIT ET LES ILLUMINATEURS (21-23) QUI Y SONT ASSOCIéS DE FAçON à PRENDRE UNE IMAGE DE L'OEIL (76, 78) QUI PUISSE êTRE UTILISéE POUR IDENTIFIER LADITE PERSONNE AU MOYEN D'ALGORITHMES DE VéRIFICATION ET DE RECONNAISSANCE DE L'IRIS. CES ILLUMINATEURS SONT PLACéS ET ILLUMINéS DE MANIèRE à éLIMINER, OU AU MOINS RéDUIRE AU MINIMUM LES RéFLEXIONS, SPéCULAIRES OU AUTRES, QUI OBSCURCISSENT L'IRIS (76).

Claims

Note: Claims are shown in the official language in which they were submitted.




23

We Claim:
1. An apparatus for acquiring images of irises comprising:
at least one camera positioned to take an image of an eye so that the
image will contain a representation of the iris which is of sufficient resolution to be
used for iris verification and identification; and
at least one illuminator positioned to illuminate the iris and comprised
of a plurality of light emitting elements which can be selectively illuminated and
concurrently illuminated.

2. The apparatus of claim 1 also comprising a second illuminator
positioned to illuminate the iris, the second illuminator being positioned apart from the
first illuminator.

3. The apparatus of claim 2 wherein the second illuminator is comprised
of a plurality of light emitting elements which can be selectively illuminated.

4. The apparatus of claim 1 wherein light is reflected from the iris to the
at least one camera along a camera axis and the light travels from the at least one
illuminator along a path which intersects the camera axis.



24.

5. The apparatus of claim 1 wherein the at least one illuminator is
comprised of at least one array of light emitting diodes mounted on a base.

6. The apparatus of claim 5 wherein the array of light emitting diodes is
comprised of:
a. a first set of light emitting diodes which is attached to the base in a
manner to emit light along a first path; and
a second set of light emitting diodes which is attached to the base in a
manner to emit light along a second path which is not parallel to the first path.

7. The apparatus of claim 5 wherein the first set of light emitting diodes
and the second set of light emitting diodes can be separately illuminated.

8. The apparatus of claim 5 also comprising an array bracket to which
the base of the at least one array of light emitting diodes is pivotally attached for
rotation about an array axis.

9. The apparatus of claim 8 also comprising a second bracket to which
the array bracket is movably attached in a manner to permit movement of the array
along a line normal to the array axis.



25.

10. The apparatus of claim 5 wherein at least some of the light emitting
diodes emit light of a wavelength which is different from light wavelengths emitted
from other light emitting diodes.

11. The apparatus of claim 1 wherein the at least one illuminator can
emit at least one of infrared light, visible light, near infrared light, a select band of light
frequencies, and both visible light and infrared light.

12. The apparatus of claim 1 the at least one illuminator can emit light
of varying intensity.

13. The apparatus of claim 12 also comprising a power source
connected to the at least one illuminator which can emit light of varying intensity and a
controller connected to the power source for changing power output to the at least one
illuminator thereby changing intensity of the light which is emitted by that illuminator.

14. The apparatus of claim 1 wherein the at least one of the illuminators
can emit different wavelengths of light.

15. The apparatus of claim 2 also comprising a controller connected to
the at least one illuminator and the second illuminator, the controller containing a



26.

program for selectively illuminating the illuminators according to a dynamically
predetermined pattern.

16. The apparatus of claim 1 also comprising a wide field of view
camera positioned to take an image which includes the eye.

17. The apparatus of claim 16 also comprising a wide field of view
illuminator.

18. The apparatus of claim 17 wherein the wide field of view
illuminator is comprised of a ring of light emitting elements arranged around the wide
field of view camera.

19. The apparatus of claim 17 also comprising a hood surrounding the
wide field of view camera to prevent light from the illuminators from directly entering
the camera.

20. The apparatus of claim 1 also comprising a power source connected
to the at least one of the illuminators and a timer connected to the power supply which
timer turns off the power source after a selected period of time.



27.

21. The apparatus of claim 1 also comprising an ambient light sensor
and a controller connected to the ambient light sensor and the at least one illuminator
the controller containing a program for shutting off the illuminators when the ambient
light is at a predetermined level.

22. The apparatus of claim 1 also comprising a motor connected to the
at least one illuminator.

23. The apparatus of claim 1 also comprising an optical system which
receives light reflected from the iris and directs the light to the at least one camera.

24. The apparatus of claim 1 wherein the at least one camera is movable.

25. A method for acquiring an image of an iris comprising the steps of:
locating a three dimensional coordinate position of an eye containing the
iris to be imaged;
positioning a camera so that at least some light reflected from the iris
will be reflected to the camera;
illuminating the eye with at least one illuminator so that light is reflected
from the iris to the camera along a camera axis and the light travels from the at least
one illuminator along at least one path which intersects the camera axis wherein the at



28.

least one illuminator is comprised of a plurality of light emitting elements which can be
selectively illuminated; and
creating at least one image of the iris with the camera during
illumination which image is of sufficient resolution to be used for iris verification and
identification.

26. The method of claim 25 wherein the at least one illuminator is
comprised of a first illuminator and a second illuminator which are illuminated
sequentially and the camera creates a first image and a second image during the
sequential illumination which images are used to create the image of sufficient
resolution to distinguish among identifying features within the iris.

27. The method of claim 25 wherein the at least one illuminator is
comprised of two sets each set containing a plurality of light emitting elements which
sets are selectively and sequentially illuminated.

28. The method of claim 25 wherein the at least illuminator can emit at
least one of infrared light, visible light, near infrared light, a select band of light
frequencies, and both visible light and infrared light.



29.

29. The method of claim 25 wherein the image is comprised of a
plurality of pixels and that portion of the image which contains the iris is comprised of
at least 200 pixels.

30. The method of claim 25 wherein the three dimensional coordinate
position of the eye is located by:
a. using a first wide field of view camera to create a first image
of a region in which the eye is believed to be located;
b. using a second wide field of view camera spaced apart from
the first wide field of view camera to create a second image of a region in which
the eye is believed to be located; and
c. combining the first image and the second image in a manner
to establish the three dimensional coordinate position of the eye.

31. The method of claim 30 wherein the images are combined using
stereographic image analysis.

32. The method of claim 30 also comprising the step of illuminating the
region using nearly on axis illumination.

33. The method of claim 25 wherein the camera is mounted within an
optical subsystem containing a pan/tilt mirror from which reflected light is directed



30.

to the camera and also comprising the step of adjusting the pan/tilt mirror toward
the three dimensional coordinate position of the eye to direct light reflected from
the iris to the camera.

34. The method of claim 25 wherein the camera is gimbal mounted and
also comprising the step of adjusting the camera toward the three dimensional
coordinate position of the eye to direct light reflected from the iris to the camera.


Description

Note: Descriptions are shown in the official language in which they were submitted.

?WO 98/08439CA 02264029 1999-02-24PCT/US97/14873TITLEAPPARATUS FOR THE IRIS ACQUIRING IMAGESBACKGROUND OF THE INVENTION1. ield of the InventionThe invention relates to a method and apparatus for illuminating the eyeto obtain an image of the iris.2. Bac roun ofth Inv 'onThere are several methods known as biometrics for recognizing oridentifying an individual. These methods include analyzing a signature, obtaining andanalyzing an image of a ?ngerprint and imaging and analyzing the retinal vascularpatterns of a human eye. Recently the art has used the iris of the eye which contains ahighly detailed pattern that is unique for each individual and stable over many yearsas a non-contact. non-obtrusive biometric. This technique is described in United StatesPatent No. 4,641,349 to F lom et al. and United States Patent No. 5,291,560 toDaugman. The systems described in these references require the person beingidenti?ed to hold at least one of their eyes in a ?xed position with respect to an imagingcamera which takes a picture of the iris. While this procedure is satisfactory for someapplications, it is not satisfactory for quick transactional activities such as using anautomated teller machine, unobtrusive access control or automated dispensing. Otherexamples are immigration control, point of sale veri?cation, welfare check dispensing,internet banking. bank loan or account opening and other ?nancial transactions.?CA 02264029 1999-02-24WO 98/08439 PCT/U S97/ 14873The iris identi?cation techniques disclosed by Flom and Daugmanrequire a clear, well-focused image of the iris portion of the eye. Once that image isobtained a comparison of that image with a coded ?le image of the iris of the person tobe identi?ed can be accomplished quite rapidly. However, prior to the presentinvention there has not been an optical system which could rapidly acquire asuf?ciently clear image of an iris of the person to be identi?ed unless that personpositioned his eye in a ?xed position relatively close to an imaging camera. There is aneed for a system which will rapidly obtain a clear picture of the iris of a person oranimal remotely from the optical system and in an uncertain position. This systemwould be particularly useful to identify users of automated teller machines as well asindividuals seeking access to a restricted area or facility or other applications requiringuser identi?cation. The system could also be used to identify patients, criminalsuspects and others who are unable or unwilling to be otherwise identi?ed.Automated teller machines, often called ATMS, are widely used forbanking transactions. Users are accustomed to receiving relatively fast veri?cation oftheir identity after inserting their identi?cation card and entering an identi?cationnumber. However, anyone who knows the identi?cation number associated with agiven card can use that card. Should a robber learn the identi?cation number bywatching the owner use the card, ?nding the number written on the card or otherwise,he can easily draw funds from the 0wner’s account. Consequently, banks have beensearching for other more reliable ways of verifying the identity of ATM users.?CA 02264029 1999-02-24wo 98/08439 PCT/US97/14873Since the iris identi?cation methods disclosed by Flom et al. haveproved to be very reliable, the use of iris identi?cation to verify the identity of ATMusers and other remote user recognition or veri?cation application has been proposed.However, for such use to be commercially available, there must be a rapid, reliable andunobtrusive way to obtain iris images of sufficient resolution to permit veri?cation andrecognition from an ATM user standing in front of the teller machine. To require theuser to position his head a predetermined distance from the camera, such as by using aneyepiece or other fixture or without ?xturing is impractical. Thus, there is a need for asystem which rapidly locates the iris of an ATM user and obtains a quality image of theiris that can be used for verification and identification. This system should be suitablefor use in combination with an access card or without such a card. The system shouldalso be able to obtain such an image from users who are wearing eyeglasses or contactlenses or ski masks or other occluding apparel.SUMMARY OF THE INVENTIONWe provide a method and apparatus which can obtain a clear image ofan iris of a person to be identi?ed whose head is located in front of the portion of ouroptical system which receives light re?ected from the iris. The system includes at leastone camera with or without ambient illumination and preferably at least one or moreilluminators. We also prefer to provide a pan/tilt mirror, or gimbal device and at leastone lens. Light re?ected from the subject is captured by the gimbaled camera or mirrorand directed through the lens to the camera. In a preferred embodiment, a narrow ?eldof view (N FOV) camera receives the light reflected from the pan/tilt mirror through the, , ..,.,............_................u..............«......~-~»~ ——?CA 02264029 1999-02-24wo 98/08439 PCT/US97/14873lens or directly via a gimbaled mounted camera. A second camera and preferably athird camera are provided to obtain a wide ?eld of view (WFOV) image of the subject.In some cases, the WFOV cameras may be super?uous, if the user is always known tobe in the ?eld of view of the NF OV camera or could be located by moving the NFOVcamera. Images from these WFOV cameras are processed to determine the coordinatesof the speci?c location of interest, such as the head and shoulders and the iris of aperson to be identi?ed. Based upon an analysis of those images the pan/tilt mirror orgimbal is adjusted to receive light re?ected from the iris or other area of interest anddirect that re?ected light to a narrow ?eld of view camera. That camera produces animage of suf?cient quality to permit iris identi?cation.The preferred embodiment contains a wide ?eld of view illuminatorwhich illuminates the face of the person to be identified. The illuminator preferablycontains a plurality of infrared light emitting diodes positioned around the lens of thewide ?eld of view camera or cameras.We also prefer to provide two or more narrow ?eld of view illuminatorseach comprised of an array of light emitting diodes. These arrays are mounted so as tobe rotatable about both a horizontal axis and a vertical axis. By using at least twoarrays we are able to compensate for specular re?ection and re?ection from eyeglassesor contact lenses or other artifacts which obscure portions of the iris.We further prefer to construct the arrays so that one set of light emittingdiodes have center lines normal to the base, a second set of light emitting diodes havecenterlines at an acute angle to the base, and a third set of light emitting diodes have?CA 02204029 1999-02-24wo 93/03439 PCT/US97/14873centerlines at an obtuse angle relative to the base. This provides the array a wider fieldof illumination. We further prefer to provide a control system which enables us toseparately illuminate each group of light emitting diodes. The control system maypermit selective activation of individual diodes. An alternative is to provide a singleillumination that is directed in a coordinated manner with the image steering device.An image processor is provided to analyze the images from the wide?eld of view camera and thereby specify the location of a point or area of interest onthe object or person being identi?ed. A preferred technique for identifying the positionof the user is stereographic image analysis. Alternatively, visible or non—visible rangeimaging or distance ?nding devices such as ultrasonic. radar, spread spectrummicrowave or thermal imaging or sensing or other optical means could be used.The present system is particularly use?il for verifying the identity ofusers of automated teller machines. The system can be readily combined with mostconventional automated teller machines and many other financial transaction machines.Image acquisition and identification can generally be accomplished in less than fiveseconds and in less than two seconds in many cases.Other configurations of cameras such as one NF OV camera, one WFOVand one NF OV camera, two NF OV cameras, multiple NFOV cameras and multipleWFOV cameras can be utilized for other special purpose applications such as more orless restricted movement or position scenarios. For example, iris imaging in atelephone booth or for a telephone hands free use, multiple iris imaging in a crowd of?CA 02264029 1999-02-24W0 93/08439 PCT/US97/14873people, iris imagining of people in a moving or stationary vehicle, iris imaging of a racehorse, or a point of sale site use.Other objects and advantages will become apparent from a description ofcertain present preferred embodiments shown in the drawings.BRIEF DESQRJPTION OF THE FIQURESFigure 1 is a front view of a present preferred embodiment of our devicefor obtaining images of irises.Figure 2 is a side View of the embodiment of Figure 1.Figure 3 is a top plan view of a ?rst present preferred embodiment of ournarrow field of view illuminator.Figure 4 is a side view of the illuminator of Figure 3 with the area ofillumination shown in chainline.Figure 5 is a side view similar to Figure 4 of a second present preferredembodiment of our narrow ?eld of view illuminator.Figure 6 is top view of a ?rst present preferred illuminator bracket.Figure 7 is a side view showing the illuminator bracket of Figure 6 withthe position of the illuminator shown in chainline.Figure 8 is top view of a second present preferred illuminator bracketFigure 9 is a side view showing the illuminator bracket of Figure 8 withthe position of the illuminator shown in chainline.Figure 10 is a block diagram showing a preferred control architecture forthe embodiment of Figure l.?CA 02264029 1999-02-24WO 98/08439 PCT/US97/14873Figure 11 is a front view showing the eyes and glasses of the person tobe identi?ed on which a re?ection of the wide ?eld of view illuminator appears.Figure 12 is a front view showing the eyes and glasses of the person tobe identi?ed on which a re?ection from the narrow ?eld of view illuminators appears.DESCRIPTION OF THE PREFERRED EMBODIMENTSReferring to Figures 1 and 2 a present preferred embodiment of ourdevice is contained in housing 1 which is 14.5 inches wide, 15 inches high, and 18inches deep. A housing of this size can be easily placed within or near an automatedteller machine or other limited access machine or entry place. When incorporatedwithin or placed near the housing of an automated teller machine our unit is locatedbehind a light transmissive bezel. Typically, the bezel would be smoked or othervisually opaque glass or a comparable plastic which obscures our device from beingeasily seen. Our device would be positioned so as to be about eye level of most users.In Figure 2 we show the head of a person to be identi?ed user in front of our device.In the preferred embodiment of our device shown in Figures 1 and 2, weprovide two wide ?eld of view (WFOV) cameras 3 each having a lens 2 and 4. Theorientation and placement of lens 2 and 4 and other components may be changed toaccommodate available space or other applications. A wide ?eld of view illuminator 6surrounds the lens. Hoods 5 and 7 are provided around the lens 2 and 4 to prevent lightemitted from the illuminator 6 from passing directly into the camera. The wide ?eld ofview illuminator is comprised of sets of light emitting diodes 10. For ease of?CA 02264029 1999-02-24WO 98/08439 PCT/US97/14873construction, these sets of diodes may be mounted on small circuit boards 12. Theseboards are then mounted on a housing 13 which surrounds the lens 2 and 4. Asuf?cient number of LED containing circuit boards 12 are provided and positioned toilluminate the head of the person to be identi?ed who is standing in front of our deviceas shown in Figure 2. Consequently, we prefer that the wide ?eld of view illuminatorprovide a ?eld of illumination which encompasses a region of about two feet indiameter at a distance of about one foot from the illuminator. This ?eld of illuminationis indicated by the solid lines extending from the wide ?eld of view illuminator 6 inFigure 2.Portions of the wide ?eld of view illuminator are positioned around theWFOV camera to provide nearly on-axis illumination. On axis illumination, nearly onaxis illumination, and oblique illuminator allow surface features to be imaged withminimal shadows which can generate false edges or other artifacts. Such illuminationof the eye produces a shadow free image with good surface features. This type oflighting may cause the pupil to be bright making the iris easier to locate although thisfeature can be disabled if desired. Any shadows produced by other light sources andcamera angles are minimized or washed out. Illumination control of this type can beused to stimulate parasympathetic autonomic nervous system re?exes that cause eyeblinks or pupil variation or other reactions. These changes may be useful to determinesubject awareness to establish certain life signs and to reduce pupil size for improvingimaging resolution.?CA 02264029 1999-02-24wo 98/08439 PCT/US97/14873Light from the wide ?eld of view illuminator is re?ected from the user'sface into the lens of the wide ?eld of view camera lens 2 and 4. This enables theWFOV cameras to create images from which an x, y, z coordinate location can bedetermined for one of the user's eyes. The WFOV cameras may also be used to providesecurity video images for monitoring transactions or other activities. We take an imageof the right eye. However, either the left eye or the right eye or both eyes can beselected. The WFOV cameras could utilize a number of techniques such as stereo,stereo with structured light, and depth from focus with or without structured light todetermine both the x-y location and distance to the object of interest. We prefer to usestereo processing techniques which compare at least a portion of the images from thetwo wide ?eld of view cameras to determine the x, y, 2 coordinate locations. We canalso provide a gaze director 9 which assists in identifying the location and motion of theeye, by attracting the attention of the user.After we know the position of the selected eye we must illuminate theiris portion of the eye and obtain an iris image which can be used for iris veri?cationand recognition. We prefer to be able to obtain an image having approximately 200pixels across the iris portion of the image so that the image recognition algorithms usedto perform identification and veri?cation can operate reliably.The iris recognition algorithms compare features of the iris in the imagewith the same features in a file image. Veri?cation is considered to have been madewhen there is a match of a predetermined number of the compared features. For somesituations a match of at least 75% of the compared features may be required. It is?CA 02264029 1999-02-24wo 98/08439 PCT/US97/1487310.therefore important that there be no specularities, spurious light re?ections or darkshadows covering a signi?cant portion of the image. When the user is wearing glasses,this can easily occur. To overcome this problem we provide at least two spaced apartilluminators for use in conjunction with the NFOV camera. These illuminators inconjunction with the WFOV camera may be selectively switched to an orchestratedcombination to enhance these image structures. For example, a structured specularitycan be created on eyeglasses to make eye ?nding easier, and then disabled for irisimage acquisition.In the embodiment shown in Figure 1 a single NFOV camera 16 ispositioned behind mirror 18. Light emitted from one or more of the NF OVilluminators 21, 22 or 23 is re?ected from the selected eye to an optical subsystem 30which directs the re?ected light to the NFOV camera. We can provide a sensor 14which senses the level of ambient light surrounding the person to be identi?ed.Information from this sensor can be used to determine what, if any, illumination mustbe provided by the illuminators 21,22 and 23. Alternatively, any one or a plurality ofcameras themselves may be used for such light sensing.The optical subsystem 30 in the embodiment shown in Figure 1 containsa pan/tilt mirror attached to rod 33 which extends from motor 34. This enables thepan/tilt mirror to be rotated about a tilt axis corresponding to a centerline through rod33. Motor 34 is mounted on arm 35 which is pivotably attached to base 36 by rod 37.This arm 35 can be moved around a pan axis corresponding to a centerline through rod37. Light emitted from any of the NFOV illuminators 21, 22, or 23 is re?ected from?CA 02264029 1999-02-24wo 98/08439 PCT/US97/1487311.the subj ect iris to the pan/tilt mirror 32. That mirror is positioned to direct the re?ectedlight to mirror 18 from which the light is re?ected to NFOV camera 16. The lens of theNFOV camera can be moved to change the focus or zoom, and the aperture of thecamera is adjustable. One can also mount NF OV camera 16 on a movable platform sothat the NFOV camera can be turned toward the eye. Then, the portions of the opticalsubsection shown in Figure 1 may not be needed. Our preferred optical subsystem hasfive degrees of freedom: the pan axis, the tilt axis, the focus axis, the aperture axis andthe zoom axis. A system with fewer degrees of freedom could also be used. The panand tilt axes are used to position the pan/tilt mirror 32 so that the correct narrow field isimaged onto the sensing array of NF OV camera 16. The ability to control movementsalong the focus axis, aperture axis and zoom axis allows us to be certain that the imagedobject is in focus. In some cases, only the NFOV camera is needed and the WFOVcamera may optionally not be used.The design of the optics resolution, magni?cation, focusing and size ofthe imager dictates the distance between the camera and lens and the distance betweenthe lens and object to be imaged. The size of the imager is of paramount importance indefining the distance from lens to imager and contributes to the depth of focus. Thoseversed in the art will recognize that NFOV camera 16 may be solid state or of vidiconnature and that the sensing array size can vary from generic industry sizes of 1/4, 1/3,1/2, 2/3 or 1 inch diagonal measurement. In an optical system, the introduction of amirror in an optical path allows the path to be redirected without effecting the opticalpath length. The use of these mirrors allows the optical path to be folded back on itself?CA 02264029 1999-02-24WO 98/03439 PCT/U S97/ 1487312.thus reducing the overall required physical length needed to implement the opticaldesign. Those skilled in the art will recognize that a gimbaled camera can also be usedto perform image steering.In the embodiment of Figure l the illuminators are positioned to providea field of illumination illustrated by the dotted lines in Figure 2. These ?elds must bedirected and sized so that light will be re?ected from the eye of the user to the pan/tiltmirror 32. To achieve this result we provide three illuminators 21, 22 and 23 placed atdifferent locations on the housing 1. These illuminators are oriented to direct light tothe areas where the user's eye is most likely to be based upon information received fromthe WFOV cameras or other position detectors that could be used. The illuminatorsmay be mounted in a permanent location and orientation or placed on manuallyadjustable or motorized brackets such as those shown in Figures 6, 7, 8 and 9. Theilluminators could also be attached to a sliding mechanism for translation along an axis.The light source preferably is a light emitting diode or other device thatemits infrared or near infrared light or a combination of these. A lens and diffuser (notshown) can be used to guarantee uniform illumination. We have found infrared light tobe particularly useful because it penetrates eyeglasses and sunglasses more easily thanvisible light or colored light within the visible spectrum. Infrared light is also invisibleto the user and extremely unobtrusive. Optical ?lters may be placed in the light path infront of the camera to reduce any undesirable ambient light wavelengths that corruptthe desired images. Different wavelength priority ?lters may be used to permitdifferent wavelengths to be used to optimize each camera's performance. For example,?CA 02264029 1999-02-24wo 98/08439 PCTIUS97/1487313.longer wave IR could be used for WFOV camera imaging and shorter IR could be usedfor NF OV camera imaging. Then the NFOV cameras would not respond to WFOVillumination. This "speed of light" processing can be used to great advantage. Ifdesired, the LED light source could be strobed. Strobing provides the capability tofreeze motion. Strobing also provides the capability to overwhelm ambient light byusing a high intensity source for a brief period of time, and exposing the cameraaccordingly to wash out the background ambient illumination which would otherwisecause interference. Strobing NFOV and WFOV illuminators, perhaps at differenttimes, and allowing the cameras to integrate photos over appropriate, possiblydisparate, time slices permits optimum usage of the speci?c spatial and timecharacteristics of each device.As shown in Figure 1, 3, 4 and 5, the NF OV illuminators are preferablycomprised of a 6 x 6 array of light emitting diodes 20 mounted on a circuit board 24. Ifthe light emitting diodes are mounted normal to the circuit board the illuminator willilluminate an area of illumination having some diameter b as indicated in Figure 4. Wehave discovered that the area of illumination can be increased using the exact samecomponents as are used for the illuminator of Figure 4 by repositioning or otherwisereconfiguring the light emitting diodes. This is shown in the embodiment of Figure 5by larger diameter a. In that illuminator the top two rows of diodes are positioned at anacute angle relative to the board 24 and the lower two rows of diodes are mounted at anobtuse angle. Circular ring illuminators or other shapes may also be used which couldbe placed around the NFOV lens. Other optical elements such as polarizers or?CA 02264029 1999-02-24W0 98/03439 PCT/U S97/ 1487314.birefringent analysis devices may be used to minimize artifacts or enhance otherbiologically relevant characteristics such as corneal curvature, eye separation, irisdiameter, skin re?ectance and sclerac vasculature.In Figures 6 and 7 we show a bracket which we have used to attach theNF OV illuminators 21, 22, and 23 to the housing 1. That bracket 30 has a U-shapedbase 31 having a hole 32 through which a screw attaches the bracket to the housing. A?rst pair of gripper arms 33 and 34 with attached pin 42 are pivotably attached to oneupright of the base. A similar second pair of gripper arms 33 and 34 are pivotablyconnected to the opposite upright through collar 37. The illuminator indicated in chainline in Figure 7 is held between the gripper arms by screws or pins 38. A series ofholes (not visible) are provided along the uprights so that the gripper arms can bepositioned at any selected one of several positions. A locking tab 39 extends from thecollar 37 into an adjacent slot in the upright to prevent rotation of the collar. Set screw38 is tightened against the pin 42 extending from gripper arms 35 and 36 to preventrotation of the gripper arms.A second bracket 44 which we have used to attach the NFOVilluminators 21, 22, and 23 to the housing 1 is shown in Figures 8 and 9. That bracket50 has a base 51 which attaches to the housing. A rod 52 extends upward from thebase 51. Collar 53 slides along rod 52 and can be held at any desired location on therod by set screw 54. Rod 55 extends from collar 53 and holds carrier 56. This carrier isslidably attached to the rod 55 in the same manner as collar 53. The illuminatorindicated in chain line in Figure 9 is attached to the carrier 56 by fasteners or snap fit on?CA 02204029 1999-02-24wo 98/08439 PCT/US97/1487315.pins 57 extending from the carrier. Both this bracket 50 and the other illustratedbracket 30 permit the attached illuminator to be repositioned or adjusted along a panaxis and a tilt axis.We prefer to connect each array of light emitting diodes through adistribution board 60 to an illumination controller 62 as shown in Figure 10. Since therecan be one or more illuminator arrays these arrays are designated as ILLUMINATOR 1through ILLUMINATOR X in the drawing. The distribution board 60 and illuminationcontroller 62 enable us to selectively light the WFOV illuminator 6 and the NF OVilluminators 21, 22 and 23 in the embodiment of Figure 1. Furthermore, we canselectively illuminate sets of light emitting diodes within each array or selectively lightindividual diodes.Each set of light emitting diodes may emit a different wavelength oflight. It has been noted that the irises of different people respond better to certainwavelengths and worse to other wavelengths of light. This could be accomplishedusing illuminators with different wavelength LEDs or populating a single illuminatorwith different wavelength LEDS next to each other. These could be strobed and thebetter image selected. Additionally, LEDs of different beam widths could be mountedside by side or in different illuminators for illumination intensity control or forspecularity control - the higher tighter the beamwidth the less the size of thespecularity.We can also control the duration and the intensity of the light which isemitted. To prevent burnout of the illuminators caused by prolonged illumination we?CA 02264029 1999-02-24WO 98/08439 PCT/US97/1487316.can provide timers 63 for each illuminator as indicated by the dotted blocks labeled "T"in Figure 10. The timers will cut the power to the array after a predetermined period ofillumination. The WFOV cameras 3 provide images to an image processor 64 whichwe call the PV—I. That processor 64 tells the computer 65 the x, y, 2 coordinates of theselected eye or eyes of the person to be identified. The image processor may alsoassess the quality of the image and contain algorithms that compensate for motion ofthe subject. This processor may also perform image enhancement. The PC 65 hasaccess to information from the ambient light level detector 69 and the NF OV camera.These data can be used to modify illumination strategies. The x, y, z coordinates forthe expected position of the eye enable the computer to direct the illuminationcontroller as to which illuminators should be lighted and to direct the pan/tilt controller66 to properly position the pan/tilt unit 67 so that a useful image of the iris can beobtained. These functions may be selected by results from the WFOV cameraprocessing. Commands are sent from the computer 65 to the motors to change thelocation of the pan/tilt axes or to adjust focus. In the simplest case, one may considerthat a WFOV image is acquired, the data is processed and then passed through theimage processor 64 and computer 65 to the pan/tilt controller 66 or a gimbaledcontroller. In order to minimize motion time and control settling time, there can besimultaneous motion in the optical subsystem along all five axes.The pan/tilt controller 66 accepts macro level commands from thecomputer and generates the proper set points and/or commands for use by theillumination control or each axis supervisor. The intermediate continuous path set?CA 02264029 1999-02-24WO 98/08439 PCT/US97/ 1487317.points for the axis are generated here and then sent to each axis supervisory controller.A command interpreter decodes the commands from the image analysis and formatsresponses using positioning information from the optical devices. A real time interruptproduces a known clock signal every n milliseconds. This signal is a requirement forthe implementation of a sampled data system for the position controller of each axisand allows synchronization via the supervisory controller for continuous path motion.A diagnostic subsystem performs health checks for the control system.Besides the choreography of the five axes, the microprocessor controllermust also provide illumination control. The illumination controller will acceptcommands similar to the commands associated with motion control to timely activate,or to synchronously activate with the camera frame taking, selected illuminators.Images from the WFOV are transmitted as analog signals to the imageprocessor 64. The image processor preferably contains two pyramid processors, amemory capable of storing at least two frames, one LUT, ALU device, a digitizerwhich digitizes the analog video signal, a Texas Instrument TMS 320 C-3] or C-32processor and a serial/parallel processor. The image is processed using the pyramidprocessors as described in United States Patent No. 5,359,574 to van der Wal. TheTexas Instruments processor computes disparities between images. The WFOV imagesdefines a region or point in the field of view of the WFOV cameras where the subject'sright eye or left eye or both are located. Using stereo processing techniques on thedisparities will result in x, y, 2 coordinates for points on the subject relative to theWFOV cameras. That information is then further processed to define an area of interest?CA 02264029 1999-02-24WO 93/03439 PCT/U S97/ 1487318.such as the head or an eye. The coordinates of the area of interest are used to direct theNFOV optical system. These position coordinates are transferred from the imageprocessor to a NFOV image and iris image processor 65. This unit 65 contains a 486,PENTIUM or other microprocessor system and associated memory. In the memory areprograms and algorithms for directing the optical platform and doing iris identi?cation.Additionally, WFOV video images can be stored as a security video record.The focus axes of the NFOV system may be controlled in an open loopfashion. In this case, the x,y,z coordinate from stereo processing de?nes via table lookor analytic computation the focus axis position so that the lens properly focuses theNFOV camera on the object of interest. A closed loop focus method could also beused. In this case, NF OV video would be processed by image processor 64 to obtain a?gure of merit de?ning if the axis was in focus. From the ?gure of merit the axis couldbe commanded forward or backward and then a new image acquired. The processwould continue in a closed loop form until the image is in focus. Other informationsuch as iris size and location as measured in camera units, and eye separation fromWFOV camera images can be combined with stereo and other focus information intomultivariate features than can be used to refine range information by fusion of direct orderived sensory information. This sensor fusion may encompass other information aswell.Since the object of interest, namely the eye, may be moving, there is arequirement that the NFOV camera track the trajectory seen by the WFOV. Whenmotion ceases to blur the image, a quality image may be acquired via the NFOV camera?CA 02264029 1999-02-24wo 98/08439 PCT/US97/1487319.and optics. By tracking the eye, the optics directing light to the NF OV camera arealigned so that when it is desired to obtain an iris quality image little or no additionalmotion may be required.In this case, the x,y,z coordinates from analysis of the WFOV images aresent to the NFOV controller at some uniform sample rate (such as every 100 ms). Acontinuous path algorithm such as described in Approach, by Klafter, Chmielewski and Negin (Prentice Hall, l989) would be used toprovide intermediate sets of {p,t,f,a,z} set points to the axis so that the axes remain inmotion during the tracking phase. To define the last end position, either a macro levelcommand can be given or the same {p,t,f,a,z} can be continually sent at the sampleperiods.It is important to recognize that as the NF OV axes move, the associatedimager may not have suf?cient time to perform the required integration to get a non-blurred image. Additionally, depending on the camera used (interlaced or progressivescan) there may be ?eld to ?eld displacement or horizontal displacement of the imageall of which can be wholly or partially corrected by computation. Thus, it is easily seenwhy the WFOV camera provides the information necessary for directing the NF OVstage. It should be noted, that certain eye tracking algorithms (such as those based onspecularity or iris con?guration or pattern matching) may be capable of providingsufficient information (even if the image is slightly blurred due to focus or exhibitssome blur caused by motion) to provide a reasonable estimate of the eye location in theNFOV camera. Hence, it is conceptually possible to use the WFOV data for coarse?CA 02264029 1999-02-24WO 98/08439 PCT/U S97/ 1487320.movement and the processed NFOV data (during motion) as additional information for?ner resolution. This fusion of data can provide a better estimate than one WFOVcamera image alone in positioning the NFOV image to acquire a quality iris image.To acquire a quality iris image, the NFOV axes must settle to a pointwhere the residual motion is less than that which can be detected by the imager. Oncethis occurs, any remaining images must be purged from the imager (typically there is adelay between an image integrated and the readout via RS 1 70) and the properintegration time allowed to acquire a non blurred image. See Robetic Engineering AnIntegrated Approach for a timing scenario. This can be accomplished in a number ofways, the simplest being a time delay which occurs after the cessation of motion until agood quality RS170 image is captured. Multiple iris images which may be partiallyobscured may be collected and fused into a single composite, less obscured iris imageusing normalization and fusion methods.We have found that any light source will cause a re?ection on eyeglassesof the person to be identi?ed. The eyes as seen from the WFOV cameras 3 are shownin Figure 11. There is a re?ection 70 from the WFOV illuminator 6 on both lenses 72of the person’s eyeglasses 74. The re?ection 70 partially covers the iris 76 of theperson’s eye making iris identi?cation difficult if not impossible. To overcome thisproblem we use illuminators 21, 22 and 23 located off—axis from the optical axis of theNFOV camera 16. By care?illy positioning and sometimes using only some of thelight emitting diodes we can achieve adequate illumination without creating anobscuring re?ection. This result is shown in Figure 12 where light from only a few?CA 02264029 1999-02-24WO 98/08439 PCT/US97/1487321.light emitting diodes in the NFOV illuminators 21, 22 and 23 has created a re?ection80 that appears in the image. That reflection does not cover any part of the iris 76. Insome cases especially with glasses, the WFOV specularity makes ?nding the head andthe eye more expeditious.Multiple illuminators also enable us to determine the shape of eyeglassesworn by the subject in the images. We illuminate the eyeglasses sequentially using twospaced apart illuminators. The specularity will be in one position during the ?rstillumination and in a different position during the second illumination. The amount ofspecularity change is then calculated to determine appropriate eyeglass shape. Fromthat information we can determine the minimum movement of illumination required tomove the specularity off of the iris.A calibration procedure must be used to correlate the center of theNF OV camera's ?eld of view with pan/tilt and focus axis positions for a series ofcoordinates in 3 dimensional space as de?ned by the wide ?eld of view. Given a set ofWFOV coordinates {x,y,z} de?ning the position of a user’s eye somewhere in theworking volume in front of the cameras, a transformation or table look up can be usedto de?ne the coordinates of the pan, tilt and focus {p,t,f} axes that make the center ofthe NFOV camera's ?eld of view coincident with x,y coordinates and in focus on the 2plane. We prefer to use a series of targets to assist in calibration. These targets havepartially ?lled circles corresponding to iris positions at known locations on a page. Thetargets are placed a known distance from the housing and the device is activated toattempt to ?nd an iris and produce a calibration image.?CA 02264029 1999-02-24wo 93/03439 PCT/US97/1487322.When NFOV and WFOV cameras are used, they must be calibratedtogether. This may be accomplished by manual or automatic procedures, optical targetsor projected targets may be used. An automatic procedure would require a calibrationobject to be automatically recognized by the computational support equipmentoperating on the camera images. Another possibility is to use the NFOV cameramotion capabilities or other motion generation capability to project a target into thecalibration volume, and then recognize the target.Although we have shown certain present preferred embodiments of ourcompact image steering and focusing device and methods of using that device, it shouldbe distinctly understood that our invention is not limited thereto but may be variouslyembodied within the scope of the following claims.
Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date Unavailable
(86) PCT Filing Date 1997-08-22
(87) PCT Publication Date 1998-03-05
(85) National Entry 1999-02-24
Examination Requested 1999-02-24
Dead Application 2003-12-12

Abandonment History

Abandonment Date Reason Reinstatement Date
2002-12-12 R30(2) - Failure to Respond
2003-08-22 FAILURE TO PAY APPLICATION MAINTENANCE FEE

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Request for Examination $400.00 1999-02-24
Application Fee $300.00 1999-02-24
Maintenance Fee - Application - New Act 2 1999-08-23 $100.00 1999-08-23
Registration of a document - section 124 $100.00 1999-09-20
Registration of a document - section 124 $100.00 1999-09-20
Maintenance Fee - Application - New Act 3 2000-08-22 $100.00 2000-08-21
Maintenance Fee - Application - New Act 4 2001-08-22 $100.00 2001-08-16
Maintenance Fee - Application - New Act 5 2002-08-22 $150.00 2002-08-07
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
SENSAR, INC.
SARNOFF CORPORATION
Past Owners on Record
CHMIELEWSKI, THOMAS A., JR.
DAVID SARNOFF RESEARCH CENTER, INC.
HANNA, KEITH JAMES
MANDELBAUM, ROBERT
MISHRA, DEEPAM
NEGIN, MICHAEL
SAINSBURY, ROBIN
SALGANICOFF, MARCOS
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Representative Drawing 1999-05-07 1 5
Abstract 1999-02-24 1 68
Description 1999-02-24 22 871
Claims 1999-02-24 8 192
Drawings 1999-02-24 7 121
Claims 1999-02-25 8 208
Cover Page 1999-05-07 2 55
Cover Page 1999-10-29 3 121
Fees 2000-08-21 1 35
Correspondence 1999-04-13 1 31
Prosecution-Amendment 1999-02-24 11 310
Prosecution-Amendment 1999-02-24 1 19
PCT 1999-02-24 5 183
Assignment 1999-02-24 4 141
Correspondence 1999-09-20 3 94
Assignment 1999-09-20 17 679
Correspondence 1999-09-20 4 161
Prosecution-Amendment 1999-10-29 2 72
Assignment 1999-02-24 6 227
PCT 1999-02-24 15 725
Prosecution-Amendment 2002-08-12 2 71
Fees 2002-08-07 1 36
Fees 1999-08-23 1 38
Fees 2001-08-16 1 36