Language selection

Search

Patent 2140681 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent: (11) CA 2140681
(54) English Title: WIDE AREA COVERAGE INFRARED SEARCH SYSTEM
(54) French Title: SYSTEME DE RECHERCHE INFRAROUGE A GRANDE COUVERTURE
Status: Deemed expired
Bibliographic Data
(51) International Patent Classification (IPC):
  • H04N 5/45 (2011.01)
  • H04N 7/18 (2006.01)
  • H04N 5/45 (2006.01)
(72) Inventors :
  • CHEVRETTE, PAUL C. (Canada)
(73) Owners :
  • HER MAJESTY THE QUEEN, IN RIGHT OF CANADA, AS REPRESENTED BY THE MINISTER OF NATIONAL DEFENCE (Canada)
(71) Applicants :
  • HER MAJESTY THE QUEEN, IN RIGHT OF CANADA, AS REPRESENTED BY THE MINISTER OF NATIONAL DEFENCE (Canada)
(74) Agent:
(74) Associate agent:
(45) Issued: 2003-09-23
(22) Filed Date: 1995-01-20
(41) Open to Public Inspection: 1995-12-18
Examination requested: 2001-09-28
Availability of licence: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): No

(30) Application Priority Data:
Application No. Country/Territory Date
08/262,061 United States of America 1994-06-17

Abstracts

English Abstract




An imaging system wherein two field-of-views (FOVs)
images of a scene are obtained and displayed as a single image
so that both FOVs can be viewed simultaneously by an observer.
This imagining system is particularly directed to infrared (IR)
imaging of a scene which is useful for search-and-rescue (SAR)
operations. One IR image sensor provides a low resolution, high
sensitivity, wide field-of-view (WFOV) image of the scene which
is displayed as a large, low resolution, WFOV image on a display
apparatus. A second IR sensor provides a high resolution narrow
field-of-view (NFOV) image of a selected portion of the scene
wherein the second sensor's NFOV optics can be directed to any
particular area within the WFOV. That smaller NFOV image is
also displayed, by the display apparatus, within the WFOV
displayed image and it is displayed at the same area in the WFOV
image as that actually covered by the NFOV, the NFOV image
displacing the WFOV image of that area with the two images being
fused together into a single image by the display apparatus.
This is rather similar in operation to an observer's eye which
has a high resolution central NFOV vision and a very large FOV
peripheral vision that has low resolution but a high sensitivity
to detection of any anomaly. By using an eye movement tracking
system to aim the NFOV image detector and the position that the
NFOV image is displayed, an observer's central vision can always
be directed to the high resolution NFOV image with that
observer's peripheral vision simultaneously viewing the WFOV low
resolution image.


Claims

Note: Claims are shown in the official language in which they were submitted.



THE EMBODIMENTS OF THE INVENTION IN WHICH AN EXCLUSIVE PROPERTY
OR PRIVILEGE IS CLAIMED ARE DEFINED AS FOLLOWS:

1. An imaging system comprising at least one imaging
device to obtain a low resolution wide field-of-view (WFOV)
image of a scene and means to simultaneously obtain a high
resolution narrow field-of-view (NFOV) image of a selected
portion of the scene wherein that selected portion can be
selectively directed to any area within the WFOV, the system
including an image display arrangement with a means to display
the low resolution WFOV image, a means to display the high
resolution NFOV image within the WFOV displayed image at the
same area in the WFOV image as that covered by the NFOV and a
means to blank out that portion of the WFOV displayed image in
which the NFOV image is present with the two images being fused
together in a single image by the display apparatus.



2. An imaging system as defined in Claim 1, wherein said
imaging device to obtain the WFOV image is a camera with a low
resolution, high sensitivity, infrared (IR) image sensor and the
means to simultaneously obtain a high resolution NFOV image is a
second, high resolution, IR image camera with optics that can be
directed towards said selected portion by first servo-
mechanisms.



3. An imaging system as defined in Claim 2, wherein the
NFOV displayed image position in the WFOV displayed image is
controlled by movements of the high resolution display's
position under the control of second servomechanisms, control of
the first and second servomechanisms being slaved to an eye
motion tracking system with the first and second servo-
mechanisms movements being synchronized.



4. An imaging system as defined in Claim 2, wherein the
NFOV displayed image position in the WFOV displayed image is
controlled by movements of the high resolution display's
position under the control of second servomechanisms, a joystick
controlling the first and second servomechanisms whose movements
are synchronized.



5. An imaging system as defined in Claim 2, wherein optics
associated with said second, high resolution, IR image camera
contain a steering mirror to direct an image from said selected
portion to a second IR image sensor, the steering mirror's
position being determined by said first servomechanisms.



6. An imaging system as defined in Claim 5, wherein the
steering mirror directs a NFOV image from said selected portion
through a lens to a 45° folding mirror having a central
aperture, the folding mirror reflecting the NFOV image to an
image-derotator which reflects the NFOV image back through said
aperture and through further lenses to an input lens for said
second IR image sensor.



7. An imaging system as defined in Claim 6, wherein said
input lens is fixed to a microscanning mechanism that can
displace the NFOV image by a fraction of a pixel pitch with
respect to the second IR image sensor.



8. An imaging system as defined in Claim 6, wherein the
image-derotator is a 90° roof mirror image-derotator.



9. An image system as defined in Claim 8, wherein the
input lens is fixed to a microscanning mechanism that can
displace the NFOV image by a fraction of a pixel pitch with
respect to the second IR Image sensor.



10. An imaging system as defined in Claim 6, wherein the
NFOV displayed image position in the WFOV displayed image is
controlled by movements of the high resolution NFOV display's
position under the control of second servomechanisms, control of
the first and second servomechanisms being slaved to an eye
motion tracking system with the first and second servomechanisms
movements being synchronized.



11. An imaging system as defined in Claim 1, wherein the
image display arrangement comprises a signal processor for
electronic processing of WFOV signals from said one imaging
device and further NFOV signals from the means to simultaneously
obtain a high resolution NFOV image; processed WFOV signals from
said signal processor being directed to a WFOV image projector
having a first lens that projects a WFOV image onto a viewing



screen, processed NFOV signals from said signal processor being
directed to a high resolution NFOV image projector having a
second lens that projects a NFOV image to a steering mirror that
is rotatable about two axis by servomechanisms which are
controlled by signals from said signal processor, the steering
mirror reflecting the NFOV projected image onto a selected
portion of said viewing screen where both images can be viewed
simultaneously, the NFOV projected image's position being
applied to said signal processor which blanks out a portion of
the WFOV projected image that corresponds with the area covered
by the NFOV projected image on the viewing screen.



12. An imaging system as defined in Claim 11, wherein an
image-derotator is located between the NFOV image display
projector and said second lens in the NFOV image projector, the
NFOV projected image being directed to the image-derotator and
from the image-derotator to said second lens, the image-
derotator being controlled by said signal processor from signals
of said NFOV projected image's position.



13. An imaging system as defined in Claim 12, wherein the
WFOV image projector comprises a low resolution liquid crystal
display (LCD) screen to which the processed WFOV signals are
applied and a first projection light source directed to that LCD
screen to project a WFOV image from the low resolution screen to
said first lens and the NFOV image projector comprises a high
resolution LCD screen to which the processed NFOV signals are
applied and a second projection light source directed to the



high resolution LCD screen to project a NFOV image from that
screen to said second lens.



14. An imaging system as defined in Claim 13, wherein an
eye motion tracking system is directed to an observer of said
viewing screen and signals from said eye motion tracking system
are applied to said signal processor which controls, in
accordance with those signals, the servomechanisms that position
the NFOV image projector's steering mirror such that the
projected NFOV image's position on said viewing screen is
located at an area towards which the observer's eye are
directed, the signal processor simultaneously directing the
means to obtain a NFOV image to a selected portion of the scene
which corresponds to that of the NFOV projected image area in
the WFOV projected image on the viewing screen.



15. An imaging system as defined in Claim 2, wherein the
image display arrangement comprises a signal processor for
electronic processing of WFOV signals from said one imaging
device and further NFOV signals from the means to simultaneously
obtain a high resolution NFOV image; processed WFOV signals from
said signal processor being directed to a WFOV image projector
having a first lens that projects a WFOV image onto a viewing
screen, processed NFOV signals from said signal processor being
directed to a high resolution NFOV image projector having a
second lens that projects a NFOV image to a steering mirror that
is rotatable about two axis by servomechanisms which are
controlled by signals from said signal processor, the steering



mirror reflecting the NFOV projected image onto a selected
portion of said viewing screen where both images can be viewed
simultaneously, the NFOV projected image's position being
applied to said signal processor which blanks out a portion of
the WFOV projected image that corresponds with the area covered
by the NFOV projected image on the viewing screen.



16. An imaging system as defined in Claim 15 wherein an
image-derotator is located between the NFOV image display
projector and said second lens in the NFOV image projector, the
NFOV projected image being directed to the image-derotator and
from the image-derotator to said second lens, the image-
derotator being controlled by said signal processor from signals
of said NFOV projected image's position.



17. An imaging system as defined in Claim 16, wherein the
WFOV image projector comprises a low resolution liquid crystal
display (LCD) screen to which the processed WFOV signals are
applied and a first projection light source directed to that LCD
screen to project a WFOV image from the low resolution screen to
said first lens and the NFOV image projector comprises a high
resolution LCD screen to which the processed NFOV signals are
applied and a second projection light source directed to the
high resolution LCD screen to project a NFOV image from that
screen to said second lens.




18. An imaging system as defined in Claim 17, wherein a eye
motion tracking system is directed to an observer of said



viewing screen and signals from said eye motion tracking system
are applied to said signal processor which controls, in
accordance with those signals, the servomechanism that position
the NFOV image projector's steering mirror such that the
projected NFOV image's position on said viewing screen is
located at an area towards which the observer's eye are
directed, the signal processor simultaneously directing the
second, high resolution, IR image camera's optics to a selected
portion of the scene which corresponds to that of the NFOV
projected image area in the WFOV projected image on the viewing
screen.



19. An imaging system as defined in Claim 3, wherein the
image display arrangement comprises a signal processor for
electronic processing of WFOV signals from said one imaging
device and further NFOV signals from the means to simultaneously
obtain a high resolution NFOV image; processed WFOV signals from
said signal processor being directed to a WFOV image projector
having a first lens that projects a WFOV image onto a viewing
screen, processed NFOV signals from said signal processor being
directed to a high resolution NFOV image projector having a
second lens that projects a NFOV image to a steering mirror that
is rotatable about the axis by servomechanisms which are
controlled by signals from said signal processor, the steering
mirror reflecting the NFOV projected image onto a selected
portion of said viewing screen where both images can be viewed
simultaneously, the NFOV projected image's position being
applied to said signal processor which blanks out a portion of



the WFOV projected image that corresponds with the area covered
by the NFOV projected image on the viewing screen.



20. An imaging system as defined in Claim 19, wherein an
image-derotator is located between the NFOV image projector and
said second lens, the NFOV projected image being directed to the
image-derotator and from the image-derotator to said second
lens, the image-derotator being controlled by said signal
processor from signals of said NFOV projected image's position.


Description

Note: Descriptions are shown in the official language in which they were submitted.


21~0681


FIELD OF THE INVENTION
The present invention relates to an imaging system and
in particular to an infrared imaging system having two field-of-
views (FOVs) displayed as a single image, one FOV being a wide
field-of-view (WFOV) infrared (IR) image having a low resolution
and the other FOV being a narrow field-of-view (NFOV) high
resolution infrared (IR) image which is displayed within the
WFOV image at the same area in the WFOV image as that covered by
the NFOV, the NFOV image replacing the WFOV image at that area
with the two FOV images being fused together in the displayed
image.



BACKGROUND TO THE INVENTION
In Search and Rescue (SAR) operations, a search is
normally performed visually from airplanes or helicopters.
However, that type of visual search is commonly interrupted at
night or when adverse weather conditions exist. The capability
of SAR operations at night or during adverse weather conditions
can be made more practical and greatly improved by the use of
special equipment such as infrared (IR) or thermal imagers.
However, the performance of existing systems is limited by the
current technology that is presently in use. Current airborne
IR systems, for instance Forward Looking Infrared (FLIR)
systems, use a scanning mechanism which has to be compatible
with a standard video format. These IR systems usually possess
two fields-of-view (FOV) imagers, a wide FOV (WFOV) one for
search and detection which has a low resolution and a narrow FOV


2190681


(NFOV) one with a high resolution which is used for recognition
and/or identification. However, since the scanning mechanism
for these IR systems must be compatible with the standard video
format, these IR detector's dwell time (i.e. exposure time) is
limited to a few microseconds. This fixed integration time
combined with the lower resolution of the WFOV mode reduces the
system sensitivity which leads to a decreased detection
probability and detection range. The NFOV mode inherently
performs better than the WFOV for detection but only if it is
looking in the right direction. Since the NFOV only covers a
small search area, its use increases the search time required to
cover the same area as that provided by a WFOV whereas the use
of a low resolution WFOV imaging system decreases the
probability of detection.
New IR detectors or sensors, which are now available,
can solve the inherent problems associated with scanning
mechanism used in current IR systems. These recently developed
sensors are known as Focal Plane Arrays (FPA) and allow for the
simultaneous recording of all the pixels forming the image of a
scene over a period of time for one frame of the image, as in a
photographic camera. This is in contrast to building up an
image by the sequential scanning of a single or a combination of
detectors across the scene since the integration time for each
pixel in the scanning process is dependent on the scanning rate.
These FPA sensors mode of operation dramatically increases the
sensitivity of the IR imaging systems because of the increased


._ 2l4o6~l

signal integration time for each pixel as compared to the
integration time for scanning systems.
The optimization of SAR operations involves selecting
search patterns for maximizing the search area with the minimum
loss of detection probability. These search patterns are
dependent on an aircraft speed and flight altitude which have to
be determined from atmospheric conditions and the system's
search FOV. Those are factors which determine the area that
should be covered on each airplane pass. This optimization of
SAR operations also involves the close relationship between the
search pattern imposed onto the imaging system, so as to cover
as wide an area as possible and reduce the search time, and the
visual search process of the human observer on the type of
display presenting the information from the imaging system. In
any given SAR operation, different parameters must be optimized
in order to ensure that the imaging and human observer interface
in a manner to perform properly together. The parameters chosen
are ones which minimize the search time and optimize the visual
search process of the human observer, i.e. the information
displayed to the observer must not be overloading.
A factor to be considered on interfacing a human
observer with a display arrangement is that the human eye has
basically two FOVs, a narrow central one and a large peripheral
FOV . The human eye is provided with a narrow (~7%) central FOV
with a high resolution along with a very large (~180)

peripheral FOV of low resolution but very high sensitivity,
particularly in the dark. That high resolution narrow central


2140681
-



FOV is used to examine, recognize and identify objects. The
peripheral, low resolution, FOV is used for the detection of any
anomaly or new object coming into a scene since this peripheral
vision is very sensitive to any movement that takes place in its
FOV.
U.S. Patent 5,262,871 by Joseph Wilder et al recognized
that the human eye is able to foveate, or focus its attention,
on a region of interest in its field of view at a high
resolution while absorbing information at low resolution in
areas peripheral to that region and describes an image sensor
with those properties that is useful in high speed processing of
visual data for industrial tasks such as inspecting bottles
which are passing a high speed filling line. In the image
sensor described by Wilder et al, the device includes a means to
randomly address individual pixels wherein relatively large
groups of pixels can be read out simultaneously to provide high
speed data capture although at a lower resolution. This reduces
the amount of data that has to be processed in an inspection
task and is useful to rapidly scan a scene being viewed in order
to locate an area of interest. The random addressing of pixels
enables the readout of pixels located in selected regions of
interest and, once an area of interest is located, the number of
pixels read out at a time may be reduced to provide a higher
resolution, lower speed, readout of the areas of interest. This
image sensor array can be selectively operated so that selected
portions can be read out with varying resolution levels.
Although U.S. Patent 5,262,871 recognizes and uses a sensor with


~1~0681

some features similar to an eye, its image sensor operates by
selectively varying the readout rate for the various pixels or
groups of pixels in order to obtain a selective resolution on
particular regions.



SUMMARY OF THE INVENTION
It is an object of the present invention to optimize
the close relationship between an imaging system and its
interface with the visual search process of a human observer on
a display which presents images of information from the imaging
system to the observer.
An imaging system, according to one embodiment of the
present invention, includes at least one imaging device to
obtain a low resolution wide field-of-view (WFOV) image of a
scene and means to simultaneously obtain a high resolution
narrow field-of-view (NFOV) image of a selected portion of the
scene wherein that selected portion can be selectively directed
to any area within the WFOV, the system including an image
display arrangement with a means to display the low resolution
WFOV image, a means to display the high resolution NFOV image
within the WFOV displayed image at the same area in the WFOV
image as that covered by the NFOV and a means to blank out that
portion of the WFOV displayed image in which the NFOV image is
present with the two images being fused together in a single
image by the display apparatus.
An imaging system, according to a further embodiment of
the invention, wherein the imaging device to obtain the WFOV


2140681

image is a camera with a low resolution, high sensitivity,
infrared (IR) image sensor and the means to simultaneously
obtain a high resolution NFOV image is a second, high
resolution, IR image camera with optics that can be directed
towards said selected portion by servo-mechanisms.

BRIEF DESCRIPTION OF THE DRAWINGS
The invention will now be described in more detail with
reference to the accompanying drawings, in which:
Figure 1 illustrates the operation of an imaging system
according to the present invention having a narrow field-of-view
detector covering a smaller area of a scene which is being
viewed by a wide field-of-view detector;
Figure 2 is a perspective view of a narrow field-of-
view imaging device that could be used in Figure 1 in accordance
with one embodiment of the present invention and a steering
mechanism for the device;
Figure 3 is a schematic diagram, parts being in cross-
section, of a top view of the narrow field-of-view image device
shown in Figure 2; and
Figure 4 illustrates an image viewing apparatus
according to one embodiment of the present invention.

DETAI~ED DESCRIPTION OF THE PREFERRED EMBODIMENTS
The optimization of the search process aboard an
aircraft is an aspect of any Search and Rescue (SAR) operation
that is required in order to save effort and reduce the search



~1~0681


time. Different parameters must, therefore, be optimized to
ensure that the search imaging system and the human observer
will operate together in an integrated manner. This involves
the optimization of the close relationship between the search
pattern imposed onto the imaging system in order to cover as
wide an area as possible and the visual search process of the
human observer on the display presenting the information from
the imaging system. The aircraft's speed and flight altitude,
which depend on atmospheric conditions, the imaging system
search FOV and the area to be covered in each airplane pass are
factors which all have to be considered in the optimization of
search patterns to maximize the search area with a minimum loss
of detection probability.
In the visual search process of the human observer, a
factor that should be considered is that the human eye has
essentially two fields-of-view (FOV), a narrow central FOV and a
very large peripheral FOV. The central FOV covers an angle of
about 7 and has a high resolution which is used to examine,
recognize and identify objects whereas the peripheral vision FOV
extends ~180, but has a low resolution. However, the very
large peripheral vision FOV, although having a low resolution,
has a very high sensitivity (particularly in the dark) for the
detection of any anomaly or new object coming into the scene
since it is very sensitive to any movement which occurs.
Figure 1 illustrates the basic operation of an imaging
system according to the present invention wherein scene 100
represents a wide area of search and 200 is a small narrow


2140681


region of interest inside of 100. The image of scene 100 is
obtained by an infrared (IR) image sensor 1 (3 to 5 ~m or 8 to
12 ~m operating wavelength) and optical lens 2 but at a low
resolution in order to cover this wide field-of-view (WFOV).
That low resolution image sensor's data is then processed and
presented to a display such as illustrated in Figure 4 where it
can be observed by an operator. The IR WFOV imaging device has
a field-of-view (FOV) of 60 or more with low resolution but
high sensitivity achieved by an operator controlled integration
or staring time for the IR focal plane array (FPA) sensor 1.
That staring time could be up to 1 second if compensation is
provided for platform movement. Image processing algorithms can
be implemented to improve the detection of movements in the
WFOV.
The image of the narrow field-of-view (NFOV) area 200
of the scene is obtained by an IR FPA image sensor 10 and
optical imaging device 12. This NFOV covers a much smaller area
(e.g. about 8) with a high resolution using currently available
IR FPA of 256 x 256 detectors. That high (enhanced) resolution
can be obtained with the implementation of known techniques, to
the sensor, such as a microscanning technique to increase the
number of samples within the image and cover the gaps between
the detectors. One microscanning technique is described in a
paper entitled "Sampling effects in CdHgTe focal plane arrays-
practical results" by R.J. Dann et al which was published in
SPIE, Vol. 685, Infrared Technology XII (1986) on pages 123 to
127. That microscanning technique will be described in more





2140681


detail later and is generally illustrated in Figure 1 by the
arrows Mx and My. Other techniques to increase image resolution
could be used such as a dithering motion of an image sensor as
described in U.S. Patent 5,301,042 which issued to D.L. Blanding
on 5 April, 1994.
The NFOV is displaceable within and moveable over the
entire large search WFOV area under control of an operator
through rotation (steering) of NFOV reflecting mirror 14 about
two axis shown by arrows Rx and Ry~ The steering mirror 14
reflects an image of a portion 200 of the WFOV search area 100
in Figure 1 to the high resolution NFOV image sensor 10. The
NFOV image sensor 10 can, as a result of rotation of mirror 14,
obtain a NFOV image of any particular portion 200 of the scene
100 simultaneously with a WFOV image of scene 100 being obtained
by the WFOV low resolution image sensor 1. The portion of the
scene 100 towards which the NFOV image sensor is directed is
under the control of an operator either through a joystick, via
servomechanisms, or through coupling the control of the steering
mechanism to the observer's eye movements when viewing the
displayed images by means of an eye motion tracking system. The
latter technique is preferred since displacement of the NFOV
must be done at a speed adequate to match eye movements in order
to minimize discomfort to the observer. The WFOV and NFOV
images are obtained simultaneously and presented to the same
display screen with the NFOV image displacing the same portion
of the scene in the WFOV image as that observed by the NFOV.
The NFOV and WFOV images are fused together on the same display,


21~0681
-




thus constantly keeping the large low resolution WFOV image of a
search area within the eyesight of an observer while the
observer may use the high resolution NFOV to view details of an
area of interest in the WFOV once a detection, such as motion,
has been observed. A system for displaying a NFOV and WFOV to
an observer simultaneously is illustrated in Figure 4. A
perspective view of a NFOV imaging system, i.e. camera, and
steering mechanism, according to one preferred embodiment of the
invention, is illustrated in Figure 2 with a top schematic view
of the NFOV opto-mechanical layout and lens system being shown
in Figure 3. The NFOV optical view 200' is reflected by a
steering mirror 14, rotatable about two axis x and y, through a
lens L1 to a 45 - folding mirror M2 having central aperture 16.
The two-axis steering mirror 14 allows IR radiation from any
NFOV portion ~about 8) of the WFOV scene to enter the NFOV
optical system from anywhere within the WFOV of a search scene.
The use of a single steering mirror for a two axis positioning
reduces the number of optical components at the costs of image
rotation around the elevation axis x which is parallel to the
optical axis. This image rotation necessitates the use of an
image-derotator M1 which can be a simple 90 roof mirror that is
rotatable about an axis 15 in directions shown by arrows Rdr.
The folding mirror M2 with aperture 16 reflects the image to the
image-derotator and is necessary since the use of a mirror
image-derotator M1 sends optical IR rays back along the optical
axis. These reflected rays, from the image-derotator, are
directed through aperture 16 of mirror M2 where they are focused




- 10 -

2190681
-




by lenses L2 and L3 onto the detector's input lens L4 for the
image sensor. The main imaging optics is constituted by an
afocal telescope formed by lenses L1, L2, L3 and the detector's
input lens L4. This telescope arrangement allows for adjustment
of the NFOV through the interchange of one of the lenses L1 or
L3 without modifying the detector's input lens L4 which, in this
embodiment, is fixed to a microscanning mechanism.
A two-mirror steering system could be designed which
would not produce any image rotation such as that caused by the
use of a single steering mirror 14 with two axis positioning.
However, a two mirror system would require an intermediate field
lens in order to reduce the size of one of the mirrors because
of the wide angle coverage. This would increase the number of
refractive surfaces as well as making the optical system larger
and more cumbersome. Other types of steering mechanism could be
used such as counter-rotating prisms or split cubes. However,
these types have less desirable chromatic characteristics and
their costs of fabrication are generally higher. Other types of
mirror image-derotator, i.e. refractive types of derotators,
could be used rather than a roof-mirror M1. However, a simple
90 roof mirror image-derotator is preferred in the sense that
this type represents only two reflecting surfaces, can be made
of large aperture at low costs, possesses a large angular
coverage and does not introduce any chromatic aberration into
the optical system.
The lenses L1, L2 and L3 direct the NFOV rays from
steering mirror 14 and image derotator M1 to the input lens L4


21~0681


for the NFOV IR image sensor. The input lens L4 is fixed to a
microscanning mechanism, which is used to increase the
resolution of the sensor, and has a focal length compatible with
the distance between the dewar window 60 and the actual detector
array (10' in Figure 3) position within the dewar. L2 is a
field lens to keep L3 and L4 within reasonable dimensions and
above all to position the optical exit aperture at the entrance
aperture of the detector cold shield to avoid vignetting.
A microscanning arrangement is used to increase the
resolution of the high resolution NFOV IR image sensor which is
accomplished by allowing multiple images to be formed on a
mosaic of the image sensors detector elements with a small
displacement being created between each of the images. In
microscanning arrangements, an image scene is displaced by some
fraction of a pixel pitch on the detector array for each field.
This displacement of the image with respect to the array may be
done by small movements of either the lens that focuses the
image on the image sensor or by small movements of the image
sensor. If a 2 x 2 microscan is applied, for instance, the
first field records the image at a first reference position on
the pixel array. The image is then displaced, with respect to
the array, by half a pixel pitch to the right to record a second
image and then a half a pixel pitch vertically to record a third
image. In this 2 x 2 microscan, the image is next displaced by
half a pixel pitch to the left in order to record a fourth image
and then half a pixel pitch vertically to return the image to
its original, first reference, position. These four microscan




- 12 -

2140681

images are then merged together as interlaced fields to form a
regular full frame image containing the four fields. In a
similar manner, the image could be displaced by one third of a
pixel pitch for each step to implement a 3 x 3 microscan or one
fourth of a pixel pitch for a 4 x 4 microscan. These microscan
patterns have been described in the previously mentioned paper
by R.J. Dann et al wherein a single plane mirror actuated by
piezo-ceramic transducers was used for image displacement.
An opto-mechanical component, according to one
embodiment of the present invention, establishes a two-axis
microscanning of the image over the NFOV detector array using an
appropriate holder and precision sliding of the lens L4
(see arrows ~ and ~ in Figures 1 and 2) with piezoelectric
elements being used for micro-positioning the lens. The holder
and opto-mechanical mechanism preferably have a step rate
requirement for positioning the lens at 240 Hz with a rest time
of >3ms between steps.
An image display arrangement, according to one
embodiment of the invention, for displaying and fusing together
a NFOV image and a WFOV image into a single image by a display
device so that both images can be viewed simultaneously, by an
observer, is illustrated in Figure 4.
The low resolution WFOV image signals are
electronically processed in a signal processor 20 with its WFOV
signal output being directed, along transmission line 22, to a
WFOV image projector 25 as shown in Figure 4. In that WFOV
image projector 25, a projection light source 31 is directed



- 13 -


214~681

towards a liquid crystal display (LCD) screen 30, to which the
WFOV processed image signals are applied, and projects a WFOV
image from LCD screen 30 through a lens 40 onto a diffuser
screen 60. That WFOV, low resolution, image projected onto
screen 60 can then be viewed by an observer 50 through a bi-
ocular lens 46. A bi-ocular lens is one in which both eyes of
an observer looks through the same large lens and these are
used, for example, in night vision goggles or driver's
periscopes. The low resolution WFOV image on screen 60 will, to
the observer, appear through the bi-ocular lens 46 as a virtual
image in the plane 110. A circle 112 in the WFOV image
projected onto screen 60 will, therefore, appear as an enlarged
circle 112' in the virtual image plane 110.
The high resolution NFOV image signals are
electronically processed in signal processor 20 with its NFOV
signal output being directed, along transmission line 21, to the
NFOV image projector 24. In that NFOV projector 24, a
projection light source 33 is directed to a high resolution LCD
screen 32 (1024 x 1024 pixels required for a 4 x 4 microscan
with a 256 x 256 IR FPA), to which the high resolution processed
NFOV image signals are applied. Light source 33 projects a high
resolution NFOV image from LCD screen 32 through an image-
derotator 34 and lens 42 to a steering mirror 44. The steering
mirror 44 is rotatable about two axis, shown by arrow Rx and R
by servomechanisms under control of signals from the signal
processor transmitted to the NFOV projector 24 via transmission
line 21'. The mirror 44 reflects and directs the NFOV pro~ected

2140681


image onto a selected portion 210 of the screen 60. That NFOV
image 210 can then be viewed, by the observer 50, through the
bi-ocular lens 46 as a NFOV virtual image 210' in the virtual
image plane 110. That virtual image 210' will appear, to the
observer, as a NFOV image inside of the observed WFOV image with
the signal processor blanking out the portion of the WFOV image
in which the NFOV image appears. The signal processor 20 can
rotate steering mirror 44, under control of observer 50, to
project the NFOV image 210 onto any portion of the screen 60.
The use of a single steering mirror 44 will cause image
rotation which is counteracted, in this case, by an image-
derotator 34 formed by rotating prisms 34 that are rotatable in
the directions of arrows Rdr under control of the signal
processor 20. Various types of image-derotators could be used,
as previously mentioned in the description of the NFOV image
camera, including electronic modifications of the NFOV image
signals applied to LCD 32. The image-derotator 34 and steering
mirror 44 will be located outside of the WFOV projection beam's
path. The LCD screens and projection lamps in the WFOV and NFOV
projections illustrate one type of system but projection cathode
ray tubes (CRTs) could be used instead.
Although a joystick has been mentioned as one means for
positioning the NFOV image, the use of an eye motion tracking
system 26 with LEDs and detectors 28, as illustrated in Figure
4, is preferred since it frees the observer 50 from the need to
manually position the NFOV image. Signals from the eye motion
tracking system 26 are applied, by transmission line 23, to the


21~0681


signal processor 20 which then positions the steering mirror 44
and image-derotator 34 to project a NFOV image onto a selected
portion of screen 60 in accordance with those signals. The
signal processor 20 will also, at the same time, direct the
optics (steering mirror 14) of the NFOV image camera to that
selected portion of the WFOV scene. The eye motion tracking
system 26 will, as a result, allow the position of the NFOV
image, as seen by observer 50, to be moved synchronously with
any movements of the observer's eyes so that the observer's
central, high resolution, vision is always directed to the NFOV
displayed image while the observer's peripheral vision is
simultaneously viewing the low resolution WFOV image.
Various modifications may be made to the preferred
embodiments without departing from the spirit and scope of the
invention as defined in the appended claims. Imaging systems,
for instance, such as a helmet mounted display could be used
wherein the IR WFOV image detector and display is coupled to the
observer's head movements and the high resolution IR NFOV image
detector and display is coupled to the observer's eye movements.
Although, the invention has been described with respect to IR
imaging for an SAR operation, the same principle could be used
for other purposes in which it is necessary to view an image
such as in ultrasonic examination or inspection systems.




- 16 -

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date 2003-09-23
(22) Filed 1995-01-20
(41) Open to Public Inspection 1995-12-18
Examination Requested 2001-09-28
(45) Issued 2003-09-23
Deemed Expired 2013-01-21

Abandonment History

There is no abandonment history.

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Application Fee $0.00 1995-01-20
Maintenance Fee - Application - New Act 2 1997-01-20 $100.00 1996-11-07
Maintenance Fee - Application - New Act 3 1998-01-20 $100.00 1997-10-21
Maintenance Fee - Application - New Act 4 1999-01-20 $100.00 1998-10-28
Maintenance Fee - Application - New Act 5 2000-01-20 $150.00 1999-11-09
Maintenance Fee - Application - New Act 6 2001-01-22 $150.00 2000-10-24
Request for Examination $400.00 2001-09-28
Maintenance Fee - Application - New Act 7 2002-01-21 $150.00 2001-10-01
Registration of a document - section 124 $0.00 2001-11-09
Maintenance Fee - Application - New Act 8 2003-01-20 $150.00 2002-10-21
Final Fee $300.00 2003-06-17
Maintenance Fee - Patent - New Act 9 2004-01-20 $150.00 2003-10-31
Maintenance Fee - Patent - New Act 10 2005-01-20 $250.00 2004-11-18
Maintenance Fee - Patent - New Act 11 2006-01-20 $250.00 2005-11-23
Maintenance Fee - Patent - New Act 12 2007-01-22 $250.00 2006-11-01
Maintenance Fee - Patent - New Act 13 2008-01-21 $250.00 2007-11-07
Maintenance Fee - Patent - New Act 14 2009-01-20 $250.00 2009-01-19
Maintenance Fee - Patent - New Act 15 2010-01-20 $450.00 2010-01-18
Maintenance Fee - Patent - New Act 16 2011-01-20 $450.00 2011-01-17
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
HER MAJESTY THE QUEEN, IN RIGHT OF CANADA, AS REPRESENTED BY THE MINISTER OF NATIONAL DEFENCE
Past Owners on Record
CHEVRETTE, PAUL C.
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Representative Drawing 1999-12-07 1 19
Representative Drawing 2003-03-04 1 15
Cover Page 2003-08-21 2 63
Cover Page 1996-02-06 1 16
Abstract 1995-12-18 1 41
Description 1995-12-18 16 641
Claims 1995-12-18 8 281
Drawings 1995-12-18 4 80
Correspondence 2009-05-13 1 14
Fees 2002-10-21 1 38
Assignment 1995-01-20 6 298
Prosecution-Amendment 2001-09-28 1 55
Prosecution-Amendment 2001-10-03 2 74
Prosecution-Amendment 2002-06-17 2 59
Prosecution-Amendment 2002-08-29 4 171
Correspondence 2003-06-17 1 30
Fees 2001-10-01 1 36
Fees 2003-10-31 1 28
Fees 1999-11-09 1 37
Fees 1998-10-28 1 44
Fees 2000-10-24 1 36
Fees 1997-10-21 1 43
Fees 2004-11-18 1 27
Fees 2005-11-23 1 27
Correspondence 2005-12-01 1 28
Prosecution-Amendment 2005-12-14 1 12
Correspondence 2005-12-14 3 155
Fees 2006-11-01 3 41
Fees 2007-11-07 1 28
Correspondence 2009-01-06 1 25
Assignment 2009-02-18 2 72
Fees 2009-01-19 1 30
Fees 2010-01-18 1 32
Fees 2011-01-17 1 31
Correspondence 2012-03-16 1 31
Correspondence 2012-07-27 2 80
Fees 1996-11-07 1 42