Language selection

Search

Patent 2850811 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent Application: (11) CA 2850811
(54) English Title: CURVED SENSOR ARRAY CAMERA
(54) French Title: CAMERA A CAPTEUR RECOURBE
Status: Deemed Abandoned and Beyond the Period of Reinstatement - Pending Response to Notice of Disregarded Communication
Bibliographic Data
(51) International Patent Classification (IPC):
  • G02B 15/15 (2006.01)
  • G03B 11/00 (2021.01)
  • G03B 17/02 (2021.01)
  • G03B 30/00 (2021.01)
  • G03B 37/02 (2021.01)
  • H01L 27/146 (2006.01)
(72) Inventors :
  • SUTTON, GARY (United States of America)
  • LOCKIE, DOUGLAS GENE (United States of America)
  • BARTON, WILLIAM MAYNARD, JR. (United States of America)
(73) Owners :
  • GARY SUTTON
  • DOUGLAS GENE LOCKIE
  • WILLIAM MAYNARD, JR. BARTON
(71) Applicants :
  • GARY SUTTON (United States of America)
  • DOUGLAS GENE LOCKIE (United States of America)
  • WILLIAM MAYNARD, JR. BARTON (United States of America)
(74) Agent: MILTONS IP/P.I.
(74) Associate agent:
(45) Issued:
(22) Filed Date: 2013-12-27
(41) Open to Public Inspection: 2015-06-27
Availability of licence: N/A
Dedicated to the Public: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): No

(30) Application Priority Data: None

Abstracts

English Abstract


One embodiment of the present invention comprises methods and apparatus
for a camera that includes a curved sensor. In another embodiment, the camera
includes mechanical image stabilization. Yet another embodiment utilizes
electronic
image stabilization. Another embodiment incorporates optical image
stabilization.
In yet another embodiment, a camera with a conventional sensor includes an
automatically-controlled lens shade which is mounted on the outside of the
camera
enclosure. This automatically-controlled lens shade extends for telephoto
shots, and
retracts for wider angle shots.


Claims

Note: Claims are shown in the official language in which they were submitted.


CLAIMS
1. An apparatus comprising:
an enclosure;
an optical element;
said optical element being mounted on said enclosure;
said optical element for conveying a stream of radiation; and
a curved sensor; said curved sensor including a plurality of mini-sensors
which are
separated by gaps;
said curved sensor being mounted inside said enclosure;
said curved sensor being aligned with said optical element;
a signal processor connected to said curved sensor;
said signal processor being connected to an optical image stabilization
circuit;
said signal processor recording a first exposure while said optical image
stabilization
circuit is active; said first exposure including only those portions of a
first image
which register with said plurality of mini-sensors;

said signal processor recording a second exposure while said optical image
stabilization circuit is active; said second exposure being taken later in
time than said
first exposure; said second exposure including only those portions of a second
image
which register with said plurality of mini-sensors;
said first and second exposures then being compared by said signal processor
to detect
missing portions in each of said first and said second exposures;
a composite image then being produced by said signal processor using said
first and
said second exposures.
2. An apparatus as recited in Claim 1, in which:
said sensor generally includes a plurality of segments.
3. An apparatus as recited in Claim 2, in which:
said plurality of segments are disposed to approximate a curved surface.
4. An apparatus as recited in Claim 1, in which:
said curved sensor has a two dimensional profile which is not completely
colinear
86

with a straight line.
87

5. An apparatus as recited in Claim 1, in which:
said curved sensor is fabricated from ultra-thin silicon.
6. An apparatus as recited in Claim 5, in which said ultra-thin silicon ranges
from 10
to 250 microns in one dimension.
7. An apparatus as recited in Claim 1, in which:
said curved sensor is fabricated from polysilicon.
8. An apparatus as recited in Claim 1, in which:
said plurality of pixels are arranged on said curved sensor in varying
density.
9. An apparatus as recited in Claim 1, in which:
said sensor is configured to have a relatively higher concentration of pixels
generally
near the center of said sensor.
88

10. An apparatus as recited in Claim 1, in which:
said sensor is configured to have a relatively lower concentration of pixels
generally
near an edge of said sensor.
11. An apparatus as recited in Claim 1, in which:
said plurality of segments forms a gap between each of said plurality of
segments;
and
said gap is used as a pathway for an electrical connector.
12. An apparatus recited in Claim 1, in which:
said sensor includes a wiring backplane disposed behind said sensor;
each of said plurality of facets is connected to a via to connect said via to
said wiring
backplane.
89

13. An apparatus as recited in Claim 1, in which:
said sensor is configured to have a relatively higher concentration of pixels
generally
near the center of said sensor.
14. An apparatus as recited in Claim 1, in which:
said sensor is configured to have a relatively lower concentration of pixels
generally
near an edge of said sensor.
15. An apparatus as recited in Claim 1, in which:
said relatively high concentration of pixels generally near the center of said
sensor
enables zooming into a telephoto shot using said relatively high concentration
of
pixels generally near the center of said sensor only, while retaining
relatively high
image resolution.

16. An apparatus comprising:
an enclosure;
an optical element;
said optical element being mounted on said enclosure;
said optical element for conveying a stream of radiation; and
a sensor;
said sensor being mounted inside said enclosure;
said sensor being aligned with said optical element;
said sensor being deliberately moved during the collection of said stream of
radiation
to enhance said image;
said sensor including a plurality of pixels;
said plurality of pixels are arranged on said curved sensor in varying
density;
said sensor having a plurality of connections through a plurality of vias to a
wiring
backplane;
said wiring backplane having an output to a signal processor for recording an
image.
91

17. An apparatus comprising:
an enclosure;
an optical element;
said optical element being mounted on said enclosure;
said optical element for conveying a stream of radiation; and
a sensor;
said sensor being mounted inside said enclosure;
said sensor being aligned with said optical element;
said sensor including a plurality of pixels;
said plurality of pixels are arranged on said curved sensor in varying
density;
said sensor having a plurality of connections through a plurality of vias to a
wiring
backplane;
said wiring backplane having an output to a signal processor for recording an
image;
and
an electronic stabilization circuit attached to said signal processor for
producing an
enhanced image.
92

18. An apparatus comprising:
an enclosure;
an optical element;
said optical element being mounted on said enclosure;
said optical element for conveying a stream of radiation; and
a sensor;
said sensor being mounted inside said enclosure;
said sensor being aligned with said optical element;
said sensor having a plurality of connections through a plurality of vias to a
wiring
backplane;
said wiring backplane having an output to a signal processor for recording an
image.
93

19. An apparatus comprising:
an enclosure;
an optical element;
said optical element being mounted on said enclosure;
said optical element for conveying a stream of radiation; and
a sensor;
said sensor being mounted inside said enclosure;
said sensor being aligned with said optical element;
said sensor including a plurality of pixels;
said plurality of pixels are arranged on said curved sensor in varying
density;
said sensor having a plurality of connections through a plurality of vias to a
wiring
backplane;
said wiring backplane having an output to a signal processor for recording an
image;
and
an electronic stabilization circuit attached to said signal processor for
producing an
94

enhanced image.

20. An apparatus comprising:
an enclosure;
an optical element;
said optical element being mounted on said enclosure;
said optical element for conveying a stream of radiation; and
a sensor;
said sensor being mounted inside said enclosure;
said sensor being aligned with said optical element;
a zoom lens; said zoom lens being mounted on said enclosure;
a zoom lens control mechanism; said zoom lens control mechanism being
connected
to said zoom lens; and
an automatically controlled lens shade; said automatically controlled lens
shade being
connected to said zoom lens control mechanism so that said automatically
controlled
lens shade is extended for telephoto exposures and is retracted for wide angle
exposures;
said automatically controlled lens shade being mounted on the exterior of said
enclosure.
96

21. An apparatus as recited in Claim 20, further comprising:
a scattered light sensor mounted outside sensor frame output of said scattered
light
sensor detects size of telephoto image; and
a motor;
said motor connected to said scattered light sensor;
said motor connected to said lens shade; when said scattered light sensor
detects light
scattered outside said sensor, said motor extends said lens shade to shield
said optical
element from stray light.
22. An apparatus as recited in Claim 21, further comprising:
a gear mechanism; said gear mechanism being mounted inside said enclosure for
moving said zoom lens between telephoto and wide angle positions.
97

23. An apparatus comprising:
an enclosure;
an optical element;
said optical element being mounted on said enclosure;
said optical element for conveying a stream of radiation; and
a sensor;
said sensor being mounted inside said enclosure;
said sensor being aliped with said optical element;
a zoom lens; said zoom lens being mounted on said enclosure;
a manual zoom lens control mechanism; said manual zoom lens control mechanism
being connected to said zoom lens;
a manually controlled lens shade; said manually controlled lens shade being
mounted
on the exterior of said enclosure and over said optical element;
said manually controlled lens shade being connected to said manual zoom
control
mechanism so that said manually controlled lens shade is extended for
telephoto
shots, and is retracted for wide angle shots.
98

24. An apparatus comprising:
an enclosure;
an optical element
said optical element being mounted on said enclosure;
said optical element for conveying a stream of radiation; and
a curved sensor;
said curved sensor being mounted inside said enclosure;
said curved sensor being aligned with said optical element;
said curved sensor being deliberately moved during the collection of said
stream of
radiation to enhance said image;
said sensor having a plurality of connections through a plurality of vias to a
wiring
backplane;
said wiring backplane having an output to a signal processor for recording an
image.
25. An apparatus as recited in Claim 12, in which:
said sensor is a curved sensor.
99

26. An apparatus comprising:
an enclosure;
an optical element;
said optical element being mounted on said enclosure;
said optical element for conveying a stream of radiation; and
a curved sensor;
said curved sensor being mounted inside said enclosure;
said curved sensor being aligned with said optical element;
an electronic image stabilization sensor, said electronic image stabilization
sensor
being mounted inside said enclosure; said electronic image stabilization
sensor for
sensing unwanted motion of said enclosure when an exposure is taken;
an actuator; said actuator being electrically connected to said electronic
image
stabilization sensor; said actuator being mechanically coupled to said curved
sensor;
said actuator for moving said curved sensor to counteract unwanted motion of
said
enclosure sensed by said electronic image stabilization sensor;
said sensor having a plurality of connections through a plurality of vias to a
wiring
backplane;
said wiring backplane having an output to a signal processor for recording an
image.
100

27. An apparatus as recited in Claim 26, in which:
said sensor includes a plurality of pixels;
said plurality of pixels are arranged on said curved sensor in varying
density.
101

28. A method comprising the steps of
providing a camera; said camera including a sensor; said camera including an
optical
train; said sensor including a plurality of facets generally bounded by a
plurality of
gaps; said camera including an optical train motion means for intentionally
imparting
movement to said optical train;
recording a first exposure;
activating said optical train motion means to intentionally impart movement to
said
optical train while said second exposure is taken;
taking a second exposure;
comparing said first and said second exposures to detect any missing portions
of the
desired image due to said plurality of gaps in said sensor; and
composing a complete image using both said first and said second exposures.
29. A method as recited in Claim 28, in which:
said optical train motion means for intentionally imparting movement to said
optical
train imparts motion to said curvilinear sensor.
102

30. A method as recited in Claim 28, in which:
said sensor includes a wiring backplane disposed behind said sensor;
each of said plurality of facets is connected to a via to connect said via to
said wiring
backplane.
31. A method as recited in Claim 28, in which:
said sensor is configured to have a relatively higher concentration of pixels
generally
near the center of said sensor.
32. A method as recited in Claim 28, in which:
said sensor is configured to have a relatively lower concentration of pixels
generally
near an edge of said sensor.
33. A method as recited in Claim 28, in which:
said relatively high concentration of pixels generally near the center of said
sensor
enables zooming into a telephoto shot using said relatively high concentration
of
pixels generally near the center of said sensor only, while retaining
relatively high
image resolution.
103

34. A method comprising the steps of:
providing a camera; said camera including a sensor; said camera including an
optical
train; said sensor including a plurality of facets generally bounded by a
plurality of
gaps; said camera including an optical train motion means for intentionally
imparting
movement to said optical train;
an electronic image stabilization sensor; said electronic image stabilization
sensor
being mounted inside said enclosure; said electronic image stabilization
sensor for
sensing unwanted motion of said enclosure when an exposure is taken;
an actuator; said actuator being electrically connected to said electronic
image
stabilization sensor; said actuator being mechanically coupled to said curved
sensor;
said actuator for moving said curved sensor to counteract unwanted motion of
said
enclosure sensed by said electronic image stabilization sensor;
recording a first exposure;
activating said optical train motion means to intentionally impart movement to
said
optical train before said second exposure is taken;
taking a second exposure;
comparing said first and said second exposures to detect any missing portions
of the
desired image due to said plurality of gaps in said sensor; and
composing a complete image using both said first and said second exposures.
104

35. A method as recited in Claim 34, in which:
said optical train motion means for intentionally imparting movement to said
optical
train imparts motion to said curvilinear sensor.
36. A method as recited in Claim 34, in which:
said sensor includes a wiring backplane disposed behind said sensor;
each of said plurality of facets is connected to a via to connect said via to
said wiring
backplane.
37. A method as recited in Claim 34, in which:
said sensor is configured to have a relatively higher concentration of pixels
generally
near the center of said sensor.
38. A method as recited in Claim 34, in which:
said sensor is configured to have a relatively lower concentration of pixels
generally
near an edge of said sensor.
105

39. A method as recited in Claim 34, in which:
said relatively high concentration of pixels generally near the center of said
sensor
enables zooming into a telephoto shot using said relatively high concentration
of
pixels generally near the center of said sensor only, while retaining
relatively high
image resolution.
106

40. A method comprising the steps of:
providing a camera;
said camera including a curved sensor;
said curved sensor including a plurality of mini-sensors;
said camera including an optical train;
a signal processor connected to said curved sensor;
aggregating a plurality of output signals from a neighboring group of said
plurality
of mini-pixels formed on said curved sensor by adding said plurality of output
signals
from said neighboring group of said plurality of pixels, so that the combined
output
is treated by said signal processor as the output of one pixel to improve low
light
performance.
107

41. A method comprising the steps of:
providing a camera;
said camera including a curved sensor;
said curved sensor including a plurality of mini-sensors;
said camera including an optical train;
a signal processor connected to said curved sensor; and
eliminating redundant pixel storage in exchange for detail loss.
108

42. An apparatus comprising:
an enclosure;
an optical element;
said optical element being mounted on said enclosure;
said optical element for conveying a stream of radiation;
said enclosure being filled with an insulating gas;
a curved sensor; said curved sensor including a plurality of mini-sensors
which are
separated by gaps;
said curved sensor being mounted inside said enclosure;
said curved sensor being aligned with said optical element; and
a signal processor connected to said curved sensor for recording an output.
43. An apparatus as recited in Claim 42, in which:
said insulating gas is Argon.
109

44. An apparatus as recited in Claim 42, in which:
said insulating gas is Krypton.
45. An apparatus as recited in Claim 42, in which:
said insulating gas is Xenon.
110

47. An apparatus comprising:
an enclosure;
an optical element;
said optical element being mounted on said enclosure;
said optical element for conveying a steam of radiation;
a curved sensor; said curved sensor being produce from Graphene;
said curved sensor being mounted inside said enclosure;
said curved sensor being aligned with said optical element; and
a signal processor connected to said curved sensor for recording an output.
111

48. An apparatus comprising:
an enclosure;
an optical element;
said optical element being mounted on said enclosure;
said optical element for conveying a steam of radiation;
a curved sensor; said curved sensor being produce from Stressed Silicon;
said curved sensor being mounted inside said enclosure;
said curved sensor being aligned with said optical element; and
a signal processor connected to said curved sensor for recording an output.
112

49. An apparatus comprising:
an enclosure;
an optical element;
said optical element being mounted on said enclosure;
said optical element for conveying a stream of radiation;
a curved sensor; said curved sensor being produce from Strained Silicon;
said curved sensor being mounted inside said enclosure;
said curved sensor being aligned with said optical element; and
a signal processor connected to said curved sensor for recording an output.
113

50. An apparatus comprising:
an enclosure;
an optical element;
said optical element being mounted on said enclosure;
said optical element for conveying a stream of radiation;
a curved sensor; said curved sensor including a plurality of petal-shaped
segments
joined together and shaped so that they overlap;
said curved sensor being mounted inside said enclosure;
said curved sensor being aligned with said optical element; and
a signal processor connected to said curved sensor for recording an output.
114

51. An apparatus comprising:
a camera enclosure;
an objective lens; said objective lens being mounted on said camera enclosure;
a plurality of mini-sensors;
said plurality of mini-sensors being disposed within said camera enclosure;
said plurality of mini-sensors being arranged along a first arc to form a
curved array;
a separating and concentrating optical element for splitting and focusing rays
of light
emerging from said objective lens onto said plurality of mini-sensors; said
separating
and concentrating optical element being disposed between said objective lens
and said
plurality of mini-sensors; said separating and concentrating optical element
being
aligned along a second arc which is parallel to said first arc;
a signal processor;
each of said plurality of mini-sensors having an output;
said output each of said plurality of mini-sensors being connected to a signal
processor.
115

52. An apparatus as recited in Claim 23, in which:
said zoom lens includes a zoom lens barrel; said zoom lens barrel being
connected
to said manually controlled lens shade for controlling the position of said
lens shade.
116

Description

Note: Descriptions are shown in the official language in which they were submitted.


CA 02850811 2013-12-27
CURVED SENSOR ARRAY CAMERA
BACKGROUND OF THE INVENTION
I. A Brief History of Cameras
Evolution of the Three Primary Camera Types
Current photographic cameras evolved from the first "box" and "bellows"
models into three basic formats by the late twentieth century.
The rangefinder came first. It was followed by the SLR, or, single lens reflex
and finally the Compact "Point and Shoot" cameras. Most portable cameras today
use
rangefinder, SLR or "Point and Shoot" formats.
Simple Conventional Cameras
Figure 1 is a simplified view of a conventional camera, which includes an
enclosure, an objective lens and a flat section of photographic film or a flat
sensor.
A simple lens with a flat film or sensor faces several problems. Light travels
over a longer pathway to the edges of the film or the sensor's image area,
diluting
those rays. Besides being weaker, as those rays travel farther to the sensor's
edges,
they suffer more "rainbow effect," or chromatic aberration.
Figure 2 presents a simplified view of the human eye, which includes a curved
surface for forming an image. The human eye, for example, needs only a cornea
and
a single lens to form an image. But on average, one human retina contains
twenty-
five million rods and six million cones. Today's high end cameras use lenses
with
from six to twenty elements. Only the rarest, most expensive cameras have as
many
1

CA 02850811 2013-12-27
pixels as the eye has rods and cones, and none of these cameras capture images
after
sunset without artificial light.
The eagle's retina has eight times as many retinal sensors as the human eye.
They are arranged on a sphere the size of a marble. The eagle's rounded
sensors
make simpler optics possible. No commercially available camera that is
available
today has a pixel count which equals a fourth of the count of sensors in an
eagle's eye.
The eagle eye uses a simple lens and a curved retina. The best conventional
cameras
use multiple element lenses with sophisticated coatings, exotic materials and
complex
formulas. This is all to compensate for their flat sensors. The eagle sees
clearly at
noon, in daylight or at dusk with simpler, lighter and smaller optics than any
camera.
Rangefinder Cameras
Rangefinder cameras are typified by a broad spectrum from the early LEICATm
thirty-five millimeter cameras, for professionals, to the later "1NSTAMATICTm"
film
types for the masses. (Most of KODAK' sTM INSTAMATICTm cameras did not focus,
so they were not true rangefinders. A few "Instamatic type" models focused,
and had
a "viewing" lens separated from the "taking" lens, qualifying them as
rangefinders.)
Rangefinder cameras have a "taking" lens to put the image on the film (or
sensor today) when the shutter opens and closes; mechanically or digitally.
These
cameras use a second lens for viewing the scene. Focusing takes place through
this
viewing lens which connects to, and focuses, the taking lens.
Since the taking lens and the viewing lens are different, and have different
perspectives on the scene being photographed, the taken image is always
slightly
different than the viewed image. This problem, called parallax, is minor in
most
2

CA 02850811 2013-12-27
situations but becomes acute at close distances.
Longer telephoto lenses, which magnify more, are impractical for rangefinder
formats. This is because two lenses are required, they are expensive and
require more
side-to-side space than exists within the camera body. That's why no long
telephoto
lenses exist for rangefinder cameras.
Some rangefinder cameras use a frame in the viewfinder which shifts the
border to match that of the taking lens as the focus changes. This aligns the
view with
the picture actually taken, but only for that portion that's in focus.
Backgrounds and
foregrounds that are not in focus shift, so those parts of the photographed
image still
vary slightly from what was seen in the viewfinder.
A few rangefinder cameras do exist that use interchangeable or attachable
lenses, but parallax remains an unsolvable problem and so no manufacturer has
ever
successfully marketed a rangefinder camera with much beyond slightly wide or
mildly
long telephoto accessories. Any added rangefinder lens must also be
accompanied
by a similar viewfinder lens. If not, what is viewed won't match the
photograph taken
at all. This doubles the lens cost.
A derivation of the rangefinder, with the same limitations for accessory
lenses,
was the twin lens reflex, such as those made by ROLLEIWERKETM cameras.
Compact, or "Point and Shoot" Cameras
Currently, the most popular format for casual photographers is the "Point and
Shoot" camera. They emerged first as film cameras but are now nearly all
digital.
Many have optical zoom lenses permanently attached with no possibility for
interchanging optics. The optical zoom, typically, has a four to one range,
going from
3

CA 02850811 2013-12-27
slight wide angle to mild telephoto perspectives. Optical zooms don't often go
much
beyond this range for acceptable results and speed. Some makers push optical
zoom
beyond this four to one range, but the resulting images and speeds
deteriorate. Others
add digital zoom to enhance their optical range; causing results that most
trade editors
and photographers currently hate, for reasons described in following
paragraphs.
There are no "Point and Shoot" cameras with wide angle lenses as wide as the
perspective are for an eighteen millimeter SLR lens (when used, for relative
comparison, on the old standard thirty-five millimeter film SLR cameras.)
There are
no "Point and Shoot" cameras with telephoto lenses as long as a two hundred
millimeter SLR lens would have been (if on the same old thirty-five millimeter
film
camera format.)
Today, more photographs are taken daily by mobile phones and PDAs than by
conventional cameras. These will be included in the references herein as
"Point and
Shoot Cameras."
Single Lens Reflex (SLR) Cameras
Single lens reflex cameras are most commonly used by serious amateurs and
professionals today since they can use wide selections of accessory lenses.
With 35 mm film SLRs, these lenses range from 18 mm "fisheye" lenses to
1,000 mm super-telephoto lenses, plus optical zooms that cover many ranges in
between.
With most SLRs there's a mirror behind the taking lens which reflects the
image into a viewfinder. When the shutter is pressed, this mirror flips up and
out of
4

CA 02850811 2013-12-27
the way, so the image then goes directly onto the film or sensor. In this way,
the
viewfinder shows the photographer almost the exact image that will be taken,
from
extremes in wide vistas to distant telephoto shots. The only exception to an
"exact"
image capture comes in fast action photography, when the delay caused by the
mirror
movement can result in the picture taken being slightly different than that
image the
photographer saw a fraction of a second earlier.
This ability to work with a large variety of lenses made the SLR a popular
3.0 camera format of the late twentieth century, despite some inherent
disadvantages.
Those SLR disadvantages are the complexity of the mechanism, requiring
more moving parts than with other formats, plus the noise, vibration and delay
caused
by the mirror motion. Also, lens designs are constrained, due to the lens
needing to
be placed farther out in front of the path of the moving mirror, which is more
distant
from the film or sensor, causing lenses to be heavier, larger and less optimal
without
lens fogging. There is also the introduction of dust, humidity and other
foreign
objects into the camera body and on the rear lens elements when lenses are
changed.
Dust became a worse problem when digital SLRs arrived, since the sensor is
fixed, unlike film. Film could roll away the dust speck so only one frame was
affected. With digital cameras, every picture is spotted until the sensor is
cleaned. Recent designs use intermittent vibrations to clear the sensor. This
doesn't
remove the dust from the camera and fails to remove oily particles. Even more
recent
designs, recognizing the seriousness of this problem, have adhesive strips
inside the
cameras to capture the dust if it is vibrated off from the sensor. These
adhesive strips,
however, should be changed regularly to be effective, and, camera users
typically
would require service technicians to do this.
5

CA 02850811 2013-12-27
Since the inherent function of an SLR is to use interchangeable lenses, the
problem continues.
Extra weight and bulk are added by the mirror mechanism and viewfinder
optics to SLRs. SLRs need precise lens and body mounting mechanisms, which
also have mechanical and often electrical connections between the SLR lens and
the SLR body. This further adds weight, complexity and cost.
Some of these "vibration" designs assume all photos use a horizontal format,
with no adhesive to catch the dust if the sensor vibrates while in a vertical
position,
or, when pointed skyward or down.
Optical Zoom Lenses
Optical zoom lenses reduce the need to change lenses with an SLR. The
photographer simply zooms in or out for most shots. Still, for some
situations, an
even wider or longer accessory lens is required with the SLR, and the
photographer
changes lenses anyway.
Many "Point and Shoot" cameras today have zoom lenses as standard;
permanently attached. Nearly all SLRs offer zoom lenses as accessories. While
optical technology continues to improve, there are challenges to the zoom
range any
lens can adequately perfon-n. Other dilemmas with zoom lenses are that they
are
heavier than their standard counterparts, they are "slower," meaning less
light gets
through, limiting usefulness, and zoom lenses never deliver images that are as
sharp
or deliver the color fidelity as a comparable fixed focal length lens. And
again, the
6

CA 02850811 2013-12-27
optical zoom, by moving more elements in the lens, introduces more moving
parts,
which can lead to mechanical problems with time and usage, plus added cost.
Because optical zooms expand mechanically, they often function like an air
pump,
sucking in outside air while zooming to telephoto and squeezing out air when
retracting for wider angle perspectives. This can easily introduce humidity
and
sometimes dust to the inner elements.
II. The Limitations of Conventional Mobile Phone Cameras
The Gartner Group has reported that over one billion mobile phones were sold
worldwide in 2009. A large portion of currently available mobile phones
include a
camera. These cameras are usually low quality photographic devices with simple
planar arrays situated behind a conventional lens. The quality of images that
may be
captured with these cell phone cameras is generally lower than that which may
be
captured with dedicated point-and-shoot or more advanced cameras. Cell phone
cameras usually lack advanced controls for shutter speed, telephoto or other
features.
Conventional cell phone and PDA cameras suffer from the same four
deficiencies.
1. Because they use flat digital sensors, the optics are deficient,
producing poor quality pictures. To get normal resolution would
require larger and bulkier lenses, which would cause these compact
devices to become unwieldy.
7

CA 02850811 2013-12-27
2. Another compromise is that these lenses are slow, gathering less
light. Many of the pictures taken with these devices are after sunset or
indoors. This often means flash is required to enhance the illumination.
With the lens so close to the flash unit, as is required in a compact
device, a phenomena known as "red-eye" often occurs. (In darkened
situations, the pupil dilates in order to see better. In that situation, the
flash often reflects off the subject's retina, creating a disturbing "red
eye" image. This is so common that some camera makers wired their
devices so a series of flashes go off before the picture is taken with
flash, in an attempt to close down the pupils. This sometimes works and
always disturbs the candid pose. Pencils to mark out "red eye" are
available at retail. There are "red eye" pencils for humans and even
"pet eye" pencils for animals. Some camera software developers have
written algorithms that detect "red eye" results and artificially remove
the "red eye," sometimes matching the subject's true eye color, but
more often not.
3. Flash photography shortens battery life.
4. Flash photography is artificial. Faces in the foreground can be
bleached white while backgrounds go dark. Chin lines are pronounced,
and it sometimes becomes possible to see into a human subject's
nostrils, which is not always pleasing to viewers.
8

CA 02850811 2013-12-27
Current sales of high definition television sets demonstrate the growing
public
demand for sharper images. In the past, INSTAMATIC cameras encouraged more
picture-taking, but those new photographers soon tired of the relatively poor
image
quality. Thirty-five millimeter cameras, which were previously owned mostly by
professionals and serious hobbyists, soon became a mass market product.
With unprecedented numbers of photos now being taken with mobile phones,
and the image quality being second-rate, this cycle is likely to repeat.
The development of a system that reduces these problems would constitute a
major technological advance, and would satisfy long-felt needs in the imaging
business.
SUMMARY OF THE INVENTION
One embodiment of the present invention comprises methods and apparatus
for a camera that includes a curved sensor. In another embodiment, the camera
includes mechanical image stabilization. Yet another embodiment utilizes
electronic
image stabilization. Another embodiment incorporates optical image
stabilization.
In yet another embodiment, a camera with a conventional sensor includes an
automatically-controlled lens shade which is mounted on the outside of the
camera
enclosure. This automatically-controlled lens shade extends for telephoto
shots, and
retracts for wider angle shots. In yet another embodiment, the camera utilizes
a
arcuate array of mini-sensors, together with a corrective optical element.
An appreciation of the other aims and objectives of the present invention, and
a more complete and comprehensive understanding of this invention, may be
obtained
by studying the following description of a preferred embodiment, and by
referring to
the accompanying drawings.
9

CA 02850811 2013-12-27
A BRIEF DESCRIPTION OF THE DRAWESIGS
Figure 1 depicts a generalized conventional camera with flat film or a flat
sensor.
Figure 2 is a simplified depiction of the human eye.
Figure 3 provides a generalized schematic diagram of a digital camera with a
curved sensor manufactured in accordance with one embodiment of the present
invention.
Figures 4A,.4B, and 4C offer an assortment of views of a generally curved
sensor.
Figure 5 depicts a sensor formed from nine planar segments or facets.
Figure 6 reveals a cross-sectional view of a generally curved surface
comprising a number of flat facets.
Figure 7 provides a perspective view of the curved surface shown in Figure 6.
Figure 8 offers a view of one method of making the electrical connections for
the sensor shown in Figures 6 and 7.
Figures 9A and 9B portray additional details of the sensor illustrated in
Figure
7, before and after enlarging the gaps above the substrate, so the flat
surface can be
bent.
Figures 10A and 10B supply views of sensor connections.
Figures 11A and 11B depict a series of petal-shaped segments of ultra-thin
silicon that are bent or otherwise formed to create a generally dome-shaped
surface.

CA 02850811 2013-12-27
Figure 12 furnishes a detailed view of an array of sensor segments.
Figure 13 is a perspective view of a curved shape that is produced when the
segments shown in Figure 12 are joined.
Figures 14A, 14B and 14C illustrate an alternative method of the invention
that
uses a thin layer of semiconductor material that is formed into a generally
dome-
shaped surface using a mandrel.
Figures 14D, 14E and 14F illustrate methods for formed a generally dome-
shaped surface using a mandrel.
Figure 14G shows the dome-shaped surface after sensors have been deployed
on its surface.
Figure 15A shows a camera taking a wider angle photo image.
Figure 15B shows a camera taking a normal perspective photo image.
Figure 15C shows a camera taking a telephoto image.
Figures 16 and 17 illustrate the feature of variable pixel density by
comparing
views of a conventional sensor with one of the embodiments of the present
invention,
where pixels are more concentrated in the center.
Figures 18, 19, 20 and 21 provide schematic views of a camera with a
retractable and extendable shade. When the camera is used for wider angle
shots, the
lens shade retracts. For telephoto shots, the lens shade extends. For normal
perspectives, the lens shade protrudes partially.
11

CA 02850811 2013-12-27
Figures 22 and 23 supply two views of a composite sensor. In the first view,
the sensor is aligned in its original position, and captures a first image. In
the second
view, the sensor has been rotated, and captures a second image. The two
successive
images are combined to produce a comprehensive final image.
Figures 24A and 24B offer an alternative embodiment to that shown in Figures
22 and 23, in which the sensor position is displaced diagonally between
exposures.
Figures 25A, 25B, 25C and 25D offer four views of sensors that include gaps
between a variety of arrays of sensor facets.
Figures 26, 27 and 28 provide illustrations of the back of a moving sensor,
revealing a variety of connecting devices which may be used to extract an
electrical
signal.
Figure 29 is a block diagram that illustrates a wireless connection between a
sensor and a processor.
Figure 30 is a schematic side sectional view of a camera apparatus in
accordance with another embodiment of the present invention.
Figure 31 is a front view of the sensor of the camera apparatus of Fig. 30.
Figure 32 is a block diagram of a camera apparatus in accordance with a
further embodiment of the present invention.
Figures 33, 34, 35, 36 and 37 provide various views of an electronic device
which incorporates a curved sensor.
Figures 38 through 50 illustrate a method to capture more detail from a scene
than the sensor is otherwise capable of recording.
12

CA 02850811 2013-12-27
Figure 51 presents a schematic illustration of an optical element which moves
in a tight circular path over a stationary flat sensor.
Figure 52 is an overhead view of the optical element and sensor shown in
Figure 51.
Figure 53 presents a schematic illustration of an optical element which moves
over a stationary curved sensor.
Figure 54 is an overhead view of the optical element and sensor shown in
Figure 53.
Figure 55 presents a schematic illustration of a method for imparting motion
to a flat sensor, which moves beneath a stationary optical element.
Figure 56 is an overhead view of the optical element and sensor shown in
Figure 55.
Figure 57 presents a schematic illustration of a method for imparting circular
motion to a sensor, such as the ones shown in Figures 55 and 56.
Figure 58 is a perspective illustration of the components shown in Figure 58.
Figure 59 presents a schematic illustration of a method for imparting motion
to a curved sensor, which moves beneath a stationary optical element.
Figure 60 is an overhead view of the optical element and sensor shown in
Figure 59.
Figure 61 is a schematic illustration of a method for imparting circular
motion
to an optical element.
Figure 62 presents nine sequential views of a flat sensor as it moves in a
single
circular path.
13

CA 02850811 2013-12-27
Figure 63 is a schematic representation of a flat sensor arrayed with pixels.
In
Figure 63, the sensor resides in its original position. In Figures 64 and 65,
the sensor
continues to rotate through the circular path.
Figure 66 shows a combination of a flat sensor and a lens.
Figure 67 shows a combination of a curved sensor with gaps and a lens.
Figures 68 and 69 provide two successive views of a first exposure taken by
a camera without image stabilization.
Figures 70 and 71 provide two successive views of a first exposure taken by
a camera with image stabilization.
Figure 72 presents an unaided eye's view of a cat.
Figures 73 and 74 offer two successive views of a first and a second exposure
of the cat, which are superimposed over the mini-sensors and gaps within the
camera.
Figure 75 reveals the final composite image of the cat.
Figure 76 is a schematic diagram of one embodiment of the invention which
depicts optical image stabilization.
Figure 77 is a schematic diagram of another embodiment of the invention
which depicts electronic image stabilization.
Figure 78 is a schematic diagram of another embodiment of the invention
which illustrates a lens shade motor control.
Figure 79 is a schematic diagram of another embodiment of the invention
which portrays a manual zoom and lens shade control.
Figures 80 through 83 are schematic diagrams which illustrate lens shade
control mechanisms.
14

CA 02850811 2013-12-27
Figure 84 is a schematic diagram which depicts a manual zoom and lens shade
control.
Figures 85, 86 and 87 illustrate binning and compression methods.
Figure 88 depicts an arcuate array of mini-sensors, together with a corrective
optical element.

CA 02850811 2013-12-27
A DETAILED DESCRIPTION OF PREFERRED
& ALTERNATIVE EMBODIMENTS
Section 1. Overview of the Invention
The present invention provides methods and apparatus related to a camera
having a non-planar, curved or curvilinear sensor. The present invention may
be
incorporated in a mobile communication device. In this Specification, and in
the
Claims that follow, the terms "mobile communication device" and "mobile
communication means" are intended to include any apparatus or combination of
hardware and/or software which may be used to communicate, which includes
transmitting and/or receiving infoi ______________________________________
illation, data or content or any other form of signals
or intelligence.
Specific examples of mobile communication devices include cellular or
wireless telephones, smart phones, personal digital assistants, laptop or
netbook
computers, iPadsTM or other readers/computers, or any other generally portable
device
which may be used for telecommunications or viewing or recording visual
content.
Unlike conventional cellular telephones which include a camera that utilizes
a conventional flat sensor, the present invention includes a curved or
otherwise
non-planar sensor. In one embodiment, the non-planar surfaces of the sensor
used
in the present invention comprise a plurality of small flat segments which
altogether
approximate a curved surface. In general, the sensor used by the present
invention
16

CA 02850811 2013-12-27
occupies three dimensions of space, as opposed to conventional sensors, which
are
planes that are substantially and generally contained in two physical
dimensions.
The present invention may utilize sensors which are configured in a variety of
three-dimensional shapes, including, but not limited to, spherical,
paraboloidal and
ellipsoidal surfaces.
In the present Specification, the terms "curvilinear," "curved," and "concave"
encompass any line, edge, boundary, segment, surface or feature that is not
completely colinear with a straight line. The term "sensor" encompasses any
detector,
imaging device, measurement device, transducer, focal plane array, charge-
coupled
device (CCD), complementary metal-oxide semiconductor (CMOS) or photocell that
responds to an incident photon of any wavelength.
While some embodiments of the present invention are configured to record
images in the optical spectrum, other embodiments of the present invention may
be
used for a variety of tasks which pertain to gathering, sensing and/or
recording other
forms of radiation. Embodiments of the present invention include systems that
gather
and/or record color, black and white, infra-red, ultraviolet, x-rays or any
other stream
of radiation, emanation, wave or particle. Embodiments of the present
invention also
include systems that record still images or motion pictures.
17

CA 02850811 2013-12-27
Section 2. Specific Embodiments of the Invention
Figure 3 provides a generalized schematic diagram of a digital camera 10 with
a curved sensor 12 sub-assembly which may be incorporated into a mobile
communication device. A housing 14 has an optical element 16 mounted on one of
its walls. The objective lens 16 receives incoming light 18. In this
embodiment, the
optical element is an objective lens. In general, the sensor 12 converts the
energy of
the incoming photons 18 to an electrical output 20, which is then fed to a
signal or
photon processor 22. The signal processor 22 is connected to user controls 24,
a
battery or power supply 26 and to a solid state memory 28. Images created by
the
signal processor 22 are stored in the memory 28. Images may be extracted or
downloaded from the camera through an output terminal 30, such as a USB port.
Embodiments of the present invention include, but are not limited to, mobile
communication devices with a camera that incorporate the following sensors:
1. Curved sensors: Generally continuous portions of spheres, or
revolutions of conic sections such as parabolas or ellipses or other
non-planar shapes. Examples of a generally curved sensor 12 appear
in Figures 4A, 4B and 4C. In this specification, various embodiments
of curved sensors are identified with reference character 12, 12a, 12b,
12c, and so on.
18

CA 02850811 2013-12-27
2. Faceted sensors: Aggregations of polygonal facets or
segments.
Any suitable polygon may be used, including squares, rectangles,
triangles, trapezoids, pentagons, hexagons, septagons, octagons or
others. Figure 5 exhibits a sensor 12a comprising nine flat polygonal
segments or facets 32a. For some applications, a simplified assembly
of a few flat sensors might lose most of the benefit of a smoother curve,
while achieving a much lower cost. Figures 6 and 7 provide side and
perspective views of a generally spherical sensor surface 12b
comprising a number of flat facets 32b. Figure 7 shows exaggerated
gaps 34 between the facets. The facets could each have hundreds,
thousands or many millions of pixels. In this specification, the facets
of the sensor 12 are identified with reference characters 32, 32a, 32b,
32c and so on.
Figure 8 offers a view of the electrical connections 36 for the curved sensor
12b shown in Figure 7. The semiconductor facet array is disposed on the
interior
surface. The exterior surface may be a MYLARTM, KAPTONTm or similar wiring
backplane formed in a curved shape. Vias provide electrical connections
between the
facet array and the wiring backplane. In one embodiment, two to two thousand
or
more electrical pathways may connect the facet array and the wiring backplane.
19

CA 02850811 2013-12-27
For one embodiment of the invention, several methods are currently available
to produce "bendable" silicon:
"Japanese chemical company Teijin, in cooperation with
California-based NanoGram, has developed a technology that makes it
possible to produce bendable silicon semiconductor chips. The key
factor was the usage of tiny silicon particles which are tens of
nanometers in diameter (and a nanometer is one billionth of a meter)."
See website for Techcrunch, 19 August 2009.
In their article entitled Bendable GaAs metal-semiconductor
field-effect transistors formed with printed GaAs wire arrays on plastic
substrates, published on 15 August 2005, Sun et al. disclose that
"Micro/nanowires of GaAs with integrated ohmic contacts have been
prepared from bulk wafers by metal deposition and patterning, high-
temperature annealing, and anisotropic chemical etching. These wires
provide a unique type of material for high-performance devices that can
be built directly on a wide range of unusual device substrates, such as
plastic or paper. In particular, transfer printing organized arrays of these
wires at low temperatures onto plastic substrates yield high-quality
bendable metal-semiconductor field-effect transistors."

CA 02850811 2013-12-27
According to the website Endgadget, "researchers from IMEC have
developed bendable microprocessor by layering a plastic substrate, gold
circuits, organic dielectric, and a pentacene organic semiconductor to
create an 8-bit logic circuit with 4000 transistors."
In another embodiment of the invention, the sensor may be formed from
stressed or strained Silicon.
Figure 9 provides a detailed view of facets on the curved sensor 12b. In
general, the more polygons that are employed to mimic a generally spherical
surface,
the more the sensor will resemble a smooth curve. In one embodiment of the
invention, a wafer is manufactured so that each camera sensor has tessellated
facets.
Either the front side or the rear side of the wafer of sensor chips is
attached to a
flexible membrane that may bend slightly (such as MYLARTM or KAPTONTm), but
which is sufficiently rigid to maintain the individual facets in their
respective
locations. A thin line is etched into the silicon chip between each facet, but
not
through the flexible membrane. The wafer is then shaped into a generally
spherical
surface. Each facet is manufactured with vias formed through the wafer to
connect
a rear wiring harness. This harness may also provide mechanical support for
the
individual facets.
Figures 9A and 9B furnish a view of the facets 32b which reside on the
interior
of the curved sensor, and the electrical interconnects that link the sensor
facets with
the wiring backplane.
21

CA 02850811 2013-12-27
Figures I OA and 10B illustrate a wiring backplane 38 which may be used to
draw output signals from the facets on the sensor.
Figures 11A and 11B show a generally hemispherical shape 40 that has been
formed by bending and then joining a number of ultra-thin silicon petal-shaped
segments 42. These segments are bent slightly, and then joined to form the
curved
sensor.
Figure 12 provides a view of one embodiment of the petal-shaped segments 42.
Conventional manufacturing methods may be employed to produce these segments.
In one embodiment, these segments are formed from ultra-thin silicon, which
are able
to bend somewhat without breaking. In this Specification, and in the Claims
that
follow, the term "ultra-thin" denotes a range extending generally from 10 to
250
microns. In another embodiment, pixel density is increased at the points of
the
segments, and are gradually decreased toward the base of each segment. This
embodiment may be implemented by programming changes to the software that
creates the pixels.
Figure 13 offers a perspective view of one embodiment of a curved shape that
is formed when the segments shown in Figure 12 are joined. The sensors are
placed
on the concave side, while the electrical connections are made on the convex
side.
The number of petals used to form this non-planar surface may comprise any
suitable
number. Heat or radiation may be employed to form the silicon into a desired
shape.
The curvature of the petals may be varied to suit any particular sensor
design.
22

CA 02850811 2013-12-27
In one alternative embodiment, a flat center sensor might be surrounded by
these "petals" with squared-off points.
Figures 14A, 14B and 14C depict an alternative method for forming a curved
sensor. Figure 14A depicts a dome-shaped first mandrel 43a on a substrate 43h.
In
Figure 14B, a thin sheet of heated deformable material 43c is impressed over
the first
mandrel 43a. The central area of the deformable material 43c takes the shape
of the
first mandrel 43a, forming a generally hemispherical base 43e for a curved
sensor, as
shown in Figure 14C.
Figures 14D, 14E and 14F depict an alternative method for forming the base
of a curved sensor. In Figure 14D, a second sheet of heated, deformable
material 43f
is placed over a second mandrel 43g. A vacuum pressure is applied to ports
43h,
which draws the second sheet of heated, deformable material 43f downward into
the
empty region 43i enclosed by the second mandrel 43g. Figure 14E illustrates
the next
step in the process. A heater 43j increases the temperature of the second
mandrel 43g,
while the vacuum pressure imposed on ports 43h pulls the second sheet of
heated,
deformable material 43f down against the inside of the second mandrel 43g.
Figure
14F shows the resulting generally hemispherical dome 43k, which is then used
as the
base of a curved sensor.
Figure 14G shows a generally hemispherical base 43e or 43k for a curved
sensor after sensor pixels 431 have been formed on the base 43e or 43k.
23

CA 02850811 2013-12-27
Digital Zoom
Figure 15A shows a camera taking a wide angle photo. Figure 15A shows the
same camera taking a normal perspective photo, while Figure 15B shows a
telephoto
view. In each view, the scene stays the same. The view screen on the camera
shows
a panorama in Figure 15A, a normal view in Figure 15B, and detail from the
distance
in Figure 15C. Just as with optical zoom, digital zoom shows the operator
exactly the
scene that is being processed from the camera sensor.
Digital zoom is software-driven. The camera either captures only a small
portion of the central image, the entire scene or any perspective in between.
The
monitor shows the operator what portion of the overall image is being
recorded.
When digitally zooming out to telephoto in one embodiment of the present
invention,
which uses denser pixels in its center, the software can use all the data.
Since the
center has more pixels per area, the telephoto image, even though it is
cropped down
to a small section of the sensor, produces a crisp image. This is because the
pixels are
more dense at the center.
When the camera has "zoomed back" into a wide angle perspective, the
software can compress the data in the center to approximate the density of the
pixels
in the edges of the image. Because so many more pixels are involved in the
center of
this wide angle scene, this does not effect wide angle image quality. Yet, if
uncompressed, the center pixels represent unnecessary and invisible detail
captured,
24

CA 02850811 2013-12-27
and require more storage capacity and processing time. Current photographic
language might call the center section as being processed "RAW" or
uncompressed
when shooting telephoto but being processed as "JPEG" or other compression
algorithm in the center when the image is wide angle.
Digital zoom is currently disdained by industry experts. When traditional
sensors capture an image, digital zooming creates images that break up into
jagged
lines, forms visible pixels and yields poor resolution.
Optical zoom has never created images as sharp as fixed focus length lenses
are capable of producing. Optical zooms are also slower, letting less light
through the
optical train.
Embodiments of the present invention provide lighter, faster, cheaper and more
dependable cameras. In one embodiment, the present invention provides digital
zoom. Since this does not require optical zoom, it uses inherently lighter
lens designs
with fewer elements.
In various embodiments of the invention, more pixels are concentrated in the
center of the sensor, and fewer are placed at the edges of the sensor. Various
densities
may be arranged in between the center and the edges. This feature allows the
user to
zoom into a telephoto shot using the center section only, and still have high
resolution.

CA 02850811 2013-12-27
In one embodiment, when viewing the photograph in the wide field of view,
the center pixels are "binned" or summed together to normalize the resolution
to the
value of the outer pixel density.
When viewing the photograph in telephoto mode, the center pixels are utilized
in their highest resolution, showing maximum detail without requiring any
adjustment
of lens or camera settings.
The digital zoom feature offers extra wide angle to extreme telephoto zoom.
This feature is enabled due to the extra resolving power, contrast, speed and
color
resolution lenses are able to deliver when the digital sensor is not flat, but
curved,
somewhat like the retina of a human eye. The average human eye, with a cornea
and
single lens element, uses, on average, 25 million rods and 6 million cones to
capture
images. This is more image data than is captured by all but a rare and
expensive
model or two of the cameras that are commercially available today, and those
cameras
typically must use seven to twenty element lenses, since they are constrained
by flat
sensors. These cameras cannot capture twilight images without artificial
lighting, or,
by boosting the ISO which loses image detail. These high-end cameras currently
use
sensors with up to 48 millimeter diagonal areas, while the average human
eyeball has
a diameter of 25 millimeters. Eagle eyes, which are far smaller, have eight
times as
many sensors as a human eye, again showing the optical potential that a curved
sensor
or retina provides. Embodiments of the present invention are more dependable,
cheaper and provide higher performance. Interchangeable lenses are no longer
necessary, which eliminates the need for moving minors and connecting
mechanisms.
26

CA 02850811 2013-12-27
Further savings are realized due to simpler lens designs, with fewer elements,
because
flat film and sensors, unlike curved surfaces, are at varying distances and
angles from
the light coming from the lens. This causes chromatic aberrations and varying
intensity across the sensor. To compensate for that, current lenses, over the
last two
centuries, have mitigated the problem almost entirely, but, with huge
compromises.
Those compromises include limits on speed, resolving power, contrast, and
color
resolution. Also, the conventional lens designs require multiple elements,
some
aspheric lenses, exotic materials and special coatings for each surface.
Moreover,
there are more air to glass surfaces and more glass to air surfaces, each
causing loss
of light and reflections.
Variable Density of Pixels
In some embodiments of the present invention, the center of the sensor, where
the digitally zoomed telephoto images are captured, is configured with dense
pixilation, which enables higher quality digitally zoomed images.
Figures 16 and 17 illustrate this feature, which utilizes a high density
concentration of pixels 48 at the center of a sensor. By concentrating pixels
near the
central region of the sensor, digital zoom becomes possible without loss of
image
detail. This unique approach provides benefits for either flat or curved
sensors. In
Figure 16, a conventional sensor 46 is shown, which has pixels 48 that are
generally
uniformly disposed over the surface of the sensor 46. Figure 17 shows a sensor
50
27

CA 02850811 2013-12-27
produced in accordance with the present invention, which has pixels 48 that
are more
densely arranged toward the center of the sensor 50.
In another embodiment of the invention, suitable software compresses the
dense data coming from the center of the image when the camera senses that a
wide
angle picture is being taken. This feature greatly reduces the processing and
storage
requirements for the system.
Lens Shade
Other embodiments of the invention include a lens shade, which senses the
image being captured, whether wide angle or telephoto. When the camera senses
a
wide angle image, it retracts the shade, so that the shade does not get into
the image
area. When it senses the image is telephoto, it extends, blocking extraneous
light
from the non-image areas, which can cause flare and fogged images.
Figures 18 and 19 provide views of a camera equipped with an optional
retractable lens shade. For wide angle shots, the lens shade is retracted, as
indicated
by reference character 52. For telephoto shots, the lens shade is extended, as
indicated by reference character 54.
Figures 20 and 21 provide similar views to Figures 18 and 19, but of a camera
with a planar sensor, indicating that the lens shade feature is applicable
independently.
28

CA 02850811 2013-12-27
Dust Reduction
Embodiments of the present invention reduce the dust problem that plagues
conventional cameras since no optical zoom or lens changes are needed.
Accordingly,
the camera incorporated into the mobile communication device is sealed. No
dust
enters to interfere with image quality. An inert desicate gas, such as Argon,
Xenon
or Krypton may be sealed in the lens and sensor chambers within the enclosure
14,
reducing oxidation and condensation. If these gases are used, the camera gains
some
benefits from their thermal insulating capability and temperature changes will
be
moderated, and the camera can operate over a wider range of temperatures.
Improved Optical Performance
The present invention may be used in conjunction with a radically high speed
lens, useable for both surveillance without flash (or without floods for
motion) or fast
action photography. This becomes possible again due to the non-planar sensor,
and
makes faster ranges like a f/0.7 or f/0.35 lens designs, and others, within
practical
reach, since the restraints posed by a flat sensor (or film) are now gone.
All these enhancements become practical since new lens foimulas become
possible. Current lens design for flat film and sensors must compensate for
the
"rainbow effect" or chromatic aberrations at the sensor edges, where light
travels
29

CA 02850811 2013-12-27
farther and refracts more. Current lens and sensor designs, in combination
with
processing algorithms, have to compensate for the reduced light intensity at
the edges.
These compensations limit the performance possibilities.
Since the camera lens and body are sealed, an inert gas like Argon, Xenon or
Krypton may be inserted, e.g., injected during final assembly, reducing
corrosion and
rust. The camera can then operate in a wider range of temperatures. This is
both a
terrestrial benefit, and, is a huge advantage for cameras installed on
satellites.
Rotating & Shifted Sensors
Figures 22 and 23 illustrate a series of alternative sensor arrays with sensor
segments 32c separated by gaps 34, which are necessary when tilting each outer
row
inward, row by row, further and further, to form the overall concave shade of
the
overall sensor, which facilitates easier sensor assembly. In this embodiment,
a still
camera which utilizes this sensor array takes two pictures in rapid
succession. A first
sensor array is shown in its original position 74, and is also shown in a
rotated
position 76. The position of the sensor arrays changes between the times the
first and
second pictures are taken. Software is used to recognize the images missing
from the
first exposure, and stitches that data in from the second exposure. The change
in the
sensor motion or direction shift may vary, depending on the pattern of the
sensor
facets.

CA 02850811 2013-12-27
A motion camera can do the same, or, in a different embodiment, can simply
move the sensor and capture only the new image using the data from the prior
position
to fill in the gaps in a continuous process.
This method captures an image using a moveable sensor with gaps between the
smaller sensors that make up its concave shape. This method makes fabricating
much
easier, because the spaces between segments become less critical. So, in one
example, a square sensor in the center is surrounded by a row of eight more
square
sensors, which, in turn, is surrounded by another row of sixteen square
sensors. The
sensors are sized to fit the circular optical image, and each row curves in
slightly
more, creating the non-planar total sensor.
In use, the camera first takes one picture. The sensor immediately rotates or
shifts slightly and a second image is immediately captured. Software can tell
where
the gaps were and stitches the new data from the second shot into the first.
Or,
depending on the sensor's array pattern, it may shift linearly in two
dimensions, and
possibly move in an arc in the third dimension to match the curve.
This concept makes the production of complex sensors easier. The complex
sensor, in this case, is a large sensor comprising multiple smaller sensors.
When such
a complex sensor is used to capture a focused image, the gaps between each
sensor
lose data that is essential to make the complete image. Small gaps reduce the
severity
of this problem, but smaller gaps make the assembly of the sensor more
difficult.
Larger gaps make assembly easier and more economical, but, create an even less
31

CA 02850811 2013-12-27
complete image. The present method, however, solves that problem by moving the
sensor after the first image, and taking a second image quickly. This gives
the
complete image and software can isolate the data that is collected by the
second image
that came from the gaps and splice it into the first image.
The same result may be achieved by a moving or tilting lens element or a
reflector that shifts the image slightly during the two rapid sequence
exposures. In
this embodiment, the camera uses, but changes in a radical way, an industry
technique
known as "image stabilization." The camera may use image stabilization in both
the
first and second images. This method neutralizes the effect of camera motion
during
an exposure. Such motion may come from hand tremors or engine vibrations.
However, in this embodiment, after the first exposure, the camera will reverse
image
stabilization and introduce "image de-stabilization" or "intentional jitter"
to move the
image slightly over the sensor for the second exposure. This, with a sensor
fixed in
its position, also gives a shift to the second exposure so the gaps between
the facets
from the first exposure can be detected, and, the missing imagery recorded and
spliced
into the final image.
In one example shown in Figure 23, the sensor rotates back and forth. In an
alternative embodiment, the sensor may shift sideways or diagonally. The
sensor may
also be rotated through some portion of arc of a full circle. In yet another
embodiment, the sensor might rotate continuously, while the software combines
the
data into a complete image.
32

CA 02850811 2013-12-27
Figures 24A and 24B also shows a second set of sensors. The sensor is first
shown in its original position 78, and is then shown in a displaced position
80.
Sensor Grid Patterns
Figures 25A, 25B, 25C and 25D reveal four alternative grid patterns for four
alternative embodiments of sensors 82, 84, 86 and 88. The gaps 34 between the
facets
32e, 32f, 32g and 32h enable the manufacturing step of forniing a curved
sensor.
Electrical Connections to Sensors
Figures 26, 27 and 28 provide views of alternative embodiments of electrical
connections to moving sensors.
Figure 26 shows a sensor 90 has a generally spiral-shaped electrical connector
92. The conductor is connected to the sensor at the point identified by
reference
character 94, and is connected to a signal processor at the point identified
by reference
character 96. This embodiment of an electrical connection may be used when the
sensor is rotated slightly between a first and second exposure, as illustrated
in Figure
23. This arrangement reduces the flexing of the conductor 92, extending its
life. The
processor may built into the sensor assembly.
33

CA 02850811 2013-12-27
Figure 27 shows the back of a sensor 102 with an "accordion" shape conductor
100, which is joined to the sensor at point A and to a processor at point B.
This
embodiment may be used when the sensor is shifted but not rotated between a
first
and second exposure, as illustrated in Figure 24.
This type of connection, like the coiled wire connection, makes a 20 back and
forth sensor connection durable.
Figure 28 shows the back of a sensor 114 having generally radially extending
conductors. The conductors each terminate in brush B which are able to contact
a
ring. The brushes move over and touch the ring, collecting an output from the
rotating sensor, and then transmit the output to the processor at the center
C. This
embodiment may be used when the sensor is rotated between exposures. In
addition,
this connection makes another embodiment possible; a continuously rotating
sensor.
In that embodiment, the sensor rotates in one direction constantly. The
software
detects the gaps, and fills in the missing data from the prior exposure.
Wireless Connection
Figure 29 offers a block diagram of a wireless connection 118. A sensor 12
is connected to a transmitter 120, which wirelessly sends signals to a
receiver 122.
The receiver is connected to a signal processor 124.
34

CA 02850811 2013-12-27
In summary, the advantages offered by the present invention include, but are
not limited to:
High resolution digital zoom
Faster
Lighter
Cheaper
Longer focusing ranges
More reliable
Lower chromatic aberration
More accurate pixel resolution
Eliminate need for flash or floodlights
Zooming from wide angle to telephoto

CA 02850811 2013-12-27
Section 3. Additional Embodiments
A mobile communication device including a camera 150 having many of the
preferred features of the present invention will now be described with
reference to
Figures 30 and 31.
It will be understood that numerous conventional features such as a battery,
shutter release, aperture monitor and monitor screen have been omitted for the
purposes of clarity.
The camera comprises an hermetically-sealed enclosure 154 accommodating
a generally curved sensor 160 and a lens 156. Enclosure 154 is filled with
Argon,
Xenon or Krypton. A front view of the sensor 160 is illustrated schematically
in Fig.
31 and comprises a plurality of flat square pixel elements or facets 162
arranged to
be relatively inclined so as to fonn an overall curved configuration. To
minimize the
area of the substantially triangular gaps 164 which result between the
elements 162,
the center square 170 is the largest, and the adjacent ring of eight squares
172 is made
of slightly smaller squares so that they touch or nearly touch at their
outermost
corners. The next ring of sixteen squares 176 has slightly smaller squares
than the
inner ring 172.
36

CA 02850811 2013-12-27
The center square 170 has the highest density of pixels; note that this square
alone is used in the capture of telephoto images. The squares of inner ring
172 have
medium density pixilation, which for normal photography gives reasonable
definition.
The outer ring 176 of sixteen squares has the least dense pixel count.
In this embodiment, the gaps 164 between the elements 162 are used as
pathways for electrical connectors.
The camera 150 further comprises a lens shade extender arrangement 180
comprising a fixed, inner shade member 182, first movable shade member 184 and
a second, radially outeimost, movable shade member 186. When the operator is
taking a wide angle photograph, the shade members are in a retracted
disposition as
shown in Fig. 30; only stray light from extremely wide angles is blocked. In
this
mode, to reduce data processing time and storage requirements, the denser
pixel data
from the central portions 170, 172 of the curved sensor can be normalized
across the
entire image field to match the less dense pixel counts of the edge facets 176
of the
sensor.
For a normal perspective photograph, the shade member 184 is extended so
that stray light from outside of the viewing area is blocked. In this mode, a
portion
of the data facets 172 of the curved sensor are compressed. To reduce
processing
time and storage requirements, the data from the most center area 170, with
higher
density of pixels, can be normalized across the entire image field.
37

CA 02850811 2013-12-27
When the user zooms out digitally to a telephoto perspective, shade member
186 is extended. In this mode, only the center portion 170 of the curved
sensor 160
is used. Since only that sensor center is densely covered with pixels, the
image
definition will be crisp.
Photographers generally zoom to fill the frame and to block out distractions.
The lens shade works on a wide range of settings, and has an infinite number
of
positions between the widest angle and the narrowest telephoto positions. An
alternative embodiment utilizes a single shade element.
Other alternative
embodiments may include two or more elements. The embodiments that use
multiple
shade elements have a telephoto element inside the other elements.
In operation, camera 150 uses two exposures to fill in any gaps within the
sensors range, i.e., to obtain the pixel data missing from a single exposure
due to the
presence of gaps 164. For this purpose, the camera deploys one of two methods.
In
the first, as previously described, the sensor moves and a second exposure is
taken in
rapid succession. The processing software detects the image data that was
missed in
the first exposure, due to the sensor's gaps, and "stitches" that missing data
into the
first exposure. This creates a complete image. The process is run continuously
for
motion pictures, with the third exposure selecting missing data from either
the
preceding or the following exposure, again to create a complete image.
In the second method, a radical change to the now-standard process known in
the industry as "image stabilization" is used. For the first exposure, the
image is
38

CA 02850811 2013-12-27
stabilized. Once recorded, this "image stabilization" is turned off, the image
is shifted
by the stabilization system, and the second image is taken while it is re-
stabilized. In
this method, a complete image is again created, but without any motion
required of
the sensor.
The dashed lines shown in Figure 30 indicate the two-dimensional motion of
the lens for one embodiment of the focusing process.
In another embodiment of the invention that includes intentional jittering,
the
lens does not move back and forth, but, rather, tilts to alter the position of
the image
on the sensor.
The above-described camera 150 has numerous advantages. The sealing of the
enclosure 154 with a gas like argon prevents oxidation of the parts and
provides
thermal insulation for operation throughout a broader range of temperature.
Although the center square 170 with a high pixel density is relatively
expensive, it is relatively small and it is only necessary to provide a single
such
square, this keeping down the overall cost. A huge cost advantage is that it
provides
an acceptable digital zoom without the need for accessory lenses. Accessory
lenses
cost far, far more than this sensor, and are big, heavy and slow. The outer
ring 176
has the smallest squares and the lowest pixel count and so they are relatively
inexpensive. Thus, taking into account the entire assembly of squares, the
total cost
of the sensor is low, bearing in mind it is capable of providing an acceptable
performance over a wide range of perspectives.
39

CA 02850811 2013-12-27
Numerous modifications may be made to the camera 150. For example,
instead of being monolithic, lens 156 may comprise a plurality of elements.
The enclosure 154 is sealed with another inert gas, or a non-reactive gas such
as Nitrogen, Krypton, Xenon or Argon; or it may not be sealed at all.
The pixels or facets 170, 172, 176 may be rectangular, hexagonal or of any
other suitable shape. Squares and rectangles are easiest to manufacture.
Although a
central pixel and two surrounding "square rings" of pixels are described, the
sensor
may comprise any desired number of rings.
In Figure 32, there is shown a block diagram of a camera 250 having many of
the features of the camera 150 of Figures 30 and 31. A non-planar sensor 260
has a
central region 270 with high pixel density and a surrounding region comprising
facets
272 with low pixel density. A shutter control 274 is also illustrated. The
shutter
control 274 together with a focus/stabilization actuating mechanism 290 for
lens 256
and a lens shade actuator 280 are controlled by an image sequence processor
200.
The signals from pixels in facets 270, 272 are supplied to a raw sensor
capture device
202. Another output of device 202 is supplied to a device 206 for effecting
pixel
density normalization, the output of which is supplied to an image processing
engine
208. A first output of engine 208 is supplied to a display/LCD controller 210.
A
second output of engine 208 is supplied to a compression and storage
controller 212.
The features and modifications of the various embodiments described may be
combined or interchanged as desired.

CA 02850811 2013-12-27
Section 4. Mobile Communicator with a Curved Sensor Camera
Figures 33, 34, 35 and 36 present views of one embodiment of the invention,
which combines a curved sensor camera with a mobile communication device. The
device may be a cellular telephone; laptop, notebook or netbook computer; or
any
other appropriate device or means for communication, recordation or
computation.
Figure 33 shows a side view 300 of one particular embodiment of the device,
which includes an enhanced camera 150 for still photographs and video on both
the
front 305a and the back 305b sides. A housing 302 encloses a micro-controller
304,
a display screen 306, a touch screen interface 308a and a user interface 308b.
A
terminal for power and/or data 310, as well as a microphone, are located near
the
bottom of the housing 302. A volume and/or mute control switch 318 is mounted
on
one of the slender sides of the housing 302. A speaker 314 and an antenna 315
reside
inside the upper portion of the housing 302.
Figures 34 and 35 offer perspective views 330 and 334 of an alternative
embodiment 300a. Figures 36 and 37 offer perspective views 338 and 340 of yet
another alternative embodiment 300b.
41

CA 02850811 2013-12-27
Section 5. Method to Capture More Detail from a Scene than the Sensor is
Otherwise Capable of Recording
This alternative method uses multiple rapid exposures with the image moved
slightly and precisely for each exposure.
In the illustrated example, four exposures are taken of the same scene, with
the
image shifted by 1/2 pixel in each of four directions for each exposure. (In
practice,
three, four, five or more exposures might be used with variations on the
amount of
image shifting used.)
For this example, Figure 38 shows a tree. In this example, it is far from the
camera, and takes up only four pixels horizontally and the spaces between
them, plus
five pixels vertically with spaces.
(Cameras are currently available at retail with 25 Megapixel resolution, so
this
tree image represents less than one millionth of the image area and would be
undetectable by the human eye without extreme enlargement.)
Figure 39 represents a small section of the camera sensor, which might be
either flat or curved. For the following explanation, vertical rows are
labeled with
letters and horizontal rows are labeled with numbers. The dark areas represent
spaces
between the pixels.
42

CA 02850811 2013-12-27
Figure 40 shows how the tree's image might be first positioned on the pixels.
Note that only pixels C2, C3, D3, C4, D4, B5, C5 and D5 are "more covered than
not"
by the tree image. Those, then, are the pixels that will record its image.
Figure 41 then shows the resulting image that will represent the tree from
this
single exposure. The blackened pixels will be that first image.
Figure 42, however, represents a second exposure. Note that the image for this
exposure has been shifted by 1/2 pixel to the right. This shift might be done
by moving
the sensor physically, or, by reversing the process known in the industry as
"image
stabilization." Image stabilization is a method to eliminate blur caused by
camera
movement during exposures. Reversing that process to move the image focused on
the sensor, for the additional exposures, and reversing only between those
exposures,
is a unique concept and is claimed for this invention.
With Figure 42, the resulting pixels that are "more covered than not" by the
image are D2, C3, D3, C4, D4, (E4 might go either way,) CS, DS and ES.
This results in a data collection for this image as shown by Figure 43.
Figure 44 represents a third exposure. This time the image is moved up from
exposure 2 by 1/2 pixel. The results are that the tree is picked up on pixels
D2, C3, D3,
C4, D4, E4 and D5.
43

CA 02850811 2013-12-27
This third exposure, then, is represented by data collected as shown in Figure
45.
Figure 46 continues the example. In this case, the image is now shifted to the
left by 1/4 pixel from the third exposure. The result is that imagery is
caught by pixels
C2, C3, D3, B4, C4, D4 and C5.
Figure 47 represents that fourth recorded image.
Now the camera has four views of the same tree image.
Current image stabilization neutralizes tiny hand tremors and even some motor
or other vibrations during a single exposure, eliminating blur. That
capability
suggests moving the image to second, third and fourth or more positions can
occur
quickly.
Pixel response times are also improving regularly, to the point that digital
cameras that were formerly only still cameras, have, for the most part, also
become
motion picture cameras in subsequent model enhancements. This also suggests
that
rapid multiple exposures can be done; particularly since this is the essence
of motion
photography.
44

CA 02850811 2013-12-27
What has not been done or suggested is changing the mode of the image
stabilization mechanism so that it moves the image slightly, and by a
controlled
amount, for each of the multiple exposures, while stabilizing the image during
each
exposure.
Alternatively, moving the sensor slightly for the same effect is also a novel
method.
Software interprets the four captured images and are part of this invention's
claims. The software "looks" at Figures 45 and 47, and conclude that whatever
this
image is, it has a stub centered at the bottom. Because this stub is missing
from
Figures 41 and 43, the software concludes that it is one pixel wide and is a
half pixel
long.
The software looks at all four figures and determine that whatever this is, it
has
a base that's above that stub, and that base is wider than the rest of the
image, going
three pixels horizontally. This comes from line five in Figures 41 and 43 plus
line
four in Figures 45 and 47.
The software looks at lines three and four in Figure 41 and Figure 43 and
conclude that there is a second tier above the broad base in this image,
whatever it is,
that is two pixels wide and two pixels tall.

CA 02850811 2013-12-27
But, the software also looks at lines three in Figure 45 and Figure 47,
confirming that this second tier is two pixels wide, but, that it may only be
one pixel
tall.
The software averages these different conclusions and make the second tier 1
1/2 pixels tall.
The software looks at line two in all four images and realize that there is a
narrower yet image atop the second tier. This image is consistently one pixel
wide
and one pixel high, sits atop the second tier but is always centered over the
widest
bottom tier, and the stub when the stub appears.
Figure 48 shows the resulting data image recorded by taking four images, each
1/2 pixel apart from the adjoining exposures taken. Note that since the data
has four
times as much information, the composite image, whether on screen or printed
out,
will produce 1/4 fractions of pixels. This shows detail that the sensor screen
was
incapable of capturing with a single exposure.
Figure 49 shows the original tree image, as it would be digitally recorded in
four varying exposures on the sensor, each positioned 1/2 pixel apart. Figure
49 shows
the tree itself, and the four typical digital images that would be recorded by
four
individual exposures of that tree. None look anything like a tree.
46

CA 02850811 2013-12-27
The tree is captured digitally four times. Figure 50 shows how the original
tree
breaks down into the multiple images, and, how the composite, created by the
software from those four images, starts to resemble a tree. The resemblance is
not
perfect, but is closer. Considering that this represents about 0.000001% of
the image
area, this resemblance could help some surveillance situations.
47

CA 02850811 2013-12-27
Section 6. Alternative Method for Forming a Curved Sensor
One embodiment of this new method proposes to create a concave mold to
shape the silicon after heating the wafer to a nearly molten state. Gravity
then settles
the silicon into the mold. In all of these methods, the mold or molds could be
chilled
to maintain the original thickness uniformly by reducing the temperature
quickly.
Centrifuging is a second possible method. The third is air pressure relieved
by
porosity in the mold. A fourth is steam, raised in temperature by pressure
and/or a
liquid used with a very high boiling point. The fourth is simply pressing a
convex
mold onto the wafer, forcing it into the concave mold, but again, doing so
after raising
the silicon's temperature.
Heating can occur in several ways. Conventional "baking" is one. Selecting
a radiation frequency that affects the silicon significantly more than any of
the other
materials is a second method. To enhance that second method, a lampblack like
material that absorbs most of the radiation might be placed on the side of the
silicon
that's to become convex, and is removed later. It absorbs the radiation,
possibly burns
off in the process but heats the thickness of the wafer unevenly, warming the
convex
side the most, which is where the most stretching occurs. A third method might
be
to put this radiation absorbing material on both surfaces, so the concave
side, which
absorbs compression tension and the convex side, which is pulled by tensile
stresses,
are each heated to manage these changes without fracturing.
48

CA 02850811 2013-12-27
A final method is simply machining, polishing or laser etching away the excess
material to create the curved sensor. In the first embodiment, the curved
surface is
machined out of the silicon or other ingot material. The ingot would be
thicker than
ordinary wafers. Machining could be mechanical, by laser, ions or other
methods.
In the second embodiment, the wafer material is placed over a pattern of
concave discs. Flash heating lets the material drop into the concave shape.
This may
be simply gravity induced, or, in another embodiment, may be centrifuged.
Another
enhancement may be to "paint" the backside with a specific material that
absorbs a
certain frequency of radiation to heat the backside of the silicon or other
material
while transmitting less heat to the middle of the sensor. This gives the
silicon or other
material the most flexibility across the side being stretched to fit the mold
while the
middle, is less heated, holding the sensor together and not being compressed
or
stretched, but only bent. In another embodiment, the front side is "painted"
and
irradiated, to allow that portion to compress without fracturing. In another
embodiment, both sides are heated at the same time, just before reforming.
Radiation
frequency and the absorbent "paint" would be selected to minimize or eliminate
any
effect on the dopants.
49

CA 02850811 2013-12-27
Section 7. Improving Image Details
In another embodiment of the invention, a generally constant motion is
deliberately imparted to a sensor and/or an optical element while multiple
exposures
are taken. In another embodiment, this motion may be intermittent. Software
then
processes the multiple exposures to provide an enhanced image that offers
greater
definition and edge detail. The software takes as many exposures as the user
may
predetermine.
In this embodiment, the sensor is arrayed with pixels having a variable
density,
with the highest density in the center of the pixels. When the sensor rotates,
the
motion on the outer edges is far greater than at the center, so with a
consistent pixel
density across the sensor, either too little would change in the center, or
too much
would change at the outer edges at any given speed. Varying pixel density
solves
that. By taking pictures with less than a pixel diameter of motion, enhanced
detail is
captured in the composite image.

CA 02850811 2013-12-27
Fixed Sensor with Moving Image
In one alternative embodiment of the invention, a stationary flat or curved
sensor may be used to collect data or to produce an image using an image which
moves in a circular motion. In one implementation of this embodiment, the
circular
path of the image has a diameter which is generally less than the width of a
pixel on
the sensor. In one embodiment, the circular path has a diameter which is half
the
width of a pixel. In this embodiment, pixel density is constant across the
sensor. If
the image was a picture of a clock, it would move constantly in a small
circle, with
the number 12 always on top and the number 6 always on the bottom. The present
invention includes both embodiments¨ one in which the sensor moves under the
objective lens, and another in which the image moves over the sensor.
Moving Sensor with Stationary Image
In yet another alternative embodiment of the invention, a flat or curved
sensor
which generally constantly moves in a tight circle may be used to collect data
or to
produce an image. In one implementation of this embodiment, the circular path
of the
moving sensor has a diameter which is generally less than the width of a pixel
on the
sensor. In one embodiment, the circular path has a diameter which is half the
width
of a pixel.
51

CA 02850811 2013-12-27
The advantages of these embodiments include:
Elimination of any reciprocal movement
No vibration
No energy loss from stop and go motions
Figure 51 presents a schematic illustration 342 of an optical element 344
which
moves over a flat sensor 346. The optical element 344 moves in a tight
circular path
over the flat sensor to move the incoming light over the surface of the flat
sensor
along a tight circular path 348. In this embodiment, the optical element is
shown as
an objective lens. In other embodiments, any other suitable lens or optical
component
may be employed. In an alternative embodiment, the optical element 344 may
tilt or
nutate back and forth in a generally continuous or intermittent motion that
moves the
image in a tight circle over the surface of the stationary flat sensor 346.
Figure 52 is an overhead view 350 of the same optical element 344 which
moves over a the same stationary flat sensor 346 as shown in Figure 51. The
optical
element 344 moves in a tight circular path over the sensor 346 to move the
incoming
light over the surface of the flat sensor 346.
52

CA 02850811 2013-12-27
Figure 53 furnishes a schematic illustration 352 of an optical element 344
which moves over a stationary curved sensor 354.
Figure 54 is an overhead view 356 of the same optical element 344 and sensor
354 shown in Figure 53.
Figure 55 supplies a schematic illustration 358 of one method for imparting
motion to a flat sensor 360 as it moves beneath a stationary optical element
362.
Figure 56 is an overhead view 372 of the same stationary optical element 362
and sensor 360 as shown in Figure 55.
Figure 57 is an illustration 364 that reveals the details of the components
which
impart the spinning motion to the sensor 360 shown in Figures 55 and 56. The
flat
sensor 360 is attached to a post or connector 364 which is mounted on a
spinning disc
366 which is positioned below the sensor 360. The attachment is made at an off-
center location 368 on the disc which is not the center of the disc. The disc
is rotated
by an electric motor 370, which is positioned below the disc. The axis 372 of
the
motor is not aligned with the attachment point 368 of the connecting post 364.
Figure 58 offers a perspective view of the components shown in Figure 57.
53

CA 02850811 2013-12-27
Figure 59 offers a schematic depiction 374 of a stationary optical element 362
which resides over a curved sensor 376 which moves below the fixed optical
element
362.
Figure 60 is an overhead view of the optical element 362 and sensor 376
shown in Figure 59.
Figure 61 furnishes an illustration 378 of a method for imparting a circular
motion to an optical element 344 like the one shown in Figures 51 and 52. The
optical element 344 is surrounded by a band 380, which provides pivoting
attachment
points 382 for a number of springs 384. Two of the springs are attached to
cams 386
and 388, and each cam is mounted on an electric motor 390 and 392. When the
cams
rotate, the springs connected to the bands which surround the optical elements
move
the optical element. The two cams are out of phase by ninety degrees to
provide
circular motion.
Figure 62 presents a series 394 of nine simplified views of a flat sensor as
it
moves through a single orbit in its circular path. In one embodiment, the
circular path
is less than one pixel in diameter. In each view, an axis of rotation C is
shown, which
lies near the lower left corner of the square sensor. A radius segment is
shown in each
successive view, which connects the axis of rotation to a point on the top
side of each
square. In each view, the square sensor has moved forty-five degrees in a
clockwise
direction about the axis of rotation, C. In each view, a dotted-line version
of the
54

CA 02850811 2013-12-27
original square is shown in its original position. The radius segments are
numbered
r, through r9, as they move through each of the eight steps in the circle.
In alternative embodiments, the sensor depicted in Figure 62 may be
configured in a rectangular or other suitable planar shape. In another
alternative
embodiment, the sensor may be curved or hemispherical. The motion may be
clockwise, counter-clockwise or any other suitable displacement of position
that
accomplishes the object of the invention.
Figure 63 is a schematic representation of a flat sensor arrayed with pixels
396.
In Figure 63, the sensor resides in its original position. In Figures 64 and
65, the
sensor continues to rotate through the circular path. As the sensor rotates
multiple
exposures are taken, as determined by software. In this embodiment, the outer
and
inner rows of pixels each move by the same number of pixel spaces.
This embodiment enhances detail in an image beyond a sensor's pixel count,
and may be used in combination with the method described in Section 5, above,
"Method to Capture More Detail from a Scene than the Sensor is Otherwise
Capable
of Recording."
While pixel density is increasing on sensors rapidly, when pixels are reduced
in size such that each pixel can sense only a single photon, the limit of
pixel density
has been reached. Sensitivity is reduced as pixels become smaller.

CA 02850811 2013-12-27
This embodiment may be utilized in combination with methods and apparatus
for sensor connections described in U.S. Patent Grant No. 8 248 499.
In yet another embodiment, miniature radios may be used to connect the output
of the sensor to a micro-processor.
56

CA 02850811 2013-12-27
Section 8. Method to Create Complete Image from Digital Sensors Containing
Gaps
In another embodiment of the invention, a complete image is produced from
digital sensors that contain gaps. In yet another embodiment, a complete image
is
produced from an array of sensors that are physically spaced apart or
separated. In
either of these two embodiments, the sensors operate behind a single optical
path.
In the first embodiment, a camera includes a generally concave sensor which
is formed so that it includes gaps 34 between facets 32, as shown in Figure 7.
A first
exposure is taken while the image stabilization feature is activated. Image
stabilization is described above in Sections II, III and V. The image
stabilization
feature is then de-activated, and then re-activated. A second exposure is then
taken
while the image stabilization feature is active. The signal processor 22,
which runs
a software program, then picks up the image data missing from the first
exposure, and
stitches it into the first exposure, creating a complete image. This process
may be
used generally continuously to create motion pictures or videos.
Figures 59 and 60 illustrate the difference between sensors that have gaps,
and
those that do not. Figure 59 shows a lens and sensor combination 400. The lens
402
is positioned near flat sensor 404. The lens 402 has a central axis 406. A
light ray
408 enters the lens402, and is refracted. The light ray which emerges from the
other
side of the lens 402 impinges on the flat sensor 404. In Figure 60, a
different
57

CA 02850811 2013-12-27
combination of elements 410 includes the same lens 402 and a sensor with gaps
412.
The incident light 408 enters the lens 402, and, after emerging from the other
side of
the lens 402, strikes one portion of the sensor 412.
As shown in Figures 59 and 60, using a flat sensor 404 causes the incident
light
408 to impinge on the outside portion of the flat sensor due to the refraction
through
the lens 402. By using a more "curved" sensor as shown in Figure 60, the
incident
ray impinges upon the portion of the sensor at an angle which is closer to the
normal,
and also strikes the portion of the sensor more near its center. This feature
provides
an enhanced image.
Figures 68 and 69 offer views of a scene that is photographed with a handheld
camera that does not include an image stabilization system. In Figure 68, an
image
frame 416 shows a boy 418 and a baseball 420 in two successive views separated
by
a short period of time. During that period of time, the user's hand shakes
slightly. In
Figure 68, an exposure begins. A short time later, as shown in Figure 69, the
exposure ends. The resulting photographic image is bluiTy, due to the slight
jitter of
the handheld camera.
Figures 70 and 71 offers vies of the same scene as shown in Figures 68 and 69,
but which is photographed with a handheld camera that includes image
stabilization.
In Figure 70, the image frame 416 shows the boy 418 and the baseball 420 in
two
successive views separated by a short period of time. During that period of
time, the
user's hand shakes slightly. In Figure 70, an exposure begins. A short time
later, as
58

CA 02850811 2013-12-27
shown in Figure 71, the exposure ends. The resulting photographic image is
sharp,
since the slight jitter of the handheld camera is counteracted by the image
stabilization
system.
Figures 72 through 75 further illustrate this embodiment of the invention,
which includes optical image stabilization. Figure 72 shows the unaided eye's
view
432 of a cat. In Figure 73, an image of the cat is superimposed over a portion
of a
camera's sensor 434. The sensor 434 includes four generally square mini-
sensors 436
which are separated by gaps 438.
In Figure 73, the camera takes a first exposure while optical image
stabilization
is active. The first exposure records only those portions of the cat's image
440 which
register with the mini-sensors 436. The other portions of the entire cat's
image which
are not recorded in this first exposure are those which are superimposed over
the gaps
438, and are shown as cross-hatched "missing portions" 442 of the image.
In Figure 74, the camera takes a second exposure while optical image
stabilization is active. The cat has moved or has changed position in the time
between
the beginning of the first and second exposures. This movement may simply be
the
jitter created by a user's hand. The second exposure records only those
portions of
the cat's image 444 which register with the mini-sensors 436. The other
portions of
the entire cat's image that are not recorded in this second exposure are those
which
are superimposed over the gaps 438, and are shown as cross-hatched "missing-
portions" 446 of the image.
59

CA 02850811 2013-12-27
After the camera records the first and second exposures, electronic
stabilization
software, which is stored in the camera's memory, is executed on the camera's
processor. This software compares the two exposures, pixel by pixel, and
detects the
missing portions in each exposure. The software then creates a composite image
450,
as shown in Figure 75, which "stitches together" the originally recorded and
missing
portions to produce a complete image.
Electronic image stabilization is well known in the art.
According to
Wilikpedia, electronic image stabilization "reduces blurring associated with
the
motion of a camera during exposure." In some cameras, a gyroscope is used to
sense
camera rotation, which causes angular error. The gyroscopes measure the
rotation,
and send information to an actuator which moves the sensor in the camera to
counteract the rotation. In another embodiment, an angular rate sensor may be
used
to measure and to compensate for unwanted camera motion while an exposure is
taken. An Image Stabilizer Primer is available at the web site for Videomaker,
and is
also described at the websites operated by Nikon and Canon. Yu et al. disclose
a
Summarization of Electronic Image Stabilization in their paper published at
the 7th
International Conference on Computer-Aided Industrial Design and Conceptual
Design in 2006.

CA 02850811 2013-12-27
This embodiment of the invention provides the following benefits:
simpler, smaller optics;
optics that capture more light; and
missing data from the gaps are captured.
61

CA 02850811 2013-12-27
Section 9. Image Stabilization Methods
Figure 76 supplies a schematic view of a camera 452 that incorporates both a
curved sensor and optical image stabilization. An objective lens 16 resides on
an
enclosure 14. Inside the camera, a curved sensor 12 is positioned to receive
light
beams from the objective lens 16 above it. The curved sensor 12 includes a
number
of mini-sensors 436. The output of the mini-sensors 436 is connected to an
optical
image stabilization circuit 454, which is also connected to a signal processor
22.
Figure 77 shows a camera 456 that incorporates electronic image stabilization.
An objective lens 16 is situated on an enclosure 14. Inside the camera, a
conventional
flat sensor 458 is positioned to receive light gathered by the objective lens
16. The flat
sensor 458 includes vias 36 that allow for connections to a wiring backplane
38. An
electronic image stabilization circuit 460 is connected to an electronic image
stabilization sensor 462, which detects any unwanted rotation of the camera.
The
electronic image stabilization circuit 460 is also connected to a actuator
464, which
physically adjusts the position of the sensor 458 to counteract any unwanted
rotation.
62

CA 02850811 2013-12-27
Section 10. Lens Shade Motion Control Mechanisms
Figure 78 is a schematic diagram of a camera 465 with automatic lens shade
control. A zoom lens 466 is mounted over the objective lens 16 which is
affixed to
the enclosure 14. A lens shade 186 extends over both the objective and zoom
lenses
16 & 466. A zoom lens control mechanism 467 is connected to a scattered light
sensor 468, and operates the zoom lens 466. The scattered light sensor 468 is
disposed
beyond the outermost edge of the flat sensor 458. When the scattered light
sensor 468
detects too much scattered light, it sends a signal to a lens shade control
motor 469,
and the lens shade 186 is extended.
Figure 79 reveals a diagram of a camera 470 with a manual zoom and lens
shade control. As shown in Figure 78, the objective lens 16 is located on the
enclosure 14. A zoom lens 466 is mounted over the objective lens 16. A flat
sensor
458 receives light from the objective lens 16. A manual zoom control knob 471
is
mounted on the enclosure, and is connected to a manually controlled lens shade
472,
which is also mounted on the enclosure 14, and which extends over both the
objective
and the zoom lenses 16 & 466. In an alternative embodiment, the lens shade
control
is mechanically linked to the zoom lens barrel.
Figure 80 depicts a first embodiment 474 of lens shade control. A zoom control
button 476 is connected, in series, to a first motor 478, a first gear
mechanism 480 and
63

CA 02850811 2013-12-27
one or more lenses 482. The zoom control button 476 is also connected, in
series, to
a second motor 484, a second gear mechanism 486 and a lens shade 488 which is
controlled in concert with a zoom lens.
Figure 81 depicts a second embodiment 490 of lens shade control. A zoom
control button 476 is connected, in series, to a first motor 478, and a twin
track gear
mechanism 492, one or more lenses 482 and a lens shade 488.
Figure 82 depicts a third embodiment 494 of lens shade control. A zoom
control button 476 is connected, in series, to a first motor 478, and a double
lever arm
496, one or more lenses 482 and a lens shade 488.
Figure 83 depicts a fourth embodiment 498 of lens shade control. A zoom 10
control button 476 is connected, in series, to a first motor 478, and a single
arm 500,
one or more lenses 482 and a lens shade 488.
Figure 84 presents a view of a manual zoom and lens shade controller 502. A
pantograph 503 is connected, in series, to the enclosure 14, the zoom lens 466
and to
a lens shade 488.
64

CA 02850811 2013-12-27
Section 11. Binning & Compression
Figures 85, 86 and 87 depict methods for binning and compression. Figure
85 is a view 504 of a tiny fraction of a digital photo, which covers only
forty-eight
pixels. The black spot 506 on the white background 508 may be thought of as a
peppercorn on a white tablecloth. The pixels are indicated by the horizontal
and
vertical axes, labeled A through H, and 1 through 6, respectively.
Figure 86 provides another view 512 of the scene 504 shown in Figure 85, but
with grid lines 514 that show the boundaries of the forty-eight pixels. In one
embodiment of the invention, a compression algorithm is used, and the signal
processor stores and prints this tiny section of the image, going left to
right and top
to bottom as (Al-C2 white, D2-E2 black, F2-B3 white, C3-F3 black, G3-B4 white,
C4-F4 black, G4-05 white, D5-E5 white, F5-H6 white.) The alternative is to
store
each bit alone, which in this case would mean (Al white, B1 white, Cl white,
D1
white, El white, Fl white, G1 white, Hi white, A2 white, 82 white, C2 white,
D2
black, E2 black, F2 white, G2 white, H2 white, etc.) Using this alternative
method,
much more data needs to be processed and stored, but the quality of the image
is not
improved.
Figure 87 supplies another view 516, which illustrates how the amount of data
the is needed to represent the image changes when a binning method is
employed,

CA 02850811 2013-12-27
20
instead of a compression method. When implementing the binning method, the
data
66

CA 02850811 2013-12-27
produced by neighboring pixels are aggregated together to generate a virtually
larger
pixel. The binning method generally produces less detail, and results in more
sensitive
response per "virtual" pixel, so the performance of the camera in lower light
conditions is improved. In Figure 87, the nearest four pixels are joined, and
so the
signal processor stores and prints the image as (Al-A3 white, C3-E3 black, G3-
H5
white). In Figure 87, a pixel code is used to identify each virtual pixel. As
an
example, the first virtual pixel in the top right of Figure 87 is identified
as pixel code
A 1 -A3. The pixel code is generated by concatenating the horizontal and
vertical
coordinates 518 of the top left corner and the lower left corner of each
virtual pixel.
Accordingly, the codes for the four virtual pixels in the top row of virtual
pixels
shown in Figure 87 are A1-A3, Cl-C3, E1-E3 and Gl-G3.
The method illustrated in Figure 87 not only improves performance under low
light conditions, but also increases image detail, since more photons are
captured per
virtual pixel. The processing time and storage needs are reduced. This method
also
reduces the noise created in low light situations when the ISO, or sensitivity
of the
pixels, is heightened.
67

CA 02850811 2013-12-27
Section 12. Arcuate Array of Mini-Sensors
Figure 88 offers a view 520 of another embodiment of the invention, which
includes an arcuate array of individual mini-sensors combined with a
corrective
optical element. The arcuate array 522 includes a number of mini-sensors 524,
which
each have an output 526 that is fed to a signal processor 22. Each mini-sensor
524 is
aligned along a curve or arc, and is disposed inside the camera. Each mini-
sensor 524
is separated by gaps, allowing for ease of construction. A corrective optical
element
528 is disposed above the arcuate array 522. The corrective optical element
528
includes a number of segments 530. Each of these segments 530 directs light
that
emerges from the objective lens 16 to one mini-sensor 524 in the array 522.
This embodiment of the invention achieves all the benefits of a curved or
concave sensor, without the need to bend the sensor material and without any
moving
parts. When a single flat sensor is used in a camera, the light rays travel
further and
bend sharper to reach the edges of the flat sensor. The result is weaker light
at the
edges with more chromatic (rainbow effects) aberrations.
In this embodiment, the light rays entering the camera strike the sensors at
nearly identical distances from the objective lens. The light rays also strike
the
sensor at closer to a right angle on average. This embodiment enables lens
designers
to create faster lenses. Faster lenses capture more photons, which eliminates
the
need for flash in many low light conditions.
68

CA 02850811 2013-12-27
In an alternative embodiment, a number of these arrays may be deployed in
parallel.
69

CA 02850811 2013-12-27
SCOPE OF THE CLAIMS
Although the present invention has been described in detail with reference to
one or more preferred embodiments, persons possessing ordinary skill in the
art to
which this invention pertains will appreciate that various modifications and
enhancements may be made without departing from the spirit and scope of the
Claims
that follow. The various alternatives for providing a Curved Sensor Array
Camera
that have been disclosed above are intended to educate the reader about
preferred
embodiments of the invention, and are not intended to constrain the limits of
the
invention or the scope of Claims.

CA 02850811 2013-12-27
LIST OF REFERENCE CHARACTERS
Camera with curved sensor
12 Curved sensor
14 Enclosure
16 Objective lens
18 Incoming light
Electrical output from sensor
22 Signal processor
24 User controls
26 Battery
28 Memory
Camera output
32 Facet
34 Gap between facets
36 Via
38 Wiring backplane
Curved sensor formed from adjoining petal-shaped segments
42 Petal-shaped segment
43a First Mandrel
71

CA 02850811 2013-12-27
43b Substrate
43c First sheet of deformable material
43d Dome portion of deformable material over mandrel
43e Hemispherical base for curved sensor
43f Second sheet of deformable material
43g Second mandrel
43h Ports
43i Empty region
43j Heater
43k Hemispherical base for curved sensor
431 sensor after sensor pixels 431 have been formed on the base 43e or 43k.
44 Camera monitor
46 Conventional sensor with generally uniform pixel density
48 Sensor with higher pixel density toward center
50 Pixel
52 Shade retracted
54 Shade extended
56 Multi-lens camera assembly
58 Objective lens
60 Mirrored camera/lens combination
72

CA 02850811 2013-12-27
62 Primary objective lens
73

CA 02850811 2013-12-27
64 Secondary objective lens
66 First sensor
68 Second sensor
70 Mirror
72 Side-mounted sensor
74 Sensor in original position
76 Sensor in rotated position
78 Sensor in original position
80 Sensor in displaced position
82 Alternative embodiment of sensor
84 Alternative embodiment of sensor
86 Alternative embodiment of sensor
88 Alternative embodiment of sensor
90 View of rear of one embodiment of sensor
92 Spiral-shaped conductor
94 Connection to sensor
96 Connection to processor
98 View of rear of one embodiment of sensor
100 Accordion-shaped conductor
102 Connection to sensor
74

CA 02850811 2013-12-27
104 Connection to processor

CA 02850811 2013-12-27
106 View of rear of one embodiment of sensor
108 Radial conductor
110 Brush
112 Brush contact point
114 Annular ring
116 Center of sensor, connection point to processor
118 Schematic view of wireless connection
120 Transmitter
122 Receiver
124 Processor
150 Camera
154 Enclosure
156 Lens
160 Sensor
162 Facets
164 Gaps
170 Center square
172 Ring of squares
176 Ring of squares
180 Shade extender arrangement
76

CA 02850811 2013-12-27
182 Inner shade member
77

CA 02850811 2013-12-27
184 Movable shade member
186 Outer, movable shade members
190 Lens moving mechanism
200 Image sequence processor
202 Sensor capture device
204 Auto device
206 Pixel density normalization device
208 Image processing engine
210 Display/LCD controller
212 Compression and storage controller
250 Camera
256 Lens
260 Sensor
270 Central region facet
272 Surrounding region facets
274 Shutter control
280 Lens shade actuator
290 Focus/stabilization actuator
292 Lens moving
300 First embodiment of combined device
78

CA 02850811 2013-12-27
300a First embodiment of combined device
300b First embodiment of combined device
302 Housing
304 Micro-controller
305a Front side
305b Back side
306 Display screen
308a Touch screen interface
308b User interface
310 Terminal for power and/or data
314 Speaker
315 Antenna
330 View of alternative embodiment
334 View of alternative embodiment
338 View of alternative embodiment
340 View of alternative embodiment
342 Schematic illustration of moving lens with fixed flat sensor
344 Moving lens
346 Fixed flat sensor
348 Light path
350 Overhead view of Figure 51
352 Schematic illustration of moving lens with fixed curved sensor
79

CA 02850811 2013-12-27
354 Fixed curved sensor
356 Overhead view of Figure 53
358 Schematic illustration of fixed lens with moving flat sensor
360 Moving flat sensor
362 Fixed lens
364 Overhead view of Figure 55
365 Schematic depiction of components that impart circular motion to sensor
366 Spinning disc
367 Connecting post
368 Attachment point
370 Electric motor
372 Axis of motor
373 Perspective view of Figure 57
374 Schematic view of fixed lens over moving curved sensor
376 Moving curved sensor
377 Overhead view of Figure 59
378 Schematic illustration of components for imparting motion to lens
380 Band
382 Springs
384 Springs connected to cams
386 First cam

CA 02850811 2013-12-27
388 Second cam
390 First electric motor
392 Second electric motor
394 Series of nine views of rotating sensor
396 Sensor
398 Pixels
400 Lens and sensor combination
402 Lens
404 Flat sensor
406 Central axis
408 Light ray
410 Combination of elements
412 Gaps
414 First exposure
416 Image frame
418 Boy's hand at beginning of exposure
420 Baseball at beginning of exposure
422 Exposure ends
424 Boy's hand at end of exposure
426 Baseball at end of exposure
428 Image at beginning of exposure with image stabilization
81

CA 02850811 2013-12-27
430 Image at end of exposure with image stabilization
432 Eye's view of cat
434 Camera sensor
436 Mini-sensor
438 Gaps
440 Cat's image
442 Missing portions of image
444 Portions of cat's image which register with mini-sensors
446 Cross-hatched missing portions
448 Missing portion of image in second exposure
450 Composite image
452 Camera with optical image stabilization
454 Optical image stabilization circuit
456 Camera with electronic image stabilization
458 Flat sensor
460 Electronic image stabilization circuit
462 Electronic image stabilization sensor
464 Actuator
466 Camera with manual zoom and lens shade control
468 Zoom lens
82

CA 02850811 2013-12-27
470 Manual zoom control
472 Manually controlled lens shade
474 First embodiment of lens shade control
476 Zoom control
478 Motor
480 First gear mechanism
482 Lens element
484 Motor
486 Second gear mechanism
488 Lens shade
490 Second embodiment of lens shade control
492 Twin track gear mechanism
494 Third embodiment of lens shade control
496 Lever arm
498 Fourth embodiment of lens shade control
500 Single arm lens shade controller
502 Manual zoom and lens shade controller
504 View of black object on white background
506 Black object
508 White background
510 Horizontal and vertical axes
512 View of black object on white background with grid lines
83

CA 02850811 2013-12-27
514 Grid lines
516 View of black object on white background showing binned virtual pixels
518 Axes for binned virtual pixels
520 Arcuate array of mini-sensors with corrective optical element
522 Array of mini-sensors
524 Mini-sensor
526 Mini-sensor output
528 Corrective optical element
530 Segment of corrective optical element
84

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

2024-08-01:As part of the Next Generation Patents (NGP) transition, the Canadian Patents Database (CPD) now contains a more detailed Event History, which replicates the Event Log of our new back-office solution.

Please note that "Inactive:" events refers to events no longer in use in our new back-office solution.

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Event History , Maintenance Fee  and Payment History  should be consulted.

Event History

Description Date
Inactive: IPC expired 2023-01-01
Inactive: IPC assigned 2021-03-08
Inactive: IPC assigned 2021-03-08
Inactive: IPC assigned 2021-03-08
Inactive: IPC assigned 2021-03-08
Inactive: IPC removed 2020-12-31
Inactive: IPC removed 2020-12-31
Application Not Reinstated by Deadline 2016-12-29
Time Limit for Reversal Expired 2016-12-29
Deemed Abandoned - Failure to Respond to Maintenance Fee Notice 2015-12-29
Inactive: Cover page published 2015-07-20
Application Published (Open to Public Inspection) 2015-06-27
Small Entity Declaration Request Received 2014-06-03
Inactive: IPC assigned 2014-06-03
Inactive: Reply to s.37 Rules - Non-PCT 2014-06-03
Inactive: IPC assigned 2014-05-26
Inactive: First IPC assigned 2014-05-26
Inactive: IPC removed 2014-05-20
Inactive: IPC assigned 2014-05-20
Inactive: IPC assigned 2014-05-20
Inactive: IPC assigned 2014-05-20
Inactive: IPC assigned 2014-05-20
Inactive: Filing certificate - No RFE (bilingual) 2014-05-20
Application Received - Regular National 2014-05-14
Inactive: Correspondence - Formalities 2014-05-12
Small Entity Declaration Determined Compliant 2013-12-27
Inactive: Pre-classification 2013-12-27

Abandonment History

Abandonment Date Reason Reinstatement Date
2015-12-29

Fee History

Fee Type Anniversary Year Due Date Paid Date
Application fee - small 2013-12-27
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
GARY SUTTON
DOUGLAS GENE LOCKIE
WILLIAM MAYNARD, JR. BARTON
Past Owners on Record
None
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column (Temporarily unavailable). To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.

({010=All Documents, 020=As Filed, 030=As Open to Public Inspection, 040=At Issuance, 050=Examination, 060=Incoming Correspondence, 070=Miscellaneous, 080=Outgoing Correspondence, 090=Payment})


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Description 2013-12-26 84 2,459
Claims 2013-12-26 32 510
Abstract 2013-12-26 1 15
Drawings 2013-12-26 78 2,121
Representative drawing 2015-05-31 1 10
Filing Certificate 2014-05-19 1 178
Reminder of maintenance fee due 2015-08-30 1 112
Courtesy - Abandonment Letter (Maintenance Fee) 2016-02-08 1 171
Correspondence 2014-05-11 4 141
Correspondence 2014-06-02 19 461