Language selection

Search

Patent 2207326 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent Application: (11) CA 2207326
(54) English Title: DATA RECOGNITION SYSTEM
(54) French Title: SYSTEME DE RECONNAISSANCE DE DONNEES
Status: Dead
Bibliographic Data
(51) International Patent Classification (IPC):
  • A21B 1/40 (2006.01)
  • G01J 3/46 (2006.01)
  • G01N 33/10 (2006.01)
  • G06N 3/00 (2006.01)
  • G06T 1/00 (2006.01)
  • G06T 7/00 (2017.01)
  • G06F 19/00 (2006.01)
  • G06F 15/18 (2006.01)
  • G06T 7/00 (2006.01)
(72) Inventors :
  • WESCOTT, CHARLES TASMAN (Australia)
  • HAMEY, LEONARD GEORGE CHADBORN (Australia)
(73) Owners :
  • WESCOTT, CHARLES TASMAN (Not Available)
  • HAMEY, LEONARD GEORGE CHADBORN (Not Available)
(71) Applicants :
  • ARNOTT'S BISCUITS LIMITED (Australia)
(74) Agent: BARRIGAR & MOSS
(74) Associate agent:
(45) Issued:
(86) PCT Filing Date: 1995-12-01
(87) Open to Public Inspection: 1996-06-20
Availability of licence: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): Yes
(86) PCT Filing Number: PCT/AU1995/000813
(87) International Publication Number: WO1996/018975
(85) National Entry: 1997-06-09

(30) Application Priority Data:
Application No. Country/Territory Date
PN 0023 Australia 1994-12-13

Abstracts

English Abstract




The present invention is directed towards a method of determining the state of
a substance (2) having a color property dependent on the state of the
substance. The method includes the following steps: forming a pixel image of
the substance (3) using a camera (3); projecting the pixel image into a three-
dimensional color space using a neural network (7); and comparing the
projection (37) with a projection of a second position of the substance (2) to
determine the state of the substance (2) using a second neural network (9).


French Abstract

La présente invention concerne un procédé de détermination de l'état d'une substance (2) ayant, dans le domaine de la couleur, une propriété qui dépend de l'état de cette substance. Ce procédé comprend les étapes suivantes: formation d'une image de pixels de la substance (2) à l'aide d'une caméra (3), projection de l'image de pixels dans un espace en couleurs tridimensionnel en utilisant un réseau neuronal (7) et comparaison entre la projection (37) et la projection d'une deuxième position de la substance (2) pour déterminer l'état de cette dernière à l'aide d'un deuxième réseau neuronal (9).

Claims

Note: Claims are shown in the official language in which they were submitted.





- 10 -
CLAIM
1. A method of baking foodstuffs by determining the state of a substance
having a color property dependent on said state, said method comprising the steps of:
forming a pixel image of said substance, wherein said pixel image comprises a
plurality of pixels;
projecting said pixel image into a three dimensional color space;
comparing said projection with a projection of a second portion of said
substance, having a predetermined state, so as to determine the state of said substance;
and
controlling an oven used to bake said foodstuffs dependent on the state of said
substance.

2. The method as claimed in claim 1, wherein said comparing step
further comprises the steps of:
deriving a series of state data points from said second portion of said
substance; and
producing a comparison of said projection with said state data points so as to
determine the state of said substance.

3. The method as claimed in claim 2, wherein said deriving step further
comprises the step of:
deriving said series of state data points by means of a self organising feature
map.

4. The method as claimed in claim 2, wherein said producing step further
comprises the step of:
producing a data histogram from said projected pixel image and said state data
points, said histogram indicating the number of projected pixels in the neighbourhood
of each of said state data points.

5. The method as claimed in claim 4, wherein said producing step further
comprises:
deriving a neural network input to a neural network using said data histogram,
said neural network being trained to recognise neural network inputs derived from
histograms of projected pixel images of substances having a predetermined state and to
generate a signal indicative of said predetermined state; and
outputting a signal from said neural network indicative of said state of the
projected pixel image of said data histogram.




- 11 -
6. A method of baking foodstuffs by determining the state of a first data
sample having collectively, a series of characteristics, said method comprising the steps
of:
projecting said first data sample comprising a plurality of data points into a
multi-dimensional space;
comparing said projection with a projection of a second data sample having a
set of predetermined characteristics, so as to determine the state of said first data
sample characteristics; and
controlling an oven used to bake said foodstuffs dependent on the state of said
first data sample characteristics.

7. The method as claimed in claim 6, wherein said comparing step
further comprises the steps of:
deriving a series of state data points from said second data sample; and
producing a comparison of said projection with said state data points so as to
determine the state of said first data sample characteristics.

8. The method as claimed in claim 7, wherein said deriving step further
comprises the step of:
deriving said series of state data points by means of a self organising feature
map.

9. The method as claimed in claim 8, wherein said producing step further
comprises the step of:
producing a data histogram from said first data sample and said state data
points, said histogram indicating the number of first samples in the neighbourhood of
each of said state data point.

10. The method as claimed in claim 9, wherein said producing step further
comprises the steps of:
deriving a neural network input to a neural network using said data histogram,
said neural network being trained to recognise neural network inputs derived from
histograms of a data samples having a predetermined series of characteristics and to
generate a signal indicative of said predetermined state, and
outputting a signal from said neural network indicative of said state of the
projected data samples of said data histogram.

11. A system for baking foodstuffs by determining the state of a substance
having a color property dependent on said state, said system comprising:




- 12 -
an imaging device for forming a pixel image of said substance, wherein said
pixel image comprises a plurality of pixels;
means for projecting said pixel image into a three dimensional color space;
means for comparing said projection with a projection of a second portion of
said substance, having a predetermined state, so as to determine the state of said
substance; and
means for controlling an oven used to bake said foodstuffs dependent on the
state of said substance.

12. The system as claimed in claim 11, wherein said means for comparing
said projection further comprises:
means for deriving a series of state data points from said second portion of said
substance; and
means for producing a comparison of said projection with said state data points
so as to determine the state of said substance.

13. The system as claimed in claim 12, wherein said deriving means
further comprises:
means for generating a self organising feature map to derive said series of state
data points.

14. The system as claimed in claim 12, wherein said means for producing
said comparison further comprises:
means for producing a data histogram from said projected pixel image and said
state data points, said histogram indicating the number of projected pixels in the
neighbourhood of each of said state data points.

15. The system as claimed in claim 14, wherein said producing means
further comprises:
means for deriving a neural network input to a neural network using said data
histogram, said neural network being trained to recognise neural network inputs derived
from histograms of projected pixel images of substances having a predetermined state
and to generate a signal indicative of said predetermined state; and
means for outputting a signal from said neural network indicative of said state
of the projected pixel image of said data histogram.

16. A system for baking foodstuffs by determining the state of a first data
sample having collectively, a series of characteristics, said system comprising:

- 13 -
-means for projecting said first data sample comprising a plurality of data points
into a multi-dimensional space;
means for comparing said projection with a projection of a second data sample
having a set of predetermined characteristics, so as to determine the state of said first
data sample characteristics; and
means for controlling an oven used to bake said foodstuffs dependent on the
state of said first data sample characteristics.

17. The system as claimed in claim 16, wherein said means for comparing
said projection further comprises:
means for deriving a series of state data points from said second data sample;
and
means for producing a comparison of said projection with said state data points
so as to determine the state of said first data sample characteristics.

18. The system as claimed in claim 17, wherein said deriving means
further comprises:
means for generating a self organising feature map to derive said series of state
data points.

19. The system as claimed in claim 18, wherein said means for producing
said comparison further comprises:
means for producing a data histogram from said first data sample and said state
data points, said histogram indicating the number of first samples in the neighbourhood
of each of said state data point.

20. The system as claimed in claim 19, wherein said means for producing
said histogram further comprises:
means for deriving a neural network input to a neural network using said data
histogram, said neural network being trained to recognise neural network inputs derived
from histograms of a data samples having a predetermined series of characteristics and
to generate a signal indicative of said predetermined state, and
means for outputting a signal from said neural network indicative of said state
of the projected data samples of said data histogram.

21. An assembly for baking foodstuffs, comprising:
an oven; and
the system for baking foodstuffs in accordance with any one of claims
11 to 20.

Description

Note: Descriptions are shown in the official language in which they were submitted.


CA 02207326 1997-06-09

WO 96/18975 PCT/AU95/0~813

DATA RECOGNITION ~Y~ ;M
Field of the Invention
The present invention relates to recognition of data characteristics wit'nin an
i~nage and more particularly to the recognition of characteristics within a baking
process.
Back~ l vulld of the Invention
There are several factors which influence the results produced by a baking
process. For example, in the production of bread or biscuits or other such foodstuffs, a
number of factors including the temperature of the oven, the baking period, and the
size, shape, thickn~ and moisture level of the substances (e.g. dough) utilised
influence the final product produced at the end of the baking process. Further, these
baked foodstuffs are often produced on a commercial scale, in large volumes, using a
mass production process.
At the end of the production process, the goods are normally inspected to ensurequality control. Inspection often takes the form of looking at a particular batch of
goods coming off a production line in order to ensure their suitability.
Often the inspection process is undertaken by a human observer. It has been
found, in practice, that human observers provide subjective judgements that are prone
to both long- and short-term variations. Additionally, human observers are only
capable of working at certain fixed speeds and are prone to adverse conditions such as
boredom, tiredness and/orisickn,ss~.
Summary of the Invention
It is an object of the present invention to provide a method of determining the
state of an object having visual data properties that are subject to variation.
In accordance with a first aspect of the present invention, there is provided a
~ method of determining the state of a substance having a color property dependent on
said state, said method comprising the steps of:
forming a pixel image of said substance;
projecting said pixel image into a three dimensional color space; and

CA 02207326 l997-06-09

W O 96/18975 PC~r/AU95/00813
- 2 -

co~ ring said projection with a projection of a second portion of said
s~lhst~nre, having a predetermined state, so as to determine the state of said substance.
In accordance with a second aspect of the present invention tnere is disclosed
an a~dlus which is able to operate in accordance with the above method.
In accordance with a third aspect of the present invention, there is provided a
method of determining the state of a first data sample having collectively a series of
characteristics, said method col~lising the steps of:
projecting said first data sample into a multi-dimensional space; and
comparing said projection with a projection of a second data sample having a
10 set of predetermined characteristics, to determine the state of said first data sample
characteristics .
Brief Des~. i,~ion of the D~
The preferred embodiment of the present invention is described with reference tothe accompanying drawings, in which:
Fig. 1 ill~l~Lldtes the steps of the preferred embodiment;
Fig. 2 illustrates the input scanned data format utilised by the preferred
embo~iment;
Figs. 3-8 illustrate the progression of the color change with increased baking
levels;
Fig. 9 illustrates a Kohonen self-organising feature map;
Fig. 10 illustrates a first baking curve; and
Fig. 11 illustrates a second baking curve for a different biscuit product.
Detailed Des~ ,lion
Although the pl~felled embodiment will be described with reference to baking,
it will be apparent to a person skilled in the art that the invention is not limited thereto
and applies to all forms of visn~ tinn where the state of a substance can be
determined from its appearance.

CA 02207326 l997-06-09

WO 96/18975 PCT/AU95/00813
- 3 -

In the ~rer~led embodiment of the present invention, a detailed analysis of the
color chara~;L~ ics of the baked product is undertaken by a col~ul~l system ill~e~l~led
into the producti- n process.
O A mass production system 1 for producing baked foodstuffs according the
~er~lled embodiment of the invention is illustrated schem~ti-~lly in Fig. 1. Thesystem 1 col~.~rises a food processin~ apparatus (e.g. oven) 4, subst~nce
characl~lisa~ion apparatus 3, 6, 7, 8, 9, and control system 11. Baked products 2 are
imaged by a camera 3 as the products exit the oven 4. The image taken by camera 3 is
subjected to a number of pl~ces~sirl~ steps preferably implemented using a
microcomputer, which will be described in more detail hereinafter. The processing
includes an image processing or calibration step 6, a first neural network processing
step 7, an image processing step 8, and a second neural network step 9 to produce an
output in~ tion 10 that in~lir~t~s the extent of the baking of the baked products 2.
The output 10 provides an indication of whether the products 2 are under- or over-
baked and can be utilised as an input to the control system 11 to determine if the oven
parameters such as heat or time need to be adjusted.
The camera 3 utilised in the prerelled embodiment is a three-chip charge
coupled device (CCD) camera which produces a 512 X 512 array of red (R), green (G),
and blue (B), or R(~rB, pixel values. This array of pixel values is stored in a "frame
grabber" board mounted within the microcolll~uL~. The baking products 2 were
imaged using two daylight-color-b~l~n~ed THORN (Registered Trade Mark) fluorescent
lamps for direct illumin~tion and a black curtain to exclude ambient ilhlmin~tion. It
should be apparent to a person skilled in the art that other forms of illumination may be
used, and/or the background curtain may be elimin~ted in an industrial production
process, although some other form of ambient light shielding may by utilised without
departing from the scope and spirit of the invention. In a first sample, the baked
products imaged were NSAO" (Registered Trade Mark) biscuits produced in a
production line of the Arnott's Biscuits T imit~l biscuit factory.

CA 02207326 l997-06-09

WO 96tl8975 PCT/AU95/00813
- 4 -

In order to ensure proper color calibration of the imaged biscuits, the biscuitswere imaged with a color calibration chart 12 of the form shown in Fig. 2. The
sc-~nn~d image 12' includes biscuit area 13, background 14, reference white background
15, grey scale 16, and reference colors 17. The sc~nn~d image 12' and image 12 were
then utilised to color calibrate the pixels of the biscuit 13. The process of color
calibrations to ensure color consistency of sampled images is a well known process to
those skilled in the art of co,lll,u~er im~ging. For a ~ cll~cion of the color calibration
process, reference is made to Novak, C.L. and S.A. Shafer 1990, "Supervised Color
Constancy Using a Color Chart," Technical Report CMU-CS-140, Carnegie Mellon
10 University School of Co~ ,uLer Science. The paper by Novak et al. sets out more
extensive means of color calibration. However, it should be noted that under
ap~lopliate ci,.;u~llstances of illumination and instrumentation control, calibration can
be dispensed with.
On completion of the color calibration process (block 6 of Fig. 1), each biscuitimage can be stored as an RGB file. The RGB color space is a well known color space,
in common use with colllpu~er systems. However, it will be a~p~nl to a person
skilled in the art that other color systems such as HSV, XYZ or CIE color space
coordinates may also be lltili~e~
Referring now to Figs. 3 to 8, a projection of the pixel values obtained for
biscuit 13 (Fig. 2) is shown for different stages of the baking process.
Fig. 3 shows a projection 20 of the pixel image of a "raw" biscuit onto an RGB
cube 21.
Figs. 4 to 8 show corresponding projections for biscuits varying from
underbaked biscuits (Fig. 4) to overbaked biscuits (Fig. 8). It can be seen fromco.~pa~ing Figs. 4 to 8 that the color characteristics of a biscuit change as that biscuit is
subjected to increased levels of baking. The particular progression of pixel data of
Figs. 4 to 8 can therefore be utilised to determine the baking state of any particular
sample biscuit.

CA 02207326 1997-06-09

WO 96J18975 PCT/AU95/00813
- 5 -

In order to utilise the progression from Fig. 4 to Pig. 8 in the baking process, it
is n~cP.,cs ~y to succinctly and compactly describe an ~ ation to the data samples
of Figs. 4 to 8. This can be done through the production of a "baking curve" which is
a one-dimensional represent~tion of the important color variations within the three-
~imt~ncional data space of Figs. 4 to 8. One method of production of a baking curve isto utilise a Kohonen self-org~nicing feature map which is an unsupervised learning
technique that is effective in extracting structure from complex experimental data which
bears a highly non-linear relationship. A detailed description of Kohonen's self-
org~ni~in~ feature map is provided in Neural Network Architectures: An Introduction
o by Judith E. Dayhoff, published 1990 by Van Nostrand Reinhold at pages 163-191, the
contents of which are hereby incorporated by cross-reference.
Turning now to Fig. 9, there is shown an example of a self-organising map
(SOM) as utilised in the ~leÇ~led embodiment. The SOM 30 has three input nodes 31
which correspond to the red, green and blue color components of pixel values from the
15 tli~iti~e~ color image of the biscuit 13 (Fig. 2). The SOM 30 includes N output nodes
32. Every input node 31 is connected by means of edges e.g. 33, to each of the output
nodes 1 to N. The use of a one-dimensional SOM 30 means that, upon training, theSOM network 30 will map the entire set of RGB pixel values 31 from a biscuit image
to a one-dimensional array of points or output nodes 32. This will have two maineffects: firstly, it will reduce the ir.put data function inio a data space of lower
dimensionality; and secondly, the interrelationships between the most relevant points in
the input data 31 will be retained intact in the output format of the network 30.
If the input 31 is denoted E as follows:
E = (Red, Green, Blue), Eqn 1
and if output node i is connected to the input 31 by a series of edges having weights
denoted Ui, where Ui takes the following form:
Ui --(Wi,red ~ Wi,green . Wi,blue)~ Eqn 2
where it is assumed that the weights are initially randomly assigned to edges and range
over the same set of data values as the pixel component values (in the ~ef~.led

CA 02207326 1997-06-09

WO 96/18975 PCI/AU95/00813


embodiment from 0 to 255), then the SOM is trained by finding, for each pixel, the
output node 32 having weights Ui which are closest to the input pixel E. The degree of
"closeness" is normally measured by means of a Euclidean distance measure. The
closest output node, having ~ul)scli~t c, can be denoted mathem~tir~lly as follows:
¦¦E UC¦¦ = ¦¦E-Ui¦¦. Eqn3

The best or winning node c is then altered in conjunction with nodes within the
neighbourhood of c (for example, node c - 1 and c + 1). The alteration, for eachoutput node j in the neighbourhood of c (denoted Nc) proceeds by first calculating ~U
as follows:
~Uj = a (t) x (E - Uj), Eqn 4
and then deriving a new set of weights Uj(t+1) as shown in equation 5:
t+l t
U j = U j + ~ Uj. Eqn 5

Those output nodes that are not in a predetermined neighbourhood around the chosen
node c are left unaltered.
The function of a(t) in equation 4 is known as the "learning rate" and is a
monotonically decreasing function, with an alpha function of the following form being
suitable:
a (t) = aO (1 - t/T), Eqn 6
where aO takes on values in the range of 0.02 to 0.05, t is the current training iteration,
and T is the total number of training iterations to be done. The width d of the
neighbourhood Nc can also be chosen to vary in a similar manner as set out in
equation 7:
d (t) = do (1 - t/T), Eqn 7
where do can be chosen to initially be, say, a third of the width of the output nodes.
As an example of a process carried out in accordance with the preferred
embodiment, four biscuit samples of "SAO" from Arnott's Biscuits T imitt.~l weresc~nn.ocl to yield 47,967 pixels per biscuit with three separate R, G, and B values for
each pixel. The four sets of pixels were then shuffled in a random sequence and used

-
CA 02207326 l997-06-09
Wo 96/18975 pcTlAus5loo8l3
- 7-

as training input to an SOM of the form of Fig. 9 having ten output nodes for a total of
20 training passes.
Referring now to Fig. 10, there is shown a plot of the ten output node weight
values Ui within a three--limen~ional color cube 35. The ten points (e.g., 36) are
shown joined together by curve 37. The curve 37 is hereby denoted to be the final
"baking curve" of the input data.
Turning now to Fig. 11, the process was repeated for a second form of biscuit,
comprising Arnott's "MILK COFFEE" (Registered Trade Mark) biscuit for a 15 node
output SOM and the results 41 are shown plotted within the color cube 40, with the
10 training data being passed through the SOM a total of 50 times. It can be seen that the
structure of the baking curve 41 of Pig. 11 is similar to that of the baking curve 37 of
Fig. 10, with the two curves 37,41 oc-;u~ying a different portion of the color cube
35,40 and reflecting the differences in ingredients between the products. The color
curve 41 of Fig. 11 is also shorter than that of Fig. 10 as the "MILK COFFEE" form
of biscuit exhibits more consi~lellt browning than the "SAO" form of biscuits, which
have blisters that can cause uneven browning in color.
Once trained, the SOM 30 of Fig. 9 can be utilised as neural network 7 of Fig.
1 to produce, for each input pixel, an output node intlir~tor having the closest position
to the input pixel. The closest m~t~hing output node 32 (Fig. 9) for each pixel of an
image can be subjected to image procPcsing 8 (Fig. 1) which can take the form ofhistogr~mming, thereby producing a histogram profile of the sc~nn~cl baking product 2.
A second neural network 9, which takes the form of a supervised feed f~Jlw~d
neural network, can then be subjected to "training" by im~ging a large number ofbiscuits 2 having known baking characteristics, feeding the images through SOM 7, and
forming a histogram 8. The histogram 8 can then form the input data to a supervised
back propagation neural network which can be trained, in the normal manner, to
classify the color level of the baking product (e.g. the biscuit) 2. The samples can be
continuously fed through the steps 6 to 9 until the neural network 9 is properly trained
to produce output 10 in(li~ting the level of baking.



,

CA 02207326 1997-06-09

W 096J18975 PCT/AU9S/00813
-8-

The output 10 can then be utilised by control system 11 which can take the form
of human or aulo~ Lic process adjustment to adjust the conditions within oven 4 to
hll~ruve the baking products 2.
The specified system provides automatic segmPnt~tion of the biscuit subject fromdiverse backgrounds. This is accomplished by the histogr~mming process which
"weighs" each image pixel as a reducing function of its distance from the histogram
points. Thus, pixels that are signifir~ntly distant from the baking curve discovered by
the self-organising map are down-weighted to the extent that they make little or no
contribution to the overall histogram. In practice, this means that it is not n~cess~ry for
10 biscuits to be imaged with a specially prepared background. Any background with
colors sufficiently dissimilar to the biscuit colors under consideration will suffice. This
is of practical benefit when applying the system to on-line monitoring and oven control,
as the im~ging background may well be a conveyor belt of inconsistent color.
The co.ll~uu~ion of the histogram 8 from the map produced by the SOM is
performed in detail as follows. In principle, the purpose is to obtain histograms in
which each bin represents the weighted count of pixels falling within a fuzzy portion of
the baking curve 35,40. The fuzzy portion is defined by a Gaussian weighting function
with parameters ~x and ~y. The parameter ~x denotes the spread of the Gaussian
weighting function about the baking curve, and it is determined by consideration of the
20 likely color variation around the curve. It is chosen sufficiently small enough to enable
the automatic segm~nt~tion process previously described. The parameter ~y denotes
the spread of the Gaussian weighting function along the baking curve for a particular
histogram bin. Normally, c~y is greater than '~x-
A practical implementation of the above technique involves the following steps:
a. The SOM nodes are interpolated to obtain a large number of sampling
points. The number of sampling points is determined by ~x so as to limit to an
acceptable level the aliasing effect caused by sampling the baking curve at discrete
points. The Nyquist result in sampling theory applies in this step.

CA 02207326 1997-06-09

W O 96/18975 PCT/AU95100813
g

b. The biscuit pixels are histogrammed at the interpolated sampling points.
The dict~n~e of each pixel (in RGB color space) from each sampling point is computed,
and the Gaussian function with spread 6x is used to compute the weighted conLribulion
of that pixel to the hi~Lo~ bin at that particular sampling point.
C. The histogram produced in (b) is treated as a l-D signal and filtered with a
second (~ si~n function with a spread ~. In the process, it is subsampled to

the number of input nodes of the feed fol w~rd neural network. The value of 6yiSchosen so as to limit to an acceptable level the aliasing caused by this furthersubsampling.
The system may be applied to multi-dimensional data of diverse kinds such as
combinations of color, visual texture and/or three-dimensional structure sensing. In
some such situations, the set of state points may require a 2-D or 3-D SOM whereas the
baking curve requires only a 1-D SOM.
It will be obvious to those skilled in the art that the steps 6 to 9 can be
15 implement~d in many different ways, including de~ic~t~d neural network hardware and
associated computer h~dw~ or in the form of a software siml]l~tion of the neuralnetwork system. The pler~ d method of implement~tion of steps 6 to 9 is in the form
of software implement~tion on a standard microcoln~uL~l system as this allows for easy
alteration when it is desired to alter the form of baking products.
The foregoing describes only one embodiment of the present invention, and
mo~ c~tions obvious to those skilled in the art can be made thereto without departing
from the scope of the invention.

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date Unavailable
(86) PCT Filing Date 1995-12-01
(87) PCT Publication Date 1996-06-20
(85) National Entry 1997-06-09
Dead Application 1999-09-10

Abandonment History

Abandonment Date Reason Reinstatement Date
1998-09-10 FAILURE TO RESPOND TO OFFICE LETTER
1998-12-01 FAILURE TO PAY APPLICATION MAINTENANCE FEE

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Application Fee $300.00 1997-06-09
Maintenance Fee - Application - New Act 2 1997-12-01 $100.00 1997-09-18
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
WESCOTT, CHARLES TASMAN
HAMEY, LEONARD GEORGE CHADBORN
Past Owners on Record
None
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Drawings 1997-06-09 11 140
Representative Drawing 1997-09-19 1 6
Cover Page 1997-09-19 1 41
Abstract 1997-06-09 1 53
Claims 1997-06-09 4 197
Description 1997-06-09 9 423
PCT 1997-06-09 17 609
Assignment 1997-06-09 3 108
Correspondence 1997-08-22 1 30