Language selection

Search

Patent 2054851 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent: (11) CA 2054851
(54) English Title: AUTOMATED POINT-OF-SALE MACHINE
(54) French Title: MACHINE AUTOMATISEE DE POINT DE VENTE
Status: Term Expired - Post Grant Beyond Limit
Bibliographic Data
(51) International Patent Classification (IPC):
  • G06K 7/10 (2006.01)
(72) Inventors :
  • SCHNEIDER, HOWARD (Canada)
(73) Owners :
  • FUJITSU FRONTECH NORTH AMERICA INC.
(71) Applicants :
  • FUJITSU FRONTECH NORTH AMERICA INC. (United States of America)
(74) Agent: ROBIC AGENCE PI S.E.C./ROBIC IP AGENCY LP
(74) Associate agent:
(45) Issued: 1999-06-29
(22) Filed Date: 1991-11-01
(41) Open to Public Inspection: 1993-05-02
Examination requested: 1996-02-20
Availability of licence: N/A
Dedicated to the Public: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): No

(30) Application Priority Data: None

Abstracts

English Abstract


An automated retail point-of-sale machine is disclosed having the ability to
allow consumers to check out their purchases with a minimal of direct human
assistance. The machine is designed to work with products whether labelled
or not with machine readable bar codes. The machine possesses security
features which deter customers from fraudulently bagging items by comparing
the weight changes on the packing scale with the product number related
information in the case of labelled products. In the case of nonlabelled
products, experienced customers can identify the product through a series of
menu choices while beginner customers can allow the supervisory employee
to enter a product number abbreviated code, with additional visual and/or
dimensional sensory information of the contents being relayed to a supervisory
employee. The machine allows high shopper efficiency by minimizing customer
handling of products by positioning the packing scale adjacent to the scanner
and typically not requiring further handling of the purchased items until
checkout is completed.


Claims

Note: Claims are shown in the official language in which they were submitted.


48
THE EMBODIMENTS OF THE INVENTION IN WHICH AN
EXCLUSIVE PROPERTY OR PRIVILEGE IS CLAIMED ARE DEFINED
AS FOLLOWS:
1. A self-service checkout system comprising:
(a) a robot module;
(b) a laser bar code scanner mounted in said robot module for generating
a first electrical signal corresponding to the bar code scanned;
(c) a packing scale mounted in said robot module for generating a second
electrical signal corresponding to the weight on said packing scale where said
packing scale is mounted in proximity to the said laser bar code scanner such
that a customer can scan and bag a product with one motion;
(d) attachments on the said packing scale to hold bags open and in place;
(e) a first video display mounted in said robot module;
(f) first user interface means operating in proximity to said first video
display generating a third electrical signal;
(g) a sensor mounted above the said packing scale where said sensor
generates a fourth electrical signal representative of the external
characteristics of the contents of the packing bags;
(h) a supervisor module to be used by a supervisory employee to supervise
the operation of said robot module;

49
(i) second user interface means mounted in the said supervisor
module generating a fifth electrical signal;
(j) a second video display mounted in said supervisor module;
(k) an electronic computer having access to a product lookup
table and receiving said first, second, third, fourth and
fifth electrical signals, and sending a sixth electrical
signal to said first video display and a seventh
electrical signal to said second video display;
(l) a computer program causing said electronic computer in the
case of a product containing a machine readable bar code,
to look up, in response to said first electrical signal,
in the said product lookup table the allowable weight for
the product and to verify correspondence with the weight
addition on the said packing scale as indicated by the
said second electrical signal, and in the case of a
product without a valid machine readable bar code to
present the customer, via said sixth electrical signal via
said first video display, with a series of choices to
identify the product, via said first user interface means
via said third electrical signal, including the option of
requesting the said supervisory employee, via said seventh
electrical signal via said second display means, to
identify the product via said second user interface means
via said fifth electrical signal and optionally in
response to said sensed external characteristics as
indicated by said fourth electrical signal;
and

(m) a storage scale mounted in close proximity to the said packing scale so
that when the said packing scale becomes filled, products and their bags can
be transferred to said storage scale which generates an eighth electrical signal
which is received and surveyed by the said electronic computer to ensure that
no unauthorized products are fraudulently placed on or in the bags on the said
storage scale.
2. The self-service checkout system of claim 1 containing a receipt printer
attached to the said electronic computer to produce a printed list of the
customer's purchases and total payment requested.
3. The self-service checkout system of claim 1 whereby said electronic
computer contains a human voice generating circuit.
4. The self-service checkout system of claim 1 whereby the said robot
module contains a payment reader capable of reading forms of payment
characterized by credit cards, debit cards and currency, where such payment
reader generates an electrical signal which is received and surveyed by said
electronic computer.

51
5. The self-service checkout system of claim 1 containing a television
camera and monitor to allow the supervisory employee to verify that before
the customer removes his products from the said robot module that no
products have been fraudulently put aside and containing a monitor visible to
the customer to make the customer aware that his/her actions are being
surveyed.
6. The self-service checkout system of claim 1 whereby the supervisor
module contains a cash drawer.
7. The self-service checkout system of claim 1 whereby the robot module
contains angled, sealed surfaces.
8. The self-service checkout system of claim 1 where the said sensor
mounted above the packing scale generates high resolution color images of the
product in the packing bags.

52
9. The self-service checkout system of claim 1 where the said sensor
mounted above the packing scale contains ultrasonic transducers generating
said fourth electrical signal which is representative of the distances from the
said sensor to the top of the contents in the packing bags and thus allows the
said electronic computer to compute the increase in volume of the contents
of the bags on the said packing scale after an item is placed in said bags and
to verify correspondence of the thus net volume of the product with the
volume specified in the said product lookup table for that particular product.
10. A self-service checkout system comprising:
(a) a robot module;
(b) a laser bar code scanner mounted in said robot module for generating
a first electrical signal corresponding to the bar code scanned;
(c) a packing scale mounted in said robot module for generating a second
electrical signal corresponding to the weight on said packing scale where said
packing scale is mounted in proximity to the said laser bar code scanner such
that a customer can scan and bag a product with one motion;
(d) attachments on the said packing scale to hold bags open and in place;
(e) a first video display mounted in said robot module;
(f) first user interface means operating in proximity to said first video
display generating a third electrical signal;

53
(g) a sensor mounted above the said packing scale where said sensor
generates a fourth electrical signal representative of the external
characteristics of the contents of the packing bags;
(h) a supervisor module to be used by a supervisory employee to supervise
the operation of said robot module;
(i) second user interface means mounted in the said supervisor module
generating a fifth electrical signal;
(j) a second video display mounted in said supervisor module;
(k) an electronic computer having access to a product lookup table and
receiving said first, second, third, fourth and fifth electrical signals, and sending
a sixth electrical signal to said first video display and a seventh electrical signal
to said second video display;
(l) a computer program causing said electronic computer in the case of a
product containing a machine readable bar code, to look up, in response to
said first electrical signal, in the said product lookup table the allowable
weight for the product and to verify correspondence with the weight addition
on the said packing scale as indicated by the said second electrical signal, and
in the case of a product without a valid machine readable bar code to present
the customer, via said sixth electrical signal via said first video display, with a
series of choices to identify the product, via said first user interface means via
said third electrical signal, including the option of requesting the said
supervisory employee, via said seventh electrical signal via said second display

54
means, to identify the product via said second user interface means via said
fifth electrical signal and optionally in response to said sensed external
characteristics as indicated by said fourth electrical signal;
and
(m) in proximity to the said packing scale a three-dimensional array of light
beams and light detectors generating an eighth electrical signal which is
received by the said electronic computer where interruption of the said light
beams by the customer's hand transferring a product to the packing scale and
by the customer's empty hand leaving the packing scale causes the said
electronic computer to subtract the computed dimensions of the customer's
hand alone from the computed dimensions of the customer's hand holding the
product and to verify correspondence of the thus net dimensions of the
product with the dimensions specified in the said product lookup table for that
particular product.

Description

Note: Descriptions are shown in the official language in which they were submitted.


AUTOMATED POINT OF-SALE M~CEIINE
2 FIELD OF THE INVENTION
3 The present invention relates to retail point-of-sale systems which allow
4 the customer to check out purchased items with a lllillilllUlll of operator
intervention while prevent;ng customer fraud.
6 BACKGROUND OF THE INVENTION
7 In most retail environments the customer selects various items for
8 purchase and brings these items to an operator for checkout. The operator
9 enters the price of each item selected, as well as a code particular to the item,
into a point-of-sale terminal which then calcutates the total amount the
11 customer must pay. After payment is received the point-of-sale terminal
12 calculates any change owing to the customer and produces a written receipt
13 Eor the customer. C)ver the klst two decades many retail products have been
14 manufactured to contain a machine readable bar code. In response, many
retail environments have incol~o~ d an optical scanner into their point-of-
16 sale systems. The operator is able to save time by sç~nnine purchased items
17 rather than having to manually key in price and product information. When
18 the operator scans a product the optical scanner sends a signal corresponding
19 to the product number to the data processing component of the point-of-sale
, , :
- , . ~ . .
. - ~ ..
-. ~ .
. . ~

terminal system. In the latter resides a product lookup table which quickly
2 provides the price and the description of the scanned item.
3 Many inventions have been proposed over the last two decades to
4 automate the point-of-sale terrninal by having the customer scan the item
S himself/herself and then place the item on a checkout weighing receptacle.
6 Since many items have predetermined weights, the point-of-sale terminal
7 system need only compare the actual weight of the product placed on the
8 checkout weighing device with the weight given by the product lookup table
9 (ie, along with the price and description information) to assure that the item
placed on the checkout weighing receptacle is indeed the item scanned.
11 One early prior art system ~or automated checkout is described in
12 Ehrat U.S. Pat. No. 3,836,755. Ehrat's invention consists of a shopping cart
13 which contains a scannin~ and weighin~ appar~tus and which in conjunction
14 with an evaluation system evatuates the correspolldence of weight with
product designation. Another prior art system for automated checkout is
16 described in Clyne U.S. Pat. No. 4,373,133. Clyne's ;nvention consists of
17 providing each customer's shopping cart with an electronic recording unit
18 which is used by the customer to scan each item selected for purchase. The
19 recording unit can contain a product lookup table to enable it to obtain weight
and price information. When the customer wishes to check out, his/her
. ..... . .. . . . . . . .
,. . :
:,;, : . :-
: - . . -
: .
. ' ' ' ' ' ! : ' :'
' . ' ": ' '' 'i, : ~
~:: : : : ..

3 ~ :~
collection of items is weighed to verify that the actual total weight corresponds
2 with the total weight calcLllated by the electronic recording unit. One
3 important limitation of Ehrat's and Clyne's inventions is their poor ability to
deal with products not having a machine readable code. Another limitation is
S the risk of customer fraud if the customer easily substitutes a more expensive
6 item having the same weight.
7 Improved systems for automated checkout are described in
8 Mergenthaler l~.S. Pat. No. 4,779,706, Johnson U.S. Pat. No. 4,787,467 and
9 Humble U.S. Pat. No. 4,792,018. The Mergenthaler and Johnson inventions
are quite similar. At a self-service station customers scan and weigh items
11 (where weight is automatically checked against procluct code) and then place
12 items into a new cart (Johnson) or a bag (l~lergenthaler) which is on a
13 weighing receptacle. The new cart or new bags are then brought to a checkout
14 station where it is verified that the weight o~ the cart or bags llas not changed.
The Humble invention passes items on a conveyer through a light tunnel after
16 sç~nning. Not only is weight determined and verified against product number,
17 but the product's dimensions can also be determined and verified against
18 product number thereby making substitution of similar weight items difficult.
19 The customer's items accumulate at the end of the light tunnel where they
must later be bagged and presented to an operator for payrnent. To prevent
21 customers from not sçs-nning items and placing them at the end of the light
- . . . .
: ~ . . , , ,- .
. . ~ . ~ , ~, .
, i . , -
. . . -.
, .
.

tunnel for bagging, the H~lmble invention suggests the use of an electronic
2 surveillance system in the pedestrian passage about the system.
3 The above inventions all have serious limitations with respect to4 customer fraud, shopping efficiency, non-coded products and use by non-
S experienced users. In the Mergenthaler and Johnson patents, customer fraud
6 remains an important problem as customers can scan a cheap item at the sel~-
7 service station, discard it and immediately substitute a more expensive
8 similarly ~Iveighing item. Despite the Humble patent's use of the light tunnel
9 to determine item shape in addition to weight, the customer need only place
an item at the bagging area without sc~nning it. The electronic surveillance
11 system sllggested by the Humble patent is not economical for retail
12 environments such as supermarkets. As noted in the Shapiro article, "13 shoppers could conceivably pllt groceries directly from their carts into their
14 shopping bags." In the Mergenthaler and Johnson patents, little attention is
lS paid to shopper efEiciency (as opposed to operator etEiciency). Customers mllst
16 handle items repeatedly to place them ~rom one weighing station to another.
17 The Humble invention also does a poor job with respect to shopper efficiency.
18 After having scanned and placed all the purchased items on the conveyor~ the
19 customer must once again handle all the items dwring the bagging operation.
The Johnson invention does make a limited provision for items not po~es~ing
21 a machine readable code by allowing customers to enter a code or price value.
. .
. - ;.
: : ~ ,. .
~:: .:

However, the items are not verified in any way by the invention. The Humble
2 invention pays more attention to products not ~on~ailling a machine readable.
3 CLlstomers are presented with selection on a computer screen and the4 invention attempts to verify the dimensions of the item correspond with the
selection made. However, such correspondence is very limited. As a result, as
6 the Shapiro article points out, "Fruits and vegetables present considerable
7 problems.. an employee is stationed in the produce department to weigh fruit
8 and affix a coded label for the system to read." The Johnson and
9 Mergenthaler inventions pay scant attention to user friendliness -- an
important consideration for non-experienced users. The Humble invention
I pays more attention to user friendliness with the incorporation of a touch-
l2 activated display screen. Nonetheless, as the Shapiro article notes, ".. not
13 delivered the promised labor savings.. CheckRobot says one cashier can handle
14 three to eight lanes. But becaL~se of the need to help confused customers.. a
cashier assigned to every two lanes and other employees hover around the
16 machines to help customers."
17 SUMMARY OF THE INVENTION
18 The present invention describes a method and apparatus which allows
19 consumers to check ollt their purchases with a minims~l of direct human
assistance. The present invention possesses significant hl~pr~v~ ents with
-- ,.
~ . . . .

6 2 ~ ;~ .q ~
respect to the prior art in the areas of customer fraud, shopping efficiency,
2 non-coded products and use by non-experienced users.
3 The present invention consists of two major modules - the self service
4 unit utilized by the customer, herein referred to as the 'robot module' and the
unit utilized by the store employee to supervise the operations of several robot
6 modules, herein referred to as the 'supervisor module'. The customer presents
7 himself/herself at any available robot module with the items he/she has
8 selected for purchase. The customer scans a product item and then places it
9 into a bag resting on a scale, herein referred to as the 'packing scale'. The
electronic signals from the scanner and the scale go to an electronic computer
11 which contains (or can access) a product lookup table allow;ng the increase
12 of weight on the packing scale to be verified against product number. The
:13 customer repeats this operation for al1 remaining ;tems. If a weight change
1~ does not correspond with the pro~ ct llumber then the c~lstomer will receive
an audio and/or visual prompt to this eEfect from the robot module. Prompts
16 typically are simultaneously transmitted to the superviso~ module. A
17 bidirectional intercom system allows the supervisory employee to immediately
18 help the customer with any difficulties and if necessary, via the supervisor
19 module keyboard, directly enter commands or product information. When the
customer has scanned and bagged all items selected for purchase, the
21 customer goes to the supervisor module to pay, or if the robot module is so
' ~

equipped, as it would typically be in the case of debit or credit cards, the
2 customer remains at the robot module for payment. In either case, the3 customer is instructed to leave the bag on the packing scale alone. Removing
4 the bag from the packing scale will cause a change in weight (or similarly,
adding a nonscanned item to the bag will cause a change in weight) that will
6 be noticed by the computer and cause warning to be given. Only after the
7 computer receives a signal that payment has been received will it allow the
8 bag from the packing scale to be removed without a warning prompt
9 occurring. Note that the customer has handled each item only one time. The
customer scans and then directly bags the item. ~he item nor the bag is not
11 hanclled again until checkout is finished, thus allowing a high shopper
12 efficiency. A small exception occurs if the customer has items too numerous
13 to fit in the bag(s) on the packing scale fn which case full bags are slid several
1~ inches to an adjacent larger 'storage scale' where weight changes are:l5 monitored by the computer.
16 To prevent the customer from sc~nning one item and substituting a17 more expensive item into the bag on the packing scale and to prevent the
1~ customer from placing a nr~ncczlnnt d item into his/her bags after payment, the
19 present invention incorporates several innovative features. The robot module
is physically constructed to contain no openings nor any folds nor any flat
21 surfaces, except the limited but prominent surface adjacent to the scanner,
.~ "
.
- ''- '
~ ' '
!

~ ~ r ~
where fraudulently subst;tutecl items could be discarded. The robot monitor
2 contains a closed circuit video camera and video monitor to psychologically
3 deter the customer from fraudulent activity. As well, a signal from the closed-
4 circuit video camera showing the areas CO~ lg the floor, the shopping cart
S and the flat scanner area, is presented to the supervisory employee via the
6 supervisor module after payment is received. The supervisory employee must
7 press a key on the superYisor module keyboard to accept the video image (or
8 avoid pressing a 'reject' button) to allow the computer to allow the customer
9 to remove hislher bags without the occurence of an audiovisual warning. Note
that the present invention requires the supervisory employee to observe the
11 video image for only a second unlike the constant monitoring that is required
12 of typical video surveillance systems.
13 Before the customer uses the robot module, he/she presses a button or
14 sw;tch indicating the level of experience he/she has with this type of
automated point-o~-sale machine. For 'beginner' customers, when they have
16 an item not containing a machine readable bar code, as indicated by pressing
17 a 'no bar code' button on the robot module, they will be instructed to place
1~ the item directly into the bag on the packing scale where its image (and/or
19 possibly ultrasonic dimensions and/or dimensions obtained by breaking a light
curtain aboYe the bag) is sent to the supervisor module. The supervisory
21 employee receis~es a prompt to examine the image and to enter the product
':
.
. . .: ,. ~
. , ,. :
. . : . . : : . ::~ :

2 ~
-l number or a corresponding abbreviation of the new item. In the case of the
2 'experienced' customer, the computer monitor of the robot module will3 present the customer with a menu selection in order for the customer to
4 qualitatively iden~iiy the product and optionally identify its quantity. After
S identification, typically involving pressing a button corresponding to a choice
6 on a sub-menu, the customer is instructed to place the item in the bag on the
7 packing scale. An image of the bag's new contents along with the customer's
8 identification are presented to the supervisory employee via the supervisor
9 module for verification. In the case of both the 'beginner' and 'experienced'
customers, the weight change on the packing scale is evaluated by the
11 cu~ ul~r with reference to the product number ultimately chosen to see if the
12 weight change is reasonable. If the weight increase differs by more than the
13 allowed tolerance for that prod~lct, then the supervisory employee will receive
14 a prompt to inspect the transmitted video image with more care. Note that
:lS with only a small investment of the supervisory employee's time and with little
16 con~usion to the inexperienced user, tlhat a product not bearing a mslchinf~
17 readable code is accurately identified. In particular, note that the customer is
18 not obligated to key in a series of product number digits to identify the
1~ product.
As mentioned above, in the case of nonlabelled products, an image and
21 possibly the dimensions of the product are llalls~ led to the supervisor
-- . .
-
:
~,.,. ~ ,

mod~lle for approval by the supervisory employee. For beginner customers, the
2 supervisory employee will actually identify the product and if necessary its
3 quantity (ie, enter the product number or an abbreviation thereof and if
4 necessary the quantity) while experienced customers are expected to identi~
S the product typically through a series of menus displayed on a video display.
6 Occasionally the customer will be expected to identify the quantity of the
7 product as well, eg, "4 apples." For the experienced customer, the supervisory
8 employee then will verify that the customer has correctly identified the
9 product and its quantity. As mentioned above, the weight of the product is
nonetheless evaluated by the coll,pu~er to make sure that the weight increase
11 on the packing scale corresponds reasonably with the product and its quantity.
12 I~ poor correspondence is determined by the computer, thell the supervisory
13 employee will be prompted to veriEy the transmitted image with more care.
14 Note that for both types of customers, and especially for the experienced
lS customer, only a small amo~lnt nt the s~pervisory employee's time is req-lired.
16 The supervisory employee is not expected to constantly watch a video screen
17 as is typically done in close-circuit television surveillance systems. Rather, the
1~ supervisory employee receives the occasional prompt during a customer's
19 order to look at the video screen tor a moment for those products not bearing
machine readable product codes. To ~ e labor savings it is often
21 advantageous to have one supervisory employee monitor as many as eight
22 robot modules. In such a case, should two or more customers have
,
,, ~ ~ ., . , , ~ i . .

C t 5
nonlabelled products for verification by the supervisory employee at the same
2 time, assuming that the customers are experienced customers and have
3 identified the product, then it is useful after a certain period of time has
4 elapsed, eg, 3 seconds, to verify the product soley on its weight. For the
occasional time when the suyervisory employee is busy, this scheme maintains
6 shopper efficiency without reducing overall security very much. It is possible
7 to extend this scheme even further to maximize labor savings even more. By
8 using additional sensory modalities in conjunction with the transmitted video
9 images, it is possible to have one supervisory employee monitor more robot
modules without reducing shopper efficiency or overall security. By
11 determining the dimensions of the product being placed into the bags on the
:12 packing scale, for the majority of nonla~elled products it will be suffic;ent to
13 verify the dimensions and the weight ol~ the product against its product code
1~ information to assllre that the experienced customer is accurately and honestly
identit'ying the prodllct. Only ~or those cases where the computer has
16 determined that the cnrrespondence of measured dimensions and measured
17 weight is poor, will it be necessary to use the supervisory employee's time to
18 examine the transmitted image to make a final decision. Two methods of
19 determining dimensions are readily available for use with the robot module.
One method consists of placing in ~lu~ y to the packing scale a three-
21 dimensional array of light beams and light detectors. The dimensions of the
22 customer's hand holding the product and the dimensions of the customer's

12 ~A~
empty hand returning from the packing scale can be easily computed by the
2 computer by following which light beams have been interrupted. Thus, by
3 subtracting the dimensions of the empty hand from the dimensions of the
4 hand plus product, net dimensions of the product can be calculated. Another
S method of determining dimensions involves placing ultrasonic transducers
6 above the packing scale. The ultrasonic transducers and appropriate cil~uiLIy
7 can measure the distance from their fixed position to the top of the contents
8 in the packing scale bag(s). Thus, by observing the change in distances from
9 the ultrasonic transducers to the tops of the contellts in the packing scale
bag(s), the computer can calculate net volume changes. This net measured
11 volume can then be verified against the product number's stored volume
12 limits.
13 BRIEF DESCE~IPTION OF T~IE Dl~AWINGS
14 FIG.1 is a perspective view showing the exterior configuration of a
preferred embodiment of the 'robot module' portion of the invention.
16 FIG.2 is a perspective view showing the exterior configuration of a
17 preferred embodiment of the 'supelvisor module' of the invention.
,;.
., . : ~,
:
:

13 ~ 3.
FIG.3 is a block diagram of the invention.
2 FlG.4a-4d is a flow-chart showing the logic steps associated with the
3 invention.
4 DESCRIPTION OF THE PREFER~ED P.MBODIMENT
External Configuration
6 Turning now to FIGS. I and 2 there is shown a pl~felled embodiment
7 of the automatic POS machine. Figure 1 shows the portion of the machine
8 used by the consumer to checkout his/her purchases. This portion of the
9 machine will herein be referred to as the 'robot module'. Figure 2 shows the
portion of the machine used by the store employee to supervise the operations
11 of several 'robot moclules'. This portion of the machine w;ll herein be referred
12 to as the 'supervisor module'. Figure 2 depicts a supervisor module which is
13 capable of supervising two robot modules.
14 Robot Module
The robot module, as shown in Figure 1, instructs the consumer ~ia a
16 centrally located video display terminal 11. To co~ ullicate with the robot
17 module, the consumer can press buttons 1 through 10. In the embodiment
18 shown here the video display terminal would typically be a high resolution
19 color graphical video display terminal and the buttons would be color coded
'
.~
.~ ' ' ' ' ~ ' ., .

2 ~
14
switches. The buttons would be lined up precisely with the video display
2 terminal 11 so that they could be used for many different ~unctions. In other
3 embodiments, the labelling or the quantity of the buttons could differ from the
4 present embodiment. As well the video display terminal could be monochrome
rather than color, and its size and location could differ from the present
6 embodiment. It is possible, in a different embodiment of the present invention,
7 to replace or supplement the combination of buttons 1 to 1û and the video
8 display terminal 11, with a touch-sensitive video display terminal. Other
9 embodiments of the present invention are also possible whereby the buttons
1 to 10 are replaced by other means of user interface, eg, voice recognition
11 circuitry, h~t~llu~ion of beams of light by a pointing finger, joystick, etc.
12 The robot module also instructs the consumer via a speaker system 12.
13 Speaker system 12 consists of one or more audio speakers attached to one or
14 more au(lio nmplifiers. The speaker system 12 receives computer generated
voice signals and colll~uuler generated tonalities from the computer portion 66
16 of the automatic POS machine. Speaker system 12 also receives speech
17 signals from the microphone 61 at the supervisor module. Likewise, the
18 consumer can communicate by voice with the employee supervising the
19 automatic POS machine via microphone 13. Note that in the present
embodiment microphone 13 attaches to the robot module via a flexible neck
21 161.
, ,:

2 ~
Sign 141 provides the consumer with information regarding the
2 operation of $he autornatic POS machine, as well as advertising for services
3 and prod-lcts of~ered by the store.
4 Laser scanner 14 is capable of interpreting a bar coded label on a retail
S product. Bar coded labels, as one skilled in the art knows, represent digits and
6 occasionally alphanumeric symbols, by a series of thin and thick bars. Many
7 products sold at retail stores possess a bar coded label representing the
8 manufacturer's product number for that product. Laser scanners are
9 commercially available which scan with a moving laser beam the bar coded
label on a product and produce an electrical signal representing that product's
1l code n~lmber. An area 16 prior to the laser scanner clllOWS cons~lmers to
12 prepare prod~lcts for sczlnning, In FIC;. l, a shopping basket 15 is shown
13 resting on area 16.
l4 ~l~ter a consllmer scnns n purchased item over the l~ser scanner lq, the
consumer places the item into the plastic or paper bag 21 held in place by bag
16 holders 19 and 20. Bag holders I9 and 20, as we]l as portions of bag 21, lie on
17 platform 22. Platform 22 lies on a weighing scale 23, herein referred to as the
18 'packing scale'.
. .
~ ' ' ~' " '
~- .

l6 2 ~
For the sake of simplicity, in the embodiment being discussed here, 18
2 is considered to be a sensor transmitting only images of the contents of bag
3 21 to the supervisor module. Thus, in the embodiment being discussed here,
4 sensor 18 will be referred to also as 'sensor/video camera' 1S. However, as
S mentioned above, sensor 18 may in other embodiments contain a three-
6 dimensional array of light beams and detectors which measure the dimensions
7 of the customer's hand and product going to the bag 21 and the customer's
8 empty hand returning from bag 21 thus allowing computation of the net
9 dimensions of the product. Sensor 18 may also contain an plane of ultrasonic
transducers which measure the distance from the fixed position of sensor 18
l1 to the top of the contents of the bag 21. By noting the change in these
12 distances after a product is placed in bag 21, it is possible to compute the
13 volume of the prod~lct. Other embodiments of the present invention are thus
1~ possible where sensor 18 consists of a video camera asld/or a light-beam
dimension computing array and/or an ultrasonic transdllcer volume comp-lting
16 plane.
17 After bag 21 is full, it can be transferred by the consumer to platform
18 28. In FIG. 1, such a bag 24 is shown resting on platEorm 28. Note also that
19 platform 28 contains a pole 26 which in turn contains hooks 27. Additional
bags can be h~lng on hooks 27. Platform 28 lies on a weighing scale 29, herein
21 referred to as the 'storage scale'.
, ~,
. , . : ~ ,..
:

1 7
Pole 30 ;s attached to the cabinet 162 of the robot module (it does not
2 make any contac~ whatsoever with p]atform 28). Mounted on ehe top of pole
3 30 is a surveillance camera 32 and a surveillance monitor 31. Surveillance
4 camera 32 transmits video images of the consumer and the immediate region
around the consumer. These images are sent to the supervisor module as well
6 as being displayed on the surveillance monitor 31. Thus, the consumer can see
7 ;mages of himself/herself on monitor 31 and thus is aware that his/her actions
8 are being monitored by the s~lpervisor employee.
9 Cabinet 162 and cabinet 17 of the robot module do not have openings.
As well, platforms 22 and 28 occupy most of the horizontal space over cabinet
lI 162. An important feat~lre of the present invention is that it is difficult for a
12 customer to leave aside an item he/she does not scan so as to avoid paying for
13 the item by simply bagging the item when the order is completed and he/she
14 is taking the bags from platforms 22 ancl 28. Any item the customer places on
platforms 22 or 28 will ca~lse a weight change to be detected by the packing
16 scale 23 or the storage scale ~9. If the item has not been scanned, the machine
17 wi]l prompt the customer to remove the item, as discussed later be]ow. If the
18 customer leaves an item on the laser scanner 14 or on the surface 16 adjacent
19 to the laser scanner, the supervisory employee will be able to see these items
via the video image recorded by camera 32.
' ' '
,. ' ' ~', :
.
' ~

18 ~
The surface 16 adjacent to the laser scanner 14 is a useful feature of
2 the present invention. Surface 16 allows the customer to place a shopping
3 basket 15 adjacent to the laser scanner 14. In the case whereby the customer
4 uses a shopping cart, surface 16 serves as a small area where the customer can
unload items from the shopping cart before deciding exactly which items
6 should be scanned first.
7 A key feature of the present invention is the plu~~ y of the laser
8 scanner 14 to .he packing scale 23. This ~ llily allows the customer to scan
9 and then bag an item in one single step.
Supervisor Module
11 The supervisor module, as shown in FIG. 2, allows a store employee
12 to supervise the operation of the robot module of FIG. 1. Together, FIG. 1
13 and 2, ie, the robot module nnd the supervisor module, constitute an
14 embodiment of the present invention. As mentioned above, the present
embodiment depicts a supervisor modu]e which is capable of supervising two
16 different robot modules. However, other embodiments can be envisioned
17 which allow the store employee to supervise greater number of robot modules.
18 Since the supervisor module shown in FIG. 2 is intended to supervise
19 the operation of two robot modules, the present embodiment of the supervisor
' ' ' """ '. ' i . ",' ,', ,,
.~ ' '' '. ~ '~' ' ' ' ' ' ' . . ' . .'
.
' ' '' . "' ' ' ' '
~; "~' ' '' ',' .
,'
' '' ' , : ' ,' '

19 ~ $~
modllle contains two of all parts. An exception is ~hat it contains only one
2 microphone 61 which must be shared between two robot modules via
3 microphone switch buttons 62 and 63. From the point view of relia'oility there
4 are advantages to keeping the supervisory equipment required for the each of
the two robots separate. For example, if one set of supervisory equipment
6 fails, then only one robot will be inoperable since the other set of supervisory
7 equipment is working. However, for reasons of economy, it is possible to
8 envision other embodiments of the supervisor module which share many
g supervisor components to supervise the operations of many robot modules.
Since the supervisor module contains two sets of symmetrical
11 components, we shall arbitrarily decide to consider the components on the
12 left-hand side of the page as being the components which connect with the
13 particular robot module shown in FIG. 1.
14 Video monitor 51 displays the video images transmitted by video
cameras 18 and/or 32. Video monitor switch 60 controls whether the monitor
~6 displays the image from sensor/video camera 18 and/or the image from video
17 camera 32. As is apparent from FIG. 1, sensor/video camera 18 allows the
18 supe~visory employee to see the contents of the sac 21 on the packing scale
19 23. Similarly, video camera 32 allows the supervisory employee to see the
actions of the consumer and the area immediately aro-md the consumer.
.
-
: ~ :

2 ~
Video d;splay terminal 53 generally displays the same information
2 shown on video display terminal 11. Thus, the supervisory employee can see
3 what actions the consumer is being instructed to perform at that moment, as
4 well as the summary information about the order (eg, total cost, items
S purchased, etc) normally displayed to the consumer. Occasionally, video
6 display terminal 53 may contain information not shown on video display
7 terminal 11; generally this is information required by the supervisory employee
8 but not by the consumer, eg, an acceptable weight tolerance fnr a certain
9 product. In other embodiments of the present invention whereby it is desired
to economize as much as possible on components required for the supervisor
11 module, video display terminal 53, as well as video monitor 51, would contain
12 alternating or reduced size or summarized images and information from
13 several dif~erent robot mod~lles.
14 Microphone 61 allows the supervisory employee to talk w;th the
cons~imer. Note that in the present embodiment of the invention, there is only
16 one microphone for the two robots served by the supervisory module. The
17 supervisory employee must press microphone switch 62 on the supervisor
18 keyboard 57 to transmit a message to the speaker system 12 of the specific
19 robot module shown in FIG. 1.
. . . : . . . . ~
. .. ~. :; . . : .

2 ~ LV ~
21
Receipt printer 55 prints a receipt for the consumer. If a separate
2 receipt printer is used ~or each robot, as shown in the present embodiment,
3 then evely time the consumer scans an item and places it in sac 21, it makes
4 sense to print out the item purchased and its price. When the consumer has
S finished his/her order, the receipt will have already largely been completed
6 thus saving time. As well, if there are any problems during the order, the
7 operator can examine the receipt to very quickly see what items have been
8 purchased (although the latter information is also generally available via the
9 video display terminal 53). Receipt printers, as one skilled in the art knows,
are available commercially from many different manufacturers with many
11 different features. Some receipt printers have the ability to print in color,
12 while others may have the ability to pr;nt bar coded collpons. In general,
13 receipt printers print a 40 column or narrower receipt Eor the consumer, as
14 opposed to the 80 or 132 column printers used by many data processing
systems.
16 Operator keyboard 57 consists of a group of buttons which the
17 supervisory employee uses to control the robot. For example, if a product
18 which has no bar coded label is placed in sac 21, then the supervisory
19 employee may be expected to enter a code and/or approve the item via the
operator keyboard 57. Other embodiments of the present invention are also
21 possible whereby the operator keyboard 57 is replaced by other means of user
.' . , ':,
. . '

22
interface, eg, voice recognition circuitry, interruption of beams of light by a
2 pointing finger, joystick, etc.
3 Cash drawer 64 is metal cash drawer which can be opened by the
4 computer in cabinet ~6 of the supervisoIy module. For example, if a consumer
intends to pay in cash and his/her order is finished, then the consumer would
6 walk over the supervisor module and give the supervisory employee cash. The
7 supervisory employee would enter the amount of cash into the computer via
8 the opelaLol- keyboard 57. The co~ u~er would then open the cash drawer 64
9 to deposit the payment and to make change, if necessary, for the consumer.
In the embodiment of the present invention shown in Figure 2, a separate
11 cash drawer is ~1sed for each robnt that the sllpervisor modules sllpervises.
12 However, one can also produce an embodiment of the present invention
13 whereby one cash drawer is shared by several robots. Similarly, although not
14 shown in FIGS. I or 2, one skilled in the art is aware that other means of
paying for purchases are in commercial existence. These means include
1~ cheques, credit cards, debit cards, store vouchers, and store cards. Apparatus
17 to process such means of payment3 as well as apparatus that alltnmzltic~lly
18 reads legal currency and provides coin change, is commercially available and
19 can be built into the robot module of FIG. 1 to allow the consumer to
automatically pay ~or his/her order. For examine, a commercially available
21 credit card reader apparatus could be attached to pole 30. The consumer
, - . . f,l' ..................... . . .... ..
. .
:, :.~ . . : ; ' .. . '-:
1 '

23
would place his/her credit card in such apparatus at the end of the order to
2 pay for the order without any assistance by the human supervisory employee.
3 Similarly, it is possible to envision a commercially avai]able currency reader
4 to be attached to pole 30 to allow the consumer to pay for the order with cash
without any assistance by the human supervisory employee.
6 Functional Description
7 Turning now to FIG. 3, there is shown a block diagram corresponding
8 to preferred embodiment of the automatic POS machine shown in FIGS~ 1
9 and 2. The components of the robot module and the components of the
supervisor module (ie, the portion of the supervisor module devoted to that
l l robot) are connected by a cable 140. In the preferred embodiment, cable 140
12 is composed of video cable capable of transmitting higher bandwidth video
13 signals, lower capacity audio cable and data communication cable for
14 transmitting the data processing signals to and from the communicat;on ports
109 and the keyboard encoder 122.
16 Note that FIG. 3 is composed of three largely independent systems.
17 These can be considered as the 'video system', the 'audio system' and the
18 'information system'.
. ~ ~ , . . .
~. .-.
:
,

24
The 'video system' of the robot module consists of the color
2 sensor/video camera 18, the black and white surveillance video camera 32, the
3 black and white video monitor 31 which displays the image from camera 32.
4 (If in another embodiment sensor 18 cons;sts of dimensional measuring and
volume measuring sensors as well as a video camera, then please note that
6 only the video camera portion would be part of the 'video system'. The
7 dimensional and volume measuring sensors would interface with the
8 'information system'.) Signals from the color camera 18 and the surveillance
9 camera 32 are sent to the supervisor module. At the supeIvisor module,
monitor switch 60 allows the supervisory employee to decide whether to
11 display on video monitor 51 the image from the camera 18 and/or the image
12 from the surveillance camera 32. One purpose of the 'video system' is to allow
13 the supervisory employee to see what items are being placed in the sac 21 on
1~ the pack;ng scale 23. Occasionally items may not have a bar coded label and
the supervisory employee may be expected to enter a code or to a approve a
16 product number chosen by the consumer. As well, it is useful for the
17 supervisory employee to occasionally check if the contents of the bag
18 correspond with the products scanned (in addition to the automatic weight
19 checking that the machine performs for all products). Another purpose of the
'video system' is to allow the supervisory employee to see what the consumer
21 is doing. If the consumer requires assistance and speaks to the supervisory
22 employee via the microphone 13, the supervisory employee will be better able
. . . .
,, , ;,
. : .. . ..
, ' ,~'' : . .. ; '".' . ' '
"'", ' ~ ' ~

to aid the consumer since the employee can see via video monitor 51 what the
2 consumer is doing right or wrong. Another purpose of the 'video system' is to
3 psychologically deter the consumer from trying to fraudulently defraud the
4 machine. By displaying the video image of the consumer on video monitor 31
located in the robot nlodule, the consumer is constantly reminded that his/her
6 actions are being monitored and thus is less likely to try to defraud the
7 machine.
8 The 'audio system' of the robot module consists of microphone 13
9 which attaches to preamplifier 101, and speaker system 12 driven by audio
amplifiers 102, 103, and 104. The 'audio system' of the supervisor module
11 consists oE microphone 61 which attaches to microphone switch 62 which
12 attaches to preamplifier 1~7 and speaker system 126 which is driven by audio
13 amplifiers 123,124, and 12~. One purpose of the 'audio system' is to allow two
14 way audio communication between the consumer and the supervisory
employee. The consumer can ask quesl:ions, Eor example, via microphone 13
16 which attaches to preamplifier 101 and whose signal is reproduced by speaker
17 system 126 of the supervisor module. The supervisory èmployee can respond
18 to questions via microphone 61 which is switched to a particular robot module
19 via swi~ch 62 and which then attaches to preamplifier 127 whose signal is
reproduced by speaker ~system 12 of the robot module. Speaker systems 12
21 and 126 also receive and reproduce digitized voice and tonality signals from
;
.

$ ~ ~
26
the 'information system'. For example, if the 'information system' wants the
2 user to place sac 21 on the storage scale 29, the 'information system', via the
3 voice digitizer circuit 121 will send a human sounding voice to the robot
4 module and the supervisor module speaker systems 12 and 126. This voice
S would instruct the consumer, for example, to place sac 21 on storage scale 29.
6 For example, if the consumer presses an incorrect button, the 'information
7 system' may send a thudding tonality signal via the tone circuit 116 to speaker
8 systems 12 and 126.
9 The remainder of the components shown in FIG. 3 can be taken to
make up the 'information system'. The ';n~ormation system' is controlled by
1I the CPU (Central-Processing-Unit) 120. Many powerflll, compact and yet
12 economical CPU's are commercially available. As one skilled in the art
13 recognizes, CPU 120 can retrieve conl~ul~r programs from magnetic disk
l4 drive 118 and from ROM (read-t)nly-memory) program memory 117. Magnetic
disk drive 118 is also used to store information such as product codes of the
16 store's inventory, prices, other product specific information, images of
17 products, images intended to help the user use the machine, and digitized
18 representations of various words and phrases. For timely operations, it is
19 advantageous for CPU 120 to process data stored temporarily in the l~M
(random-access-memory) l19. As one skilled in the art knows, it is possible to
21 construct CPU 120, RAM 11~, and program and data storage circuit
''' ' ~ ~ ' ' ' 1

27
equivalent to magnetic disk dr;ve 118 and ROM 117, from discrete transistors,
2 resistors, capacitors and intercomlecting wires. However, advances in
3 technology have allowed the thousands of transistors re~uired for an
4 appropriate CPU 120, an applopliate RAM 119, an appropriate ROM 117
and an appropriate magnetic disk drive 118 to be placed on a relatively small
6 amount of integrated circuits. Advances in technology have also allowed one
7 or two small rotating rigid magnetic platters to form the mechanical basis for
8 an appropliate magnetic disk drive 118. As one skilled in the art knows, the
9 algorithm which controls the CPU 120 can be implemented with discrete
transistors, resistors, capacitors or can be implemented entirely in the ROM
11 117. However, due to advances in technology, as one skilled in the art is
12 aware, algorithms controlling CPU's are largely kept on magnetic disk
13 (occasionally tape) drives. By keeping algorithms stored on magnetic disk
14 drives, fulure modification becomes simple as it is easy to read and write
lS programs from and to magnetic d;sk drives. As well, due to advances in
16 technology, many of the algorithms for controlling what is o~ten described as
17 the 'low-level ~unctions', ie, the creation and movement of the data
18 communication signals, are commercia~ly available from numerous sources. In
19 the present invention, it would seem that the algorithm, or program,
controlling the operation of CPU 120 is somewhat rernoved from the physical
21 basis of the invention. However, in reality, it is simply that current technology
~. .
: .
, ~
'; . ,
'~1' :
. .~.

28
makes it economically advantageous to use several layers of algorithms,
2 whereby the lower layers are inexpensive, generically available algorithms.
3 Although, as mentioned above, the 'video system', the 'audio system'
4 and the 'information system' are largely independent, the 'information system'
does in fact send audio signals to the 'audio system'. CPU 120 can instruct
6 tone circuit 116 to produce various tones, eg, beeps, thuds, alarm tones, which
7 are then sent to the speaker system 126 in the supervisor module and the
8 speaker system 12 in the robot module. Similarly CPU 120 can instruct the
9 voice digitizer circuit 121 to reconstruct various digitized words or phrases,
whose digital representations are currently in RAM 119, and to send the
11 reconstructed audio signal to the speaker system 126 in the supervisor rnodule
12 and the speaker system 12 in the robot module.
13 CPU 120 can instruct the graphical processing circuitry 132 to display
14 characters representing prices, product descriptions, etc, in various colors, on
the supervisor module's video display terminal 53 and simultaneously on the
16 robot module's video display terminal 11. CPU 120 can also instruct the
17 graphical processing circuitry 132 to reconstruct various digitized video images,
18 whose digital representations are cu~ ly in RAM 119, and to display these
19 images on video display terminals 53 and 11. Such images can consist of
illustrations showing the customer how to use the machine, eg, SC~nnin~

2 ~
29
products, placing products in the bags, pressing buttons, etc; images
2 corresponding to products being scanned or those which the customer must
3 select from; images consisting of characters in fonts which are generally larger
4 than is usual for characters to be displayed on video display terminals.
The customer can communicate with the 'information system' via
6 buttons (generally momentary contact switches) 1 to 10, strategically located
7 around the video display terminal 11. For example, if a product does not have
8 bar coded product code, it is necessary for the customer to press one of the
9 above buttons to indicate this to the 'information system'. Similarly, the
supervisory employee can communicate with the 'in~ormation system' via the
11 supervisor keyboard 57. For example, if the supervisory employee m-lst
12 visually approve a product which does not have a bar coded product code,
13 then he/she will have to press an appropriate button on the supervisor
14 keyboard 57. Buttons I to lO and the sllpervisor keyboard 57 attach directly,
or send an encoded data signal, to keyboard encoder 122. Keyboard encoder
16 lL22 transforms the signa]s from buttons I to 10 and from the supervisor
17 keyboard S7 into data signals compatible with ~PU 12~, to which the
1~ keyboard encoder 122 is attached.
19 CPU 120 collllllunicates with modem 108,the laser scanner 14, the
packing scale 23, the storage scale 29, the government regulated weight display
'. . ;,
.. ;
,

2 ~ ~ l.7~
~o
105, the lane status lamp 106, the receipt printer SS and the cash drawer ~4
2 via the communication ports circuitry 109 and respectively individual
3 communication ports 110, l11, 112,113,114, and 115. Note that in the shown
4 configuration communication port 114 sends signals to relay board 107 which
S in turn controls the weight display 105 and the lane status lamp 10~. Note also
6 that in the shown configuration, communication port 115 co~ nicates
7 indirectly with the cash drawer 64 via the receipt printer 55. If the receipt
8 printer 55 receives a predetermined unique string of character(s), then it will
9 in turn send a signal to cash drawer 6~ causing it to open.
I0 The functions of laser scanner 14, packing scale 23 and storage scale
11 29 have been discussed above. I aser scanner 14 will read a bar coded label
12 placed in the path of its laser beam and will convert the intormation conveyed
13 by the bar coded label into a representation of the product code which can be
:14 sent to the CPU :120 via port lll. Packing scale ~3 will convert the weight o:~
the products placed on its weighing platform 22 into a data signal which can
16 be sent to the CPU 120 via port 112. Note that packing scale 23 sends a signal
17 to the government regulated weight display 105. In many localities, the law
18 requires that customers be shown the weight registered by a scale which is to
19 be used to weigh products whose price is determined by weight. In cases
where the customer is not required to see the actual weight on the scale, or
21 if the weight is shown instead on video display terminal 11, CPU 120 is able

2 ~ F ~
31
to turn off the government regulatecl weight display v;a port 114 and relay
2 board 107. CPU 120 is also able to turn on and off, via port 114 and relay
3 board 107, lane status lamp 10~. Lane status lamp 106 is an optional ~eature
4 not shown in FIG. 1. Lane status lamp 106 is a lamp which is generally
S mounted on pole 30 or on top of camera 18 and indicates to customers that
6 the lane is available for service. Although not shown in the present
7 configuration, ;t would be possible to include several such lamps and place
8 them on top the storage scale 29, the packing scale 23 and other locations to
9 help the customer use the machine properly. For example, when the customer
was to move sac 21 from the packing scale to the storage scale 29, the CPU
11 120 could cause a lamp mounted on the storage scale to turn on so as to
12 prompt the customer.
13 Modem 108 allows the 'information system' to communicate with other
14 computer systems. ~lodem 108 attaches to CPU 120 via communication port
lS 110 and commun;cation circuitry 109. As one skilled in the art is aware,
16 numerous commercia]ly available modems e~ist which transmit data signals
17 over ordinary phone wires, over specialized phone wires, over ]ocal eo~llL)uL~r
18 networks, asynchronously, synchronously, to microcomputers, to minicomputers
19 and to mainframe computers. A typical use of present invention will be to
have numerous robot-supervisor modules report to a centralized co~ uL~r
21 system. In such a case, the modem 108 would transmit inventory changes to

32
the central computer system. In such a system the central computer system
2 would transmit price changes and new product information to the CPU 120
3 via the modem 108. As well, changes in the computer program controlling the
4 CPU 120 stored on magnetic disk drive 118 could be changed by the central
S computer system via applopli~te commands to the CPU 120 via modem 108.
6 Logic Description
7 FIG. 4 is a flow-chart describing the overall function of the 'information
8 system' of the present invention. As mentioned 2bove, current technology
9 makes it economically advantageous to use several layers of algorithms,
whereby the lower layers are inexpensive, generically available algorithms. The
11 high-level algorithm shown in FIG. 4 along with textllal discussion of this
12 algorithm is sufficient to allow one skilled in the art to construct a working
13 automatic point-of-sale machine. One skilled in the art will also realize that
14 the algorithm shown in FIG. 4 is only Gne of many poss;ble algorithms which
:15 could be llse~l to control the f~lnction o~ the alltomatic point-oE-sale machine.
16 Referring now to Section A of FIG. 4, this shows the highest algorithm
17 level and is appropriately called the 'Main Algorithm'. When power is applied
18 to the automatic point-of-sale machine and hence to the 'information system'
19 of the latter, the 'Main Algorithm' commences with an initi~li7~tion routine.
The initialization routine, like all the routines shown in FIG. 4, is actually an
,. ~ : . ~ -
, .
. . . . . .

33
algorithm. This algorithm is a layer below the 'Main Algorithm' and itself
2 makes use of other algorithms on again even lower levels and so on. The
3 lowest layer of algorithms are those that present and receive 1's and 0's from
4 the ~PU 120. Only the high level algorithms are shown in FIG. 4 since many
of the lower level algorithms are common, commercially available a]goliLhllls,
6 or simple variants thereof, which one skilled in the art would already be
7 familiar with. The initialization routine would typically call other algorithms to
8 initiali~e the collllllunication port circuitry 109, to transfer files from the
9 magnetic disk drive 11~ to RAM 119, etc.
After initialization, the video display terminal 11 display a graphical
l1 message to the c~lstomer to press any bl.ltton to begin checkout o~ one's orcler.
12 The CPU 120 is instructed to wait for a button 1 to l0 to be pressed. If a
13 customer wishes to use the automatic point-of-sale machine, then he/she will
~ press any b~ltton to commence oper~ations. At this point the algorithm instructs
the CPU 120 to collect various information from the customer. One useful
16 piece of in~ormation is whether the customer has used this machine previously
17 or if he/she is a beginner. The next step is to prompt the customer, vja
18 digitized images on the video display terminal 11 and via digitized human-
19 so~mding voice phrases from speaker system 12, to place a bag in the bag
holders 19 and 20. This prompting algorithm would then have the user press
21 a button to indicate that the bag is in place.
I

~4
The 'Main Algorithm' now checl~s three conditions (each, of course,
2 composed of numerous sub-conditions): Has an unauthorized ~veight change
3 occurred on packing scale 23 or on the storage scale 29? Has the laser
4 scanner 14 read a bar code? Has the user pressed any button 1 to 10 or has
the supervisory employee pressed any key on the supervisor keyboard 57.
6 Let us consider the case whereby the customer tries to steal an item by
7 placing it directly into sac 21 without sc~nnin~ it first. When the 'Main
8 ~Igorithm' checks to see if an unauthorized weight change has occurred, it
9 calls lower algorithms which provide the current weight on the packing scale
23 and on the storage scale 29. If the current weight on a particular scale
11 difters by greater than a precleterminecl error margin, then we;ght has been
12 added to or removed from the scale, whichever the case may be. Thlls, the
13 'Main Algoritllm' will consider the condition of wheth~r an unauthorized
1~ weight chan~e to have occurrecl to be trlle and will as shown trans~er control
to the 'Weight Change ~lgorithm'. Section B of FIG. 4 is a flow-chart of the
16 'Weight Change Algorithm'. In the above case where the customer placed an
17 object into the sac 21 without sc~nnine it in an attempt to avoid paying for the
18 item, the 'Main Algorithm' would have determined that unallthorized weight
19 had been added to the packing scale 23. Thus the 'Weight Change Algorithm'
2Q would display an a~lopli~te digitized video image on the video display
21 terminal 11 and play an appropriated digitized human audio message from
' ' ~'' ~'
.
. . ~
: :
,.
~'' '

2~8~.
speaker system 12 prompting the customer to remove the item from the sac
21. At the end of the prompt, the 'Weight C:hange Algorithm' checks to see
3 if the weight on the packing scale 23 is back ~o the previous weight, ie, has the
4 item been removed. If it is back to the previous weight then the 'Weight
Change Algorithm' ends and control is transferred back to point 'B' on the
6 'Main Algorithm'. If the weight has not returned back to the previous value7
7 or if the customer has tried to remove a dif~erent item resulting in a lower
8 weight but one not equal to the previous value, then the visual and audio
9 prompt is repeated. Note that supervisory employee can press a button on the
supervisor keyboard 57 to leave the 'Weight Change Algorithm' and return
11 back to point 'B' on the '~ain Algorithm'.
12 Let us assume that the customer has taken out of the sac 21 the item
13 in question in the above case. Thus control has passed back to the 'Main
14 Algorithm' where the latter is continllally examining whether an unauthorized
weight change has occurred, whether a ~ar code has been scanned or whether
16 a key has been pressed. Now ~et's assume that the customer scans the item
17 over the laser scanner 14 and then p]aces the item in the sac 21. The ]aser
18 scanner 14 will convert the bar code into the corresponding product code and
19 send this code via the port 111 and the communication port circuitry 109 to
the CPU 120. Thus the condition 'Scan Received' will become true and thus,
21 as shown in FIG. 4 control will go to the 'Scan Algorithm'.
. ., ~ -
. ~ . :' ' '
. :: .
' ' .: ~ ' ~ ' :
,. , , . ' ~ ~' ~:
, -
. :
:

2~
36
Section C of FIG. 4 is a ~low-chart of the 'Scan Algorithm'. The 'Scan
2 Algorithm' first takes the product code and looks up information for this
3 product code. Lower level algorithms are used to maintain a database of all
4 product items and to allow quick retrieval from such a database. The product
information for a given product code would typically consist of price,
6 description, weight, weight tolerances to accept, tax information, inventory
7 information, and discount information. The 'Scan Algorithm' then calls an
8 algorithm which waits for an increase in weight on the packing scale 23. ~hen
9 this weight increase has occurred and the weight reading from scale 23 is
considered stable, the 'Scan Algorithm' considers the condition of whether the
11 weight increase on packing scale 23 is within the weight range specified by the
12 product information for that product. 1~ the weight increase is considered
13 within rarlge, then the 'Scan Algorithm' goes to the next step where it causes
14 receipt printer SS to add the prodllct to the receipt. The product description
lS ancl price, as well as the cllrrent total price of the order is clisplayecl on the
16 video display terminal 11 (as well as video display terminal ~3). The 'Scan
17 Algorithm' then ends and control is transferred back to point 'B' on the 'Main
18 Algorithm'. If, on the other hand, the weight increase is not within the
19 specified range, the 'Scan Algorithm' will transfer control to the 'Weight
Change Algorithm'. As described above, the 'Weight Change Algorithm' will
21 prompt the user to remove the item from the grocery sac.
. . . . -, . . .
:. - , , .,- .
~ . . . ., , - .
: '' '', .- -:.," :: .'
. . . .
;',.; ,: ' . ~

37
Let us assume that control has passed back to the '~Iain Algorithm'
2 where the latter is continually ex~minine whether an unauthorized weight
3 change has occurred, whether a bar code has been scanned or whether a key
4 has been pressed. Now let's assume that the customer has an item which has
S no bar code label. When the 'Main Algorithm' is continually PYAmining
6 whether an unauthorized weight change has occurred, whether a bar code has
7 been scanned or whether a key has been pressed, it displays on the video
8 display terminal 11 ten arrows pointing to the ten buttons 1 to 10. Each arrow
9 is labelled. For example, let us consider an embodiment of the present
invention whereby the arrow to button 1 is labelled 'HELP', the arrow to
11 button 2 is labelled 'NO BAR COI:)E', the arrow to button 3 is labelled
12 'CHANGE BAG', the arrow to button ~ is labelled 'END ORDE~', the arrow
13 to button 5 is labelled '~OUPON' and that the arrows to buttons 6 to 10 are
14 not labelled. The customer will thus press button 2, which corresponds to the
label 'NO BAR CODE' on the video display terminal 1l. The customer then
16 places the item in sac 21.
17 The condition 'Key Pressed' will become true after the customer
18 presses button 2 ('NO BAR CODE'). Thus, control will pass from the 'Main
19 Algorithm' to the 'Key Press Algorithm'. Section D of FIG. 4 is a flow-chart
of the 'Key Press Algorithm'. As shown in this figure, since the condition 'No
21 Bar Code Key Pressed' is true, the 'Key Press ~Igorithm' calls the 'No Code
. . - ~
:. :
: : . .

38
Algorithm'. In the case of a user using the automatic point-o~-sale machine for
2 one of his/her first times, the 'No Code Algorithm' alerts the supervisory
3 employee with a visual message on video display terminal 53 and an audio
message from speaker system 126 that an item having no bar code has been
S placed in sac 21. The supervisory employee will examine the video image of
6 sac 21 L~ns~ d by camera 18 and displayed on video monitor 51 and via
7 the supervisor keyboard 57 key in the product code or a product description
8 which will allow a lower-level algorithm to use to determine the product code.
9 In the case of an experienced customer, the 'No Code Algorithm' will present
the customer with a menu of choices. Such a menu consists of a graphical
ll image displayed on video display terminal l1 consisting of ten arrow pointing
12 to the ten buttons 1 to lO, each with a klbel of prod~lct choice or another sub-
13 menu to choose from. A~ter the customer has chosen the product, the
14 supervisory employee is prompted to examine the video image of the sac 21
transmitted by camera 18 to vicleo monitor 51 and to approve or reject the
16 choice. If the customer made a mistake or intentionally chose a cheaper
17 product, the rejection by the supervisory employee will cause the 'No Code
18 Algorithm' to start over again. In any case, when the 'No Code Algorithm' is
19 successfully completed, control transfers back to point 'B' on the 'Main
Algorithm'.
;
~' .' ~'' ';' ,,
'''' " : '' ' ''
' . ' ' ''' '~
~' ~ .. ,'

39 2 ~ ~; f~
Let us consider the nther buttons which the customer can press. As
2 mentioned above, let us consider an embodiment of the present invention
3 whereby the arrow to button 1 is labelled 'H~LP', the arrow to button 2 is
4 labelled 'NO BAR CODE', the arrow to button 3 is labelled 'CHANI~E
BAG', the arrow to button 4 is labelled 'END ORDER', the arrow to button
6 5 is labelled 'COUPON' and that the arrows to buttons 6 to 10 are not
7 labelled. If button 1 ('HELP') is pressed then control is transferred to the '~ey
8 Press ~lgorithm' which in turn calls the 'Help ~Igorithm'. The 'Help
9 Algorithm' alerts the supervisory employee and plolllpL~ the customer to
speak into microphone 13. Microphones 13 and 61 and speaker systems 12
11 and 126 allow the customer and the supervisory employee to carry on a two-
12 way conversation. As well, the supervisory employee can press the monitor
13 switch 60 to display the image from carnera 32 on video monitor Sl which is
14 the video ilnage of the customer and his/her immediate surroundings. Section
D of FIG.4 shows that after the 'Help Algorithm' is finished, control returns
16 to point 'B' on the 'Main ~lgorithm'. l'his is the general case, although not
17 shown is the possibility for the supervisory employee to br~nch to dif~erent
18 parts of the 'Main Algorithm' as well as various lower level algorithms.
19 We have already considered the case of pressing the 'NO BAR CODE'
20 ' button 2. Let us now consider the case of pressing the 'CHAN(~E BAG'
21 button 3. If the customer has a large order requiring several bags, then when
., . , ~
'' , ': ~ ,
. - :
.: "
. . .. , :
.' '
- , ~
.
.
. . .

2 ~ 3
the customer wants to use a new bag, he/she should press the 'CHANGE
2 BAG' button 3. Control is transferred from the 'Main Algorithm' to the 'Key
3 Press Algorithm' and in turn to the 'Change Bag Algorithm'. The 'Change Bag
4 Algorithm' prompts the customer to transfer bag 21 to platform 28 or the
S hooks 27 on the platform 2~ of the storage scale 29. The customer is
6 p~ d via a digiti~ed video image on the video display terminal 11 and via
7 a digitized human-sounding vo;ce from the speaker system 12. The customer
8 is asked to transfer bag 21 to the storage scale 29 and then place a new bag
9 on the bag holders 19 and 20 of packing scale 23. The customer is asked to
press any button 1 to 10 when ready. At this point the 'Change Bag
11 Algorithm' verifies that the weight increase on storage scale 29 is equal to the
12 previous weight on packing scale 23. If the customer tried to add an extra
13 non-scanned item to the storage scale during changing of bags or tried to swap
14 an inexpensive item with a more expensive non-scanned item then there will
generally be a weight discrepancy and the 'Change Bag Algorithm' will ask the
16 user to correct the situation repeatedly until the weight on the storage scale
17 is within the a predetermined tolerance range. When the 'Change Bag
18 Algorithm' is successfully completed control passes back to point 'B' on the
19 'Main Algorithm'.
Let us now consider the case of pressing the 'END ORDER' button 4.
21 When the customer has completed sç~nninE and bagging hislher order, he/she
, . .. ... . .
.
, .; : .,:
:

$ ~
41
should press the 'END ORDER' button 4. Control is transferred from the
2 'Main Algorithm' to the 'Key Press Algorithm' and in turn to 'End Order
3 Algorithm'. The 'End Order Algorithm' plo~ ts the customer, via the video
display terminal 11 and speaker system 12, for any final information required
such as delivery choices and payment modalities. The typical embodiment o~
6 the present invention then instructs the customer to pay the human
7 supervisory employee. However, it is not hard to imagine other embodiments
8 which use commercially available magnetic credit card readers for credit or
9 debit card payment, commercially available electronic clebit card readers for
debit card payment or commercially available currency readers for automatic
11 cash payment. In the typical embodiment, after the supervisory employee has
12 received payment, the c~lstomer is given the receipt Eor the order. If a cash
13 payment was made then the 'End Order Algorithm' will instruct the port 115
14 to signal the receipt printer 55 to operl the cash drawer 64. The 'End Order
Algorithm' then makes sure that there have been no llna~lthorized weight
16 changes on packing scale 23 or storage scale 29. The customer is now free to
17 remove his/her bags from the packing scale 23 and the storage scale 29. Note
18 that when the 'End Order Algorithm' finishes, control returns to point 'A' on
19 the 'Main Algorithm', ie, the automatic point-of-sale machine waits for the
next order.
, .
. ~ : ; .. ,
. ~,. .. .
" '' .

2 ~
~2
Let us IIOW consider the case of pressing the 'COUPON' button ~.
2 When the customer has a discount coupon for a particular product or perhaps
3 a general credit voucher he/she should press the 'COUPON' button S. Control
4 is transferred from the 'Main Algorithm' to the 'Key Press Algorithm' and in
turn to 'Coupon Algorithm'. In the case of a user using the automatic point-of-
6 sale machine for one of hislher first times, the 'Coupon Algorithm' will simply
7 have the receipt printer 5~ print a short note or a symbol that will alert the
8 cashier at the time of payment that there is a credit adjustment to be made.
9 In the case of a more experienced user, the 'Coupon Algorithm' will prompt
the user to enter the amount of the coupon or voucher via a human sounding
11 voice from speaker system 12 and via a graphical message displayed on the
12 video display terminal 11. The image on the video display terminal 1I will
13 consist of the arrows pointing to the ten buttons I to 10 labelled '1' to '10' so
14 that the cllstomer is able to use buttons 1 to 10 to enter the monetary amount
of the collpon or the vo~lcher. In the Eutllre, coupons that have bar codes on
16 them will become more widespread. For the case of sllch coupons, the
1~ customer need only scan the coupon over the laser scanner 14 instead of
18 having to enter the coupon amount. After the '~oupon Algorithm' has
19 successfully finished, control passes back to point 'B' on the 'Main Algorithm'.
Note that the graphical image displayed on the v;deo display terminal 11
21 changes back to the usllal image that displays arrows pointing to the buttons
, . ~ ;; ...
, ~ ': :~, ~ .,

43
labelled 'HELP', 'NO BAR CODE', 'CHANGE BAG', 'END ORDER' and
2 'COUPON', as discussed above.
3 In the embodiment of the present invention that is being considered
4 here, buttons 6 to 10 have no particular label or significance for the 'Main
S Algorithm' at point 'B' of the algorithm, FIG. 4. If one of the but~ons 6 to 10
6 are pressed, the condition 'Key Pressed' becomes true so that control is passed
7 to the 'Key Press Algorithm'. However, none of the primary conditions of the
8 'Key Press Algorithm' becomes true so that control passes back to point 'E,'
9 of the 'Main Algorithm' without any particular operations occurring. (Of
course, one can envision equivalents of the present embodiment of the
11 invention where pressing such a key causes a prompt such as a thudding sound
12 from speaker 12 to occur.)
13 It is occasionally ne~essS~ry for the supervisory employee to enter a
14 product for a customer or make a correction. If the supervisory employee
presses a key on the ~ul)elvi~or keyboard 57 then control passes to the 'Key
16 Press Algorithm' and in turn to the 'Operator Algorithm'. The 'Operator
17 Algorithm' consists of a series of conditional tests, similar to the structure of
18 the 'Key Press Algorithm' which acts appropriately depending on which key
19 on the supervisor keyboard 57 was pressed. For example, if the supervisory
employee pressed a key to allow the customer to remove an item from the sac
. , .
.
. ' ' '' ~' ~'''.. :::, :

44
21 he/she decided at the last minute he/she did not want to purchase, then the
2 'Operator Algorithm' would call a lower-level 'Remove Item Algorithm' which
3 would in turn call lower-level algorithms to reduce the total amount of the
4 order, to print a correction on the receipt via receipt printer 55, to verify the
S new weight on packing scale 23, etc.
6 An embodiment of the present invention may concisely be described
7 as a self-service checkout system colllpli~ing: (a) a robot module; (b) a laser
8 bar code scanner mounted in said robot module for generating a first
9 electrical signal corresponding to the bar code scanned; (c) a packing scale
mounted in said robot module for generating a second electrical signal
11 corresponding to the weight on said packing scale where said packing scale
12 is mounted in proximity to the said laser bar code scanner such that a
13 customer can scan and bag a product with one motion; (d) attachments on the
14 said packing scale to hold bags open and in place; (e) a video display mounted
in said robot module; (f) user interface~ means operating in proximity to said
L6 video display generating a third electrical signal; (g) a sensor mounted above
17 the said packing scale where said sensor generates a fourth electrical signal
18 representative of the external characteristics of the contents o~ the packing
19 bags; (h) a ~upelvisor module to be used by a supervisory employee to
supervise the operation of said robot module; (i) user interface means
21 mounted in the said supervisor module generating a fifth electrical signal; (j)
2~ a video display mounted in said supervisor module; (k) an electronic collll~uL~l
! ,
~.
...

having access to a product lookup table and receiving said first, second, third,
2 fourth and ~i~th electrical signals; and (l) a computer program causing said
3 electronic computer in the case of a product containing a machine readable
4 bar code, to look up in the said product lookup table the allowable weight for
the product and to verify correspondence with the weight addition on the said
6 packing scale, and in the case of a product without a valid machine readable
7 bar code to present the customer with a series of choices to identify the
8 product including the option of requesting the said supervisory employee to
9 identify the product.
The high-level alguli~h~ shown in FIG. 4 along with textual discussion
11 of these algorithms is intended not as a comprehensive discussion of the
12 algorithms used in an embodiment of the present invention, but only to be
13 sufficient to allow one skilled in the art to construct a working automatic
14 point-of-sale machine. One skilled in the art will be capable of producing or
obtaining the lower-level al~orithms dic tated by the algorithms shown in FIG.
16 4. The set of algorithms shown in FlIG. 4 is only one Oe many possible sets of
17 algorithms which could be used to control the function of the automatic point-
18 of-sale machine. Using no more that routine experimentation it is possible to
19 produce many equivalent sets of algorithms. Similarly, using no more than
routine experimentation it is possible to add numerous features to the set of
21 algoli~hms shown in FIG. 4. For example, a feature could be added to the
22 'Scan Algorithm' shown in Section C of FIG. 4, whereby if the product
.. - , .- , . . . . .
,~.
:'' ~' . ' ~:,
. ~, '.:

O;~ 3.
46
information indicated that the product was heavy or of large size, then the
2 customer would be prompted to place the product directly on the storage
3 scale 29 instead of the packing scale 23. This algorithm could also be modified
4 so that if the product information indicated that another product had a similar
weight, then the supervisory employee should be l)lo~ L~d to verify that the
6 correct product has been placed in the sac 21 or on the storage scale 29,
7 whichever the case may be. The 'No Code Algorithm' could be given a feature
8 such that if the supervisory employee is very busy or cannot respond within
9 several seconds, then for the case of an experienced customer who has
indicated via buttons 1 to 10 in response to choices presented on the video
11 display terminal 11 the product placed in sac 21, then the product will by
12 ~lefault be approved so that the custor~er does ilot have wait an unreasonable
13 amount of time for the supervisory employee to approve or reject the item.
14 As mentioned above, Eor the sake of simplicity, in the embodiment
being discussed here, 18 is considered to be a sensor transmitting only images
16 of the contents of bag 21 to the supervisor module. However, as mentioned
17 above, sensor 1~ may in other embodiments contain a three-dimensional array
18 of light beams and detectors which measure the dimensions of the customer's
19 hand and product going to the bag 21 and the customer's empty hand
returning from bag 21 thus allowing colllpulaLion of the net dimensions of the
21 product. Sensor 18 may a]so contain an plane of ultrasonic transducers which
.

47
measure the distance from the fixed position of sensor 18 to the top of the
2 contents of the bag 21. By noting the change in these distances after a product
3 is placed in bag 21, it is possible to compute the volume of the product. In an
4 embodiment where sensor 18 consists of a video camera and a light-beam
dimension computing array and an ultrasonic transducer volurne computing
6 plane, the measured dimensions and volume will be verified against
7 dimensions and volume stored for a particular product, as indicated by the
8 product lookup table. Dimensions and volume may be verified for every single
9 item placed in bag 21, or as mentioned earlier, dimensions and volume may
be used along with weight to determine that a non-labelled product identified
1I by an experienced user has in fact been correctly identified and for the small
12 minority of cases where measured weight,dimensions and volume don't
13 reasonably correspond with the stored values, an image of bag 21 is veriEied
14 by the supervisory employee.
Those skilled in the art will be able to ascertclin, using no more than
16 routine experimentation, other equivalents for the method and apparatus
17 above described. Such equivalents are to be included within the scope of the
18 following claims.
?

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

2024-08-01:As part of the Next Generation Patents (NGP) transition, the Canadian Patents Database (CPD) now contains a more detailed Event History, which replicates the Event Log of our new back-office solution.

Please note that "Inactive:" events refers to events no longer in use in our new back-office solution.

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Event History , Maintenance Fee  and Payment History  should be consulted.

Event History

Description Date
Inactive: IPC expired 2022-01-01
Inactive: Expired (new Act pat) 2011-11-01
Letter Sent 2011-02-25
Inactive: Office letter 2010-11-22
Inactive: Correspondence - MF 2010-08-10
Letter Sent 2009-12-20
Inactive: Correspondence - Formalities 2009-10-21
Inactive: Single transfer 2009-10-16
Inactive: Late MF processed 2008-11-18
Letter Sent 2008-11-03
Inactive: Office letter 2006-11-27
Inactive: Corrective payment - s.78.6 Act 2006-11-17
Inactive: Office letter 2006-11-01
Inactive: Corrective payment - s.78.6 Act 2006-10-04
Inactive: First IPC derived 2006-03-11
Letter Sent 2005-01-25
Inactive: Correspondence - Formalities 2004-11-04
Inactive: Correspondence - Transfer 2004-11-04
Letter Sent 2004-10-01
Inactive: Office letter 2004-10-01
Inactive: Single transfer 2004-09-01
Inactive: Single transfer 2004-08-31
Inactive: Entity size changed 2002-10-08
Grant by Issuance 1999-06-29
Inactive: Cover page published 1999-06-28
Inactive: Final fee received 1999-03-10
Pre-grant 1999-03-10
Notice of Allowance is Issued 1999-01-26
Letter Sent 1999-01-26
Notice of Allowance is Issued 1999-01-26
Inactive: Status info is complete as of Log entry date 1999-01-22
Inactive: Application prosecuted on TS as of Log entry date 1999-01-22
Inactive: IPC removed 1998-11-26
Inactive: First IPC assigned 1998-11-26
Inactive: IPC assigned 1998-11-26
Inactive: IPC removed 1998-11-26
Inactive: Approved for allowance (AFA) 1998-11-26
All Requirements for Examination Determined Compliant 1996-02-20
Request for Examination Requirements Determined Compliant 1996-02-20
Application Published (Open to Public Inspection) 1993-05-02

Abandonment History

There is no abandonment history.

Maintenance Fee

The last payment was received on 1998-10-08

Note : If the full payment has not been received on or before the date indicated, a further fee may be required which may be one of the following

  • the reinstatement fee;
  • the late payment fee; or
  • additional fee to reverse deemed expiry.

Please refer to the CIPO Patent Fees web page to see all current fee amounts.

Fee History

Fee Type Anniversary Year Due Date Paid Date
MF (application, 6th anniv.) - small 06 1997-11-03 1997-10-09
MF (application, 7th anniv.) - small 07 1998-11-02 1998-10-08
Final fee - small 1999-03-10
MF (patent, 8th anniv.) - small 1999-11-01 1999-10-29
MF (patent, 9th anniv.) - standard 2000-11-01 2000-10-18
MF (patent, 10th anniv.) - standard 2001-11-01 2001-10-19
MF (patent, 11th anniv.) - standard 2002-11-01 2002-10-01
MF (patent, 12th anniv.) - standard 2003-11-03 2003-10-16
Registration of a document 2004-08-31
Registration of a document 2004-09-01
MF (patent, 13th anniv.) - standard 2004-11-01 2004-10-14
MF (patent, 14th anniv.) - standard 2005-11-01 2005-10-27
2006-10-04
MF (patent, 15th anniv.) - standard 2006-11-01 2006-10-25
2006-11-17
MF (patent, 16th anniv.) - standard 2007-11-01 2007-11-01
MF (patent, 17th anniv.) - standard 2008-11-03 2008-11-18
Reversal of deemed expiry 2008-11-03 2008-11-18
Registration of a document 2009-10-16
MF (patent, 18th anniv.) - standard 2009-11-02 2009-10-20
MF (patent, 19th anniv.) - standard 2010-11-01 2010-11-01
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
FUJITSU FRONTECH NORTH AMERICA INC.
Past Owners on Record
HOWARD SCHNEIDER
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Description 1994-03-30 47 1,626
Cover Page 1994-03-30 1 14
Abstract 1994-03-30 1 29
Claims 1994-03-30 7 204
Drawings 1994-03-30 7 110
Claims 1998-11-18 7 225
Cover Page 1999-06-21 1 43
Representative drawing 1999-06-02 1 13
Representative drawing 1999-06-21 1 10
Commissioner's Notice - Application Found Allowable 1999-01-26 1 163
Courtesy - Certificate of registration (related document(s)) 2004-10-01 1 128
Courtesy - Certificate of registration (related document(s)) 2005-01-25 1 105
Maintenance Fee Notice 2008-11-27 1 172
Late Payment Acknowledgement 2008-11-27 1 165
Courtesy - Certificate of registration (related document(s)) 2009-12-18 1 103
Fees 2003-10-16 1 28
Fees 2001-10-19 1 33
Fees 1999-10-29 1 31
Correspondence 1999-03-10 1 33
Fees 1998-10-08 1 31
Fees 2002-10-01 1 31
Fees 1997-10-09 1 36
Fees 2000-10-18 1 32
Correspondence 2004-10-01 1 25
Correspondence 2004-11-04 2 43
Fees 2004-10-14 1 28
Fees 2005-10-27 1 29
Correspondence 2006-11-01 1 22
Fees 2006-10-25 1 43
Correspondence 2006-11-27 1 13
Fees 2007-11-01 1 45
Fees 2008-11-18 1 40
Correspondence 2009-10-21 3 91
Correspondence 2010-08-10 1 45
Correspondence 2010-11-22 1 17
Fees 2010-11-01 1 33
Correspondence 2011-02-25 1 14
Correspondence 2010-11-23 2 81
Fees 2010-11-01 1 34
Fees 1996-09-19 1 43
Fees 1995-10-26 1 29
Fees 1994-10-26 1 30
Fees 1993-10-27 1 39
Courtesy - Office Letter 1993-02-11 1 51
Courtesy - Office Letter 1996-02-01 1 14
PCT Correspondence 1996-02-15 1 24
Courtesy - Office Letter 1996-03-16 1 15
Courtesy - Office Letter 1996-03-15 1 14
Courtesy - Office Letter 1996-05-09 1 41
Prosecution correspondence 1996-02-20 1 49
Prosecution correspondence 1998-09-03 2 40
Prosecution correspondence 1995-01-12 1 17
PCT Correspondence 1993-01-13 1 28