Language selection

Search

Patent 2360940 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent Application: (11) CA 2360940
(54) English Title: INTERACTIVE ENTERTAINMENT SYSTEMS AND METHODS
(54) French Title: SYSTEMES ET PROCEDES DE DIVERTISSEMENT INTERACTIF
Status: Deemed Abandoned and Beyond the Period of Reinstatement - Pending Response to Notice of Disregarded Communication
Bibliographic Data
(51) International Patent Classification (IPC):
  • G06F 03/00 (2006.01)
  • A63F 03/00 (2006.01)
  • G10L 13/00 (2006.01)
(72) Inventors :
  • SCHMIDT, CHRISTOPHER H. (United States of America)
  • JORDON, ADAM C. (United States of America)
  • PIERNOT, PHILIPPE P. (United States of America)
  • CESAROTTI, WILLIAM A. (United States of America)
  • PAPADOPOULOS, DESPINA (United States of America)
  • PETRAVIC, ROBIN G. (United States of America)
  • LAITURI, DAVID W. (United States of America)
  • WIRTSCHAFTER, JENNY DANA (United States of America)
(73) Owners :
  • INTERLEGO AG
(71) Applicants :
  • INTERLEGO AG (Switzerland)
(74) Agent: SMART & BIGGAR LP
(74) Associate agent:
(45) Issued:
(86) PCT Filing Date: 2000-01-31
(87) Open to Public Inspection: 2000-08-03
Availability of licence: N/A
Dedicated to the Public: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): Yes
(86) PCT Filing Number: PCT/US2000/002342
(87) International Publication Number: US2000002342
(85) National Entry: 2001-07-23

(30) Application Priority Data:
Application No. Country/Territory Date
09/306,647 (United States of America) 1999-05-06
60/118,384 (United States of America) 1999-02-01

Abstracts

English Abstract


Disclosed is an interactive entertainment system. The system includes a base
(104a, 104b) that represents an entertainment device, a physical sensor object
(106) that is separate from the base, a detector (102) arranged to detect the
presence, location and identity of the sensor object (106) relative to the
base (104a, 104b), and an interactive audiovisual program (112) configured to
output audiovisual segments related to the entertainment device represented by
the base (104a, 104b). Detection of the physical sensor object (106) causes
the interactive audiovisual program (112) to output an audiovisual segment
that is based at least in part upon the detected position of the sensor object
(106) relative to the base (104a, 104b) and that includes information that is
not visible on the base (104a, 104b) but which is deemed detectable by the
sensor object (106).


French Abstract

L'invention concerne un système de divertissement interactif comprenant une base (104) qui représente un dispositif de divertissement, un objet capteur physique (106) séparé de la base, un détecteur (102) décelant la présence, l'emplacement et l'identité de l'objet capteur par rapport à la base, et un programme audiovisuel interactif (112) conçu pour fournir des segments audiovisuels liés au dispositif de divertissement représenté par la base. La détection de l'objet capteur physique fait que le programme audiovisuel interactif produit un segment audiovisuel fondé au moins en partie sur la position décelée de l'objet capteur par rapport à la base et concernant une information non visible sur la base mais jugée détectable par l'objet capteur.

Claims

Note: Claims are shown in the official language in which they were submitted.


CLAIMS
What is claimed is:
1. An interactive entertainment system comprising:
a base (104) that represents an entertainment device;
a physical sensor object (106) that is separate from the base;
a detector (102) arranged to detect the presence, location and identity of the
sensor object relative to the base; and
an interactive audiovisual program (112, 304, 306, 308) configured to output
audiovisual segments related to the entertainment device represented by the
base,
wherein detection of the physical sensor object causes the interactive
audiovisual
program to output an audiovisual segment that is based at least in part upon
the
detected position of the sensor object relative to the base and that includes
information that is not visible on the base but which is deemed detectable by
the
sensor object.
2 An interactive entertainment system as recited in claim 1 wherein the
sensor object is not itself displayed in the audiovisual segments.
3. An interactive entertainment system as recited in claims 1 or 2 wherein
the entertainment device is selected from the group consisting of a game, a
toy, a
book and a story.
4. An interactive entertainment system as recited in any of claims 1-3
wherein the sensor object has a physical form that represents its function.
5. An interactive entertainment system as recited in any of claims 1-4
wherein the audiovisual segment output by the audiovisual program includes an
enhanced view of a portion of the base located at the detected position of the
sensor
object.
6. An interactive entertainment system as recited in claim 5 wherein the
enhanced view is a magnified view of the base portion located at the detected
position
of the sensor object.
7. An interactive entertainment system as recited in claim 5 wherein the
enhanced view is an x-ray view of the base portion located at the detected
position of
the sensor object.
8. An interactive entertainment system as recited in claim 5 wherein the
enhanced view is a lighted view of the base portion located at the detected
position of
the sensor object.
-29-

9. An interactive entertainment system as recited in claim 5 wherein the
enhanced view is a decoded view of the base portion located at the detected
position
of the sensor object.
10. An interactive entertainment system as recited in any of claims 5-9
wherein the enhanced view include an image of at least a portion of the sensor
object.
11. An interactive entertainment system as recited in any of claims 1-11
wherein the audiovisual segment output by the audiovisual program includes an
audio
segment that is related to a portion of the base located at the detected
position of the
sensor object.
12. An interactive entertainment system as recited in claim 11 wherein the
base portion includes a representation of an audio tape and the audio segment
represents sounds that are produced from the audio tape.
13. An interactive entertainment system as recited in claim 11 wherein the
base portion includes a representation of a structure that may hide audible
objects or
people and the audio segment represents sounds that are produced from the
audible
objects or people.
14. An interactive entertainment system as recited in any of claims 1-13
wherein the audiovisual segments include a video clip having a start scene
based on a
first position of the physical sensor object and a end scene based on a second
position
of the physical sensor object that is different from the first position.
15. An interactive entertainment system comprising:
a base that represents an entertainment device;
a physical character object that represents a character within the interactive
entertainment system;
a detector arranged to detect the presence, location and identity of the
character object relative to the base;
a gesture recognizer arranged to recognize gesture movements of the physical
character object based upon a detected state of the character object; and
an interactive audiovisual program configured to output audiovisual segments
related to the entertainment device represented by the base, wherein
recognition of a
selected gesture by the gesture recognizer causes the interactive audiovisual
program
to output an audiovisual segment that is based at least in part upon the
detected
gesture.
16. An interactive entertainment system as recited in claim 15 wherein the
audiovisual segments include a video clip having a start scene based on a
first
-30-

position of the character object and a end scene based on a second position of
the
character object that is different from the first position.
17. An interactive entertainment system as recited in claims 15 or 16
wherein the audiovisual segments are related to the entertainment device
represented
by the base, wherein recognition of a selected gesture by the gesture
recognizer
causes the interactive audiovisual program to output an audiovisual segment
that
includes the character represented by the character object performing an
action
associated with the detected gesture.
18. An interactive entertainment system comprising:
a base that represents an entertainment device selected from the group
consisting of a game, a toy, and a book;
a plurality of physical objects, the physical objects including environmental
objects, character objects and sensor objects;
a detector arranged to detect the presence, location and identity of the
physical
objects relative to the base; and
an interactive audiovisual program configured to output audiovisual segments
related to the entertainment device represented by the base based at least in
part upon
the detected position of the physical objects relative to the base; and
wherein detection of a selected sensor object causes the interactive
audiovisual
program to output an audiovisual segment that includes information that is not
visible
on the base but which is deemed detectable by the sensor object, detection of
a
character object causes the interactive audiovisual program to include the
detected
character in an outputted audiovisual segment, and detection of an
environmental
object causes the interactive audiovisual program to include the detected
environmental object in an outputted audiovisual segment.
19. An interactive entertainment system as recited in claim 18 wherein the
sensor objects are not themselves displayed in the audiovisual segments.
20. An interactive entertainment system as recited in claims 18 or 19
wherein the base is a book having a plurality of pages.
21. An interactive entertainment system as recited in claims 18 or 19
wherein the base is a game board.
22. A method of interfacing with a book system having a plurality of regions,
the method comprising:
-31-

scanning the book system to extract data, the extracted data including a
position and an identifier of a physical object that is part of the book
system;
identifying a region that is nearest to the physical object based on the
extracted data;
running an audiovisual program based at least in part on the determinations of
the identifier of the physical object and the position of the physical object
in relation
to the identified region.
23. A method as recited in claim 22, wherein the extracted data also includes
a
position and an identifier of an interactive device that is part of the book
system, the
method further comprising identifying a state of the interactive device based
on the
extracted data, and wherein the audiovisual program is also based on the
identified
state of the interactive device.
24. A method as recited in claim 23, wherein the interactive device is
associated with the physical object.
25. A method as recited in any of claims 22-24, wherein the extracted data
also includes a position and an identifier of an interactive device that is
part of the
book system, the method further comprising:
identifying a state of the interactive device based on the extracted data; and
running an audiovisual program based on the identified state of the
interactive
device.
26. A method as recited in 22-25, wherein the book system includes a plurality
of pages, and the physical object is in the form of a tool that may be placed
on a
portion of a selected page that is near the identified region.
27. A method as recited in claim 26, wherein the physical object is a
magnifying device and when the magnifying device is placed on the portion of
the
page that is near the identified region, the audiovisual program that is run
includes a
magnified view of the portion of the page that is near the identified region.
28. A method as recited in claim 26, wherein the physical object is an x-ray
device and when the x-ray device is placed on the portion of the page that is
near the
identified region, the audiovisual program that is run includes an x-ray view
of the
portion of the page that is near the identified region.
29. A method as recited in claim 26, wherein the physical object is a decoding
device and when the decoding device is placed on the portion of the page that
is near
the identified region, the audiovisual program that is run includes a text,
auditory, or
-32-

visual object that is not shown within the portion of the page that is near
the identified
region.
30. A method as recited in claim 26, wherein the physical object is a
listening
device and when the listening device is placed on the portion of the page that
is near
the identified region, the audiovisual program that is run generates one or
more
sounds.
31. A method as recited in claim 26, wherein the physical object is a
selection
device and when the selection device is placed on the portion of the page that
is near
the identified region, the audiovisual program includes displaying the portion
of the
page and a response that is based on the portion of the page.
32. A method as recited in claim 26, wherein the physical object has an
associated image and when the physical object is placed on the portion of the
page
that is near the identified region, the audiovisual program includes
displaying the
image associated with the physical object.
33. A method as recited in claim 32, wherein the audiovisual program also
includes displaying a second image in addition to the image associated with
the
physical object.
34. A method as recited in claim 32, wherein the audiovisual program also
includes morphing of the image associated with the physical object into a
second
image.
35. A method as recited in claim 26, wherein the physical object is a morphing
device and when the morphing device is placed on the portion of the page that
is near
the identified region, the audiovisual program includes morphing of the
portion of the
page into an image that is not included within the portion of the page that is
near the
identified region.
36. An interactive entertainment system as recited in 22-35wherein the
audiovisual program displays a video clip having a start scene based on a
first
position of the physical object and a end scene based on a second position of
the
physical object that is different from the first position.
37. A computer system for interacting with a book system, the computer
system comprising:
a data input device arranged to receive data that is extracted from the book
system, the extracted data including at least a position and an identification
of a
selected physical object that is part of the book system;
-33-

a data interpreter arranged to identify a region that is nearest to the
selected
physical object based on the extracted data;
a display device configured to outputting an interactive visual image based at
least in part on the determinations of the position and identifier of the
selected
physical object and the nearest region and associated identifier.
38. A computer system as recited in claim 37, wherein the extracted data also
includes a position and an identifier of an interactive device that is part of
the book
system, and the data interpreter is also arranged to identify a state of the
interactive
device based on the extracted data, and the display device is further
configured to run
an audiovisual program based on the identified state of the interactive
device.
39. A computer system as recited in claims 37 or 38, wherein the book system
includes a plurality of pages, and the physical object is in the form of a
tool that may
be placed on a portion of a selected page that is near the identified region.
40. A computer system as recited in claim 39, wherein the physical object is a
magnifying device and when the magnifying device is placed on the portion of
the
page that is near the identified region, the audiovisual program that is run
includes a
magnified view of the portion of the page that is near the identified region.
41. A computer system as recited in claim 39, wherein the physical object is a
decoding device and when the decoding device is placed on the portion of the
page
that is near the identified region, the audiovisual program that is run
includes a text,
auditory, or visual object that is not shown within the portion of the page
that is near
the identified region.
42. A computer system as recited in claim 39, wherein the physical object is
an x-ray device and when the x-ray device is placed on the portion of the page
that is
near the identified region, the audiovisual program that is run includes an x-
ray view
of the page portion that is not shown within the page portion itself.
43. A computer system as recited in claim 39, wherein the physical object is a
listening device and when the listening device is placed on the portion of the
page that
is near the identified region, the audiovisual program that is run includes
playing of
an audio file.
44. A computer system as recited in claim 39, wherein the physical object is a
selection device and when the selection device is placed on the portion of the
page
that is near the identified region, the audiovisual program includes
displaying the
portion of the page and a response that is based on the portion of the page.
45. A computer system as recited in claim 39, wherein the physical object has
an associated image and when the physical object is placed on the portion of
the page
-34-

that is near the identified region, the audiovisual program includes
displaying the
image associated with the physical object.
46. A computer system as recited in claim 39, wherein the physical object is a
morphing device and when the morphing device is placed on the portion of the
page
that is near the identified region, the audiovisual program includes morphing
of the
portion of the page into an image that is not included within the portion of
the page
that is near the identified region.
47. A computer readable medium containing program instructions for
interfacing with a book system having a plurality of regions, the computer
readable
medium comprising:
computer code for scanning the book system to extract data, the extracted data
including a position and an identifier of a physical object that is part of
the book
system;
computer code for identifying a region that is nearest to the physical object
based on the extracted data;
computer code for running an audiovisual program based at least in part on the
determinations of the identifier of the physical object and the position of
the physical
object in relation to the identified region; and
a computer readable medium for storing the computer readable codes.
48. A computer readable medium as recited in claim 47, wherein the extracted
data also includes a position and an identifier of an interactive device that
is part of
the book system, and the computer readable medium further comprising:
computer code for identifying a state of the interactive device based on the
extracted data; and
computer code for running an audiovisual program based on the identified
state of the interactive device.
49. A computer readable medium as recited in claims 47 or 48, wherein the
book system includes a plurality of pages, and the physical object is in the
form of a
tool that may be placed on a portion of a selected page that is near the
identified
region.
50. A computer readable medium as recited in claim 49, wherein the physical
object is a magnifying device and when the magnifying device is placed on the
portion of the page that is near the identified region, the audiovisual
program that is
run includes a magnified view of the portion of the page that is near the
identified
region.
-35-

51. A computer readable medium as recited in claim 49, wherein the physical
object is a decoding device and when the decoding device is placed on the
portion of
the page that is near the identified region, the audiovisual program that is
run includes
a text, auditory, or visual object that is not shown within the portion of the
page that
is near the identified region.
52. A computer readable medium as recited in claim 49, wherein the physical
object is a listening device and when the listening device is placed on the
portion of
the page that is near the identified region, the audiovisual program that is
run includes
playing of an audio file.
53. A computer readable medium as recited in claim 49, wherein the physical
object is a selection device and when the selection device is placed on the
portion of
the page that is near the identified region, the audiovisual program includes
displaying the portion of the page and a response that is based on the portion
of the
page.
54. A computer readable medium as recited in claim 49, wherein the physical
object has an associated image and when the physical object is placed on the
portion
of the page that is near the identified region, the audiovisual program
includes
displaying the image associated with the physical object.
55. A computer readable medium as recited in claim 49, wherein the physical
object is a morphing device and when the morphing device is placed on the
portion of
the page that is near the identified region, the audiovisual program includes
morphing
of the portion of the page into an image that is not included within the
portion of the
page that is near the identified region.
56. A book system comprising:
a plurality of pages, each page having one or more regions;
a physical object movable over the pages; and
a position sensing device that may be coupled with a computer system, the
position sensing device being configurable by the computer system to detect a
position of the pages, an identity of the physical object, and a position of
the physical
object,
wherein the computer system is also programmed with instructions to
configure the position sensing device to detect the positions of the pages and
physical
object and the identity of the physical object, the computer system being also
programmed with instructions to determine which region is nearest to the
physical
object and to generate an audiovisual program based on the pages' positions,
the
physical object's identity and position, and the nearest region.
-36-

57. A book system as recited in claim 56, wherein the position sensing device
is in the form of an electromagnetic sensing mechanism.
58. A book system as recited in claim 57, wherein each page and the physical
object has an associated resonator circuit, and the position sensing device
includes an
antenna through which an excitation signal having a predetermined frequency
may be
transmitted to activate at least one of the associated resonator circuits such
that the
resonator circuit(s)'s position(s) and identity(s) may be detected.
59. A method as recited in claim 58, wherein each resonator circuit associated
with each page responds to a different frequency value.
60. A method as recited in claim 59, wherein at least some of the resonator
circuits associated with each page is positioned over a different portion of
the
antenna.
61. A book system as recited in claim 57, wherein the pages form a first
resonator circuit and the physical object has a second resonator circuit, and
the
position sensing device includes an antenna through which an excitation signal
having
a predetermined frequency may be transmitted to activate at least one of the
resonator
circuit(s) such that the resonator circuit(s)'s position(s) and identity(s)
may be
detected.
62. A book system as recited in claim 61, wherein each page has an
associated coil that is serially coupled to another coil of another page, and
the coils of
the page are serially coupled to a resonator capacitor such that as a selected
one of the
pages is turned, the associated coils reverse direction.
63. A book system as recited in claim 57, further including a base resonator,
wherein each page has an associated coil, and each coil may be positioned over
the
base resonator, and the position sensing device includes an antenna through
which an
excitation signal having a predetermined frequency may be transmitted to cause
the
base resonator to generate a detected signal that is affected by coils of
associated
pages that are placed near the base resonator such that the position of the
pages with
respect to the base resonator may be determined.
64. A book system as recited in claim 57, further including a base resonator,
wherein each page has an associated conductive strip, and each conductive
strip may
be positioned over the base resonator, and the position sensing device
includes an
antenna through which an excitation signal having a predetermined frequency
may be
transmitted to cause the base resonator to generate a detected signal that is
affected by
one or more conductive strip(s) of associated pages that are positioned near
the base
-37-

resonator such that the position of the pages with respect to the base
resonator may be
determined.
65. A method as recited in claim 56, wherein the position sensing device
includes a capacitance measuring device that measure the capacitance between
two
contacts to determine the positions of the pages.
66. A method as recited in claim 65, wherein each page includes a conductive
strip that may be positioned to cover at least an area that is above and
between the
two contacts to affect the measured capacitance.
67. A method as recited in claim 66, wherein each conductive strip may be
positioned to cover the entire area that is above and between the two contacts
to affect
the measured capacitance.
68. A method as recited in claim 56, wherein the position sensing device
includes a resistance measuring device that measure the resistance between a
plurality
of contact pairs to determine the positions of the pages.
69. A method as recited in claim 68, wherein each page includes a conductive
strip that may be positioned between an associated one of the contact pairs
such that a
short is formed between the associated contact pairs.
70. A method as recited in claim 68, wherein each conductive strip includes
two plugs and the associated contact pairs include two receptacles for
receiving the
two plugs.
71. A method as recited in claim 68, wherein each conductive strip includes
two receptacles and the associated contact pairs include two plugs for
inserting into
the two receptacles.
72. A method as recited in claim 56, wherein the position sensing device
includes a plurality of photodetector devices that are each associated with
page, and
each page is shaped such that the page may be positioned to cover the
associated
photodetector device such that the positions of the pages may be determined.
73. A method as recited in claim 56, wherein the position sensing device
includes a magnetic measuring device and each page includes an associated
magnetic
tab that may be positioned near the magnetic measuring device when the page is
turned in a predetermined direction.
74. A method as recited in 56-73, wherein the position sensing device is
capable of detecting an x, y, and z position.
75. A method as recited in claim 74, wherein the position sensing device is
also capable of detecting a rotation angle position.
-38-

76. A book system as recited in 56-75, further comprising:
a second physical object movable over the pages; and
wherein the computer system is also programmed with instructions to
configure the position sensing device to detect the position and the identity
of the
second physical object and to determine which region is nearest to the second
physical object and to generate a second audiovisual program based on the
pages'
positions, the second physical object's identity and position, and the nearest
region to
the second physical object.
77. A method of interfacing with a book system having a plurality of pages, a
first physical object, and a second physical object, wherein each physical
object may
be placed near one of a plurality of pages, the method comprising:
scanning the book system to extract data, the extracted data including at
least a
first position and a first identifier of the first selected physical object if
the first
physical object is placed near one of the pages, the extracted data including
a second
position and a second identifier of the second physical object if the second
physical
object is placed near one of the pages;
identifying a first region that is nearest to the first selected physical
object
based on the extracted data if the first physical object is placed near one of
the pages;
identifying a second region that is nearest to the second physical objects
based
on the extracted data if the second physical object is placed near one of the
pages;
running an audiovisual program based at least in part on the first position
and
first identifier of the first physical object and the identifier of the first
region if the
first physical object is placed near one of the pages; and
running an audiovisual program based on at least in part on the second
position and second identifier of the second physical object and the
identifier of the
second region if the second physical object is placed near one of the pages.
78. A method as recited in claim 77, wherein the extracted data also includes
a
position and an identifier of an interactive device that is part of the book
system, the
method further comprising:
identifying a state of the interactive device based on the extracted data; and
running an audiovisual program based at least in part on the identified state
of
the interactive device.
79. A method as recited in claims 77 or 78, wherein the first physical object
is
in the form of a tool that may be placed on a portion of a selected page that
is near the
identified first region.
-39-

80. A method as recited in claim 79, wherein the first physical object is a
magnifying device and when the magnifying device is placed on the portion of
the
page that is near the identified first region, the audiovisual program that is
run
includes a magnified view of the portion of the page that is near the
identified first
region.
81. A method as recited in claim 79, wherein the first physical object is an x-
ray device and when the x-ray device is placed on the portion of the page that
is near
the identified first region, the audiovisual program that is run includes an x-
ray view
of the portion of the page that is near the identified first region.
82. A method as recited in claim 79, wherein the first physical object is a
decoding device and when the decoding device is placed on the portion of the
page
that is near the identified first region, the audiovisual program that is run
includes a
text, auditory, or visual object that is not shown within the portion of the
page that is
near the identified first region.
83. A method as recited in claim 79, wherein the first physical object is a
listening device and when the listening device is placed on the portion of the
page that
is near the identified first region, the audiovisual program that is run
includes playing
of an audio file.
84. A method as recited in claim 79, wherein the first physical object is a
selection device and when the selection device is placed on the portion of the
page
that is near the identified first region, the audiovisual program includes
displaying the
portion of the page and a response that is based on the portion of the page.
85. A method as recited in claim 79, wherein the first physical object has an
associated image and when the physical object is placed on the portion of the
page
that is near the identified first region, the audiovisual program includes
displaying the
image associated with the physical object.
86. A method as recited in claim 79, wherein the first physical object is a
morphing device and when the morphing device is placed on the portion of the
page
that is near the identified first region, the audiovisual program includes
morphing of
the portion of the page into an image that is not included within the portion
of the
page that is near the identified first region.
87. A computer readable medium containing program instructions for
interfacing with a book system having a plurality of pages, a first physical
object, and
a second physical object, wherein each physical object may be placed near one
of a
plurality of pages, the computer readable medium comprising:
-40-

computer code for scanning the book system to extract data, the extracted data
including at least a first position and a first identifier of the first
selected physical
object if the first physical object is placed near one of the pages, the
extracted data
including a second position and a second identifier of the second physical
object if the
second physical object is placed near one of the pages;
computer code for identifying a first region that is nearest to the first
selected
physical object based on the extracted data if the first physical object is
placed near
one of the pages;
computer code for identifying a second region that is nearest to the second
physical objects based on the extracted data if the second physical object is
placed
near one of the pages;
computer code for running an audiovisual program based at least in part on the
first position and first identifier of the first physical object and the
identifier of the
first region if the first physical object is placed near one of the pages;
computer code for running an audiovisual program based on at least in part on
the second position and second identifier of the second physical object and
the
identifier of the second region if the second physical object is placed near
one of the
pages; and
a computer readable medium for storing the computer readable codes.
88. A computer system as recited in claim 37, further comprising a gesture
recognizer for identifying and interpreting gesture movements by the of the
physical
object.
-41-

Description

Note: Descriptions are shown in the official language in which they were submitted.


CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
INTERACTIVE ENTERTAINMENT SYSTEMS AND METHODS
BACKGROUND OF THE INVENTION
This invention relates generally to computer interactions with a physical
system and more particularly to interactions with a physical book. The present
invention is also applicable to other types of entertainment systems.
In general terms, a conventional book is a bundle of text that provides the
reader with a story having a specific set of characters and events. As one
reads a
book, the reader may imagine how the characters look and how they react to
different
events. The conventional book may also include pictures that provide the
reader with
visual depictions of characters and events from the story.
A reader typically reads a conventional book from start to finish, wherein
events of the story unfold sequentially in a predetermined ordered that is
laid out by
the text description. In other words, there is no deviation from the story
line; the
book's story or text remains fixed. In sum, conventional books are not
interactive and
do not allow the reader any input as to how the story is presented.
Recently, some books have been designed that provide limited interactive
capabilities. For example, some children's books include sound files that may
be
selected and played with the push of a button that is built into the book.
These sounds
typically correspond to the story. For example, barnyard animal noises may be
provided with a story about a farm.
Although conventional books provide limited interactivity, improved
mechanisms for interfacing with a book to provide improved interactivity would
be
desirable. Such mechanisms would preferably allow the user to choose the
direction
of the story or to select events that are added to the story.

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
SUMMARY OF THE INVENTION
Accordingly, the present invention provides a system and method for
interfacing with a book, or any other type of physical entertainment device,
in a
complex manner. The entertainment device includes physical objects that may be
placed on or associated with a physical base, such as a book or a game board
or
platform. A user may move the physical objects over the base and/or add and
remove
physical objects to and from the base. When the user interacts with the
physical
objects (e.g., moves an object in relation to the base or turns a page of the
book), a
corresponding audiovisual program is executed to display or play one or more
audiovisual segments that add to what is presented within the physical
entertainment
device.
In one embodiment of the invention, an interactive entertainment system is
disclosed. The system includes a base that represents an entertainment device,
a
physical sensor object that is separate from the base, a detector arranged to
detect the
presence, location and identity of the sensor object relative to the base, and
an
interactive audiovisual program configured to output audiovisual segments
related to
the entertainment device represented by the base. Detection of the physical
sensor
object causes the interactive audiovisual program to output an audiovisual
segment
that is based at least in part upon the detected position of the sensor object
relative to
the base and that includes information that is not visible on the base but
which is
deemed detectable by the sensor object.
In a preferred embodiment, the sensor object is not itself displayed in the
audiovisual segments. In yet another preferred embodiment, the entertainment
device
is either a game, a toy, a book or a story. Alternatively, the sensor object
has a
physical form that represents its function. Preferably, the audiovisual
segment output
by the audiovisual program includes an enhanced view of a portion of the base
located at the detected position of the sensor object, and/or includes an
audio segment
that is related to a portion of the base located at the detected position of
the sensor
object.
In an alternative embodiment, the entertainment system includes a base that
represents an entertainment device, a physical character object that
represents a
character within the interactive entertainment system, a detector arranged to
detect the
presence, location and identity of the character object relative to the base,
and a
gesture recognizer arranged to recognize gesture movements of the physical
character
object based upon a detected state of the character object. The entertainment
system
-2-

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
further includes an interactive audiovisual program configured to output
audiovisual
segments related to the entertainment device represented by the base. The
recognition
of a selected gesture by the gesture recognizer causes the interactive
audiovisual
program to output an audiovisual segment that is based at least in part upon
the
detected gesture.
In yet another embodiment, the interactive entertainment system includes a
base that represents an entertainment device in the form of a game, a toy, or
a book
and a plurality of physical objects. The physical objects include
environmental
objects, character objects and sensor objects. The entertainment system also
includes
a detector arranged to detect the presence, location and identity of the
physical objects
relative to the base and an interactive audiovisual program configured to
output
audiovisual segments related to the entertainment device represented by the
base
based at least in part upon the detected position of the physical objects
relative to the
base. Detection of a selected sensor object causes the interactive audiovisual
program
to output an audiovisual segment that includes information that is not visible
on the
base but which is deemed detectable by the sensor object, detection of a
character
object causes the interactive audiovisual program to include the detected
character in
an outputted audiovisual segment, and detection of an environmental object
causes
the interactive audiovisual program to include the detected environmental
object in an
outputted audiovisual segment.
In an alternative embodiment, a method of interfacing with a book system
having a plurality of regions is disclosed. The book system is scanned to
extract data.
In this embodiment, the extracted data including a position and an identifier
of a
physical object that is part of the book system. A region that is nearest to
the physical
object is identified based on the extracted data. An audiovisual program is
run based
at least in part on the determinations of the identifier of the physical
object and the
position of the physical object in relation to the identified region.
In an apparatus aspect of the invention, a computer system for interacting
with
a book system is disclosed. The computer system includes a data input device
arranged to receive data that is extracted from the book system. The extracted
data
includes at least a position and an identification of a selected physical
object that is
part of the book system. The computer system also includes a data interpreter
arranged to identify a region that is nearest to the selected physical object
based on
the extracted data and a display device configured to outputting an
interactive visual
image based at least in part on the determinations of the position and
identifier of the
selected physical object and the nearest region and associated identifier.
-3-

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
In yet another aspect, a computer readable medium containing program
instructions for interfacing with a book system having a plurality of regions
is
disclosed. The computer readable medium includes computer code for scanning
the
book system to extract data. The extracted data includes a position and an
identifier
of a physical object that is part of the book system. The computer readable
medium
also includes computer code for identifying a region that is nearest to the
physical
object based on the extracted data, computer code for running an audiovisual
program
based at least in part on the determinations of the identifier of the physical
object and
the position of the physical object in relation to the identified region, and
a computer
readable medium for storing the computer readable codes.
In an apparatus aspect, a book system is disclosed. The book system includes
a plurality of pages with each page having one or more regions, a physical
object
movable over the pages, and a position sensing device that may be coupled with
a
computer system. The position sensing device is configurable by the computer
system to detect a position of the pages, an identity of the physical object,
and a
position of the physical object. The computer system is also programmed with
instructions to configure the position sensing device to detect the positions
of the
pages and physical object and the identity of the physical object. The
computer
system is also programmed with instructions to determine which region is
nearest to
the physical object and to generate an audiovisual program based on the pages'
positions, the physical object's identity and position, and the nearest
region. In a
preferred embodiment, the position sensing device is in the form of an
electromagnetic sensing mechanism.
In yet another embodiment, a method of interfacing with a book system
having a plurality of pages, a first physical object, and a second physical
object is
disclosed. Each physical object may be placed near one of a plurality of
pages. The
book system is scanned to extract data, wherein the extracted data including
at least a
first position and a first identifier of the first selected physical object if
the first
physical object is placed near one of the pages and/or a second position and a
second
identifier of the second physical object if the second physical object is
placed near
one of the pages. A first region that is nearest to the first selected
physical object is
identified based on the extracted data if the first physical object is placed
near one of
the pages. A second region that is nearest to the second physical objects is
identified
based on the extracted data if the second physical object is placed near one
of the
pages. An audiovisual program is run based at least in part on the first
position and
first identifier of the first physical object and the identifier of the first
region if the
first physical object is placed near one of the pages, and an audiovisual
program is run
-4-

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
based on at least in part on the second position and second identifier of the
second
physical object and the identifier of the second region if the second physical
object is
placed near one of the pages.
In an alternative embodiment, a computer readable medium containing
program instructions for interfacing with a book system having a plurality of
pages, a
first physical object, and a second physical object is disclosed. Each
physical object
may be placed near one of a plurality of pages, and the computer readable
medium
includes (i) computer code for scanning the book system to extract data with
the
extracted data including at least a first position and a first identifier of
the first
selected physical object if the first physical object is placed near one of
the pages and
including a second position and a second identifier of the second physical
object if the
second physical object is placed near one of the pages, (ii) computer code for
identifying a first region that is nearest to the first selected physical
object based on
the extracted data if the first physical object is placed near one of the
pages, (iii)
computer code for identifying a second region that is nearest to the second
physical
objects based on the extracted data if the second physical object is placed
near one of
the pages, (iv) computer code for running an audiovisual program based at
least in
part on the first position and first identifier of the first physical object
and the
identifier of the first region if the first physical object is placed near one
of the pages,
(v) computer code for running an audiovisual program based on at least in part
on the
second position and second identifier of the second physical object and the
identifier
of the second region if the second physical object is placed near one of the
pages, and
(vi) a computer readable medium for storing the computer readable codes.
-5-

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 is a top view of a book system in accordance with one embodiment of
the present invention.
Figure 2A is a diagrammatic representation of a first example of a page
detection mechanism that utilizes electromagnetic detection in accordance with
one
embodiment of the present invention.
Figure 2B is a diagrammatic representation of a second example of a page
detection mechanism that utilizes electromagnetic detection in accordance with
one
embodiment of the present invention.
Figure 2C is a diagrammatic representation of a third example of a page
detection mechanism that utilizes electromagnetic detection in accordance with
one
embodiment of the present invention.
Figure 2D is a diagrammatic representation of a first example of a page
detection mechanism that utilizes capacitance value detection in accordance
with one
embodiment of the present invention.
Figure 2E is a diagrammatic representation of a second example of a page
detection mechanism that utilizes capacitance value detection in accordance
with one
embodiment of the present invention.
Figure 2F is a diagrammatic representation of a page detection mechanism
that utilizes resistance value detection in accordance with one embodiment of
the
present invention.
Figure 3 is digital photograph of a book system having a plurality of physical
objects in accordance with one embodiment of the present invention.
Figure 4A is a digital photograph of a book system with a decoding device
positioned over a page of the book in accordance with one embodiment of the
present
invention.
Figure 4B is a screen shot generated by an audiovisual program that is based
on the position of the decoding device of Figure 4A in accordance with one
embodiment of the present invention.
-6-

CA 02360940 2001-07-23
WO 00/45250 PCT/~JS00/02342
Figure 4C is a digital photograph of a book system with a second decoding
device positioned over a page of the book in accordance with one embodiment of
the
present invention.
Figure SA is a digital photograph of a book system with a magnifying device
positioned over a page of the book in accordance with one embodiment of the
present
invention.
Figure SB is a screen shot generated by an audiovisual program that is based
on the position of the magnifying device of Figure SA in accordance with one
embodiment of the present invention.
Figure 6A is a digital photograph of a book system with a listening device
positioned over a page of the book in accordance with one embodiment of the
present
invention.
Figure 6B is a screen shot generated by an audiovisual program that is based
on the position of the listening device of Figure 6A in accordance with one
embodiment of the present invention.
Figure 7A is a digital photograph of a book system with a selection device
positioned over a page of the book in accordance with one embodiment of the
present
invention.
Figure 7B is a screen shot generated by an audiovisual program that is based
on the position of the selection device of Figure 7A in accordance with one
embodiment of the present invention.
Figure 8A is a digital photograph of a book system with an image card
positioned over a page of the book in accordance with one embodiment of the
present
invention.
Figure 8B is a screen shot generated by an audiovisual program that is based
on the position of the image card of Figure 8A in accordance with one
embodiment of
the present invention.
Figure 9 is a flowchart illustrating a process of interfacing with a book
system,
such as the book system in Figure l, in accordance with one embodiment of the
present invention.
Figure 10 is a flow chart illustrating the operation of Figure 9 of
interrupting
the extracted data in accordance with one embodiment of the present invention.

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
Figure 11 is a flow chart illustrating the operation of Figure 9 of executing
the
audiovisual program in accordance with one embodiment of the present
invention.
Figure 12 is a digital photograph of a game system with an environmental
object, a character object, and a sensor object positioned over a base in
accordance
with one embodiment of the present invention.
Figure 13A is a screen shot illustrating a display output from an audiovisual
program based on the character object, the environmental objects, and the
sensing
object of Figure 12 in accordance with one embodiment of the present
invention.
Figure 13B is a screen shot illustrating a display output from an audiovisual
program based on a gesture via character object of Figure 12 in accordance
with one
embodiment of the present invention.
Figure 14 is a flowchart illustrating the process of analyzing a gesture of
the
character object of Figure 12 in accordance with one embodiment of the present
invention.
_g_

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
DETAILED DESCRIPTION OF SPECIFIC EMBODIMENTS
Reference will now be made in detail to the specific embodiments of the
invention. Examples of the these specific embodiments are illustrated in the
accompanying drawings.. While the invention will be described in conjunction
with
these specific embodiments, it will be understood that it is not intended to
limit the
invention to the described embodiments. On the contrary, it is intended to
cover
alternatives, modifications, and equivalents as may be included within the
spirit and
scope of the invention as defined by the appended claims. In the following
description, numerous specific details are set forth in order to provide a
thorough
understanding of the present invention. The present invention may be practiced
without some or all of these specific details. In other instances, well known
process
operations have not been described in detail in order not to unnecessarily
obscure the
present invention.
In general terms, one of the embodiments of the current invention includes a
method and apparatus for interfacing with a book in a complex manner. As the
reader
turns pages of the book, for example, the page positions are detected and a
corresponding audiovisual program is executed on a computer system. That is,
visual
images are displayed and/or sounds are played. The audiovisual program is
related to
the text of the physical book and adds to the book reading experience.
The book system may also include tools that may be used with the physical
book and allow the audiovisual program to be manipulated or changed. By way of
example, as a particular tool (e.g., a magnifying tool) is placed on a
particular portion
of a particular page (e.g., a picture) of the book, a corresponding
audiovisual program
(e.g., an enlarged view of a portion of the picture) may be displayed.
In other words, as the user interacts with the book by using the tools or
turning
pages, the user is provided with a corresponding audiovisual interaction on
the
computer, which interaction greatly enhances the book experience. In sum, the
book
system includes mechanisms that detect pages and/or tool movements such that
events or characters are inserted within an audiovisual representation of the
story, in
addition to the story represented by text and images on the pages of the
physical
book.
Figure 1 shows an interactive book system 100 in accordance with one
embodiment of the present invention. The detection system includes a physical
book
unit 103, a computer system 110, and an I/O board 108 for interfacing between
the
physical book unit 103 and computer system 110. As shown, the physical book
unit
-9-

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
103 includes a printed circuit (PC) board 101 having an antenna 102, a
physical book
having a plurality of pages 104, and one or more tools 106. Some of the tools
106
may also include an interactive device 107, such as a button. The book unit
103 may
also include a fixed interactive device 109, such as a switch or button, (i.
e., the device
has a fixed resting position).
The computer system 110 includes, among other things, a driver 114,
application program 112, display 116, and speakers 118. Several well known
components of a computer system, such as a processor and memory unit, are not
described or included within Figure 1 so as to not obscure the invention.
Although the computer system 110 is shown as being a separate component
from the physical book unit 103, of course, the computer system 110 may be
integrated with the physical book unit 103. Additionally, it should be
understood that
other type of interfaces may be used, such as a television system or set top
box. Also,
the I/O board 108 may be integrated with the physical book unit 103.
The interactive book system 100 may include any suitable mechanisms for
detecting the presence, positions, and identities of the various tools (e.g.,
106) and
pages (e.g., 104) associated with the book unit 103. For example, the
detection
mechanism may detect how many pages are turned to the right side of the book
(e.g.,
page 104a) and how many pages are turned to the left side of the book (e.g.,
page
104b). The detection mechanism may also be capable of identifying which tools
have
been placed on the pages of the book unit and on which page and which portion
of the
page a tool has been placed.
To detect and identify the pages and tools of the book unit 103, at least some
of the tools and the pages will include a detectable marker. When the markers
are
placed or moved within the book unit 103 (e.g., by placing a tool on a page or
by
turning a page), the positions of the markers may be sensed by the detection
mechanism. Preferably, the markers also identify the tool or page. That is,
the
detection mechanism is able to distinguish between the different markers of
the
different tools and pages.
In the illustrated embodiment, the book unit includes an electromagnetic
detection system to detect the tools and pages of the book and to input the
detected
data to a computer. Any suitable electromagnetic detection system may be
implemented. Of course, it should be recognized that other detection systems
may be
used, such as optical sensors or electronic sensors.
-10-

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
In the illustrated electromagnetic detection system, the book unit 103
includes
a number of resonator circuits (not shown) that are detectable by the antenna
102
when the resonators are activated by an excitation signal having a particular
resonator
frequency. The resonator circuits may be arranged in any suitable manner to
facilitate
detection of the tools and.pages of the book unit 103.
In one embodiment, each tool 106 and page 104 includes a simple resonator
circuit (e.g., a coil in series with a capacitor) that resonates at a
particular resonating
frequency. An excitation signal having a frequency that activates one or more
resonator circuits) is transmitted on antenna 102. A resonating signal is
induced on
antenna 102 by the resonators) when the resonators) resonate at the frequency
of the
excitation signal. In sum, the excitation signal on antenna 102 is affected by
the
activated resonators) and their corresponding positions relative to the
antenna.
In one embodiment, the alteration of the excitation signal may be analyzed to
determine which resonators) are resonating, and their corresponding positions.
Preferably, the excitation signal is stopped such that the resonating signal
may be
independently analyzed. That is, when the antenna's excitation signal is
stopped, the
activated resonators) continue to produce a "ringing" signal that continues to
be
induced on the antenna 102 and, thus, may be clearly detected on the antenna
102
without interference by the excitation signal.
The antenna 102 may be any suitable form for sensing positions of various
resonator circuits. In one embodiment, the antenna is in the form of a loop
that is
embedded within the PC board 101. The antenna 102 is placed such that
resonator
circuits and their positions may be sensed by the antenna 102. In other words,
a
resonating signal induced within the antenna 102 depends on the activated
resonators) position with respect to the antenna 102. In sum, a resonator's
position,
as well as the associated tool or page position, may be determined relative to
the
antenna 102 by analyzing the resonating signal induced within the antenna 102.
The book system may include any number and kind of tools. Tools are
defined as objects that may be used to interact with portions of the book unit
103.
The tool may represent a visual or auditory sensor. That is, as the tool is
moved over
a selected position of the book unit, additional images and/or sounds related
to the
selected position are revealed through the audiovisual program. For example, a
tool
may be used to magnify certain graphic portions of a page, which magnified
view is
generated by the audiovisual program. By way of another example, a tool may be
used as a listening device to reveal sounds related to portions of the book
unit. In
addition to tools, the book system may also include interactive devices, which
are
-11-

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
defined as objects that have more than one state. For example, an interactive
device
may be in the form of a button, switch, or knob.
The tools and interactive devices may be fixed or movable within the book
unit 103. An interactive devices may be associated with a particular tool or
may be
an independent object within the book unit 103. Each tool, page, and
interactive
device may be configured with a resonator that resonates at a distinct
frequency. Of
course, if an interactive device is associated with a tool, the pair may have
a single
resonator.
In this embodiment, positions of the tools and/or interactive devices may be
separately detected by transmitting excitation signals having different
frequencies
through antenna 102. A particular excitation signal with a selected frequency
activates a corresponding tool's or interactive device's resonator, which
produces a
distinct resonating signal in antenna 102. Thus, each object may be identified
by its
particular resonating frequency.
Alternatively, the resonator circuits of the tools may resonate at a same
resonator frequency, but the resonators are positioned at varying "z"
positions within
the tool itself. Each tool is uniquely identified by the associated z position
of the
resonator. This embodiment, as well as several other embodiments, for
arranging and
identifying resonators having a same frequency are described in U.S. patent
application number 09/144,951, entitled "Detecting Physical Object States
Using
ElectroMagnetic Sensors" by Marcos R. Vescovi, et al., filed on 1 September
1998,
which application is herein incorporated by reference in its entirety.
Any suitable mechanisms may be implemented to convert the resonating or
detected signals into signals that are appropriate for input into a computer
system. As
shown, the detected signal is sent through the I/O board 108 through driver
114 to
application program 112. The I/O board 108 is arranged to convert the detected
signals to digital signals (e.g., a binary pulse wave). The application
software 112
analyzes the detected signals to determine positions and identifications of
the various
objects of the book unit 103, as well as to generate a corresponding
audiovisual
program. The driver 114 is arranged to allow communication between the
application
software 112 and the I/O board 108.
In general terms, positional data is determined based on the detected signal.
and the positional data is processed and used to generate or trigger an
audiovisual
program. Examples of methods for generating audiovisual programs based on
positional data are described further in U.S. Patent Application No.
09/018,023 filed
-12-

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
February, 2 1998 entitled " Computer Method and Apparatus for Interacting with
a
Physical System" by Piernot, et al, which application is herein incorporated
by
reference in its entirety.
Specifically, the detected signal includes positional data regarding the
resonator circuits of one or more physical objects) of the book unit 103, such
as the
tools, interactive devices, and/or pages. That is, positions of one or more
physical
objects) may be ascertained by analyzing the detected signal. In one
embodiment,
the positional data includes six degrees of position states: x, y, and z
position,
rotational angle, tilt, and yaw. One example of an electromagnetic sensing
system
that senses six degrees of positional data is described in International PCT
Application Number PCT/GB96/02896 published on 29 May 1997 by Dames, et al,
which is herein incorporated by reference in its entirety. Of course, the
positioned
data may include any subset of these six degrees of position states.
Several detection mechanisms for detecting positional information regarding a
plurality of physical objects will now be described in reference to Figures 2A
through
2F. As illustrated, each physical object is in the form of a page of the book,
and the
positional information may include information about whether or not the page
has
been turned, whether or not the page is currently being turned, and/or an
exact
position (e.g., an angle relative to the book cover) of the page. Of course,
the
following mechanisms may be applied, with minor modifications, to detecting
tool
positions and/or interactive device positions.
Figure 2A is a diagrammatic representation of a first example of a page
detection mechanism that utilizes electromagnetic detection in accordance with
one
embodiment of the present invention. As shown, a plurality of pages 104 are
positioned over an antenna 102 that is embedded in PC board 101. Of course,
the
antenna 102 may stand alone without the PC board 101.
Each page 104 has an embedded resonator 202 having a particular resonating
frequency. Each resonator 202 may be positioned on any suitable location of
the
associated page such that page position may be accurately determined (e.g.
which
pages are being viewed within the physical book). In the illustrated
embodiment,
each resonator 202 is positioned over a different portion of the antenna 102
(see axis
204 for each resonator), depending on whether or not the page is turned.
As shown, when resonator 202a of page 104a is positioned over a left portion
of the antenna, page 104a is open and has not been turned. In contrast, when
resonator 202a of page 104a is positioned on the right portion of the antenna,
page
-13-

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
104a has been turned such that a new page is displayed (i. e., page 104b).
Additionally, when resonator 202a of page 104a is positioned between the left
and
right portions of the antenna, page 104a is being turned (e.g., moving from
the left
side to the right side of the book, or visa versa).
In the embodiment of Figure 2A, each resonator 202 resonates at a particular
frequency of an excitation signal that is transmitted through antenna 102. For
example, resonator 202a of page 104a may be activated by transmitting at a
first
frequency, and resonator 202b of page 104b may be activated by transmitting at
a
second different frequency. When the excitation signal is deactivated, the
resonating
signal of one of the resonators 202 may be detected by antenna 102.
When more than one page is oriented in a same direction (e.g., pages 104a,
104b, and 104c are all turned to the left side of the book), the resonators
202 of these
pages may overlap each other. However, as more resonators overlap, they tend
to
interfere with one another and it may be difficult to analyze the detected
signal from
an individual resonator. Thus, preferably, each resonator 202 is positioned
over a
different portion of antenna 102. As shown by each axis 204, each resonator
202 is
positioned over a different portion of antenna 102.
Figure 2B is a diagrammatic representation of a second example of a page
detection mechanism that utilizes electromagnetic detection in accordance with
one
embodiment of the present invention. As shown, the pages 104 are positioned
over a
base 206 having an embedded resonator 210. The resonator 210 and base 206 are
positioned over antenna 102. Each page includes an associated coil 208 (e.g.
without
a capacitor or resonator).
As pages are turned, some of the coils 208 are positioned adjacent to the
resonator 210 such that the detected signal is affected by the adjacent coils
208. In
other words, the resonating signal of resonator 210 depends on how many coils
208
are positioned within a certain distance of the resonator 210. Each coil 208
may be
positioned adjacent to resonator 210 (i.e., the page is turned to the left
side of the
book), positioned near resonator 210 (i.e., as the page is being turned), or
positioned
far from resonator 210 (i.e., the page is turned to the right side).
As shown, the resonator 210 is located within the left portion of the book. As
shown, pages 104a through 104c are flipped to the left side of the book and,
thus.
coils 208a through 208c are located in proximity to resonator 210. In
contrast, pages
104d through 104f are flipped to the right of the book and, thus, coils 208d
through
208f are located far from resonator 210. In this arrangement, there is mutual
-14-

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
inductance between the coils of pages 104a through 104c and resonator 210,
which
mutual inductance reduces the amplitude of the resonator signal. The amount of
this
reduction may then be analyzed to determine how many coils, as well as pages,
are
present over the resonator 210.
In an alternative embodiment, the coils may be replaced with strips of
conductive material. Each strip of conductive material from each pages may
alter the
detected signal in a quantifiable manner. Thus, the detected signal may be
analyzed
to determine how many strips of conductive material are affecting the signal
and
thereby to determine how many pages have been turned.
Figure 2C is a diagrammatic representation of a third example of a page
detection mechanism that utilizes electromagnetic detection in accordance with
one
embodiment of the present invention. As shown, each page has an associated
coil
211. The coils 211 are serially coupled together, along with a resonating
capacitor
212, to form a resonator. As the pages are turned, the associated coil is also
flipped
over such that the coils are winding in a reverse direction.
Depending on the direction of the coils, each coil either subtracts or adds to
the amplitude of the resonating signal. For example, as each page that is
turned to the
right side of the book (as shown, pages 104d through 104f), the coils of the
turned
pages (coils 211 d through 211 f) may subtract from the detected signal's
amplitude.
In this example, for each page that is not turned and remains on the left of
the book
(as shown, pages 104a through 104c), each associated coil (coil 211 a through
211 c)
continues to contribute to the detected signal's amplitude. Thus, the detected
signal
may be analyzed to determine how many coils are contributing to the signal and
thereby determine how many pages have been turned or not turned.
Although page detection has been described in terms of implementing
electromagnetic detection technology, of course, other types of sensing
technologies
may be utilized that are suitable for detecting pages. For example, each page
may
contribute to a capacitance value between two points. In other words, strips
of
conductive material may be configured on each page such that the capacitance
value
between two points is increased or decreased as pages of the book are turned.
Figure 2D is a diagrammatic representation of a first example of a page
detection mechanism that utilizes capacitance value detection in accordance
with one
embodiment of the present invention. As shown, a first point 214a and a second
point
214b are located on a base 206. The capacitance value between these two points
214
is affected by how many pages are positioned over and between the points 214.
As
-15-

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
shown, each page 104 has an associated conductive strip 216 that may be
positioned
over the two points 214.
In the illustrated embodiment, the points 214 are located on the right portion
of the book. Thus, as a page is turned to the right side of the book, the
associated
conductive strip 216 of that page contributes to the capacitance value as
measured
between the two points 214. As shown, conductive strips 216d, 216e, and 216f
of
pages 104d, 104e, and 104f contribute to the capacitance value since these
pages are
positioned over the two points 214.
The conductive strips 216 may be positioned in any suitable position that
affects the capacitance value between the two points 214. In the embodiment of
Figure 2D, the strips 216 stretch between the two points 214 when the
corresponding
pages are positioned over the two points 214. The strips 216 may also overlap
the
two points 214 or cover an area that is larger than the area between the two
points
214. Alternatively, the strips 216 may cover an area that is smaller than the
area
between the two points 214.
Figure 2E is a diagrammatic representation of a second example of a page
detection mechanism that utilizes capacitance value detection in accordance
with one
embodiment of the present invention. As shown, each page 104 has an associated
conductive strip 218, wherein the strips may be interdigitated with each
other. That
is, the strips 218 may be positioned over different areas between the points
220a and
220b. Like the previous example, each strip 218 affects the measured
capacitance
value between points 220a and 220b. Since the capacitance value depends on how
many pages are positioned over the points 220, it may determined how many
pages
are turned to one side of the book (i. e., the side that includes the two
measurement
points 220).
Any number of detection technologies may be implemented to detect the page
positions. Figure 2F is a diagrammatic representation of a page detection
mechanism
that utilizes resistance value detection in accordance with one embodiment of
the
present invention. In general terms, each page has an associated conductive
strip or
wire 224 that shorts one of a plurality of contact pairs (e.g., 222c).
As shown, the contact pairs 222 are positioned along the right edge of the
book. As each page is turned to the right side of the book, the associated
conductive
strip of the turned page touches between a pair of contacts 222 and creates a
short
circuit. For example, when page 104f is turned to the right side of the book,
the
-16-

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
conductive strip 224f of page 104f shorts contact pair 222f. Likewise, the
conductive
strip 224c shorts contact pair 222c when page 104c is turned to the right.
Page turning may be determined by measuring the resistance value between
each pair of contact pairs 222. If the resistance value is relatively large,
the
corresponding page has not been turned. If the resistance value is zero, the
corresponding page has been turned.
The conductive strips 224 may be arranged on each page in any suitable
manner so as to short two contact points 222. As shown in Figure 2F, the
conductive
strips 224 are formed such that they rest between a pair of contacts 222 when
the
corresponding page is positioned directly over the contact pair 222.
Alternatively, the
conductive strips or wires may include plugs that are inserted into a pair of
conductive
holes as the page is turned. This embodiment provides a mechanism for ensuring
that
a conductive path is formed between the two point. Without the plugs, the
conductive
strip may not align correctly with the two contacts if, for example, the page
is
crumpled.
In an alternative embodiment, the book system may include a magnetic
sensing device, and each page may include a magnetic tab that may be sensed by
the
magnetic sensing device at certain page positions. For example, each page may
have
a magnetic tab positioned in a different location. The magnetic sensing device
generates an signal that is affected by magnetic tabs that are nearby. Thus,
by
analyzing the signal of the magnetic sensing device, it may be determined
whether or
not a page is positioned near the magnetic sensing device.
By way of another example, photodetectors may be placed within a base of the
book. For example, the photodetectors may be placed down the outside edge of
one
side of the base. Each page is configured into a suitable shape such that a
corresponding photodetector is covered or uncovered by the page, depending on
whether the page is turned. Similar to the embodiment of Figure ZF, the pages
may
have tabs (i. e., in place of each conductive strips 224) that each cover or
uncover a
corresponding photodetector (i. e., in place of each contact pair 222).
Any of the above described page detection mechanisms may be incorporated
within the book system of the present invention. In addition, other detection
mechanisms may be implemented for detecting other movable physical objects.
For
example, the book system may include any number of physical objects that may
be
placed and moved over the pages of the book system.
-17-

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
Figure 3 is digital photograph of a book system 103 having a plurality of
physical objects in accordance with one embodiment of the present invention.
As
shown, a book 104 rests on a base 101. The book system 103 also includes a
number
of moveable physical objects or tools (e.g., 106a through 106e), as well as a
fixed
physical object 106f. The tools may be placed within particular areas of
certain pages
of the physical book.
The tools may be used to accomplish tasks that are integrated into the story
line, for example. In one embodiment, the tools are used to obtain clues
regarding the
identity of a spy. Accordingly, a number of tools are described in reference
to
Figures 4A through 8B that help the user to determine the identity of a spy.
However,
different tools may be used with books having different story content. By way
of
example, a set of translation tools may be provided with a Spanish language
textbook.
In the illustrated embodiment of Figure 4A, the user may move decoding
device 106b over particular text portions 702a of a particular page 104b. As
shown in
Figure 4B, a decoded message 706 is displayed within audio-visual display
window
704. In this example, the user is given instructions on what to do next.
Specifically,
the user is instructed to go to a next page and use a audio tape reading tool
to obtain
audio clues. As shown in Figure 4C, a sound decoding tool 106a is moved over
one
of a plurality of tape strip representations (e.g., 708a through 708j).
In one embodiment, as the user moves the sound decoding tool 106a over a
selected tape, an audio sample is played to indicate clues for the user as to
the identity
of the spy. Sounds may be played when the user moves from left to right along
a tape
strip 708 , from right to left, or any combination therein. In this
embodiment, an
audio segment is played in a forward direction as the tool 106a is moved from
left to
right, while the same audio segment is played in reverse as the tool 106a is
moved
from right to left.
In sum, the audiovisual program may include techniques for recognizing
gestures, such as the tape reading gesture of the sound decoding tool 106a.
Any
suitable techniques may be implemented for recognizing this gesture, as well
as other
gestures. For example, when the sound decoding tool 106a is moved within a
tape
strip 708, an index to a sound file (e.g., a wave file) is calculated based on
the
position of the tool 106a along the strip 708. That is, if the tool 106a is
positioned on
the far left side of the strip 708, the index references the beginning of the
sound file.
As the tool 106a moves to the middle of the strip 708, the index also moves
(and
plays sounds) from the middle of the sound file to the end of the sound file.
-18-

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
Other tools may be provided that allow the user to search for hidden clues
that
are triggered by moving a particular tool over particular portions of a
particular page.
As shown in Figure SA, a magnifying device 106c may be placed over a
particular
portion of an image 710a of page 104e. As shown in Figure SB, when the
magnifier
device 106c is placed over a pyramid image 710a, a magnified view 710b is
displayed
in the audiovisual display 704.
Not only does the audiovisual display 704 include the image that is actually
shown on the physical book (i.e., 710a), but it also shows an additional
magnified
image 710b. Thus, by using the provided tools, the user gains access to a
wealth of
audiovisual material, in addition to the text and images that are provided
within the
physical book.
Similarly, a morphing tool (not shown) may be provided. When the morphing
tool is placed over a particular image within the physical book pages, the
particular
image is displayed by the audiovisual program and morphed into another image.
By
way of example, when user places the morphing tool over a photo of a person of
age
12, an image of the young person may be displayed and morphed into an older
person, such as the same person at age 50.
In addition to displaying additional visual images, sound may also be played
(e.g., through a computer's speakers) as the user moves a particular tool over
a
particular portion of a page. As shown in Figure 6A, a listening device 106d
is placed
on a particular window of a building on page 104~ Appropriate sounds that may
be
heard through a building's window are then played within the audiovisual
program
704. As shown in Figure 6B, a voice signal 712 is represented and displayed
within
the audiovisual program 704. Additionally, a person's voice is played through
the
computer speakers, for example. In the illustrated embodiment, the voice may
describe important clues or indicate where to find other clues that will then
help the
user to determine the identity of the spy.
In the illustrated embodiment, a tool is also provided that allows the user to
select certain image objects that are shown on a particular page of the book.
As
shown in Figure 7A, a selection tool 106c is used to select a particular
person within a
group of people as depicted within a photo on pages 104g and 104h. A
corresponding
audiovisual display may be generated based on the selected person. Within a
context
of a mystery, the audiovisual program may then indicate whether the selected
person
is the spy. As shown in Figure 7B, the audiovisual display 704 indicates that
the
selected person 714 is not the spy.
-19-

CA 02360940 2001-07-23
WO 00/45250 PCT/~JS00/02342
The tools may also be in the form of visual images that are provided
separately from the pages of the book. These separate images may be placed on
certain pages of the book. As shown in Figure 8A, a plurality of cards (e.g.,
106e and
106g) having different faces are provided. In this embodiment, the user
selects a
particular card or person 106g and places it within the center of page 104i.
In the abstract, this feature allows the user to add visual images to the
story
line. In this specific example, the user selects a card having a person that
they have
identified as the spy. An audiovisual program is then generated that indicates
whether
the user has correctly identified the spy. As shown in Figure 8B, the
audiovisual
display 704 indicates that the user has correctly identified the spy 716.
As described above, when a user moves a tool over a page of the book,
changes a state of an interactive device, or turns a page, a corresponding
audiovisual
program is generated. Any suitable mechanisms may be implemented for
generating
audiovisual programs that correspond to positions of various physical objects
within
the book system. In other words, the interfacing software for the physical
book may
include any suitable operations that accomplish the goal of generating
appropriate
audiovisual displays and/or sounds that correspond to a user's interactions
with a
physical book system.
Figure 9 is a flowchart illustrating a process 300 of interfacing with a book
system, such as the book system in Figure 1, in accordance with one embodiment
of
the present invention. Initially, a book system is provided in operation 302.
In
operation 304 portions of the book system are scanned to extract data. The
extracted
data includes information regarding the physical objects (e.g., tools, pages,
and
interactive devices) of the book system. For example, the extracted data
includes
positions and orientation for each physical object. Also, the extracted data
may
include identifiers associated with each physical object.
In a book system that includes electromagnetic sensing technology, the data is
extracted by initiating an excitation signal at a predetermined frequency on
an
antenna. The excitation signal is then stopped, and it is determined whether a
detected signal is present on the antenna from one or more resonators within
the book
system. The detected signal may include positional information related to the
responding resonators of the book system. The predetermined frequency may
correspond to an identity of a responding resonator. A succession of
excitation
signals having different predetermined frequencies may be initiated on the
antenna to
detect the presence, position, and identities of multiple physical objects
having
different resonating frequencies.
-20-

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
After the data is extracted from portions of the book, the extracted data is
interpreted in operation 306. For example, the extracted data may be analyzed
to
determine the relative positions and identities of various physical objects
within the
book unit. The extracted data may also be analyzed to determined the relative
positions and identities of the detected physical objects and various regions
of the
book unit. Operation 306 will be described in further detail in reference to
Figure 10.
After the data is interpreted, an audiovisual program is run that is based on
the
extracted data and any previously extracted and recorded data in operation
308. For
example, the relative position of regions of a particular page and the
previous physical
object positions may have been previously recorded and thereby accessed to
generate
a new audiovisual program. Operation 308 is explained in more detail below
with
reference to Figure 11.
The audiovisual program may include merely one visual frame or may include
one frame of a sequence of frames from a video. For example, the operation of
running an audiovisual program may initiate a continuous audiovisual sequence
(e.g.,
a Quicktime movie) or merely continue at a particular point within the movie.
By
way of another example, the audiovisual program may have only one frame. By
way
of another example, the audiovisual program may include an interactive game,
wherein the player is directed to perform certain tasks with the book and
physical
objects.
After the audiovisual program is initiated or continued, the extracted data
information is recorded in operation 310. After the extracted data is
recorded, it is
determined whether the end of the book has been reached in operation 312. This
may
be accomplished in any suitable manner. For example, when the book is closed,
this
may be detected and interpreted as the end of the book. If the end of the book
has not
been reached, the process returns to operation 304, where portions of the book
system
are scanned once again to extract data. After portions of the book system are
scanned
once again, operations 304 through 312 are repeated and may use the previously
recorded data. When the end of the book is reached, the process 300 ends until
it is
initiated again (e.g., when the book is opened).
Figure 10 is a flow chart illustrating the operation 306 of Figure 9 of
interpreting the extracted data in accordance with one embodiment of the
present
invention. Initially, the positions of any physical objects that are present
are
determined in operation 402. The book system may be designed such that
physical
objects may be placed on the pages of the book one at a time, or so that two
or more
physical objects may be placed simultaneously on the book pages. The
identifiers of
-21 -

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
any physical objects are determined in operation 404. For example, it may be
determined that a physical object is in the form of a magnifying device.
It is then determined whether any physical objects are associated with a
region
or " hot spot" on a particular page on operation 406. In other words, certain
areas of
each page may be defined as interactive zones. That is, when a physical object
is
placed within a hot zone, a corresponding audiovisual program is generated.
Alternatively, an entire page may be defined as an interactive zone. The hot
spots
that are associated with a physical object are then identified in operation
408. For
example, the page number and portion associated with the physical obj ect may
be
identified to determine what type of interaction will take place with the
particular
physical object. By way of specific example, a portion of a particular page
may be
identified as being magnified (i. e., by the audiovisual program) when a
magnifying
tool is placed on it.
The states of any interactive devices are then determined in operation 410.
The interactive devices may be in any suitable form for indicating a different
state of
a physical object. For example, a physical object may include a button which
may be
pushed in or let out. It is then determined whether the button is pushed in
operation
410. The process then returns to operation 308 of Figure 9 where an
audiovisual
program is executed that is based on the interpretation of the extracted data.
It should be well understood to those skilled in the art that operations 402
through operation 410 may be executed in any order that is appropriate for the
particular application. For example, operation 402 may be executed subsequent
to
operation 404. Additionally, other information from the extracted data may be
interpreted or not interpreted depending on the particular requirements of the
particular application. For example, if the book system does not include any
interactive devices, operation 410 is not required. By way of another example,
an
operation to determine the identification of the book itself may be included
in order to
aid in choosing which audiovisual program will be run in operation 308 of
Figure 9.
After the extracted data is interpreted, operation 308 of Figure 9 is
executed.
Figure 11 is a flow chart illustrating the operation 308 of Figure 9 for
executing the audiovisual program in accordance with one embodiment of the
present
invention. Initially, it is determined whether there are any physical objects
associated
with any hot spot regions in operation 502. A hot spot is defined as a
particular
portion on a particular page or an entire page. The hot spot regions may be
defined as
regions where moveable objects may be placed or where fixed interactive
devices,
such as a fixed push button, are located. If there are no physical objects
associated
-22-

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
with a hot spot region, the process returns to operation 310 of Figure 9, and
the
extracted data is recorded.
If there are physical objects associated with a hot spot, a physical object is
selected in operation 504. A physical object may be selected in any suitable
manner.
For example, a physical object which was placed upon the page first may be
selected
first. A corresponding audiovisual program is then executed in operation 506.
The
audiovisual program is based on the selected physical object, the physical
object
identifier, a state of any associated interactive device, and the associated
hot spot
identifier. In other words, the audiovisual program depends on which physical
object
is selected and where such physical object is positioned in relation to hot
spot regions.
Although not shown in Figure 1 l, the audiovisual program may also be based
on whether or not a page is currently being turned, regardless of whether or
not a
physical object is present. In one embodiment, if it is determined that a page
is being
turned, an interstitial audiovisual segment is generated. By way of a specific
example, if a first page having a jungle scene is turned to a second page
having a
desert scene, the interstitial audiovisual segment may include a video clip
that shows
a character leaving the jungle via an airplane, flying on the airplane from
the jungle to
the desert, and landing within the desert in the airplane.
After the audiovisual program runs its course, it is then determined whether
there are more physical objects to select in operation 508. If there are more
physical
objects to select, it is selected in operation 504. Another audiovisual
program is then
run in operation 506. However, if there are no more physical objects to
select, the
process returns to operation 310 of Figure 9 and the extracted data is
recorded for the
next execution of process 300.
Although the present invention has been described as being applicable to
interfacing with a book, of course, many of the described features may be
applied to
other types of entertainment devices, such as a board game or toy. For
example, some
of the described tools may be used within other types of entertainment
systems.
Figures 12, 13A, and 13B include digital photographs of an alternative
embodiment
of the present invention.
As shown in Figure 12, a game system 1200 includes a plurality of physical
objects (e.g., 1206a, 1204a, and 1202a) that may be positioned over a playing
surface
or base 1201. The base 1201 may be any shape and/or layout that is suitable
for a
particular entertainment device. In the embodiment shown, the base 1201 is a
board
game that includes flat portions and a raised portion (e.g., a bridge 1203).
The
- 23 -

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
surface pattern on the base 1201 may represent any suitable playing surface.
In the
embodiment shown, the base 1201 represents a backyard area.
The game system 1200 may include physical objects in the form of any of the
described tools of the present invention. Additionally, the game system may
include
any other suitable types of physical objects. In the illustrated embodiment,
the
physical objects may be categorized into three general types: character
objects,
environmental objects, and sensor objects. A character object is a physical
object that
represents a character within the game. For example, a character object may
represent
a person, animal, or any animated object. Generally, a character interacts
with other
game objects during the course of the game. Accordingly, the audiovisual
program
will include segments that show detected characters interacting with other
items based
in part on the position of detected character object.
An environmental object generally represents a secondary or background item
or an environmental aspect of the game. In other words, the audiovisual
program may
also be based on the detected position of an environmental object position,
but the
environmental object primarily influences the environment of the game, which
is
typically based more on the game's characters and their interactions with the
each
other and/or their surrounding environment. For example, an environmental
object
may be represented in the audiovisual segments) as an ancillary object in
relation to a
character object, which is typically the main focus. By way of another
example, an
environmental object may affect conditions or parameters of the audiovisual
segments. Several examples of environmental objects are further described in
U.S.
Patent Application No. 09/018,023 filed February, 2 1998 entitled " Computer
Method and Apparatus for Interacting with a Physical System" by Piernot, et
al,
which application is herein incorporated by reference in its entirety
The sensor object is generally a physical object that when detected, causes
the
audiovisual program to generate images and/or sounds within one or more
audiovisual segments) that are not perceived within the physical game system
(e.g.,
not displayed as part of the base pattern). For example, as described above,
visual or
auditory clues may be presented through the audiovisual program. By way of
another
example, the audiovisual segments) may include an image or sound that is
related to
a pattern on the base. In one embodiment, the audiovisual segment displays an
object
that may have been detected by the type of sensing device that is represented
by a
utilized sensor object. For example, when a sensor object represents a
magnifying
glass, the audiovisual segment will include a magnified view of a portion of
the base
that is located under the sensor object.
-24-

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
In the embodiment shown in Figure 12, the game system 1200 includes a
character object 1202a in the form of a little girl that moves over the base
1201 of the
game, which is in the form of the little girl's backyard. The character object
can take
any suitable form such as a person, an animal, a mythical creature or any kind
of
animated object. The game system 1200 also includes a sensor object 1204a in
the
form of a "magic crystal" that is used to reveal objects (i.e., via the
audiovisual
program) that are "buried" in the backyard 1201. The game system 1200 also
includes several environmental objects in the form of a gazebo 1210, a tree
1206a, a
hiding place 1208a, a tree stump 1212, and a fountain 1214.
Some environmental object's movement may be constrained. For example,
the tree stump 1212 is constrained in the x, y, and z positions, but may be
rotated to
select various game options (e.g., difficulty level). These game options may
affect
portions of one or more audiovisual segments or globally affect all
audiovisual
segments. Some environmental objects may be movable and positioned anywhere on
the base (e.g., the tree 1206a). In one embodiment, the audiovisual program
disregards the environmental object's presence (e.g., the object is not
represented
within the audiovisual segment) unless it is positioned close to the character
object
1202a.
Figure 13A is a screen shot illustrating a display output 1300 from an
audiovisual program based on the character object 1202a, the environmental
objects
1206a (the tree) and 1208a (the hiding place), and the sensing object 1204a
(magic
crystal) of Figure 12 in accordance with one embodiment of the present
invention. As
shown in the audiovisual segment 1300, a character 1202b that corresponds to
the
physical character object 1202a is displayed. The environmental objects that
are
positioned within a predetermined distance from the physical character 1202a
are also
displayed. As shown, a tree 1206b and a hiding place 1208b are displayed
relative to
the character 1202b.
Additionally, an enhanced view 1204b is displayed based on the position of
the sensor object 1204a. In the illustrated embodiment, the sensor object is
in the
form of a "magic crystal" that is deemed to detect underground objects (as
shown, a
mole 1204b) that are located underneath the base 1201. In other words, it
appears as
if the user is using the "magic crystal" 1204a to detect objects beneath the
base 1201
by moving the "magic crystal" 1204a across an area of the base. As the sensor
object
1204a is moved across the base 1204a, an underground view, which includes any
underground objects, of the base area is simulated and displayed within the
audiovisual segment. In sum, object detection within the physical game base
1201
via the sensor object 1204a is simulated by the audiovisual program.
- 25 -

CA 02360940 2001-07-23
WO 00/45250 PCT/IIS00/02342
The present invention may also include a gesture recognizer for identifying
gestures that are made using any of the physical objects. In other words, a
particular
movement sequence by a physical object may be detected and used to generate an
appropriate audiovisual segment. By way of example, the user may initiate a
"jump
rope" gesture by lifting the character object 1202a off of the base 1201. The
character is then represented within an audiovisual segment that includes a
turning
jump rope. The displayed character's jumping movements correspond to the
physical
character's jumping gesture. In other words, the displayed character's
movements are
synchronized with the physical character's movements.
The audiovisual segment may also include other features that are based on the
character object's movement. For example, if the physical character (and
displayed
character) fails to synchronize their jumps with the displayed jump rope's
turning, the
displayed character is shown as tripping over the rope or getting hit in the
head with
the rope.
Figure 13B is a screen shot illustrating a display output 1301 from an
audiovisual program based on a "jumping rope" gesture performed by character
object 1202a of Figure 12 in accordance with one embodiment of the present
invention. In the illustrated embodiment, a "jumping rope" gesture may be
initiated
by lifting the physical character 1202a off the base 1201. As shown in Figure
13B, a
displayed character 1202c that corresponds to the physical character 1202a is
shown
as engaging in a jumping rope activity. Two additional displayed characters
1302a
and 1302b hold and turn a jump rope 1304, and the main character 1202c jumps
or
trips over the rope or the rope strikes the upper body of the character 1202c,
depending on whether or not the character's jumps are synchronized with the
rope's
turning movements.
Figure 14 is a flowchart illustrating the process 1400 of analyzing a "jumping
rope" gesture of the character object of Figure 12 in accordance with one
embodiment of the present invention. Initially, the physical character
object's current
position (e.g., 1202a of Figure 12) is obtained in operation 1402. An
audiovisual
program then displays the character (e.g., 1202b of Figure 13A) based on the
obtained
position in operation 1404. It is then determined whether the physical
character's
(e.g., 1202a) position has changed in operation 1406. This determination step
is
repeated until the character changes position.
When it is determined that the character has changed position, it is then
determined whether the character's movement indicates a "jumping rope" gesture
in
operation 1410. Any suitable type of movement may indicate a gesture. For
-26-

CA 02360940 2001-07-23
WO 00/45250 PCT/~1500/02342
example, repeatedly moving the character up and down in the z direction to
indicate a
"jumping rope" gesture. A new audiovisual segment may then be activated based
on
the detected gesture. In the illustrated example, if a "jumping rope" gesture
is
initiated, the character (e.g., 1202c of Figure 13B) is then displayed as
jumping rope
(e.g., see Figure 13B) based on the position of the character.
The physical character's position is again repeatedly checked until a
movement is detected in operation 1414. After the physical character (e.g.,
1202a)
moves to a new position (e.g., changes its z position), it is then determined
whether
the physical character's movement is synchronized with the displayed jump
rope's
turning movements. That is, it is determined whether the physical object is
correctly
jumping over the displayed jump rope (e.g., 1304). In one embodiment, the
physical
character's z position is compared to the displayed jump rope's z position to
determine whether the displayed character has cleared the rope (or the rope
has
cleared the displayed character's head). This determination operation is
repeated until
the physical character becomes unsynchronized with the displayed jump rope's
timing. An appropriate audiovisual segment is then displayed for the physical
character's mistiming in operation 1418. For example, the displayed character
is
shown tripping over the displayed rope.
Although the foregoing invention has been described in some detail for
purposes of clarity of understanding, it will be apparent that certain changes
and
modifications may be practiced within the scope of the appended claims. For
example, although each page and object are described as having a corresponding
marker (e.g., resonator), of course, some of the pages or tools may not
include a
marker if they are not relevant to generating the audiovisual program.
Additionally, although only a specific set of tool types are described, of
course, the entertainment system may include any other tool types. For
example, the
system may include an x-ray tool that may be moved over any other physical
object,
such as a person or animal figure. As the user moves the x-ray tool over a
portion of
the figure, the audiovisual program will display an x-ray or inside view of
the figure
portion. As described above in reference to the game system of Figures 12
through
14, the x-ray tool may also be used to reveal objects that are deemed to be
hidden
under the game surface (e.g., buried objects). Similarly, the tool may have a
stethoscope function, wherein a heart beat, or other sound, may be heard as
the
stethoscope moves over the figure. By way of another example, the
entertainment
system may also include a flashlight tool that may also be used to reveal
hidden
objects within the physical game system. In one embodiment, as the flashlight
tool
moves over a surface, an audiovisual segment will include a lighted portion
that may
-27-

CA 02360940 2001-07-23
WO 00/45250 PCT/US00/02342
include other previously hidden objects of the surface or game base. By way of
final
example, a tool may function as a metal detector.
Also, the described interstitial feature may be applied to other types of
physical object movements, besides page turning. For example, when a physical
object in the form of a helicopter leaves the base at a first position and
returns to the
base at a second position, an appropriate interstitial audiovisual segment may
be
generated that shows a helicopter taking off from a first area, flying through
the air,
and landing at a second area. The interstitial segment may also be based on
the
amount of time that the helicopter object is in the air. For example, if the
helicopter
leaves the base for a relatively long amount of time, the interstitial segment
may show
the helicopter travelling a great distance (e.g., from one country to
another).
The gesture features may also be implemented within the described book
system, as well as the embodiment illustrated in Figures 12 through 13. For
example,
the book system may be configured to recognize gestures of a particular tool.
By way
of specific example, a digging tool may be used to make a " scratching"
movement
across a selected page portion. A corresponding audiovisual segment is then
generated to reveal previously hidden objects. That is, hidden objects are
uncovered
via the audiovisual program in response to making a "scratching" gesture over
the
hidden object.
Additionally, the entertainment systems of the present invention may include
mechanisms for providing differently scaled audiovisual segments based on
movements of a physical object. That is, as a physical object moves a distance
within
different regions of the game, different audiovisual segments may be displayed
such
that each segment represents this distance in different scales. In a specific
example,
when the physical object moves a distance within a first region of the game,
the
audiovisual segment represents that distance with a first scale (e.g., a
zoomed out
view). In contrast, when the physical object moves the same distance within a
second
region, the audiovisual segment represents that distance with a second scale
(e.g., a
zoomed in view).
Accordingly, the present embodiments are to be considered as illustrative and
not restrictive, and the invention is not to be limited to the details given
herein, but
may be modified within the scope and equivalents of the appended claims.
-28-

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

2024-08-01:As part of the Next Generation Patents (NGP) transition, the Canadian Patents Database (CPD) now contains a more detailed Event History, which replicates the Event Log of our new back-office solution.

Please note that "Inactive:" events refers to events no longer in use in our new back-office solution.

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Event History , Maintenance Fee  and Payment History  should be consulted.

Event History

Description Date
Inactive: IPC expired 2019-01-01
Inactive: IPC expired 2014-01-01
Inactive: IPC expired 2014-01-01
Inactive: IPC from MCD 2006-03-12
Inactive: IPC from MCD 2006-03-12
Inactive: IPC from MCD 2006-03-12
Inactive: IPC from MCD 2006-03-12
Inactive: IPRP received 2004-03-10
Application Not Reinstated by Deadline 2003-10-24
Inactive: Dead - No reply to Office letter 2003-10-24
Deemed Abandoned - Failure to Respond to Maintenance Fee Notice 2003-01-31
Inactive: Status info is complete as of Log entry date 2002-12-06
Inactive: Abandoned - No reply to Office letter 2002-10-24
Inactive: Applicant deleted 2002-05-15
Inactive: Cover page published 2001-12-11
Inactive: Courtesy letter - Evidence 2001-12-04
Inactive: First IPC assigned 2001-11-29
Inactive: Notice - National entry - No RFE 2001-11-29
Application Received - PCT 2001-11-19
Inactive: Correspondence - Formalities 2001-11-16
Inactive: Correspondence - Transfer 2001-11-16
Application Published (Open to Public Inspection) 2000-08-03

Abandonment History

Abandonment Date Reason Reinstatement Date
2003-01-31

Maintenance Fee

The last payment was received on 2001-12-05

Note : If the full payment has not been received on or before the date indicated, a further fee may be required which may be one of the following

  • the reinstatement fee;
  • the late payment fee; or
  • additional fee to reverse deemed expiry.

Patent fees are adjusted on the 1st of January every year. The amounts above are the current amounts if received by December 31 of the current year.
Please refer to the CIPO Patent Fees web page to see all current fee amounts.

Fee History

Fee Type Anniversary Year Due Date Paid Date
Basic national fee - standard 2001-07-23
MF (application, 2nd anniv.) - standard 02 2002-01-31 2001-12-05
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
INTERLEGO AG
Past Owners on Record
ADAM C. JORDON
CHRISTOPHER H. SCHMIDT
DAVID W. LAITURI
DESPINA PAPADOPOULOS
JENNY DANA WIRTSCHAFTER
PHILIPPE P. PIERNOT
ROBIN G. PETRAVIC
WILLIAM A. CESAROTTI
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Representative drawing 2001-12-02 1 5
Drawings 2001-07-22 23 2,655
Description 2001-07-22 28 1,679
Claims 2001-07-22 13 714
Abstract 2001-07-22 1 58
Reminder of maintenance fee due 2001-11-28 1 112
Notice of National Entry 2001-11-28 1 195
Request for evidence or missing transfer 2002-07-23 1 109
Courtesy - Abandonment Letter (Office letter) 2002-11-27 1 167
Courtesy - Abandonment Letter (Maintenance Fee) 2003-03-02 1 179
PCT 2001-07-22 17 649
Correspondence 2001-11-28 1 24
Correspondence 2001-11-15 3 109
PCT 2001-10-04 2 105
PCT 2001-07-23 11 440