Language selection

Search

Patent 2915024 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent Application: (11) CA 2915024
(54) English Title: ENHANCED ELECTRONIC GAMING MACHINE WITH X-RAY VISION DISPLAY
(54) French Title: MACHINE DE JEU ELECTRONIQUE EQUIPEE D'UN AFFICHEUR A VISON DE RAYON X
Status: Dead
Bibliographic Data
(51) International Patent Classification (IPC):
  • G07F 17/32 (2006.01)
  • A63F 13/21 (2014.01)
  • A63F 13/22 (2014.01)
(72) Inventors :
  • COREY, AARON (Canada)
  • FROY, DAVID (Canada)
(73) Owners :
  • IGT CANADA SOLUTIONS ULC (Canada)
(71) Applicants :
  • IGT CANADA SOLUTIONS ULC (Canada)
(74) Agent: SMART & BIGGAR LP
(74) Associate agent:
(45) Issued:
(22) Filed Date: 2015-12-11
(41) Open to Public Inspection: 2017-06-11
Availability of licence: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): No

(30) Application Priority Data: None

Abstracts

English Abstract


An electronic gaming machine to play an interactive game where a player's eye
gaze acts as
x-ray vision. A graphics processor generates an interactive game environment
and defines a
viewing area as its subset. A display device displays the viewing area, where
a visible game
component masks an invisible game component. A display controller controls
rendering of the
viewing area on the display device using the graphics processor. At least one
data capture
camera device continuously monitors player eye gaze to collect player eye gaze
data. A game
controller calculates a player eye gaze location relative to the viewing area,
the location
corresponding to the invisible game component, and triggers a control command
to the display
controller. In response, the display controller controls the display device in
real-time to provide a
graphical animation effect representative of a visual update to the visible
game component to
reveal the invisible game component.


Claims

Note: Claims are shown in the official language in which they were submitted.


- 57 -
WHAT IS CLAIMED IS:
1. An electronic gaming machine comprising:
a card reader to identify a monetary amount conveyed by a token to the
electronic gaming machine;
at least one data storage device to store game data for an interactive game;
a graphics processor to generate an interactive game environment in accordance

with the game data and define a viewing area as a subset of the interactive
game
environment, the viewing area having a visible game component masking or
blocking an invisible game component;
a display device to display via a user interface the viewing area;
a display controller to control rendering of the viewing area on the display
device
using the graphics processor;
at least one data capture camera device to continuously monitor eye gaze of a
player to collect player eye gaze data relative to the display device;
a game controller for calculating a location of the eye gaze of the player
relative
to the viewing area using the player eye gaze data, the location of the eye
gaze
corresponding to the invisible game component, and triggering a control
command to the display controller to dynamically update the rendering of the
viewing area based on the player eye gaze data and the location of the eye
gaze;
in response to the control command, the display controller controls the
display
device in real-time or near real-time using the graphics processor to
dynamically
update the rendering of the viewing area to provide a real-time or near real-
time
graphical animation effect displayed on the display device representative of a

visual update to the visible game component to reveal the invisible game
component in the viewing area; and
in response to an outcome of the interactive game, the card reader updates the

monetary amount using the token.


- 58 -
2. The electronic gaming machine of claim 1, wherein the display controller
controls the
display device to display a plurality of calibration symbols, wherein the at
least one data
capture camera device monitors the eye gaze of the player in relation to the
calibration
symbols to collect calibration data, and wherein the game controller
calibrates the at
least one data capture camera device and the display device based on the
calibration
data.
3. The electronic gaming machine of claim 1 or claim 2, wherein the player
eye gaze data
comprises a position and a focus, the position defined as coordinates of the
player's
eyes relative to the display device, the focus defined as a line of sight
relative to the
display device.
4. The electronic gaming machine of any one of claims 1 to 3, wherein the
game controller
determines the location of the eye gaze of the player relative to the viewing
area by
identifying coordinates on the display device corresponding to the player eye
gaze data
and mapping the coordinates to the viewing area.
5. The electronic gaming machine of any one of claims 1 to 4, wherein the
game controller
defines a filter movement threshold, wherein the game controller, prior to
determining
the location of the eye gaze of the player relative to the viewing area and
triggering the
control command to the display controller to dynamically update the rendering
of the
viewing area, determines that the player eye gaze meets the filter movement
threshold.
6. The electronic gaming machine of any one of claims 1 to 5, wherein the
game controller
predicts the location of the eye gaze of the player relative to the viewing
area at a future
time using the player eye gaze data to facilitate dynamic update of the
rendering of the
viewing area.
7. The electronic gaming machine of any one of claims 1 to 6, wherein the
at least one data
capture camera device continuously monitors an area proximate to the
electronic
gaming machine to collect proximity data, wherein the game controller detects
a location
of the player relative to the electronic gaming machine based on the proximity
data, and
triggers the display controller to display an advertisement on the display
device.
8. The electronic gaming machine of any one of claims 1 to 7, wherein the
display
controller renders a gaze-sensitive user interface on the display device,
wherein the

- 59 -
game controller detects the location of the eye gaze of the player relative to
the gaze-
sensitive user interface using the player eye gaze data, and triggers the
display
controller to dynamically update the rendering of the gaze-sensitive user
interface to
provide a real-time or near real-time graphical animation effect displayed on
the display
device representative of a visual update to the gaze-sensitive user interface.
9. The electronic gaming machine of any one of claims 1 to 8, wherein the
graphics
processor generates left and right eye images based on a selected three-
dimensional
intensity level, wherein the display device is a stereoscopic display device,
and wherein
the game controller triggers the control command to the display controller to
dynamically
update the rendering of the of the left and right eye images based on the
player eye
gaze data.
10. The electronic gaming machine of any one of claims 1 to 9, wherein the
graphical
animation effect represents looking behind the visible game component masking
or
blocking the invisible game component to reveal the invisible game component.
11. The electronic gaming machine of any one of claims 1 to 10, wherein the
graphical
animation effect represents selecting the revealed invisible game component.
12. The electronic gaming machine of any one of claims 1 to 11, wherein the
graphical
animation effect represents seeing through or rendering transparent the
visible game
component masking or blocking the invisible game component to reveal the
invisible
game component.
13. The electronic gaming device of any one of claims 1 to 12, wherein the
game controller
detects movement of the eye gaze to another location, the location
corresponding to an
additional invisible game component that is masked or blocked by the visible
game
component or another visible game component, and wherein the graphical
animation
effect represents updating the visible game component or the other visible
game
component to reveal the additional invisible game component.
14. The electronic gaming machine of any one of claims 1 to 13, wherein the
at least one
data capture camera device monitors an eye gesture of the player to collect
player eye
gesture data, and wherein the game controller triggers the control command to
the
display controller to dynamically update the rendering of the viewing area
based on the

- 60 -
player eye gesture data using the graphical animation effect to reveal the
invisible game
component in the viewing area based on the player eye gesture data.
15. The electronic gaming device of any one of claims 1 to 14, wherein the
at least one data
capture camera device is configured to collect player movement data associated
with
movement of the player's head and wherein the graphical animation effect
reveals the
invisible game component based on the movement of the player's head.
16. The electronic gaming device of any one of claims 1 to 15, wherein the
at least one data
capture camera device is configured to collect player movement data associated
with
movement of a part of the player's body and wherein the graphical animation
effect
reveals the invisible game component based on the movement of a part of the
player's
body.
17. The electronic gaming device of any one of claims 1 to 16, wherein the
at least one data
capture camera device is configured to collect player movement data associated
with a
gesture by the player and wherein the graphical animation effect reveals the
invisible
game component based on the gesture by the player.
18. The electronic gaming device of any one of claims 14 to 17, wherein the
game controller
detects the eye gesture of the player and the player movement relative to an
additional
location in the viewing area corresponding to another invisible game component
using
the player eye gesture data and player movement data, and triggers the display

controller to dynamically update the rendering of the viewing area based on
the player
eye gesture data and player movement data using the graphical animation effect
to
reveal the other invisible game component in the viewing area.
19. The electronic gaming device of any one of claims 1 to 18, wherein the
invisible game
component is a graphical element with levers that is masked or blocked by the
visible
game component, wherein the location of the eye gaze data corresponds to the
visible
game component, wherein the graphical animation effect represents seeing
through or
rendering transparent the visible game component to reveal the graphical
element with
levers and manipulating the levers to move or rotate the graphical element
based on the
eye gaze data.

- 61 -
20. The electronic gaming device of any one of claims 1 to 19, wherein the
invisible game
component is a graphical element of a series of switches and circuits, wherein
the
graphical animation effect represents revealing a portion of the switches and
circuits,
and wherein the game controller detects selection of a switch or circuit in
the portion of
the switches and circuits using the eye gaze data, the selection triggering a
prize award.
21. The electronic gaming device of any one of claims 1 to 20, wherein the
graphics
processor generates a fog effect within the viewing area masking or blocking
the
invisible game component, and wherein the graphical animation effect
represents a
transparent circle within the fog effect to reveal the invisible game
component.
22. The electronic gaming device of claim 21, wherein the game controller
detects the eye
gaze at the location for a predetermined time period and wherein the graphical
animation
effect and the visual update represents expanding the transparent circle to
reveal an
additional invisible game component.
23. The electronic gaming device of claim 21 or claim 22, wherein the game
controller
detects movement of the eye gaze to another location, the another location
corresponding to an additional invisible game component, and wherein the
graphical
animation effect and the visual update represents moving the transparent
circle to reveal
the additional invisible game component.
24. The electronic gaming device of any one of claims 1 to 23, wherein the
invisible game
component is a graphical element of one or more avatars carrying a hidden
document,
wherein the graphical animation effect represents revealing a portion of the
graphical
element of one or more avatars to reveal the hidden document, and wherein the
game
controller detects selection of the one or more avatars carrying the hidden
document
using the eye gaze data, the selection triggering a prize award.
25. The electronic gaming device of any one of claims 1 to 24, wherein the
electronic
gaming device is in communication with one or more other electronic gaming
devices,
and wherein the at least one data storage devices stores game data for a
primary multi-
player interactive game and a bonus multi-player interactive game.
26. The electronic gaming device of claim 25, wherein the invisible game
component is a
bonus game component of a set of bonus game components, wherein the graphical

- 62 -
animation effect represents revealing and selecting the first bonus game
component,
and wherein the game controller detects selection of a subset of bonus game
components using the eye gaze data, the selection triggering a bonus prize
award.
27. The electronic gaming device of claim 25 or claim 26, wherein the
invisible game
component is a first bonus game component of a set of bonus game components,
wherein the graphical animation effect represents revealing and rejecting the
bonus
game component, and wherein the game controller detects rejection of the first
bonus
game component using the eye gaze data, the rejecting of the bonus game
component
triggering the display controller to display on the display device a second
bonus game
component and the display controller of the other electronic gaming device to
display on
the display device of the other electronic gaming device the first bonus game
component.
28. The electronic gaming device of any one of claims 25 to 27, wherein the
invisible game
component is at least a portion of the viewing area of the other electronic
gaming
devices, the viewing area of the other electronic gaming devices having
another visible
game component, wherein the graphical animation effect represents seeing
through or
rendering transparent the visible game component to reveal the portion of the
viewing
area of the other electronic gaming devices, and wherein the game controller
detects a
bonus activation based on the visible game component and the another visible
game
component, the bonus activation triggering a bonus prize award.
29. The electronic gaming machine of any one of claims 1 to 28 wherein the
viewing area
has a plurality of invisible game components, and wherein the graphical
animation effect
and the visual update renders visible at least a portion of the invisible game

components.
30. An electronic gaming machine comprising:
a card reader to identify a monetary amount conveyed by a token to the
electronic gaming machine;
at least one data storage device to store game data for one or more primary
interactive games and one or more bonus interactive games;


-63-

a graphics processor to generate an interactive game environment in accordance

with the game data and define a viewing area as a subset of the interactive
game
environment, the viewing area having a visible game component masking or
blocking an invisible game selector symbol;
a display device to display via a user interface the viewing area;
a display controller to control rendering of the viewing area on the display
device
using the graphics processor;
at least one data capture camera device to continuously monitor eye gaze of a
player to collect player eye gaze data;
a game controller for calculating a location of the eye gaze of the player
relative
to the viewing area using the player eye gaze data, the location corresponding
to
the invisible game selector symbol, and triggering a control command to the
display controller to dynamically update the rendering of the viewing area
based
on the player eye gaze data and the location;
in response to the control command, the display controller controls the
display
device in real-time or near real-time using the graphics processor to
dynamically
update the rendering of the viewing area to provide a real-time or near real-
time
graphical animation effect displayed on the display device representative of a

visual update to the visible game component to reveal and select the invisible

game selector symbol in the viewing area and displaying a selected interactive

game for the selected invisible game selector symbol; and
in response to an outcome of the selected interactive game, the card reader
updates the monetary amount.
31. An electronic gaming machine comprising:
a card reader to identify a monetary amount conveyed by a token to the
electronic gaming machine;
at least one data storage device to store game data for an interactive game;


-64-

a graphics processor to generate an interactive game environment in accordance

with a set of game rules using the game data and define a viewing area as a
first
subset of the interactive game environment, the first subset of the
interactive
game environment having a first visible game component masking or blocking a
first invisible game component;
a display device to display via a user interface the viewing area;
a display controller to control rendering of the viewing area on the display
device
using the graphics processor;
at least one data capture camera device to continuously monitor eye gaze of a
player to collect player eye gaze data;
a game controller for calculating a location of the eye gaze of the player
relative
to the viewing area using the player eye gaze data, the location corresponding
to
the invisible game component, and triggering a control command to the display
controller to dynamically update the rendering of the viewing area based on
the
player eye gaze data and the location;
in response to the control command, the display controller controls the
display
device in real-time or near real-time using the graphics processor to
dynamically
update the rendering of the viewing area in real-time or near real-time to
navigate
to a second subset of the interactive game environment, the second subset of
the interactive game environment having a second visible game component
masking or blocking a second invisible game component, wherein the update
comprises a graphical animation effect displayed on the display device
representative of navigating to the second subset of the interactive game
environment; and
in response to an outcome of the interactive game, the card reader updates the

monetary amount.

Description

Note: Descriptions are shown in the official language in which they were submitted.


CA 02915024 2015-12-11
ENHANCED ELECTRONIC GAMING MACHINE WITH X-RAY VISION
DISPLAY
FIELD
[0001] Embodiments described herein relate to the field of electronic gaming
machines. The
embodiments described herein particularly relate to the field of providing an
enhanced electronic
gaming machine with an interactive display to provide x-ray vision effects
based on a player's
eye gaze.
INTRODUCTION
[0002] Casinos and other establishments may have video gaming terminals that
may include
game machines, online gaming systems (that enable users to play games using
computer
devices, whether desktop computers, laptops, mobile devices, tablet computers
or smart
phones), computer programs for use on a computer device (including desktop
computer,
laptops, mobile devices, tablet computers or smart phones), or gaming consoles
that are
connectable to a display such as a television or computer screen.
[0003] Video gaming terminals may be configured to enable users to play games
with a touch
interface. Example games may be a slot machine game, which may involve a reel
of symbols
that may move by pulling a lever to activate the reel of symbols. A user may
win a prize based
on the symbols displayed on the reel. In addition to slot machine games, video
gaming
machines may be configured to enable users to play a variety of different
types of games, which
may involve displaying one or more game components on a display screen. To
interact with a
game component of the game, the user may have to press a button that is part
of the machine
hardware, or the user may have to touch a button displayed on a display
screen.
[0004] A casino or other establishment has finite space, so it can operate
only a certain
number of video gaming terminals. The size of a video gaming terminal may be
limited by its
hardware, which may limit the number of game components, buttons, or
interfaces that can be
displayed on a display screen. This hardware limitation may also limit the
amount of and types
of physical interactions that a user may engage in with the machine to play
the game. For
convenience to the player, a casino or another establishment may want the
player to have
different experiences while playing at the same video gaming terminal.
However, since a video

CA 02915024 2015-12-11
- 2 -
gaming terminal and its associated hardware have finite size, there may be a
limit on the
number of buttons or physical elements present on the gaming terminal.
[0005] There is a need to immerse the user in their gaming experience while at
the same
video gaming terminal, and there is a further need to make more efficient use
of the physical
limitations of the hardware of the video gaming terminal. Therefore it is
necessary to innovate
by launching new and engaging electronic game machines with improved hardware
where the
player can interact with the interactive game using their eye gaze.
SUMMARY
[0006] In one aspect, there is provided an electronic gaming machine that
comprises a card
reader to identify a monetary amount conveyed by a player to the electronic
gaming machine, at
least one data storage device to store game data for an interactive game, a
graphics processor
to generate an interactive game environment in accordance with a set of game
rules using the
game data and define a viewing area as a subset of the interactive game
environment, the
viewing area having a visible game component masking or blocking an invisible
game
component, a display device to display via a user interface the viewing area,
a display controller
to control rendering of the viewing area on the display device using the
graphics processor, at
least one data capture camera device to continuously monitor eye gaze of a
player to collect
player eye gaze data, and a game controller for calculating a location of the
eye gaze of the
player relative to the viewing area using the player eye gaze data, the
location corresponding to
the invisible game component, and triggering a control command to the display
controller to
dynamically update the rendering of the viewing area based on the player eye
gaze data and
the location. In response to the control command, the display controller
controls the display
device in real-time or near real-time using the graphics processor to
dynamically update the
rendering of the viewing area to provide a real-time or near real-time
graphical animation effect
displayed on the display device representative of a visual update to the
visible game component
to reveal the invisible game component in the viewing area. In response to an
outcome of the
interactive game, the card reader updates the monetary amount.
[0007] In some embodiments, the display controller controls the display device
to display a
plurality of calibration symbols, wherein the at least one data capture camera
device monitors
the eye gaze of the player in relation to the calibration symbols to collect
calibration data, and

CA 02915024 2015-12-11
- 3 -
wherein the game controller calibrates the at least one data capture camera
device and the
display device based on the calibration data.
[0008] In some embodiments, the player eye gaze data comprises a position and
a focus, the
position defined as coordinates of the player's eyes relative to the display
device, the focus
defined as a line of sight relative to the display device.
[0009] In some embodiments, the game controller determines the location of the
eye gaze of
the player relative to the viewing area by identifying coordinates on the
display device
corresponding to the player eye gaze data and mapping the coordinates to the
viewing area.
[0010] In some embodiments, the game controller defines a filter movement
threshold,
wherein the game controller, prior to determining the location of the eye gaze
of the player
relative to the viewing area and triggering the control command to the display
controller to
dynamically update the rendering of the viewing area, determines that the
player eye gaze
meets the filter movement threshold.
[0011] In some embodiments, the game controller predicts the location of the
eye gaze of the
player relative to the viewing area at a future time using the player eye gaze
data to facilitate
dynamic update to the rendering of the viewing area.
[0012] In some embodiments, the at least one data capture camera device
continuously
monitors an area proximate to the electronic gaming machine to collect
proximity data, wherein
the game controller detects a location of the player relative to the
electronic gaming machine
based on the proximity data, and triggers the display controller to display an
advertisement on
the display device.
[0013] In some embodiments, the display controller renders a gaze-sensitive
user interface
on the display device, wherein the game controller detects the location of the
eye gaze of the
player relative to the gaze-sensitive user interface using the player eye gaze
data, and triggers
the display controller to dynamically update the rendering of the gaze-
sensitive user interface to
provide a real-time or near real-time graphical animation effect displayed on
the display device
representative of a visual update to the gaze-sensitive user interface.
[0014] In some embodiments, the graphics processor generates left and right
eye images
based on a selected three-dimensional intensity level, wherein the display
device is a

CA 02915024 2015-12-11
- 4 -
stereoscopic display device, and wherein the game controller triggers the
control command to
the display controller to dynamically update the rendering of the of the left
and right eye images
based on the player eye gaze data.
[0015] In some embodiments, the graphical animation effect represents looking
behind the
visible game component masking or blocking the invisible game component to
reveal the
invisible game component.
[0016] In some embodiments, the graphical animation effect represents
selecting the
revealed invisible game component.
[0017] In some embodiments, the graphical animation effect represents seeing
through or
rendering transparent the visible game component masking or blocking the
invisible game
component to reveal the invisible game component.
[0018] In some embodiments, the game controller detects movement of the eye
gaze to
another location, the location corresponding to an additional invisible game
component that is
masked or blocked by the visible game component or another visible game
component, and
wherein the graphical animation effect represents updating the visible game
component or the
other visible game component to reveal the additional invisible game
component.
[0019] In some embodiments, the at least one data capture camera device
monitors an eye
gesture of the player to collect player eye gesture data, and wherein the game
controller triggers
the control command to the display controller to dynamically update the
rendering of the viewing
area based on the player eye gesture data using the graphical animation effect
to reveal the
invisible game component in the viewing area based on the player eye gesture
data.
[0020] In some embodiments, the at least one data capture camera device is
configured to
collect player movement data associated with movement of the player's head and
wherein the
graphical animation effect reveals the invisible game component based on the
movement of the
player's head.
[0021] In some embodiments, the at least one data capture camera device is
configured to
collect player movement data associated with movement of a part of the
player's body and
wherein the graphical animation effect reveals the invisible game component
based on the
movement of a part of the player's body.

CA 02915024 2015-12-11
- 5 -
[0022] In some embodiments, the at least one data capture camera device is
configured to
collect player movement data associated with a gesture by the player and
wherein the graphical
animation effect reveals the invisible game component based on the gesture by
the player.
[0023] In some embodiments, the game controller detects the eye gesture of the
player and
the player movement relative to an additional location in the viewing area
corresponding to
another invisible game component using the player eye gesture data and player
movement
data, and triggers the display controller to dynamically update the rendering
of the viewing area
based on the player eye gesture data and player movement data using the
graphical animation
effect to reveal the other invisible game component in the viewing area.
[0024] In some embodiments, the invisible game component is a graphical
element with
levers that is masked or blocked by the visible game component, wherein the
location of the eye
gaze data corresponds to the visible game component, wherein the graphical
animation effect
represents seeing through or rendering transparent the visible game component
to reveal the
graphical element with levers and manipulating the levers to move or rotate
the graphical
element based on the eye gaze data.
[0025] In some embodiments, the invisible game component is a graphical
element of a
series of switches and circuits, wherein the graphical animation effect
represents revealing a
portion of the switches and circuits, and wherein the game controller detects
selection of a
switch or circuit in the portion of the switches and circuits using the eye
gaze data, the selection
triggering a prize award.
[0026] In some embodiments, the graphics processor generates a fog effect
within the
viewing area masking or blocking the invisible game component, and wherein the
graphical
animation effect represents a transparent circle within the fog effect to
reveal the invisible game
component.
[0027] In some embodiments, the game controller detects the eye gaze at the
location for a
predetermined time period and wherein the graphical animation effect and the
visual update
represents expanding the transparent circle to reveal additional invisible
game component.
[0028] In some embodiments, the game controller detects movement of the eye
gaze to
another location, the location corresponding to an additional invisible game
component, and

CA 02915024 2015-12-11
- 6 -
wherein the graphical animation effect and the visual update represents moving
the transparent
circle to reveal the additional invisible game component.
[0029] In some embodiments, the invisible game component is a graphical
element of one or
more avatars carrying a hidden document, wherein the graphical animation
effect represents
revealing a portion of the avatars to reveal the hidden document, and wherein
the game
controller detects selection of the avatar carrying the hidden document using
the eye gaze data,
the selection triggering a prize award.
[0030] In some embodiments, the electronic gaming device is in communication
with one or
more other electronic gaming devices, and wherein the at least one data
storage devices stores
game data for a primary multi-player interactive game and a bonus multi-player
interactive
game.
[0031] In some embodiments, the invisible game component is a bonus game
component of a
set of bonus game components, wherein the graphical animation effect
represents revealing
and selecting the first bonus game component, and wherein the game controller
detects
selection of a subset of bonus game components using the eye gaze data, the
selection
triggering a bonus prize award.
[0032] In some embodiments, the invisible game component is a first bonus game
component
of a set of bonus game components, wherein the graphical animation effect
represents
revealing and rejecting the bonus game component, and wherein the game
controller detects
rejection of the first bonus game component using the eye gaze data, the
rejection triggering the
display controller to display on the display device a second bonus game
component and the
display controller of the other electronic gaming device to display on the
display device of the
other electronic gaming device the first bonus game component.
[0033] In some embodiments, the invisible game component is at least a portion
of the
viewing area of the other electronic gaming devices, the viewing area of the
other electronic
gaming devices having another visible game component, wherein the graphical
animation effect
represents seeing through or rendering transparent the visible game component
to reveal the
portion of the viewing area of the other electronic gaming devices, and
wherein the game
controller detects a bonus activation based on the visible game component and
the another
visible game component, the bonus activation triggering a bonus prize award.

CA 02915024 2015-12-11
- 7 -
[0034] In some embodiments, the viewing area has a plurality of invisible game
components,
and wherein the graphical animation effect and the visual update renders
visible at least a
portion of the invisible game components.
[0035] In another aspect, there is provided an electronic gaming machine that
comprises a
card reader to identify a monetary amount conveyed by a player to the
electronic gaming
machine, at least one data storage device to store game data for one or more
primary
interactive games and one or more bonus interactive games, a graphics
processor to generate
an interactive game environment in accordance with a set of game rules using
the game data
and define a viewing area as a subset of the interactive game environment, the
viewing area
having a visible game component masking or blocking an invisible game selector
symbol, a
display device to display via a user interface the viewing area, a display
controller to control
rendering of the viewing area on the display device using the graphics
processor, at least one
data capture camera device to continuously monitor eye gaze of a player to
collect player eye
gaze data, and a game controller for calculating a location of the eye gaze of
the player relative
to the viewing area using the player eye gaze data, the location corresponding
to the invisible
game selector symbol, and triggering a control command to the display
controller to dynamically
update the rendering of the viewing area based on the player eye gaze data and
the location. In
response to the control command, the display controller controls the display
device in real-time
or near real-time using the graphics processor to dynamically update the
rendering of the
viewing area to provide a real-time or near real-time graphical animation
effect displayed on the
display device representative of a visual update to the visible game component
to reveal and
select the invisible game selector symbol in the viewing area and displaying a
selected
interactive game for the selected invisible game selector symbol. In response
to an outcome of
the selected interactive game, the card reader updates the monetary amount.
[0036] In another aspect, there is provided an electronic gaming machine that
comprises a
card reader to identify a monetary amount conveyed by a player to the
electronic gaming
machine, at least one data storage device to store game data for an
interactive game, a
graphics processor to generate an interactive game environment in accordance
with a set of
game rules using the game data and define a viewing area as a first subset of
the interactive
game environment, the first subset of the interactive game environment having
a first visible
game component masking or blocking a first invisible game component, a display
device to
display via a user interface the viewing area, a display controller to control
rendering of the

CA 02915024 2015-12-11
=
,
' - 8 -
viewing area on the display device using the graphics processor, at least one
data capture
camera device to continuously monitor eye gaze of a player to collect player
eye gaze data, and
a game controller for calculating a location of the eye gaze of the player
relative to the viewing
area using the player eye gaze data, the location corresponding to the
invisible game
component, and triggering a control command to the display controller to
dynamically update
the rendering of the viewing area based on the player eye gaze data and the
location. In
response to the control command, the display controller controls the display
device in real-time
or near real-time using the graphics processor to dynamically update the
rendering of the
viewing area in real-time or near real-time to navigate to a second subset of
the interactive
game environment, the second subset of the interactive game environment having
a second
visible game component masking or blocking a second invisible game component,
wherein the
update comprises a graphical animation effect displayed on the display device
representative of
navigating to the second subset of the interactive game environment. In
response to an
outcome of the interactive game, the card reader updates the monetary amount.
[0037] Further features and combinations thereof concerning embodiments are
described.
DESCRIPTION OF THE FIGURES
[0038] Fig. 1 is a perspective view of an electronic gaming machine for
implementing the
gaming enhancements according to some embodiments;
[0039] Fig. 2A is a schematic diagram of an electronic gaming machine linked
to a casino
host system according to some embodiments;
[0040] Fig. 2B is a schematic diagram of an exemplary online implementation of
a computer
system and online gaming system according to some embodiments;
[0041] Fig. 3 is a schematic diagram illustrating a calibration process
for the electronic
gaming machine according to some embodiments;
[0042] Fig. 4 is a schematic diagram illustrating the mapping of a player's
eye gaze to the
viewing area according to some embodiments;
[0043] Fig. 5 is a flowchart diagram of a method implemented by an electronic
gaming
machine according to some embodiments;

CA 02915024 2015-12-11
r . - 9 -
[0044] Fig. 6 is a schematic diagram illustrating an electronic gaming machine
displaying an
advertisement based on collected proximity data according to some embodiments;
[0045] Figs. 7A and 7B are schematic diagrams illustrating a gaze-sensitive
user interface
according to some embodiments;
[0046] Fig. 8 is a schematic illustrating an electronic gaming machine with a
stereoscopic 3D
screen where the player can interact with objects displayed on the
stereoscopic 3D screen with
the player's eye gaze according to some embodiments;
[0047] Figs. 9 to 13 are schematic diagrams illustrating how a player may
reveal a hidden
prize and select the prize using the player's eye gaze, according to some
embodiments;
[0048] Figs. 14 to 16 are schematic diagrams illustrating how a player may
reveal a hidden
prize, according to some embodiments; and
[0049] Figs. 17 and 18 are schematic diagrams that illustrate navigating from
one subset of
the interactive game environment to a second subset of the interactive game
environment
according to some embodiments.
DETAILED DESCRIPTION
[0050] Embodiments described herein relate to an enhanced electronic gaming
machine
(EGM) where the player can play an interactive game using their eye gaze, and
where their eye
gaze acts as x-ray vision. The EGM may have a card reader to identify the
amount of money
that a player conveys to the EGM. The EGM may have at least one data storage
device to
store game data for an interactive game. The graphics processor of the EGM may
be
configured to generate an interactive game environment using the game data of
an interactive
game. The display device of the EGM may display a viewing area, which may be a
portion of
the interactive game environment that has a visible game component masking or
blocking an
invisible game component. The EGM may include at least one data capture camera
device to
continuously monitor the eye gaze of the player to collect player eye gaze
data. The EGM may
have a game controller that can determine the location of the eye gaze of the
player relative to
the viewing area by mapping the location of the player eye gaze on the display
device to the
viewing area. The game controller may trigger a control command to the display
controller of
the EGM to dynamically update the rendering of the viewing area based on the
player eye gaze
data. In response to the control command, the display controller may control
the display device

CA 02915024 2015-12-11
'
. . - l 0 -
in real-time or near real-time using the graphics processor to dynamically
update the rendering
of the viewing area to provide a real-time or near real-time graphical
animation effect displayed
on the display device representative of a visual update to the visible game
component to reveal
the invisible game component. Depending on the outcome of the interactive
game, the card
reader may update the monetary amount.
[0051] The EGM may include one or more data capture camera devices that may be

configured with algorithms to process recorded image data to detect in real-
time the position of
the player's eyes in three-dimensional (3D) space and the focus of the
player's gaze in two
dimensional-space (2D) or 3D space. The position of the player's eyes may be
the physical
location of the player's eyes in 3D space. The focus of the player's gaze may
be the focus of
the gaze on a display device of the EGM. A player may maintain the position of
the player's
eyes while focusing on different areas of a display device of the EGM. A
player may maintain
the focus of the player's eye gaze on the same portion of a display device of
the EGM while
changing the position of their eyes.
[0052] The EGM may monitor the player eye gaze on the viewing area by mapping
the player
eye gaze on the display device to the viewing area. The player's eye gaze may
correspond to
an invisible game component in the viewing area. The EGM may dynamically
update and
render the viewing area in 2D or 3D. The player may play an interactive game
using only the
eye gaze of the player. In some embodiments, the player may play an
interactive game using
their eye gaze, eye gesture, movement, or any combination thereof.
[0053] The gaming enhancements described herein may be carried out using a
physical
EGM. EGM may be embodied in a variety of forms, machines and devices
including, for
example, portable devices, such as tablets and smart phones, that can access a
gaming site or
a portal (which may access a plurality of gaming sites) via the Internet or
other communication
path (e.g., a LAN or WAN), and so on. The EGM may be located in various
venues, such as a
casino, airport, restaurant, or an arcade. One example type of EGM is
described with respect to
Fig. 1.
[0054] Fig. 1 is a perspective view of an EGM 10 configured to continuously
monitor eye gaze
of a player to collect player eye gaze data. A game controller may determine a
location of the
eye gaze of the player relative to a viewing area of the interactive game
environment using the
player eye gaze data and triggering a control command to a display controller
to dynamically

CA 02915024 2015-12-11
=
. = - 11 -
update the rendering of the viewing area based on the player eye gaze data to
reveal an
invisible game component. EGM 10 has at least one data storage device to store
game data for
an interactive game. The data storage device may store game data for one or
more primary
interactive games and one or more bonus interactive games. EGM 10 may have the
display
controller for detecting the control command from the game controller. In
response to the
control command, the display controller may dynamically update the rendering
of the viewing
area to provide a real-time or near real-time graphical animation effect
displayed on the display
device representative of a visual update to one or more visible game
components to reveal the
one or more invisible game components that may be in the viewing area.
[0055] An example embodiment of EGM 10 includes a display device 12 that may
be a thin
film transistor (TFT) display, a liquid crystal display (LCD), a cathode ray
tube (CRT), auto
stereoscopic 3D display and LED display, an OLED display, or any other type of
display. An
optional second display device 14 provides game data or other information in
addition to display
device 12. Display device 12, 14, may have 2D display capabilities or 3D
display capabilities, or
both. Gaming display device 14 may provide static information, such as an
advertisement for
the game, the rules of the game, pay tables, pay lines, or other information,
or may even display
the main game or a bonus game along with display device 12. Alternatively, the
area for display
device 14 may be a display glass for conveying information about the game.
Display device 12,
14 may also include a camera, sensor, and other hardware input devices.
Display device 12, 14
may display at least a portion of the visible game components of an
interactive game. Display
device 12, 14 may display the viewing area, which may have one or more visible
game
components masking or blocking one or more invisible game components.
[0056] In some embodiments, the display device 12, 14 may be a touch sensitive
display
device. The player may interact with the display device 12, 14 using touch
control such as, but
not limited to, touch, hold, swipe, and multi-touch controls. The player may
use these
interactions to manipulate the interactive game environment for easier viewing
or preference, to
manipulate game elements such as visible game components, or to select at
least a portion of
the visible game components depending on the design of the game. For example,
the player
may select one or more visible game components displayed by the display device
12, 14. As
another example, the player may not have to touch the display device 12, 14 to
play the
interactive game. The player may instead interact with the interactive game
using their eye
gaze, eye gestures, and/or body movements. As yet another example, the player
may interact

CA 02915024 2015-12-11
,
,
. .
- 12 -
with the interactive game using their touch, eye gaze, eye gestures, body
movements, or a
combination thereof.
[0057] EGM 10 may include a player input device or a data capture camera
device to
continuously detect and monitor player interaction commands (e.g. eye gaze,
eye gestures,
player movement, touch, gestures) to interact with the viewing area and the
visible and invisible
game components displayed on the display device 12, 14. EGM 10 has a game
controller for
determining a location of the eye gaze of the player relative to the viewing
area using the player
eye gaze data collected by the at least one data capture camera device, which
may
continuously monitor eye gaze of a player. The location of the player's eye
gaze may
correspond to one or more invisible game components in the viewing area. The
game controller
may trigger a control command to the display controller to dynamically update
the rendering of
the viewing area based on the player eye gaze data and the location
corresponding to the
invisible game component. In response to the control command, the display
controller may
control the display device in real-time or near real-time using the graphics
processor to
dynamically update the rendering of the viewing area to provide a real-time or
near real-time
graphical animation effect displayed on the display device that may represent
a visual update to
the visible game component to reveal the invisible game component in the
viewing area, the
visual update based on the player eye gaze data.
[0058] In some embodiments, the control command may be based on the eye gaze,
eye
gesture, or the movement of the player, or any combination thereof. The eye
gaze of the player
may be the location on the display device where the player is looking. The eye
gesture of the
player may be the gesture made by the player using one or more eyes, such as
widening the
eyes, narrowing the eyes, blinking, and opening one eye and closing the other.
The movement
of the player may be the movement of the player's body, which may include head
movement,
hand movement, chest movement, leg movement, foot movement, or any combination
thereof.
A winning outcome of the game for provision of an award may be triggered based
on the eye
gaze, eye gesture, or the movement of the player, and may be based on the
revealed invisible
game component. For example, by looking at a visible game component displayed
by the
display controller on the display device 12, 14 for a pre-determined period of
time, the player
may reveal an invisible game component and trigger a winning outcome. The
award may
include credits, free games, mega pot, small pot, progressive pot, and so on.

CA 02915024 2015-12-11
=
, .
- 13 -
[0059] Display device 12, 14 may have a touch screen lamination that includes
a transparent
grid of conductors. Touching the screen may change the capacitance between the
conductors,
and thereby the X-Y location of the touch may be determined. The X-Y location
of the touch
may be mapped to positions of interest to detect selection thereof, for
example, the game
components of the interactive game. A processor of EGM 10 associates this X-Y
location with a
function to be performed. Such touch screens may be used for slot machines,
for example, or
other types of gaming machines. There may be an upper and lower multi-touch
screen in
accordance with some embodiments. One or both of display device 12, 14 may be
configured to
have auto stereoscopic 3D functionality to provide 3D enhancements to the
interactive game
environment. The touch location positions may be 3D, for example, and mapped
to at least one
visible game component of the plurality of visible game components.
[0060] A coin slot 22 may accept coins or tokens in one or more denominations
to generate
credits within EGM 10 for playing games. An input slot 24 for an optical
reader and printer
receives machine readable printed tickets and outputs printed tickets for use
in cashless
gaming. An output slot 26 may be provided for outputting various physical
indicia, such as
physical tokens, receipts, bar codes, etc.
[0061] A coin tray 32 may receive coins or tokens from a hopper upon a win or
upon the
player cashing out. However, the EGM 10 may be a gaming terminal that does not
pay in cash
but only issues a printed ticket for cashing in elsewhere. Alternatively, a
stored value card may
be loaded with credits based on a win, or may enable the assignment of credits
to an account
associated with a computer system, which may be a computer network connected
computer.
[0062] A card reader slot 34 may read from various types of cards, such as
smart cards,
magnetic strip cards, or other types of cards conveying machine readable
information. The card
reader reads the inserted card for player and credit information for cashless
gaming. Card
reader slot 34 may read a magnetic code on a conventional player tracking
card, where the
code uniquely identifies the player to a host system at the venue. The code is
cross-referenced
by the host system to any data related to the player, and such data may affect
the games
offered to the player by the gaming terminal. Card reader slot 34 may also
include an optical
reader and printer for reading and printing coded barcodes and other
information on a paper
ticket. A card may also include credentials that enable the host system to
access one or more
accounts associated with a user. The account may be debited based on wagers by
a user and
credited based on a win.

CA 02915024 2015-12-11
- 14 -
[0063] The card reader slot 34 may be implemented in different ways for
various
embodiments. The card reader slot 34 may be an electronic reading device such
as a player
tracking card reader, a ticket reader, a banknote detector, a coin detector,
and any other input
device that can read an instrument supplied by the player for conveying a
monetary amount. In
the case of a tracking card, the card reader slot 34 detects the player's
stored bank and applies
that to the gaming machine being played. The card reader slot 34 or reading
device may be an
optical reader, a magnetic reader, or other type of reader. The card reader
slot 34 may have a
slot provided in the gaming machine for receiving the instrument. The card
reader slot 34 may
also have a communication interface (or control or connect to a communication
interface) to
digitally transfer tokens or indicia of credits or money via various methods
such as RFID, tap,
smart card, credit card, loyalty card, near field communication (NFC) and so
on.
[0064] An electronic device may couple (by way of a wired or wireless
connection) to the
EGM 10 to transfer electronic data signals for player credits and the like.
For example, NFC
may be used to couple to EGM 10 which may be configured with NFC enabled
hardware. This
is a non-limiting example of a communication technique.
[0065] A keypad 36 may accept player input, such as a personal identification
number (PIN)
or any other player information. A display 38 above keypad 36 displays a menu
for instructions
and other information and provides visual feedback of the keys pressed. Keypad
36 may be an
input device such as a touchscreen, or dynamic digital button panel, in
accordance with some
embodiments.
[0066] Player control buttons 39 may include any buttons or other controllers
needed to play
the particular game or games offered by EGM 10 including, for example, a bet
button, a repeat
bet button, a spin reels (or play) button, a maximum bet button, a cash-out
button, a display pay
lines button, a display payout tables button, select icon buttons, and any
other suitable button.
Buttons 39 may be replaced by a touch screen with virtual buttons.
[0067] EGM 10 may also include a digital button panel. The digital button
panel may include
various elements such as for example, a touch display, animated buttons, frame
lights, and so
on. The digital button panel may have different states, such as for example,
standard play
containing bet steps, bonus with feature layouts, point of sale, and so on.
The digital button
panel may include a slider bar for adjusting the three-dimensional panel. The
digital button
panel may include buttons for adjusting sounds and effects. The digital button
panel may

CA 02915024 2015-12-11
=
- 15 -
include buttons for betting and selecting bonus games. The digital button
panel may include a
game status display. The digital button panel may include animation. The
buttons of the digital
button panel may include a number of different states, such as pressable but
not activated,
pressed and active, inactive (not pressable), certain response or information
animation, and so
on. The digital button panel may receive player interaction commands, in some
example
embodiments.
[0068] EGM 10 may also include hardware configured to provide eye, motion or
gesture
tracking. For example, the EGM 10 may include at least one data capture camera
device,
which may be one or more cameras that detect one or more spectra of light, one
or more
sensors (e.g. optical sensor), or a combination thereof. The at least one data
capture camera
device may be used for eye, gesture or motion tracking of player, such as
detecting eye
movement, eye gestures, player positions and movements, and generating signals
defining x, y
and z coordinates. For example, the at least one data capture camera device
may be used to
implement tracking recognition techniques to collect player eye gaze data,
player eye gesture
data, and player movement data. An example type of motion tracking is optical
motion tracking.
The motion tracking may include a body and head controller. The motion
tracking may also
include an eye controller. EGM 10 may implement eye-tracking recognition
technology using
cameras, sensors (e.g. optical sensor), data receivers and other electronic
hardware to capture
various forms of player input. The eye gaze, eye gesture, or motion by a
player may interact
with the interactive game environment or may impact the type of graphical
animation effect.
Accordingly, EGM 10 may be configured to capture player eye gaze input, eye
gesture input,
and movement input as player interaction commands.
[0069] For example, the player eye gaze data, player eye gesture data, and
player movement
data defining eye movement, eye gestures, player positions and movements may
be used to
reveal, select, manipulate, and/or move visible and/or invisible game
components. As another
example, the player eye gaze data, player eye gesture data, and player
movement data defining
eye movement, eye gestures, player positions and movements may be used to
change a view
of the gaming surface or gaming component. A visible game component of the
game may be
illustrated as a three-dimensional enhancement coming towards the player.
Another visible
game component of the game may be illustrated as a three-dimensional
enhancement moving
away from the player. The player's head position may be used as a view guide
for the at least
one data capture camera device during a three-dimensional enhancement. A
player sitting

CA 02915024 2015-12-11
'
' - 16 -
directly in front of display 12, 14 may see a different view than a player
moving aside. The at
least one data capture camera device may also be used to detect occupancy of
the machine or
detect movement proximate to the machine.
[0070] Embodiments described herein are implemented by physical computer
hardware
embodiments. The embodiments described herein provide useful physical machines
and
particularly configured computer hardware arrangements of computing devices,
servers,
electronic gaming terminals, processors, memory, networks, for example. The
embodiments
described herein, for example, is directed to computer apparatuses, and
methods implemented
by computers through the processing of electronic data signals.
[0071] Accordingly, EGM 10 is particularly configured to provide an
interactive game
environment. The display device 12, 14 may display, via a user interface, the
interactive game
environment and the viewing area having one or more visible game components
and one or
more invisible game components in accordance with a set of game data stored in
a data store.
The interactive game environment may be a 2D interactive game environment or a
3D
interactive game environment, or a combination thereof.
[0072] A data capture camera device may capture player data, such as button
input, gesture
input and so on. The data capture camera device may include a camera, a sensor
or other data
capture electronic hardware. In some embodiments, EGM 10 may include at least
one data
capture camera device to continuously monitor the eye gaze of a player to
collect player eye
gaze data. The player may provide input to the EGM 10 using the eye gaze of
the player. For
example, using the eye gaze of the player, which may be collected as player
eye gaze data, the
player may select an interactive game to play, interact with a visible or
invisible game
component, or trigger a bonus interactive game.
[0073] Embodiments described herein involve computing devices, servers,
electronic gaming
terminals, receivers, transmitters, processors, memory, display, and networks
particularly
configured to implement various acts. The embodiments described herein are
directed to
electronic machines adapted for processing and transforming electromagnetic
signals which
represent various types of information. The embodiments described herein
pervasively and
integrally relate to machines, and their uses; and the embodiments described
herein have no
meaning or practical applicability outside their use with computer hardware,
machines, a various
hardware components.

CA 02915024 2015-12-11
' - 17 -
[0074] As described herein, EGM 10 may be configured to provide an interactive
game
environment. The interactive game environment may be a 2D or 3D interactive
game
environment. The interactive game environment may include a plurality of
visible and/or
invisible game components or game symbols based on the game data. The
invisible game
components may be masked or blocked by the visible game components. The game
data may
relate to a primary interactive game or a bonus interactive game, or both. For
example, the
interactive game environment may comprise a 3D reel space that may have an
active primary
game matrix of a primary subset of game components. The bonus subset of game
components
may be different from the primary subset of game components. The player may
view a viewing
area of the interactive game environment, which may be a subset of the
interactive game
environment, on the display device 12, 14. The interactive game environment or
the viewing
area may be dynamically updated based on the eye gaze, eye gesture, or
movement of the
player in real-time or near real-time. The update to the interactive game
environment or the
viewing area may be a graphical animation effect displayed on the display
device 12, 14. In
some embodiments, the graphical animation effect may represent a visual update
to the visible
game component to reveal an invisible game component. The update to the
interactive game
environment or the viewing area may be triggered based on the eye gaze, eye
gesture, or
movement of the player. For example, the update may be triggered by looking at
a particular
part of the viewing area for a pre-determined period of time, or looking at
different parts of the
viewing area in a pre-determined sequence, or widening or narrowing the eyes.
The interactive
game environment may be updated dynamically and revealed by dynamic triggers
from game
content of the primary interactive game in response to electronic data signals
collected and
processed by EGM 10.
[0075] For an interactive game environment, the EGM 10 may include a display
device 12, 14
with auto stereoscopic 3D functionality. The EGM 10 may include a touch screen
display for
receiving touch input data to define player interaction commands. The EGM 10
may also
include at least one data capture camera device, for example, to further
receive player input to
define player interaction commands. The EGM 10 may also include several
effects and frame
lights. The 3D enhancements may be an interactive game environment for
additional game
symbols.
[0076] EGM 10 may include an output device such as one or more speakers. The
speakers
may be located in various locations on the EGM 10 such as in a lower portion
or upper portion.

CA 02915024 2015-12-11
k
- 18 -
The EGM 10 may have a chair or seat portion and the speakers may be included
in the seat
portion to create a surround sound effect for the player. The seat portion may
allow for easy
upper body and head movement during play. Functions may be controllable via an
on screen
game menu. The EGM 10 is configurable to provide full control over all built-
in functionality
(lights, frame lights, sounds, and so on).
[0077] EGM 10 may also include a plurality of effects lights and frame lights.
The lights may
be synchronized with enhancements of the game. The EGM 10 may be configured to
control
color and brightness of lights. Additional custom animations (color cycle,
blinking, etc.) may
also be configured by EGM 10. The custom animations may be triggered by
certain gaming
events.
[0078] Fig. 2A is a block diagram of hardware components of EGM 10 according
to some
embodiments. EGM 10 is shown linked to the casino's host system 41 via network

infrastructure. These hardware components are particularly configured to
provide at least one
interactive game. These hardware components may be configured to provide at
least one
primary interactive game, at least one bonus interactive game, or both.
[0079] A communications board 42 may contain circuitry for coupling the EGM 10
to network.
Communications board 42 may include a network interface allowing EGM 10 to
communicate
with other components, to access and connect to network resources, to serve an
application, to
access other applications, and to perform other computing applications by
connecting to a
network (or multiple networks) capable of carrying data including the
Internet, Ethernet, plain old
telephone service (POTS) line, public switch telephone network (PSTN),
integrated services
digital network (ISDN), digital subscriber line (DSL), coaxial cable, fiber
optics, satellite, mobile,
wireless (e.g. Wi-Fi, WiMAX), SS7 signaling network, fixed line, local area
network, wide area
network, and others, including any combination of these. EGM 10 may
communicate over a
network using a suitable protocol, such as the G2S protocols.
[0080] Communications board 42 communicates, transmits and receives data using
a
wireless transmitter, or it may be wired to a network, such as a local area
network running
throughout the casino floor, for example.
Communications board 42 may set up a
communication link with a master controller and may buffer data between the
network and game
controller board 44. Communications board 42 may also communicate with a
network server,

CA 02915024 2015-12-11
- 19 -
such as in accordance with the G2S standard, for exchanging information to
carry out
embodiments described herein.
[0081] Game controller board 44 includes memory and a processor for carrying
out program
instructions stored in the memory and for providing the information requested
by the network.
Game controller board 44 executes game routines using game data stores in a
data store
accessible to the game controller board 44, and cooperates with graphics
processor 54 and
display controller 52 to provide games with enhanced interactive game
components.
[0082] EGM 10 may include at least one data capture camera device for
implementing the
gaming enhancements, in accordance with some embodiments. The EGM 10 may
include the
at least one data capture camera device, one or more sensors (e.g. optical
sensor), or other
hardware device configured to capture and collect in real-time or near real-
time data relating to
the eye gaze, eye gesture, or movement of the player, or any combination
thereof.
[0083] In some embodiments, the at least one data capture camera device may be
used for
eye gaze tracking, eye gesture tracking, movement tracking, and movement
recognition. The at
least one data capture camera device may collect data defining x, y and z
coordinates
representing eye gaze, eye gestures, and movement of the player.
[0084] In some examples, a game component may be illustrated as a 3D
enhancement
coming towards the player. Another game component may be illustrated as a 3D
enhancement
moving away from the player. The player's head position may be used as a
reference for the at
least one data capture camera device during a 3D enhancement. A player sitting
directly in
front of display 12, 14 may see a different view than a player moving aside.
The at least one
data capture camera device may also be used to detect occupancy of the EGM 10
or detect
movement proximate to the EGM 10. The at least one data capture camera device
and/or a
sensor (e.g. an optical sensor) may also be configured to detect and track the
position(s) of a
player's eyes or more precisely, pupils, relative to the screen of the EGM 10.
[0085] The at least one data capture camera device may also be used to collect
data defining
player eye movement, eye gestures, body gestures, head movement, or other body
movement.
Players may move their eyes, their bodies or portions of their bodies to
interact with the
interactive game. The at least one data capture camera device may collect data
defining player
eye movement, eye gestures, body gestures, head movement, or other body
movement. The

CA 02915024 2015-12-11
- 20 -
game controller 44 may process and transform the data into data defining game
interactions
(e.g. selecting game components, focusing game components, magnifying game
components,
movement for game components), and update the rendering of the viewing area to
provide a
real-time or near real-time graphical animation effect representative of the
game interactions
using the player eye gaze data, player eye gesture data, player movement data,
or any
combination thereof. For example, the player's eyes may be tracked by the at
least one data
capture camera device (or another hardware component of EGM 10), so when the
player's eyes
move left, right, up or down, one or more game components on display device
12, 14, may
move in response to the player's eye movements. The player may have to avoid
obstacles, or
possibly catch or contact items to collect depending on the type of game. The
player may focus
on a particular location on the display device 12, 14 to cause a graphical
animation effect to be
displayed on display device 12, 14 representative of a visual update to a
visible game
component to reveal an invisible game component. These graphical animation
effects within
the game may be implemented based on the data derived from collected player
eye gaze data,
player eye gesture data, player movement data, or any combination thereof.
[0086] In some embodiments, the at least one data capture camera device may
track a
position of each eye of a player relative to display device 12, 14, as well as
a direction of focus
of the eyes and a point of focus on the display device 12, 14, in real-time or
near real-time. The
focus direction may be the direction at which the player's line of sight
travels or extends from his
or her eyes to display device 12, 14. The focus point may be referred to as a
gaze point and the
focus direction may sometimes be referred to as a gaze direction. In one
example, the focus
direction and focus point can be determined based on various eye tracking data
such as
position(s) of a player's eyes, a position of his or her head, position(s) and
size(s) of the pupils,
corneal reflection data, and/or size(s) of the irises. All of the above
mentioned eye tracking or
movement data, as well as the focus direction and focus point, may be examples
of, and
referred to as, player's eye movements or player movement data.
[0087] In some embodiments, the at least one data capture camera device may
monitor the
eye gaze, eye gesture, and/or movement of two or more people, who may be two
or more
players of the interactive game, to collect the player eye gaze data, player
eye gesture data,
and/or player movement data. The player eye gaze data, player eye gesture
data, and/or player
movement data may be used such that both players may be able to play the
interactive game
simultaneously. The player eye gaze data, player eye gesture data, and/or
player movement

CA 02915024 2015-12-11
. - .
,
- 21 -
data from one or more players of the interactive game may cause game
controller 44 to trigger
the control command to display controller 52 to display on display device 12,
14 a graphical
animation effect to reveal one or more invisible game components. The
interactive game may
include aspects of both cooperative and competitive play.
[0088] A visible or invisible game component may be selected to move or
manipulate with the
player's eye movements. The gaming component may be selected by the player or
by the
game. For example, the game outcome or state may determine which symbol to
select for
enhancement.
[0089] As previously described, the at least one data capture camera device
may track a
position of a player's eyes relative to display device 12, 14, as well as a
focus direction and a
focus point on the display device 12, 14 of the player's eyes in real-time or
near real-time. The
focus direction can be the direction at which the player's line of sight
travels or extends from his
or her eyes to the display device 12, 14. The focus point may sometimes be
referred to as a
gaze point and the focus direction may sometimes be referred to as a gaze
direction. In one
example, the focus direction and focus point can be determined based on
various eye tracking
data such as position(s) of a player's eyes, a position of his or her head,
position(s) and size(s)
of the pupils, corneal reflection data, and/or size(s) of the irises. All of
the above mentioned eye
tracking or movement data, as well as the focus direction and focus point, may
be instances of
player movement data.
[0090] In addition, a focus point may extend to or encompass different visual
fields visible to
the player. For example, a foveal area may be a small area surrounding a
fixation point on the
display device 12, 14 directly connected by a (virtual) line of sight
extending from the eyes of a
player to the display screen. This foveal area in the player's vision may
generally appear to be
in sharp focus and may include one or more game components and the surrounding
area. A
focus point may include the foveal area immediately adjacent to the fixation
point directly
connected by the (virtual) line of sight extending from the player's eyes to
the display screen.
[0091] The player eye gaze data and player eye gesture data may relate to the
movement of
the player's eyes. For example, the player's eyes may move or look to the
left, which may
trigger a corresponding movement of a game component within the game. The
movement of
the player's eyes may also trigger an updated view of the entire interactive
game environment
on the display device 12, 14 to reflect the orientation of the player in
relation to the display

CA 02915024 2015-12-11
e
. ,
- 22 -
device 12, 14. The player movement data may be associated with movement of the
body of the
player, such as the player's head, arms, legs, or other part of the player's
body. As a further
example, the player movement data may be associated with a gesture made by the
player, such
as a gesture by a hand or a finger. The EGM 10 may convert the focus data
relative to display
device 12, 14 to eye gaze data relative to the viewing area of the interactive
game which may
dynamically update.
[0092] In one embodiment of the invention, the EGM 10 may be configured to
reveal, target,
select, deselect, move, and/or rotate one or more visible and/or invisible
game components
based on player eye gaze data, player eye gesture data, and player movement
data. For
example, the EGM 10 may determine that a player has gazed at (e.g. the focus
point has
remained more or less constant) a previously unselected game component for
three or more
seconds, then the EGM 10 may select or highlight the game component, so the
player may
know that he or she may proceed to move or rotate the selected or highlighted
game
component. In another example, the EGM 10 may determine that after a player
has selected a
game component, the same player has moved his or her eyes to the right on a
horizontal level
for a predetermined length or period of time, then the EGM 10 may cause the
selected game
component to move to the right as well on a horizontal level. Similarly, the
EGM 10 may
determine that the player has moved his or her eyes down on a vertical level
for a
predetermined length or period of time, and then the EGM 10 may cause the
selected game
component to move to the bottom vertically.
[0093] Display controller 52 may control one or more of display device 12, 14
using graphics
processor 54 to display a viewing area that may include one or more visible
game components
masking or blocking one or more invisible game components based on the game
data of an
interactive game.
[0094] Display controller 52 may, in response to detection of the control
command from the
game controller 44 based on the player eye gaze data, player eye gesture data,
or player
movement data, control display device 12, 14 using graphics processor 54.
Display controller
52 may update the viewing area to trigger a graphical animation effect
displayed on one or both
of display device 12, 14 representative of a visual update to the visible game
components to
reveal the invisible game component in the viewing area, the visual update
based on the player
eye gaze data, player eye gesture data, or player movement data.

CA 02915024 2015-12-11
. ,
- 23 -
[0095] In some embodiments, the at least one data capture camera device and
the display
device 12, 14 may be calibrated. Calibration of the at least one data capture
camera device and
the display device may be desirable because the eyes of each player using the
EGM 10 may be
physically different, such as the shape and location of the player's eyes, and
the capability for
each player to see. Each player may also stand at a different position
relative to the EGM 10.
[0096] The at least one data capture camera device may be calibrated by the
game controller
44 by detecting the movement of the player's eyes. In some embodiments, the
display
controller 52 may control the display device 12, 14 to display one or more
calibration symbols.
There may be one calibration symbol that appears on the display device 12, 14
at one time, or
more than one calibration symbol may appear on the display device 12, 14 at
one time. The
player may be prompted by text, noise, graphical animation effect, or any
combination thereof,
to direct their eye gaze to one or more of the calibration symbols. The at
least one data capture
camera device may monitor the eye gaze of the player looking at the one or
more calibration
symbols and a distance of the player's eyes relative to the EGM 10 to collect
calibration data.
Based on the eye gaze corresponding to the player looking at different
calibration symbols, the
at least one data capture camera device may record data associated with how
the player's eyes
rotate to look from one position on the display device 12, 14 to a second
position on the display
device 12, 14. The game controller 44 may calibrate the at least one data
capture camera
device based on the calibration data.
[0097] For example, as shown in Fig. 3, before the player 310 plays the
interactive game, the
EGM 10 may notify the player 310 that the at least one data capture camera
device (not shown)
and the display device 12, 14 may be calibrated. The display controller 52 may
cause the
display device 12, 14 to display one or more calibration symbols 330. In Fig.
3, nine calibration
symbols 330 "A" through "I" are displayed, but the calibration symbols 330 may
be any other
symbols. For example, the calibration symbols 330 may be one or more game
components
related to the interactive game to be played. The calibration symbols 330 may
be displayed on
any portion of the display device 12, 14. The player 310 may be prompted to
look at the
calibration symbols in a certain order. The at least one data capture camera
device may
monitor the eye gaze 320 of the player 310 looking at the calibration symbols
330 and the
distance of the player's eyes relative to the EGM 10 to collect the
calibration data. When the at
least one data capture camera device collects player eye gaze data in real-
time, the game

CA 02915024 2015-12-11
'
. ,
- 24 -
controller 44 may compare the player eye gaze data with the calibration data
in real-time to
determine the angle at which that the player's eyes are looking.
[0098] The display controller 52 may calibrate the display device 12, 14 using
the graphics
processor 54 based on the calibration data collected by the at least one data
capture camera
device. The at least one data capture camera device may monitor the eye gaze
of the player to
collect calibration data as described herein. The display controller 52 may
calibrate the display
device 12, 14 using the graphics processor 54 to display a certain resolution
on the display
device 12, 14.
[0099] In some embodiments, the game controller 44 may determine the location
of the eye
gaze relative to the viewing area based on the position of the player's eyes
relative to the EGM
10 and an angle of the player's eyes. As shown in Fig. 4, the at least one
data capture camera
device 420 may monitor the position of the player's eyes 430 relative to EGM
10, and may also
monitor the angle of the player's eyes 430 to collect display mapping data.
The angle of the
player's eyes may be determined based on the calibration of the at least one
data capture
camera device 420 described herein. The angle of the player's eyes may define
the focus of
the eye gaze, which may be a line of sight relative to the display device 12,
14. Based on the
display mapping data, which may comprise the position of the player's eyes
relative to the EGM
10 and an angle of the player's eyes or the line of sight relative, the game
controller 44 may be
configured to determine the direction and length of a virtual array 440
projecting from the
player's eyes 430. Virtual array 440 may represent the eye gaze of the player
410. The game
controller 44 may determine where the virtual array 440 intersects with the
display device 12,
14. The intersection of virtual array 440 and display device 12, 14 may
represent where the eye
gaze of the player 410 is focused on the display device 12, 14. The display
device 12, 14 may
be controlled by display controller 52 to display the viewing area. The game
controller 44 may
identify coordinates on the display device 12, 14 corresponding to the player
eye gaze data and
may map the coordinates to the viewing area to determine the eye gaze of the
player relative to
the viewing area. EGM 10 may determine the location of the viewing area that
the player 410 is
looking at, which may be useful for EGM 10 to determine how the player 410 is
interacting with
the interactive game. The location of the viewing area that the player 410 is
looking at may
correspond to one or more invisible game components. In some embodiments, the
eye gaze of
the player may be expressed in 2D or 3D and may be mapped to a 2D or 3D
viewing area,
depending on whether the interactive game is a 2D interactive game or a 3D
interactive game.

CA 02915024 2015-12-11
. ,
- 25 -
[00100] Peripheral devices/boards communicate with the game controller board
44 via a bus
46 using, for example, an RS-232 interface. Such peripherals may include a
bill validator 47, a
coin detector 48, a smart card reader or other type of credit card reader 49,
and player control
inputs 50 (such as buttons or a touch screen).
[00101] Player input or control device 50 may include the keypad, the buttons,
touchscreen
display, gesture tracking hardware, and data capture camera device as
described herein. Other
peripherals may be one or more cameras used for collecting player input data,
or other player
movement or gesture data that may be used to trigger player interaction
commands. Display
device 12, 14 may be a touch sensitive display device. Player control input
device 50 may be
integrated with display device 12, 14 to detect player interaction input at
the display device 12,
14.
[00102] Game controller board 44 may also control one or more devices that
produce the
game output including audio and video output associated with a particular game
that is
presented to the user. For example, audio board 51 may convert coded signals
into analog
signals for driving speakers.
[00103] Game controller board 44 may be coupled to an electronic data store
storing game
data for one or more interactive games. The game data may be for a primary
interactive game
and/or a bonus interactive game. The game data may, for example, include a set
of game
instructions for each of the one or more interactive games. The electronic
data store may reside
in a data storage device, e.g., a hard disk drive, a solid state drive, or the
like. Such a data
storage device may be included in EGM 10, or may reside at host system 41. In
some
embodiments, the electronic data store storing game data may reside in the
cloud.
[00104] Card reader 49 reads cards for player and credit information for
cashless gaming.
Card reader 49 may read a magnetic code on a conventional player tracking
card, where the
code uniquely identifies the player to a host system at the venue. The code is
cross-referenced
by host system 41 to any data related to the player, and such data may affect
the games offered
to the player by the gaming terminal. Card reader 49 may also include an
optical reader and
printer for reading and printing coded barcodes and other information on a
paper ticket. A card
may also include credentials that enable host system 41 to access one or more
accounts
associated with a user. The account may be debited based on wagers by a user
and credited
based on a win.

CA 02915024 2015-12-11
,
,
. .
- 26 -
[00105] Graphics processor 54 may be configured to generate and render
animation game
enhancements based on game data as directed by game controller board 44. The
game
enhancements may involve an interactive game environment that may provide one
or more
visible and invisible game components and graphical animation effects.
Graphics processor 54
may be a specialized electronic circuit designed for image processing
(including 2D and 3D
image processing in some examples) in order to manipulate and transform data
stored in
memory to accelerate the creation of images in a frame buffer for output to
the display by way of
display controller 52. Graphics processor 54 may redraw various game
enhancements as they
dynamically update. Graphics processor 54 may cooperate with game controller
board 44 and
display controller 52 to generate and render enhancements as described herein.
Graphics
processor 54 may generate an interactive game environment that may provide one
or more
visible and invisible game components, for example, a 3D reel space of a
plurality of game
components. The graphics processor 54 may generate graphical animation effects
to represent
a visual update to the visible game components to reveal the invisible game
components in the
viewing area, the visual update based on the player eye gaze data, player eye
gesture data,
player movement data, or any combination thereof.
[00106] Display controller 52 may require a high data transfer rate and may
convert coded
signals to pixel signals for the display. Display controller 52 and audio
board 51 may be directly
connected to parallel ports on the game controller board 44. The electronics
on the various
boards may be combined onto a single board. Display controller 52 may control
output to one
or more display device 12, 14 (e.g. an electronic touch sensitive display
device). Display
controller 52 may cooperate with graphics processor 54 to render animation
enhancements on
display device 12, 14.
[00107] Display controller 52 may be configured to interact with graphics
processor 54 to
control the display device 12, 14 to display a viewing area defining the
interactive game
environment including navigation to different views of the interactive game
environment. Player
control inputs 50 and the at least one data capture camera device may
continuously detect
player interaction commands to interact with interactive game environment. For
example, the
player may move a visible game component to a preferred position, select a
visible game
component, reveal an invisible game component, or manipulate the display of
the visible and
invisible game components.

CA 02915024 2015-12-11
- 27 -
[00108] In some embodiments, display controller 52 may control the display
device 12, 14
using the graphics processor 54 to display the viewing area that may have one
or more visible
and/or invisible game components. In response to the detection of the control
command based
on the player eye gaze data, player eye gesture data, player movement data, or
any
combination thereof, display controller 52 may trigger a graphical animation
effect to represent a
visual update to the visible game components in the viewing area to reveal the
invisible game
components.
[00109] While playing an interactive game on the EGM 10, the eyes of a player
may move
suddenly without the player being conscious of the movement. The eyes of the
player may
demonstrate subconscious, quick, and short movements, even if the player is
not actively
controlling their eyes to move in this manner. These subconscious, quick, and
short eye
movements may affect the game controller's determination of the eye gaze of
the player based
on the player eye gaze data. Accurate processing of the player eye gaze data
related to these
subconscious, quick, and short eye movements may result in detecting the
location of the eye
gaze of the player representative of eye twitching or erratic eye movements
not reflective of the
player's intended eye gaze, and may be distracting to the player. It may be
useful for the player
eye gaze data to be filtered to not reflect these quick and short eye
movements, for example, so
the determination of the eye gaze of the player relative to the viewing area
by the game
controller reflects the intended eye gaze of the player. It may also be useful
for the portion of
the player eye gaze data representative of the subconscious, quick, and short
eye movements
to have less determinative effect on the determined location of the eye gaze
of the player. In
some embodiments, the game controller 44 may define a filter movement
threshold, wherein the
game controller, prior to determining a location of the eye gaze of the player
relative to the
viewing area using the player eye gaze data collected by the at least one data
capture camera
device and updating the rendering of the viewing area, determines that the
player eye gaze
meets the filter movement threshold. The at least one data capture camera
device may collect
player eye gaze data.
[00110] The game controller 44 may process the player eye gaze data to
correspond with a
location on the viewing area. The game controller 44 may determine where the
player is looking
at on the viewing area based on a certain number of previously recorded player
eye gaze data,
for example, by tracking the last ten eye gaze positions to average out where
on the viewing
area the player is looking. The game controller 44 may limit the amount of
previously recorded

CA 02915024 2015-12-11
- 28 -
player eye gaze data that is used to determine where on the viewing area the
player is looking.
The game controller 44 may filter out, or "smooth out", player eye gaze data
outside of the pre-
determined filter movement threshold, which may represent sudden and
subconscious eye
movement. The game controller 44 may map the eye gaze of the player to the
viewing area
using at least a portion of the filtered player eye gaze data to determine the
location of the
viewing area at which the player is looking, in order to map the player's eye
gaze to the viewing
area.
[00111] As another example, the game controller 44 may delay in processing the
player eye
gaze data associated with subconscious, quick, and short eye movements, so the
detected
location of the eye gaze of the player does not represent twitching or sudden
unconscious eye
movements which may trigger animation effects causing an unpleasant user
experience. Large
eye motions may also be associated with more delay in processing and more
smoothing. In
some embodiments, the game controller may partition the player eye gaze data
associated with
large eye motions into data representative of shorter eye motions. The game
controller 44 may
analyze the player eye gaze data to determine which data is associated with
subconscious eye
movement or with conscious eye movement based on a filter movement threshold,
a time
threshold, movement threshold, or any combination thereof. Player eye gaze
data associated
with quick eye movements over a certain period of time may be determined by
the game
controller 44 to be subconscious eye movement. The game controller 44 may
delay in
processing this portion of data so the detected location of the eye gaze of
the player may be
stable and may not distract the player, or the game controller may filter out
this data and not
process it. Player eye gaze data associated with large eye movements over a
certain period of
time may be determined by the game controller to be the player losing focus or
being distracted.
The game controller 44 may similarly delay in processing this portion of data
or not process this
portion of data. In some embodiments, game controller 44 may filter out, or
"smooth out" player
eye gaze data, player eye gesture data, player movement data, or a combination
thereof, that
may exceed the filter movement threshold, in the manner described herein.
[00112] The locations where EGM 10 may be used may have a variety of lighting
conditions.
For example, EGM 10 may be used in a restaurant, a hotel lobby, an airport,
and a casino. It
may be brighter in some locations and darker in other locations, or the light
quality may fluctuate
from brightness to darkness. In some embodiments, EGM 10 may include an
infrared light
source that illuminates the player. The infrared light sources may not
interfere with the eyes of

CA 02915024 2015-12-11
- 29 -
the player. In some embodiments, the at least one data capture camera device
may be an
infrared data capture camera device. The infrared data capture camera device
may collect
player eye gaze data, player eye gesture data, and player movement data
without being
affected by the lighting conditions of the locations where EGM 10 may be used.
In some
embodiments, EGM 10 may have a plurality of light sources providing a
plurality of spectra of
light, and the at least one data capture camera device may be a plurality of
data capture camera
devices configured to detect a plurality of spectra of light, so the at least
one data capture
camera device may collect player eye gaze data, player eye gesture data, and
player movement
data without being affected by the lighting conditions of the locations where
EGM 10 may be
used.
[00113] A player that plays an interactive game using EGM 10 may be wearing
glasses. The
glasses of the player may cause refractions of the light that illuminates the
player. This may
affect the at least one data capture camera device while it monitors the eye
gaze, eye gesture,
and/or movement of the player. Glasses that comprise an infrared filter may
also interfere with
or affect the at least one data capture camera device while it monitors the
eye gaze, eye
gesture, and/or movement of the player. EGM 10 may recognize that the player
may be
wearing glasses. For example, as the interactive game commences, display
controller 52 may
display on display device 12, 14 using graphics processor 54 a question asking
the player if he
or she is wearing glasses. The player may provide input indicating whether he
or she is wearing
glasses, such as, but not limited to, with an audio command, touch command, or
with the
player's eye gaze. As other example, the game controller 44 may recognize,
based on
processing the player eye gaze data from the at least one data capture camera
device, that the
light illuminating the player may be refracted, and may determine that the
player is wearing
glasses. When EGM 10 recognizes that the player may be wearing glasses, the
game
controller 44 may perform additional and/or more stringent filtering functions
as described herein
to compromise for the player's use of glasses and to accommodate the
refractions of the light
that illuminates the player. For example, the filter movement threshold may be
set to be higher
for players who wear glasses.
[00114] In some embodiments, the game controller 44 may be configured to
predict the
location of the eye gaze of the player relative to the viewing area at a
future time using the
player eye gaze data to facilitate dynamic update to the rendering of the
viewing area. For
example, if the game controller 44 determines that a player is changing their
gaze on a

CA 02915024 2015-12-11
- 30 -
horizontal plane from the left to the right, the game controller 44 may
predict that the player may
look at a game component displayed on the right side of display device 12, 14.
The ability for
game controller 44 to predict the location of the eye gaze of the player at a
future time may be
useful to rule out inaccurate readings. For example, while a player plays a
game, the at least
one data capture camera device may incorrectly detect a button on the clothing
of a player to be
the player's eyes, and may collect incorrect player eye gaze data based on the
button. Based
on the location of the eye gaze predicted by game controller 44, the incorrect
player eye gaze
data may be ruled out by game controller 44, and may not be processed by game
controller 44
to trigger a control command to update the viewing area with a graphical
animation effect. As
another example, by predicting the location of the eye gaze, the display
controller 52 may adjust
the resolution of the display device 12, 14 where the player is not expected
to be looking. This
may be useful because the EGM 10 may have limited processing power. Not all
visible game
components may require high resolution. Only the game components that the
player is looking
at may require high resolution. The ability for game controller 44 to predict
the location of the
eye gaze of the player may allow display controller 52 to reduce the
resolution of visible game
components that the player may not be looking at, which may increase the
efficiency of the
processing power of the EGM 10.
[00115] In some embodiments, EGM 10 may apply one or more predictive
techniques to
develop a plurality of predicted points of eye gaze, which, for example, may
approximate and/or
estimate where a player's gaze will travel next. These predictions may also be
provided for use
by graphics processor 54 and/or game controller board 44 in relation with
smoothing out and/or
accounting for removal of transient readings, undesirable artefacts and/or
inadvertent gaze
positions. In some embodiments, the predictions may also be used to improve
the performance
of EGM 10 in relation to gaze capture and/or processing thereof, by, for
example, applying
heuristic techniques to reduce the number of computations and/or capture
frequency by relying
on predictions to interpolate and/or extrapolate between gaze positions
captured.
[00116] For example, when a player looks at a location of a viewing area in an
interactive
game, the EGM 10 may record where they were looking and what events are being
displayed to
the player (e.g., as first movements and/or gaze positions). When an event is
triggered a
second time, the player's gaze movements are recorded into a data storage
system, but then
compared to the first movements. A comparison may include, for example,
comparing

CA 02915024 2015-12-11
- 31 -
positions, velocities, start and end positions, accelerations, etc. as between
various gaze
movements.
[00117] For example, for each duration, a path and end location may be
calculated, and a
predicted pathway may be developed based on these locations and stored in a
data storage.
[00118] As the event is triggered more times (e.g., more iterations occur),
the data may be
accumulated and a predictive pathing model can be built. Once the predictive
pathing model is
developed, when the event is triggered, the EGM 10 could reduce the frequency
of the gaze
system updates and use the recorded pathing and final location to be used to
reduce the overall
computing resources required, for example (e.g., performing various steps of
interpolation,
extrapolation using the predictive pathing model).
[00119] Accordingly, predictive pathing can also be used to reduce errors
being produced by
the gaze system. Gaze systems may utilize cameras and edge detection to
determine where
the player is looking, and many utilize use infra-red light to see the
player's eye. If there are
other infra-red light sources, for example, such sources may cause the gaze
camera to be
impacted and may reduce accuracy of the gaze detection. Accordingly,
predictive pathing may
be useful to reduce error in similar situations where there may otherwise be
recorded errors
and/or aberrations.
[00120] Further, predictions may not be limited only to a current player. For
example,
aggregate information from a large population of players may be aggregated
together to refine
the model for predictive pathing. The model may, for example, take into
consideration the type
of player, the type of interaction the player is having with the EGM 10, the
characteristics of the
player (e.g., height, gender, angle of incidence), among others.
[00121] In some embodiments, the predictive pathing model may also be utilized
in the context
of a game. For example, if the game includes aspects which may be selectively
triggered based
on various inputs, an input for triggering may include predicted pathways.
In some
embodiments, objects and/or layers may be modified and/or altered.
[00122] In some embodiments, the player may play an interactive game with EGM
10 in
communication with a mobile device. Depending on the game data of the
interactive game, the
player may play the interactive game on EGM 10, on the mobile device, or on
both. The player
may play the interactive game using their eye gaze, eye gestures, movement,
the interface of

CA 02915024 2015-12-11
- 32 -
the mobile device, or any combination thereof. The player may play the
interactive game using
only the eye gaze of the player while the player holds on to the mobile device
with one or more
hands. The mobile device may, for example, be a computer, personal digital
assistant, laptop,
tablet, smart phone, media player, electronic reading device, data
communication device, or a
wearable device, such as GoogleTM Glass, virtual reality device, or any
combination thereof.
The mobile device may be a custom mobile device that may be in communication
with EGM 10.
The mobile device may be operable by a user and may be any portable, networked
(wired or
wireless) computing device including a processor and memory and suitable for
facilitating
communication between one or more computing applications of mobile device
(e.g. a computing
application installed on or running on the mobile device). A mobile device may
be a two-way
communication device with advanced data communication capabilities having the
capability to
communicate with other computer systems and devices. The mobile device may
include the
capability for data communications and may also include the capability for
voice
communications, in some example embodiments. The mobile device may have at
least one
data capture camera device to continuously monitor the eye gaze, eye gesture,
or movement of
the player and collect player eye gaze data, player eye gesture data, or
player movement data.
[00123] EGM 10 may include a wireless transceiver that may communicate with
the mobile
device, for example using standard WiFi or Bluetooth, or other protocol based
on the wireless
communication capabilities of the mobile device. The player may be able to
play the interactive
game while the mobile device is in communication with EGM 10. When connected
to the EGM
10, the viewing area may be displayed on display device 12, 14 or on the
screen of the mobile
device, or both. The viewing area may have one or more visible game components
and/or
invisible game components. The at least one data capture camera device on the
mobile device
and/or the EGM 10 may collect player eye gaze data, player eye gesture data,
or player
movement data, which may be processed by a game controller 44 of EGM 10 to
determine a
location of the eye gaze of the player relative to the viewing area displayed
on the mobile
device. The game controller 44 may trigger a control command to the display
controller 52 to
dynamically update the rendering of the viewing area based on the player eye
gaze data, player
eye gesture data, or player movement data, and location of the invisible game
component. In
response to the control command from the game controller 44, the display
controller 52 may
control the display device 12, 14, the mobile device, or both, in real-time or
near real-time using
the graphics processor 54 to dynamically update the rendering of the viewing
area to provide a
real-time or near real-time graphical animation effect displayed on the
display device 12, 14, the

CA 02915024 2015-12-11
=
. .
- 33 -
mobile device, or both, representative of a visual update to the visible game
components to
reveal the invisible game component in the viewing area.
[00124] In some embodiments, the mobile device in communication with EGM 10
may be
configured to be a display device that compliments display device 12, 14 when
playing the
interactive game. The player may interact with the interactive game through
the interface of the
mobile device, through the EGM 10, or any combination thereof. The interactive
game
environment, viewing area, and game components of the interactive game may be
displayed on
the mobile device, display device 12, 14, or any combination thereof.
[00125] In some embodiments, a terminal may be connected to one or more EGM 10
over a
network. The terminal may serve as a registration terminal for setting up the
communication
between the mobile device and any EGM 10 connected to the network. Therefore,
the player
does not have to physically go to EGM 10 to set up the link and play the
interactive game
associated with EGM 10.
[00126] Host system 41 may store account data for players. EGM 10 may
communicate with
host system 41 to update such account data, for example, based on wins and
losses. In an
embodiment, host system 41 stores the aforementioned game data, and EGM 10 may
retrieve
such game data from host system 41 during operation.
[00127] In some embodiments, the electronics on the various boards described
herein may be
combined onto a single board. Similarly, in some embodiments, the electronics
on the various
controllers and processors described herein may be integrated. For example,
the processor of
game controller board 44 and graphics processor 54 may be a single integrated
chip.
[00128] EGM 10 may be configured to provide one or more player eye gaze, eye
gesture, or
movement interactions to one or more games playable at EGM 10. The
enhancements may be
to a primary interactive game, secondary interactive game, bonus interactive
game, or
combination thereof.
[00129] Fig. 2B illustrates an online implementation of a gaming system that
may continuously
monitor the eye gaze of a player as described herein. The eye gaze of the
player may be
monitored and/or predicted such that data relating to tracked positions,
trajectories, etc. may be
obtained. Data may be processed to obtain further information, such as various
derivatives of
eye gaze data, including, for example, velocity, acceleration, snap, and jerk.
The eye gaze data

CA 02915024 2015-12-11
- 34 -
may be processed (e.g., smoothed out) to remove undesirable characteristics,
such as
artefacts, transient movements, vibrations, and inconsistencies caused by head
movements,
blinking, eye irregularities, eyelid obstruction, etc.
[00130] The gaming system may be an online gaming device (which may be an
example
implementation of an EGM). As depicted, the gaming system includes a gaming
server 40 and
a gaming device 35 connected via network 37.
[00131] In some embodiments, gaming server 40 and gaming device 35 cooperate
to
implement the functionality of EGM 10, described above. So, aspects and
technical features of
EGM 10 may be implemented in part at gaming device 35, and in part at gaming
server 40.
[00132] Gaming server 40 may be configured to enable online gaming, and may
include game
data and game logic to implement the games and enhancements disclosed herein.
For
example, gaming server 40 may include a player input engine configured to
process player input
and respond according to game rules. Gaming server 40 may include a graphics
engine
configured to generate the interactive game environment as disclosed herein.
In some
embodiments, gaming server 40 may provide rendering instructions and graphics
data to
gaming device 35 so that graphics may be rendered at gaming device 35.
[00133] Gaming server 40 may also include a movement recognition engine that
may be used
to process and interpret collected player eye gaze data, player eye gesture
data, and player
movement data, to transform the data into data defining manipulations and
player interaction
commands.
[00134] Network 37 may be any network (or multiple networks) capable of
carrying data
including the Internet, Ethernet, POTS line, PSTN, ISDN, DSL, coaxial cable,
fiber optics,
satellite, mobile, wireless (e.g. Wi-Fl, WiMAX), SS7 signaling network, fixed
line, local area
network, wide area network, and others, including any combination of these.
[00135] Gaming device 35 may be particularly configured with hardware and
software to
interact with gaming server 40 via network 37 to implement gaming
functionality and render 2D
or 3D enhancements, as described herein. For simplicity, only one gaming
device 35 is shown
but an electronic gaming system may include one or more gaming devices 35
operable by
different players. Gaming device 35 may be implemented using one or more
processors and
one or more data stores configured with database(s) or file system(s), or
using multiple devices

CA 02915024 2015-12-11
- ,
- 35 -
or groups of storage devices distributed over a wide geographic area and
connected via a
network (which may be referred to as "cloud computing"). Aspects and technical
features or
EGM 10 may be implemented using gaming device 35.
[00136] Gaming device 35 may reside on any networked computing device, such as
a personal
computer, workstation, server, portable computer, mobile device, personal
digital assistant,
laptop, tablet, smart phone, an interactive television, video display
terminals, gaming consoles,
electronic reading device, and portable electronic devices or a combination of
these.
[00137] Gaming device 35 may include any type of processor, such as, for
example, any type
of general-purpose microprocessor or microcontroller, a digital signal
processing (DSP)
processor, an integrated circuit, a field programmable gate array (FPGA), a
reconfigurable
processor, a programmable read-only memory (PROM), or any combination thereof.
Gaming
device 35 may include any type of computer memory that is located either
internally or
externally such as, for example, random-access memory (RAM), read-only memory
(ROM),
compact disc read-only memory (CDROM), electro-optical memory, magneto-optical
memory,
erasable programmable read-only memory (EPROM), and electrically-erasable
programmable
read-only memory (EEPROM), Ferroelectric RAM (FRAM) or the like.
[00138] Gaming device 35 is operable to register and authenticate users (using
a login, unique
identifier, and password for example) prior to providing access to
applications, a local network,
network resources, other networks and network security devices. The computing
device may
serve one user or multiple users.
[00139] Gaming device 35 may include one or more input devices (e.g. player
control inputs
50), such as a keyboard, mouse, camera, touch screen and a microphone, and may
also
include one or more output devices such as a display screen (with 3D
capabilities) and a
speaker. Gaming device 35 has a network interface in order to communicate with
other
components, to access and connect to network resources, to serve an
application and other
applications, and perform other computing applications.
[00140] Gaming device 35 connects to gaming server 40 by way of network 37 to
access
technical 2D and 3D enhancements to games as described herein. Multiple gaming
devices 35
may connect to gaming server 40, each gaming device 35 operated by a
respective player.

CA 02915024 2015-12-11
- 36 -
[00141] Gaming device 35 may be configured to connect to one or more other
gaming devices
through, for example, network 37. In some embodiments, the gaming server 40
may be utilized
to coordinate the gaming devices 35. Where gaming devices 35 may be utilized
to facilitate the
playing of a same game, such as an interactive game, wherein the interactive
game includes at
interaction between activities performed by the players on the gaming devices
35, various
elements of information may be communicated across network 37 and/or server
40. For
example, the elements of information may include player eye gaze data, player
eye gesture
data, player movement data, and/or the viewing area displayed on the gaming
device 35. This
information may be used by each of the gaming devices 35 to provide and/or
display interfaces
that take into consideration the received data from another gaming device 35.
The gaming
devices 35 may be configured for cooperative and/or competitive play (or a
combination thereof)
between the players in relation to various game objectives, events, and/or
triggers.
[00142] Fig. 5 is a flowchart of a method 500 implemented by EGM 10 using
various
components of EGM 10. For simplicity of illustration, method 500 will be
described with
reference to Fig. 2A and EGM 10 but it may be implement using gaming device
35, game server
40 or a combination thereof.
[00143] As shown, EGM 10 may include a card reader 34 to identify a monetary
amount
conveyed by a player to the electronic gaming machine.
[00144] EGM 10 may include at least one data storage device storing game data
for at least
one interactive game or at least one bonus interactive game, or both.
[00145] EGM 10 may include graphics processor 54 to generate an interactive
game
environment and define a viewing area as a subset of the interactive game
environment. The
viewing area may have one or more visible game components masking or blocking
one or more
invisible game components based on the game data.
[00146] EGM 10 may include display device 12, 14 to display via a user
interface the viewing
area.
[00147] EGM 10 may include display controller 52 to control rendering of the
viewing area on
the display device 12, 14 using the graphics processor 54.

CA 02915024 2015-12-11
. .
- 37 -
[00148] EGM 10 may include at least one data capture camera device to
continuously monitor
eye gaze of a player to collect player eye gaze data.
[00149] EGM 10 may include a game controller 44 for determining a location of
the eye gaze of
the player relative to the viewing area using the player eye gaze data, the
location
corresponding to the invisible game component, and triggering a control
command to the
display controller 52 to dynamically update the rendering of the viewing area
based on the
player eye gaze data and the location.
[00150] In response to detection of the control command, the display
controller 52 controls the
display device 12, 14 in real-time or near real-time using the graphics
processor 54 to
dynamically update the rendering of the viewing area to provide a real-time or
near real-time
graphical animation effect displayed on the display device 12, 14
representative of a visual
update to the visible game components to reveal the invisible game components
in the viewing
area.
[00151] In response to an outcome of the interactive game, the card reader 34
updates the
monetary amount.
[00152] At 502 (Fig. 5), the at least one data capture camera device and the
display device 12,
14 may be calibrated by game controller 44 and display controller 52 as
described herein.
[00153] At 504, the graphics processor 54 may generate the interactive game
environment in
accordance with the set of game rules using the game data and define a viewing
area as a
subset of the interactive game environment. The viewing area may have one or
more visible
game components masking or blocking one or more invisible game components.
[00154] At 506, display controller 52 may control the display device 12, 14
may display via a
user interface the viewing area.
[00155] At 508, the at least one data capture camera device may continuously
monitor the eye
gaze, eye gesture, and/or movement to collect player eye gaze data, player eye
gesture data,
and/or player movement data.
[00156] At 510, the game controller 44 may determine a location of the eye
gaze of the player
relative to the viewing area as described herein using the player eye gaze
data, player eye
gesture data, and/or player movement data, the location corresponding to the
invisible game

CA 02915024 2015-12-11
- 38 -
component, and trigger a control command to the display controller 52 to
dynamically update
the rendering of the viewing area based on the player eye gaze data, player
eye gesture data,
and/or player movement data, and the location of the invisible game component.
[00157] At 512, display controller 52 may, in response to detection of the
control command,
control the display device 12, 14 using the graphics processor 54 to
dynamically update the
rendering of the viewing area to provide a real-time or near real-time
graphical animation effect
displayed on the display device 12, 14 representative of a visual update to
the visible game
components to reveal the invisible game components in the viewing area.
[00158] At 514, display controller 52 may trigger a winning outcome of the
game for provision
of an award based on the interactions of the player and the game, which may be
associated
with the player eye gaze data, the player eye gesture data, the player
movement data, and/or
the revealed invisible game components. The card reader 34 may update the
monetary
amount.
[00159] In some embodiments, the EGM 10 may recognize potential players
proximate to the
EGM 10. As shown in Fig. 6, the at least one data capture camera device may
continuously
monitor an area proximate to the EGM 10 to collect proximity data. The game
controller 44 may
process the proximity data to detect if a person is proximate to the EGM 10.
If a person is
detected proximate to the EGM 10 then the display controller 52 controls the
display device 12,
14 to display an advertisement. The ability for EGM 10 to recognize potential
players proximate
to the EGM 10 and commence active self-promotion is useful to gain a
competitive advantage
over other gaming machines. It may also be useful for welcoming and
encouraging players to
play the game and provide the player with a sense of astonishment. In contrast
to a gaming
machine that may interact with a player after the player has inserted a
ticket, pressed a button,
or touched a screen, EGM 10 actively starts the player's decision-making
process to interact
with EGM 10 sooner.
[00160] In some embodiments, the display controller 52 may render a gaze-
sensitive user
interface on the display device 12, 14, wherein the game controller 44 detects
the location of the
eye gaze of the player relative to the viewing area using the player eye gaze
data, and triggers
the control command to display controller 52 to dynamically update the
rendering of the viewing
area to provide a real-time or near real-time the graphical animation effect
displayed on the
display device 12, 14 representative of a visual update to the gaze-sensitive
user interface. For

CA 02915024 2015-12-11
*
. . .
- 39 -
example, display controller 52 may control display device 12, 14 to display a
gaze-sensitive user
interface as shown in Fig. 7A and Fig. 7B. The player may gaze at the one or
more visible
game components 710 at the top of the display device 12, 14, and the display
controller 52 may
cause a graphical animation effect to be displayed representative of reducing
the size of or
hiding an options menu 720 at the bottom of the display device 12, 14.
[00161] As shown in Fig. 7A, the options menu 720 may be small and out of the
way. As the
options menu 720 is being hidden, display controller 52 may cause another
graphical animation
effect to be displayed representative of enlarging the one or more visible
game components 710
to use the portion of the display device 12, 14 vacated by the options menu
720. As another
example, as illustrated in Fig. 7B, the player may gaze at the bottom of the
display device 12,
14, which may cause the options menu 720 to be revealed and additional options
may appear
on screen. When the option menu 720 is revealed, the one or more visible game
components
710 may reduce or shrink in size to accommodate the options menu 720. The
player may gaze
at a specific area of display device 12, 14, and additional information may be
displayed on
display device 12, 14. Even though the EGM 10 may have one or two display
device 12, 14, a
gaze-sensitive user interface may effectively increase the size of the display
devices available
to EGM 10. For example, as illustrated in Figs. 7A and 7B, display device 12,
14 may display
one or more visible game components 710 and an options menu 720 without
requiring an
increase in size of the display device 12, 14. The gaze-sensitive user
interface may optimize
the use of the limited space available on display device 12, 14. By monitoring
the eye gaze of
the player, EGM 10 may demonstrate context awareness of what the player is
looking at. For
example, the EGM 10 may detect when the player is distracted by detecting
whether the eye
gaze of the player is on the display device 12, 14.
[00162] EGM 10 may reward a player for maintaining their eye gaze on positive
game aspects.
For example, the at least one data capture display device may collect player
eye gaze data that
may indicate that the player is looking at a particular positive game
component, such as, but not
limited to, a positive game component representative of the rewarding of
points, credits, prizes,
or a winning line on a reel game. The player eye gaze data may also indicate
that the player is
looking at a particular positive game component that may be a revealed
invisible game
component. The display controller 52 may control the display device 12, 14 to
display a
graphical animation effect to enhance the positive game component with
additional fanfare, for
example, a special particle effect, fireworks, additional resolution and/or
size of the positive

CA 02915024 2015-12-11
A
- 40 -
game component, greater colour contrast and brightness, or lights and noises.
In some
embodiments, the graphical animation effect may correlate with the amount of
time the player
has maintained their eye gaze on the positive game component. The longer the
player focuses
their eye gaze on the positive game component, the more graphical animation
effects may be
displayed by display controller 52 on display device 12, 14 and/or the
duration of the graphical
animation effects may be extended.
[00163] The EGM 10 may include a display device 12, 14 with auto stereoscopic
3D
functionality. In some embodiments, the player may interact with a game
component presented
on a display device 12, 14 with auto stereoscopic 3D functionality. The game
component may
appear to be hovering. The player may interact with the game component with
the eye gaze of
the player. For example, the focus of the eye gaze may cause the display
controller 52 to
control display device 12, 14 with auto stereoscopic 3D functionality to
provide a graphical
animation effect representative of rotating the game component. As another
example, the focus
of the eye gaze may cause the display controller 52 to control display device
12, 14 with auto
stereoscopic 3D functionality to provide a graphical animation effect
representative of revealing
an invisible game component. The EGM 10 that may have a display device 12, 14
with auto
stereoscopic 3D functionality may allow a player to interact with the
interactive game without
their hands. This may be useful to not distract from or spoil the 3D effect
provided by the
display device 12, 14 with auto stereoscopic 3D functionality. Where the
display device is a
stereoscopic display device, the graphics processor 54 may generate left and
right eye images
based on a selected three-dimensional intensity level, and the game controller
44 may trigger
the control command to the display controller 52 to dynamically update the
rendering of the left
and right eye images based on the player eye gaze data.
[00164] Tracking the eye gaze, eye gesture, and movement of a player to reveal
masked or
blocked invisible game components may be implemented for a variety of
interactive games and
graphical animation effects. For example, the game may be a game with a reel
space and
game symbols. As another example, the game may be a game to focus eye gaze on
a game
component. The eye gaze of the player on display device 12, 14 may be
implemented as a
graphical animation effect to find and reveal a hidden or obscured game
component. As yet
another example, the game component manipulated by the player's eye gaze, eye
gesture, and
movement may be a virtual avatar. The virtual avatar may be navigated in the
game using the

CA 02915024 2015-12-11
. ,
- 41 -
player eye gaze data, player eye gesture data, player movement data, or any
combination
thereof, to avoid obstacles and collect rewards.
[00165] In some embodiments, the display controller 42 may cause display
device 12, 14 to
display one or more visible game components in front of one or more invisible
game
components. Based on the player eye gaze data, such as player eye gaze data
that may
represent maintaining the player's eye gaze on the one or more visible game
components, the
graphical animation effect displayed on display device 12, 14 may represent
looking behind the
visible game component masking or blocking the invisible game component to
reveal the
invisible game component. For example, the graphical animation effect may be
such that the
visible game component slides away from its location or pivots inwardly or
outwardly to reveal
the invisible game component behind the visible game component.
[00166] In some embodiments, the game controller 44 may process player eye
gaze data to
determine the location of the eye gaze of the player relative to the viewing
area and may trigger
a control command for the display controller 52 to reveal an invisible game
component and
select the revealed invisible game component for a primary interactive game or
a bonus
interactive game. The display controller 52 may cause display device 12, 14 to
render an
invisible game component behind a visible game component. The player may focus
their eye
gaze at the visible game component for a certain period of time. After a
certain pre-defined
period of time, the invisible game component may be revealed. The player may
focus their eye
gaze at the revealed invisible game component for a certain period of time.
After a certain pre-
defined period of time, the display controller 52 may display on display
device 12, 14 using
graphics processor 54 a graphical animation effect representative of the
player selecting the
revealed invisible game component. The selection may trigger an event related
to the
interactive game, such as, but not limited to, a prize award, a bonus game,
advancement or
progression in the interactive game, ending the game, or any combination
thereof. In some
embodiments, after the invisible game component has been revealed, the player
may select the
revealed invisible game component through display device 12, 14, which may be
a touch-
sensitive display device.
[00167] An embodiment of the player's ability to reveal invisible game
components, where the
player's eye gaze may be represented by EGM 10 to act as x-ray vision, and
select the revealed
invisible game components is illustrated in Figs. 9 through 13. Player 900 may
be playing an
interactive game on EGM 10. The interactive game shown in Figs. 9 through 13
is a reel game,

CA 02915024 2015-12-11
.. .
- 42 -
but the interactive game may be any type of game. Display controller 52 may
control display
device 12, 14 to display viewing window 910 that may contain visible game
components 940a,
940b, and 940c. Depending on the game data of the interactive game, there may
be more or
less visible game components. The visible game components 940a, 940b, and/or
940c may be
masking or blocking one or more invisible game components. For example, in
Fig. 9, the player
900 may be presented with a screen on display device 12, 14 to pick a prize,
which may be
represented by visible game components 940a, 940b, and/or 940c. The prize may
be a
multiplier prize, which may multiply the bonus bet made by the player 900. The
player may not
know the details of the multiplier prize, as the details may be represented by
an invisible game
component. The at least one data capture camera device may monitor the eye
gaze of player
900 to collect player eye gaze data. Game controller 44 may calculate the
location of the
player's eye gaze on the display device 12, 14 and map the player's eye gaze
930 to the
viewing area. The player eye gaze data may correspond to the player 900
focusing their eye
gaze on one or more of the visible game components 940a, 940b, or 940c.
[00168] The eye gaze 930 may correspond to the location of the invisible game
component.
Game controller 44 may trigger a control command to display controller 52 to
display on display
device 12, 14 using graphics processor 54 a graphical animation effect
representative of a
visual update to the visible game component to reveal the invisible game
component. As shown
in Fig. 10, player 900 may focus their eye gaze 930 on visible game component
940a to reveal
the details of the multiplier prize. The at least one data capture camera
device may collect
player eye gaze data that indicates that the player 900 is focusing their eye
gaze 930 on visible
game component 940a for a certain period of time. The focus of the player's
eye gaze 930 may
correspond to the location of the invisible game component. As shown in Fig.
11, the game
controller 44 may send a control command to display controller 52 to display a
graphical
animation effect on display device 12, 14 to remove visible game component
940a and reveal
the invisible game component 950a. In Fig. 11, the revealed invisible game
component 950a
may be a "2X" multiplier bonus. The multiplier prize corresponding to the
revealed invisible
game component 950a may be revealed but may not be selected. The player 900
may make a
decision based on the revealed invisible game component 950a.
[00169] For example, the player 900 may not believe that a "2X" multiplier
prize is a desirable
bonus prize. As shown in Fig. 12, the player 900 may focus their eye gaze 930
on another
visible game component, such as visible game component 940b. The game
controller 44 may

CA 02915024 2015-12-11
,
. .
- 43 -
determine that player 900 is looking at visible game component 940b for a
certain period of
time. This may cause game controller 44 to send a control command to display
controller 52 to
display a graphical animation effect on display device 12, 14 using graphics
processor 54
representative of a visual update to a visible game component to reveal an
invisible game
component, such as invisible game component 950b, a "3X" multiplier prize, as
shown in Fig.
13. The multiplier prize corresponding to the revealed invisible game
component 950b may be
revealed but may not be selected. After the player 900 has revealed the
invisible game
components, where EGM 10 represented the player's eye gaze 930 as x-ray
vision, the player
900 may focus their eye gaze on a revealed invisible game component to select
the revealed
invisible game component. The selection of the "3X" multiplier prize may
trigger an event
related to the interactive game, such as multiplying the bonus bet of player
900 by three.
[00170] In some embodiments, the display controller 42 may cause display
device 12, 14 to
display one or more visible game components in front of one or more invisible
game
components. The graphical animation effect may represent seeing through or
rendering
transparent the visible game component masking or blocking the visible game
component to
reveal the invisible game component. For example, based on the eye gaze of the
player that
may be focused on a portion of the visible game component, the display
controller 52 may
cause an a graphical animation effect to be displayed on display device 12, 14
such that the
portion of the visible game component that the player may be looking at may
become
translucent to a certain degree or transparent. This may reveal the invisible
game component
hidden behind the visible game component.
[00171] In some embodiments, one or more invisible game components may be
located in one
or more portions of a viewing area, according to the game data of the
interactive game stored in
the at least one data storage device. A player may look at one or more
portions of display
device 12, 14 to reveal the one or more invisible game components. The at
least one data
capture camera device may collect player eye gaze data based on the player's
eye gaze. The
game controller 44, processing the player eye gaze data, may determine that
there may be
movement of the eye gaze of the player from one location of the display device
12, 14 to
another. The game controller 44 may map the eye gaze of the player to the
viewing area. The
location of the eye gaze of the player may correspond to one or more invisible
game
components that may be masked or blocked by one or more visible game
components. The
game controller 44 may send a control command to the display controller 52. In
response to the

CA 02915024 2015-12-11
. .
- 44 -
control command, the display controller 52 may control display device 12, 14
in real-time or near
real-time using graphics processor 54 to update the rendering of the viewing
area with a
graphical animation effect that may represent a visual update of the one or
more visible game
components to reveal the one or more invisible game components.
[00172] In some embodiments, the at least one data capture camera device of
EGM 10 may
continuously monitor an eye gesture of the player to collect player eye
gesture data. Moreover,
the at least one data capture camera device of EGM 10 may continuously monitor
the player's
movement, such as movement of the player's head, movement of the player's
body, or gestures
made by the player to collect player eye gaze data or player movement data.
The game
controller 44 may trigger the control command to the display controller 52 to
dynamically update
the rendering of the viewing area based on the player eye gesture data and/or
player movement
data using the graphical animation effect to update the one or more visible
game components to
reveal the one or more invisible game components in the viewing area. For
example, the at
least one data capture camera device may collect player eye gesture data
representative of the
player squinting or widening their eyes at a visible or invisible game
component displayed on
display device 12, 14. The game controller 44 may trigger a control command to
display
controller 52 to update the rendering the viewing area in real-time or in near
real-time by
displaying a graphical animation effect representative of revealing and
magnifying the invisible
game component. As another example, the at least one data capture camera
device may
collect player movement data representative of the player moving their hand in
a certain
direction towards a visible or invisible game component displayed on display
device 12, 14. The
game controller 44 may trigger a control command to display controller 52 to
update the
rendering the viewing area in real-time or in near real-time by displaying a
graphical animation
effect representative of revealing the invisible game component, and/or
interacting with the
visible or invisible game component.
[00173] In some embodiments, one or more invisible game components may be
located in one
or more portions of a viewing area, according to the game data of the
interactive game stored in
the at least one data storage device. A player may look at or gesture at one
or more portions of
display device 12, 14 to reveal the one or more invisible game components. The
at least one
data capture camera device may collect player eye gaze data, player eye
gesture data, and/or
player movement data. The game controller 44, processing the player eye gaze
data, player
eye gesture data, and/or player movement data, may detect the eye gaze, eye
gestures, and/or

CA 02915024 2015-12-11
, . .
- 45 -
movement of the player from one location of the display device 12, 14 to
another. The game
controller 44 may map the eye gaze, eye gestures, and/or movement of the
player to the
viewing area. The location of the eye gaze, the eye gesture, and/or movement
of the player
may correspond to one or more invisible game components that may be masked or
blocked by
one or more visible game components. The game controller 44 may send a control
command
to the display controller 52. In response to the control command, the display
controller 52 may
control display device 12, 14 in real-time or near real-time using graphics
processor 54 to
update the rendering of the viewing area with a graphical animation effect
that may represent a
visual update of the one or more visible game components to reveal the one or
more invisible
game components, based on the player eye gaze data, player eye gesture data,
and/or player
movement data.
[00174] In some embodiments, the interactive game may require skill from the
player to
complete. For example, the interactive game may require a player to complete a
task within a
finite amount of time. The amount of time remaining for the player to complete
the task may be
displayed on display device 12, 14 to increase pressure on the player. For
example, the
interactive game may be a skill-based maze bonus game. The player may control
an avatar
using the player's eye gaze to travel through a series of mazes. The player
may cause the
avatar to collect prizes. There may be a timer to indicate the amount of time
the player may
navigate the maze. The maze bonus game may include visible game components
and/or
invisible game components. The maze may include traps that may be visible or
invisible. A
visible game component may mask or block an invisible trap. The player may
look at a portion
of the display device 12, 14, which may be mapped to a location of the viewing
area that may
correspond to an invisible trap, to cause display controller 52 to display on
display device 12, 14
a graphical animation effect representative of a visual update to the visible
game component to
reveal the invisible trap. The player may look at the traps with their gaze to
deactivate the traps
and allow the avatar to continue through the maze. Once the player has guided
the avatar to
the exit, the player may play a new stage of the maze based upon the amount of
prizes collect,
or the maze game may finish. The threshold for the amount of prizes needed to
be collected
may progressively increase based upon which bonus stage the player is at. The
maze bonus
game may be configured to have one or more levels of difficulty. The higher
the difficulty, the
less time the player may have to complete the maze challenge and the player
may have to
navigate through more traps in the maze.

CA 02915024 2015-12-11
- 46 -
[00175] In some embodiments, for another skill-based maze game, while the
player leads an
avatar through a maze using the eye gaze of the player, there may be special
tiles that the
display controller 52 may be configured to cause to appear on the display
device 12, 14. The
player may have a specified number of breakable tiles actions. While moving
the avatar
through the maze, the player may break any wall by locking their gaze on the
wall. In some
embodiments, a breakable tile action may be an invisible game component that
may be
revealed by the eye gaze of the player. This may be used to help the player to
find the exit.
[00176] In some embodiments, based on the game data of the interactive game,
display
controller 52 may display using graphics processor 54 one or more opaque
objects on display
device 12, 14. To interact with the interactive game, the player may use their
eye gaze as x-ray
vision to see through the one or more visible game components, such as the one
or more
opaque objects, to see the invisible game components, such as hidden
information. For
example, the one or more opaque objects may be a safe. The invisible game
component may
be a graphical element with levers. Inside the safe may be one or more levers
and/or a tumbler.
Based on the player eye gaze data, the display controller 52 may display a
graphical animation
effect that may represent seeing through the safe or revealing the graphical
element with levers
and/or a tumbler. The player may interact with and manipulate the tumbler,
such as turning the
tumbler to the correct position and moving the lever and rotating in the
opposite direct when the
correct positions of the lever and/or tumbler have been reached. The game
controller may
recognize that the correct positions of the lever and/or tumbler have been
reached, and display
controller 52 may display a graphical animation effect representative of
opening the safe, which
may reveal a prize.
[00177] In some embodiments, based on the game data of the interactive game,
display
controller 52 may display using graphics processor 54 one or more series of
switches on display
device 12, 14. To interact with the interactive game, the player may use their
eye gaze as x-ray
vision to see through the visible game component, such as the series of
switches, to reveal the
invisible game component, which may be a graphical element of a series of
circuits and
switches. The hidden circuits may connect the switches. The switches may be
associated with
a prize or a series of list prizes. The player may focus their gaze on a
switch to select the
switch corresponding to the prize that the player wants to win.
[00178] In some embodiments, as shown in Figs. 14 to 16, based on the game
data of the
interactive game, the graphics processor 54 may generate a fog effect within
the viewing area

CA 02915024 2015-12-11
-47-
1410 that may mask or block the invisible game component. The display
controller 52 may
display on display device 12, 14 the viewing area 1410 that may have a fog
effect, as shown in
Fig. 14. Player 1400 may focus their eye gaze 1430 to one or more portions of
the viewing area
1410. The at least one data capture camera device may monitor the eye gaze
1430 of the
player 1400 and may collect player eye gaze data. The game controller 44 may
calculate the
location of the eye gaze of the player relative to the viewing area 1410 using
the player eye
gaze data. The location of the eye gaze may correspond to the invisible game
component. In
this example, as shown in Fig. 14, the invisible game component may be the
tropical island
background and prizes obscured by fog effect generated by graphics processor
54. Game
controller 44 may trigger a control command to the display controller 52 to
dynamically update
the rendering of the viewing area 1410 based on the player eye gaze data and
the location of
the eye gaze. In response to the control command, the display controller 52
may control display
device 12, 14 using graphics processor 54 to display a graphical animation
effect to reveal the
invisible game component. In this example, as shown in Fig. 14, the scene is
covered in fog.
The player may not be able to see through the fog. To see through the fog, the
graphical
animation effect may be a transparent circle 1420 displayed at a location
corresponding to the
eye gaze 1430 of the player 1400 that removes the effects of the generated fog
effect and
reveals the tropical island background.
[00179] As shown in Fig. 15, the player 1400 may interact with the interactive
game based on
their eye gaze. For example, the player 1400 may focus their eye gaze 1430.
The at least one
data capture camera device may collect this player eye gaze data. Game
controller 44 may
recognize that the player is focusing their eye gaze 1430. Display controller
52, in response to
a control command from game controller 44, may display a graphical animation
effect on display
device 12, 14 representative of expanding the transparent circle 1420, so the
player may reveal
more invisible game components.
[00180] As shown in Fig. 15, the player 1400 may move their eye gaze 1430. The
game
controller 44 may process the player eye gaze data collected by the at least
one data capture
camera device to determine that the eye gaze 1430 of the player 1400 has moved
to another
location. This new location may correspond to another invisible game
component. The
graphical animation effect displayed by display controller 52 on display
device 12, 14 may
reveal this other invisible game component. In Fig. 15, the player 1400 has
moved their eye
gaze 1430. The corresponding graphical animation effect may be moving the
transparent circle

CA 02915024 2015-12-11
- 48 -
to reveal a prize that was obscured by the fog effect. As shown in Fig. 16,
the prize 1440 may
be revealed after the player 1400 focuses their eye gaze 1430 at the location
of the prize 1440
for a specified amount of time.
[00181] In some embodiments, based on the game data of the interactive game,
display
controller 52 may display using graphics processor 54 one or more avatars on
display device
12, 14. For example, one or more avatars may be a spy. One or more invisible
game
components may be a graphical element of one or more avatars carrying a hidden
document.
The hidden documents may be obscured by the spy avatar. The spy may be
attempting to
sneak out hidden documents. To interact with the interactive game, the player
may use their
eye gaze as x-ray vision to see through the visible game component, such as
the avatars, to
determine which avatar is the spy, and reveal the invisible game component,
such as the hidden
documents. The player may focus their gaze on the spy holding the hidden
documents to select
the spy, which may trigger an event related to the interactive game, such as
winning a prize.
[00182] In some embodiments, one or more players may play in a shared game or
a multi-
player game. The shared game or multi-player may be a primary interactive game
or a bonus
interactive game. The EGM 10 of one player may be in communication with one or
more other
EGMs, for example, wireless communication. The at least one data storage
device may store
game data for a shared game or multi-player game that may be a primary multi-
player
interactive game and/or a bonus multi-player interactive game. For example,
during the shared
bonus game, each player may search the viewing area displayed on display
device 12, 14 for
one or more invisible game components masked or blocked by one or more visible
game
components. The at least one data capture camera device on each EGM may
monitor the eye
gaze of the players and collect eye gaze data for each player. The invisible
game component
may be a bonus game component of a set of bonus game components. The bonus
game
components may be displayed as symbols, which may be related to the shared
game or multi-
player game. The player may use their eye gaze, such as focusing their eye
gaze on a portion
of the display device 12, 14, corresponding to the location of an invisible
game component, to
reveal the invisible game component. The player may further use their eye
gaze, such as
focusing their eye gaze on the revealed invisible game component, to select
the revealed
invisible game component. Game controller 44 may detect when a player has
selected a pre-
defined subset of bonus game components, using the player eye gaze data, such
as a player
selecting a certain combination of bonus game components or the same bonus
game

CA 02915024 2015-12-11
. 4
- 49 -
components. The selection of the pre-defined subset of bonus game components
may trigger a
bonus prize reward. For example, the first player to select a matching set of
bonus game
components may win the bonus prize.
[00183] The player may not want to select a revealed invisible game component.
The player
may use their eye gaze or eye gesture, such as looking towards the edge of the
display device
12, 14, or blinking, to reject a revealed first bonus game component. The game
controller 44
may detect the eye gaze and/or eye gesture of the player representative of
rejecting the
revealed invisible game component, and may trigger the display controller 52
to display on the
display device 12, 14 a second bonus game component. The game controller 44 of
EGM 10
may also cause the game controller 44 of another EGM to cause the display
controller of the
other EGM to display on the display device of the other EGM the revealed first
bonus game
component. This may give the effect of a player "passing along" a rejected
bonus game
component to another player.
[00184] In some embodiments, while one or more players are playing a shared
game or a
multi-player game, the display controller 52 of an EGM for a player may
control the display
device of the EGM to display different layers, each layer corresponding to the
viewing area of
the other players. While a player looks at their display device, the player
may see the other
player's viewing areas hidden behind their viewing area. In some embodiments,
the shared
game may be a reel game. When the players spin the reels, the rendering of
each player's
viewing area may be updated and may be viewed by each player. A player may
view their own
reels on their display device, except where the player's eye gaze is focused
on the display
device. Instead, at the portion of the display device that a player is looking
at, the viewing area
of one or more other players may appear and any win with a combination of the
one or more
viewing areas may give a prize to the one or more players. This may encourage
cooperative
play between multiple players. In some embodiments, a player may press a
button to view the
viewing area of another player.
[00185] In some embodiments, the graphics processor 54 may generate an
interactive game
environment with a set of game rules using game data, such that there may be
one or more
invisible game components. The graphics processor 54 may define a viewing area
as a subset
of the interactive game environment, which may contain one or more invisible
game
components. The display controller 52 may control the display device 12, 14 in
real-time or
near real-time using the graphics processor 54 to update the rendering of the
viewing area to

CA 02915024 2015-12-11
- ,
- 50 -
provide a real-time or near real-time graphical animation effect
representative of rendering
visible at least a portion of the invisible game components in the viewing
area. This may allow
more game components to be displayed on the display device 12, 14, which may
have finite
size. For example, EGM 10 may provide a privacy mode for the player. There may
be a menu
at the bottom of display device 12, 14 that may display the credits conveyed
to EGM 10 by the
player or the amount of credits won by the player. By default, the credits may
be invisible,
blurred out, or masked or blocked by a visible game component. When the player
focuses their
eye gaze on the user interface, the display controller 52 may control display
device 12, 14 to
reveal or display the amount of credits. The graphical animation effect to
reveal or display the
amount of credits may be, for example, to display the invisible credit amount,
to put in focus the
blurred out credit amount, or to remove the visible game component and reveal
the invisible
credit amount. This may allow the player to hide the amount of credits
associated with the
player or the amount of credits won by the player from nearby observers.
[00186] A player may play one or more games at EGM 10. The player may have the
option of
selecting an interactive game from a plurality of interactive games to be
played at EGM 10 when
the player initially conveys credits to EGM 10. However, not all game selector
symbols may be
displayed on display device 12, 14 because the display device 12, 14 may lack
space. Another
reason may be that one or more game selector symbols may be intentionally
masked or blocked
so the player may find and reveal it to play a hidden or bonus interactive
game. The player may
use their eye gaze to display a plurality of game selector symbols and to
select and play a game
from the plurality of games. The player may also use their eye gaze to find
and reveal one or
more invisible game selector symbols masked or blocked by one or more visible
game
components and to select and play the corresponding game. In some embodiments,
EGM 10
may have a card reader to identify the monetary amount conveyed by the player
to the EGM 10.
The EGM 10 may have at least one data storage device that may store game data
for one or
more primary interactive games and/or bonus interactive games. The graphics
processor 54
may generate an interactive game environment in accordance with a set of game
rules using
the game data and define a viewing area as a subset of the interactive game
environment, the
viewing area having one or more game selector symbols. The viewing area may
also have one
or more visible game components masking or blocking an invisible game selector
symbol. EGM
10 may have display device 12, 14 to display via a user interface the viewing
area. EGM 10
may have a display controller 52 to control rendering of the viewing area on
the display device

CA 02915024 2015-12-11
-51-
12, 14 using the graphics processor 54. At least one data capture camera
device may
continuously monitor the eye gaze of a player to collect player eye gaze data.
[00187] A game controller 44 may determine a location of the eye gaze of the
player relative to
the viewing area using the player eye gaze data, the location corresponding to
the invisible
game selector symbol, and triggering a control command to the display
controller 52 to
dynamically update the rendering of the viewing area based on the player eye
gaze data and
the location of the player eye gaze. In response to the control command, the
display controller
52 may control the display device 12, 14 in real-time or near real-time using
the graphics
processor 54 to dynamically update the rendering of the viewing area to
provide a real-time or
near real-time graphical animation effect displayed on the display device 12,
14 representative
of a visual update corresponding to selecting one of the game selector
symbols, or revealing
and selecting the invisible game selector symbol, in the viewing area and
displaying a selected
interactive game for the selected game selector symbol, the visual update
based on the player
eye gaze data. For example, display controller 52 may control display device
12, 14 to display a
plurality of game selector symbols configured in the shape of a carousel.
Based on the eye
gaze of the player, such as up, down, left, or right, the display controller
52 may control display
device 12, 14 to display a rotating carousel of game selector symbols, which
may reveal
additional and hidden game selector symbols. The player may focus on a portion
of the display
device 12, 14 to reveal one or more invisible game selector symbols. Based on
the eye gaze of
the player, such as looking at or near the center of display device 12, 14,
the rotating carousel
of game selector symbols may slow down or stop at a game selector symbol
corresponding to
the player's preferred game. The player may also focus on one or more visible
game
components to reveal the invisible game selector symbol. In response to an
outcome of the
interactive game, the card reader may update the monetary amount. The player
may focus on
the game selector symbol to select and play the game. In some embodiments, the
player may
scroll through the plurality of game selector symbols or reveal invisible game
selector symbols
using their eye gaze, eye gestures, the movement of their head, the movement
of their body, or
a combination thereof.
[00188] A player may use their eye gaze to navigate through the interactive
game
environment, change the camera angle on a visible game component or a revealed
invisible
game component, and discover and reveal invisible game components in the
interactive game
environment that may not be in the viewing area. The EGM 10 may have a card
reader to

CA 02915024 2015-12-11
- 52 -
identify a monetary amount conveyed by a player to the EGM 10. The EGM 10 may
have at
least one data storage device to store game data for an interactive game. The
graphics
processor 54 may generate an interactive game environment in accordance with a
set of game
rules using the game data and define a viewing area as a first subset of the
interactive game
environment, the first subset of the interactive game environment having a
first visible game
component masking or blocking a first invisible game component. The display
device 12, 14
may display via a user interface the viewing area. Display controller 52 may
control rendering of
the viewing area on the display device 12, 14 using the graphics processor 54.
At least one
data capture camera device may continuously monitor eye gaze of a player to
collect player eye
gaze data. The game controller 44 may determine a location of the eye gaze of
the player
relative to the viewing area using the player eye gaze data and triggering a
control command to
the display controller 52 to dynamically update the rendering of the viewing
area based on the
player eye gaze data.
[00189] In response to the control command, the display controller 52 controls
the display
device 12, 14 in real-time or near real-time using the graphics processor to
dynamically update
the rendering of the viewing area in real-time or near real-time to navigate
to the second subset
of the interactive game environment, the second subset of the interactive game
environment
having a second visible game component masking or blocking a second invisible
game
component, wherein the update comprises a graphical animation effect displayed
on the display
device representative of navigating to the second subset of the interactive
game environment,
the update based on the player eye gaze data. In response to an outcome of the
interactive
game, the card reader updates the monetary amount. A player may use their eye
gaze, eye
gestures, head movement, body movement, or any combination thereof to navigate
through the
interactive game environment, change the camera angle on a visible game
component, and
discover and reveal invisible game components in the interactive game
environment that may
not be in the viewing area.
[00190] For example, as illustrated in Fig. 17 and Fig. 18, the graphical
animation effect
displayed on the display device 12, 14 may represent a smooth sliding
transition from the first
subset of the interactive game environment to the second subset of the
interactive game
environment. Graphics processor 54 may generate interactive game environment
1710 in
accordance with a set of game rules using the game data for one or more
interactive games
stored in at least one data storage device. Interactive game environment 1710
may include one

CA 02915024 2015-12-11
..
. ,
- 53 -
or more game components, some of which may be visible, while others may be
invisible. In Fig.
17 and Fig. 18, two game components 1750a and 1750c are visible, and game
component
1750b is invisible as it may be masked or blocked by a visible game component,
but there may
be more or less game components based on the game data of the one or more
interactive
games. Graphics processor 54 may define a viewing area 1740 as a first subset
of the
interactive game environment. In Fig. 17, the viewing area 1740 includes
visible game
component 1750a and excludes invisible game component 1750b. Display device
12, 14 may
display viewing area 1740. Display controller 52 may control rendering of the
viewing area on
the display device 12, 14 using the graphics processor 54. Game controller 44
may process
player eye gaze data collected from the at least one data capture camera
device to determine
that the eye gaze 1730 of the player 1720 may be focused on visible game
component 1750a.
[00191] As illustrated in Fig. 17, player 1720 may view visible game component
1750a on
display device 12, 14. Player 1720 may wish to navigate to another area of the
interactive
game environment 1710. For example, player 1720 may wish to discover visible
game
components that may mask or block invisible game components. Game controller
44 may
determine that the location of the eye gaze of the player relative to the
viewing area has
changed. For example, the player 1720 may be looking at the top of display
device 12, 14.
Based on this change of location of the eye gaze 1730, game controller 44 may
trigger a control
command to the display controller 52 to dynamically update the rendering of
the viewing area
1740. Display controller 52 may update the rendering of the viewing area 1740
in real-time or
near real-time to navigate to the second subset of the interactive game
environment 1710. A
graphical animation effect, such as a sliding animation effect, may be used to
transition from the
viewing area 1740 comprising a first subset of the interactive game
environment 1710 to the
viewing area 1740 comprising a second subset of the interactive game
environment. As shown
in Fig. 18, the viewing area 1740 is a second subset of the interactive game
environment 1710
that is different from the first subset of the interactive game environment
1710. The viewing
area 1740 comprising the second subset of the interactive game environment
1710 contains
invisible game component 1750b. The invisible game component 1750b may be
masked or
blocked by one or more visible game components. Since the viewing area 1740 is
displayed on
display device 12, 14, from the perspective of player 1720, the player's eye
gaze has caused a
transition from a first subset of the interactive game environment 1710 to a
second subset of the
interactive game environment 1710. The effect of the eye gaze of the player
may be to navigate
the interactive game environment 1710. For example, the player 1720 may be
looking at visible

CA 02915024 2015-12-11
- 54 -
game component 1750a, and through navigation of the interactive game
environment 1710, the
player 1720 discovered a second subset of the interactive game environment.
This may create
the effect that the display device 12, 14 is an infinitely large screen, or a
larger screen than it
actually is. The player 1720 may focus their eye gaze 1730 on the visible game
component
masking or blocking invisible game component 1750b, which may cause display
controller 52 to
display a graphical animation effect on display device 12, 14 representative
of a visual update to
the visible game component to reveal invisible game component 1750b in the
second subset of
the interactive game environment. This may give the player a sense of
discovery and
satisfaction as part of an engaging gaming experience.
[00192] As another example, one or more game components 1750 may be within a
viewing
area 1740 and displayed on display device 12, 14 with a certain camera angle
or view angle.
The game controller 44 may process collected player eye gaze data and trigger
a control
command to display controller 52 to update the rendering of the viewing area
1740 in real-time
or near real-time to display a graphical animation effect representative of
changing the camera
angle or view angle. From the perspective of player 1720, the graphical
animation effect may
appear to be a rotation of the one or more game components 1750 on display
device 12, 14. As
yet another example, the player, using their eye gaze, may reveal an invisible
game component
and may rotate the revealed invisible game component.
[00193] The embodiments of the devices, systems and methods described herein
may be
implemented in a combination of both hardware and software. These embodiments
may be
implemented on programmable computers, each computer including at least one
processor, a
data storage system (including volatile memory or non-volatile memory or other
data storage
elements or a combination thereof), and at least one communication interface.
[00194] Program code is applied to input data to perform the functions
described herein and to
generate output information. The output information is applied to one or more
output devices. In
some embodiments, the communication interface may be a network communication
interface. In
embodiments in which elements may be combined, the communication interface may
be a
software communication interface, such as those for inter-process
communication. In still other
embodiments, there may be a combination of communication interfaces
implemented as
hardware, software, and combination thereof.

CA 02915024 2015-12-11
- 55 -
[00195] Throughout the following discussion, numerous references will be made
regarding
servers, services, interfaces, portals, platforms, or other systems formed
from computing
devices. It should be appreciated that the use of such terms is deemed to
represent one or
more computing devices having at least one processor configured to execute
software
instructions stored on a computer readable tangible, non-transitory medium.
For example, a
server can include one or more computers operating as a web server, database
server, or other
type of computer server in a manner to fulfill described roles,
responsibilities, or functions. The
devices provide improved computer solutions for hardware limitations such as
display screen,
display device, and so on.
[00196] The following discussion provides many example embodiments. Although
each
embodiment represents a single combination of inventive elements, other
examples may
include all possible combinations of the disclosed elements. Thus if one
embodiment comprises
elements A, B, and C, and a second embodiment comprises elements B and D,
other remaining
combinations of A, B, C, or D, may also be used.
[00197] The term "connected" or "coupled to" may include both direct coupling
(in which two
elements that are coupled to each other contact each other) and indirect
coupling (in which at
least one additional element is located between the two elements).
[00198] Embodiments described herein may be implemented by using hardware only
or by
using software and a necessary universal hardware platform. Based on such
understandings,
the technical solution of embodiments may be in the form of a software
product. The software
product may be stored in a non-volatile or non-transitory storage medium,
which can be a
compact disk read-only memory (CD-ROM), USB flash disk, or a removable hard
disk. The
software product includes a number of instructions that enable a computer
device (personal
computer, server, or network device) to execute the methods provided by the
embodiments.
[00199] The embodiments described herein are implemented by physical computer
hardware.
The embodiments described herein provide useful physical machines and
particularly
configured computer hardware arrangements. The embodiments described herein
are directed
to electronic machines methods implemented by electronic machines adapted for
processing
and transforming electromagnetic signals which represent various types of
information. The
embodiments described herein pervasively and integrally relate to machines,
and their uses;
and the embodiments described herein have no meaning or practical
applicability outside their

CA 02915024 2015-12-11
- 56 -
use with computer hardware, machines, a various hardware components.
Substituting the
computing devices, servers, receivers, transmitters, processors, memory,
display, networks
particularly configured to implement various acts for non-physical hardware,
using mental steps
for example, may substantially affect the way the embodiments work. Such
computer hardware
limitations are clearly essential elements of the embodiments described
herein, and they cannot
be omitted or substituted for mental means without having a material effect on
the operation and
structure of the embodiments described herein. The computer hardware is
essential to the
embodiments described herein and is not merely used to perform steps
expeditiously and in an
efficient manner.
[00200] For example, and without limitation, the computing device may be a
server, network
appliance, set-top box, embedded device, computer expansion module, personal
computer,
laptop, personal data assistant, cellular telephone, smartphone device, UMPC
tablets, video
display terminal, gaming console, electronic reading device, and wireless
hypermedia device or
any other computing device capable of being configured to carry out the
methods described
herein.
[00201] Although the embodiments have been described in detail, it should be
understood that
various changes, substitutions and alterations can be made herein without
departing from the
scope as defined by the appended claims.
[00202] Moreover, the scope of the present application is not intended to be
limited to the
particular embodiments of the process, machine, manufacture, composition of
matter, means,
methods and steps described in the specification. As one of ordinary skill in
the art will readily
appreciate from the disclosure of the present invention, processes, machines,
manufacture,
compositions of matter, means, methods, or steps, presently existing or later
to be developed,
that perform substantially the same function or achieve substantially the same
result as the
corresponding embodiments described herein may be utilized. Accordingly, the
appended
claims are intended to include within their scope such processes, machines,
manufacture,
compositions of matter, means, methods, or steps.
[00203] As can be understood, the examples described above and illustrated are
intended to
be exemplary only.

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date Unavailable
(22) Filed 2015-12-11
(41) Open to Public Inspection 2017-06-11
Dead Application 2022-03-04

Abandonment History

Abandonment Date Reason Reinstatement Date
2021-03-04 FAILURE TO REQUEST EXAMINATION
2021-06-11 FAILURE TO PAY APPLICATION MAINTENANCE FEE

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Application Fee $400.00 2015-12-11
Maintenance Fee - Application - New Act 2 2017-12-11 $100.00 2017-10-20
Maintenance Fee - Application - New Act 3 2018-12-11 $100.00 2018-11-23
Maintenance Fee - Application - New Act 4 2019-12-11 $100.00 2019-11-20
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
IGT CANADA SOLUTIONS ULC
Past Owners on Record
None
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Abstract 2015-12-11 1 23
Description 2015-12-11 56 3,285
Claims 2015-12-11 8 373
Drawings 2015-12-11 20 1,880
Representative Drawing 2017-05-17 1 44
Cover Page 2017-05-17 2 91
New Application 2015-12-11 4 148
Correspondence 2016-07-26 7 459
Office Letter 2016-08-29 1 30
Office Letter 2016-08-30 1 38