Language selection

Search

Patent 2880668 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent Application: (11) CA 2880668
(54) English Title: UNDERWATER PROJECTION WITH BOUNDARY SETTING AND IMAGE CORRECTION
(54) French Title: PROJECTION SOUS L'EAU AVEC DETERMINATION DES LIMITES ET CORRECTION D'IMAGE
Status: Dead
Bibliographic Data
(51) International Patent Classification (IPC):
  • G03B 43/00 (2021.01)
  • E04H 3/16 (2006.01)
(72) Inventors :
  • REDDY, RAKESH (United States of America)
  • JOHNSON, BRUCE (United States of America)
  • DOYLE, KEVIN (United States of America)
(73) Owners :
  • PENTAIR WATER POOL AND SPA, INC. (United States of America)
(71) Applicants :
  • PENTAIR WATER POOL AND SPA, INC. (United States of America)
(74) Agent: FINLAYSON & SINGLEHURST
(74) Associate agent:
(45) Issued:
(86) PCT Filing Date: 2013-07-31
(87) Open to Public Inspection: 2014-02-06
Examination requested: 2018-05-09
Availability of licence: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): Yes
(86) PCT Filing Number: PCT/US2013/053084
(87) International Publication Number: WO2014/022591
(85) National Entry: 2015-01-30

(30) Application Priority Data:
Application No. Country/Territory Date
61/678,606 United States of America 2012-08-01
13/626,867 United States of America 2012-09-25

Abstracts

English Abstract

An underwater projection system, a controller and method of controlling are herein described. The controller providing, at least in part, a boundary setting module or methodology and/or an image correction module or methodology through a user interface for the underwater projection system. A user interface is provided to enable user control and input and adjustment of the image controller from an observation point outside of the media of the underwater projection system, while the adjustments are made in-situ. An optional automated edge or edge and surface detection system is also contemplated to assist in boundary detection within the water feature for the underwater image projection system.


French Abstract

La présente invention se rapporte à un système de projection sous l'eau, à un dispositif de commande et à un procédé de commande. Le dispositif de commande comporte, au moins en partie, un module, ou une méthodologie, de détermination des limites et/ou un module, une méthodologie, de correction d'image à travers une interface utilisateur pour le système de projection sous l'eau. On utilise une interface utilisateur pour permettre un contrôle d'utilisateur et une entrée d'utilisateur et le réglage du dispositif de commande d'image à partir d'un point d'observation situé à l'extérieur du milieu du système de projection sous l'eau tandis que les réglages sont effectués in situ. Un système automatisé de détection de bord et de surface ou de bord facultatif est également considéré pour aider à la détection des limites dans un élément à base d'eau pour le système de projection d'image sous l'eau.

Claims

Note: Claims are shown in the official language in which they were submitted.


CLAIMS
What is claimed is:
1. A computer enabled device controlling image projection calibration of an

underwater projection system, comprising:
a system controller communicating with said underwater projection system;
a water feature in which said underwater projection system is located under
water
and projects an image therefrom under the water, wherein the system controller
has
an at least one boundary mapping module and an at least one image correction
module where the image is provided to the controller and projected through
said
underwater projection system after the system controller executes the at least
one
boundary mapping module to establish a boundary for the display of the image
within the water feature and the system controller executes the at least one
image
correction module to establish image correction data for correcting the image
displayed by said underwater projection system and controls projection of the
image
based on the correction data within the boundary for display of the image
within the
water feature by the underwater projection system.
2. The computer enabled device of claim 1, further comprising a user
interface
communicating with the system controller, wherein data representing the
mapping
of the at least one boundary mapping module is input by a user through the
user
interface.
32

3. The computer enabled device of claim 1, further comprising a user
interface
communicating with the system controller, wherein the correction data of the
image
is established through observations from a user and the observations are input

through the user interface.
4. The computer enabled device of claim 2, wherein the user interface is a
graphical user interface.
5. The computer enabled device of claim 4, wherein the graphical user
interface
includes a touch screen.
6. The computer enabled device of claim 5, wherein the user input is drawn
onto the touch screen.
7. The computer enabled device of claim 3, wherein the user interface is a
graphical user interface.
8. The computer enabled device of claim 7, wherein the graphical user
interface
includes a touch screen.
9. The computer enabled device of claim 8, wherein the user input is drawn
onto the touch screen.
10. The computer enabled device of claim 2, wherein the data for the
mapping of
the at least one boundary mapping module input by a user is provided through
placement of projected boundary image within the water feature through the
user
interface in communication with the system controller controlling projection
of the
boundary image in the underwater projection system to establish a projection
boundary line within an underwater surface within the water feature.
33

11. The computer enabled device of claim 10, wherein the placement of
projected boundary patterns with the water feature through the user interface
includes placement around an obstruction within the water feature.
12. The computer enabled device of claim 3, wherein the data for the
correction
data of the image is established through observations from the user of a test
image of
known shape on the user interface and correction of the same test image
projected
through the underwater image projection system to establish the correction
data.
13. The computer enabled device of claim 12, wherein the test image is
projected sequentially or simultaneously at multiple points throughout the
water
feature.
14. A method of operating an underwater image projection system,
comprising:
executing a boundary mapping module for the underwater image projection system

to establish a projection area boundary line or edge;
executing an image correction module for the underwater image projection
system
to establish a correction map or calibration table for the area within the
projection
area boundary line or edge;
retrieving image data for an image as input;
adjusting the image data and manipulating the image based on the correction
map or
calibration table through a dynamic content manipulation system; and
projecting the image underwater in the water feature through the underwater
projection system.
34

15. The method of claim 14, wherein the execution of the boundary system
module
projects through the underwater image projection system an at least one
boundary
image.
16. The method of claim 15, wherein the at least one boundary image is
displayed
on a user interface and a user positions the boundary image at a point of
projection.
17. The method of claim 16, wherein the at least one boundary image is further

manipulated to change its shape to correspond to a contours of the water
feature at
the point of projection within the water feature.
18. The method of claim 17, wherein the placement of the at least one boundary

image further comprises the placement of multiple boundary images
sequentially.
19. The method of claim 17, wherein the placement of the at least one boundary

image further comprises the placement of multiple boundary images
simultaneously.
20. The method of claim 17, wherein the placement of the at least one boundary

image further comprises the placement of multiple boundary images around an
obstruction.
21. The method of claim 14, further comprising applying additional corrective
variables for the water feature analyzed as input by a user or sensed through
sensors
in communication with the underwater projection system.
22. The method of claim 21, wherein the additional corrective variables
include at
least one of a variable relating to placement, timing, display of the at least
one
boundary image or at least one variable relating to data collected, stored,
programmed or utilized for calculations relating to the water feature.

23. The method of claim 14, wherein the execution of the image correction
module
projects a test image of a known shape into the water feature to produce an
uncorrected projected image.
24. The method of claim 23, wherein the uncorrected projected image is
analyzed
and a correction determined by an observation made from outside the water
feature.
25. The method of claim 24, wherein the correction is applied to the test
image of a
known shape and the underwater projector projects the test image of a known
shape
with the correction applied to produce an intermediary projected image.
26. The method of claim 25, wherein the intermediary projected image is
analyzed
and the corrections are repeated until the projected intermediary image is
substantially similar to the known image as projected.
27. The method of claim 26, wherein the corrections are observed by a user and
a
user interface prompts input of the observed uncorrected and intermediary
images
by the user.
28. The method of claim 27, wherein the user interface is a graphical user
interface
with a touch screen and the user draws the observed shape and corrections are
calculated from the variance of the drawn shape by a controller.
29. A method of establishing a projection area boundary in a water feature
defining
a projection area for an underwater image projection system, comprising:
projecting an at least one boundary image in the water feature;
adjusting a position of the at least one boundary image;
36

adjusting or selecting the shape of the at least one boundary image to comport
with a
surface within the water feature at the position; and
drawing and storing a projection area boundary based on the at least one
boundary
image placement and shape.
30. The method of claim 29, further comprising adjusting the position of the
at least
one boundary image using a user interface.
31. The method of claim 30, wherein the user interface is a touch screen
enabled
device.
32. The method of claim 31, wherein the boundary image is configured through a

user changing the shape of the boundary image with the touch screen user
interface.
33. The method of claim 32, wherein the boundary image shape is adjusted
through
selection of shapes in a menu or palette of shapes.
34. A method of correcting an image in a water feature projected by an
underwater
image projection system from an observation point outside of the water
feature,
comprising:
observing an uncorrected projected image of a known or expected shape at a
specified target area of said water feature alone or across a defined
projection area as
set by a defined boundary
inputting observations or sensed variables related to the variance of the
uncorrected
projected image from the known shape at the specified target area of said
water
feature alone or across the defined projection area as set by a defined
boundary;
37

projecting the uncorrected projected image with an at least one correction
data point
to produce an intermediary projected image and observing and inputting further

corrections until the intermediary projected image is the substantially
similar to the
known or expected shape; and
storing the corrections applied to the image at the specified target area of
said water
feature alone or across the defined projection area as set by a defined
boundary.
35. The method of claim 34, further comprising input of observations using a
user
interface.
36. The method of claim 35, wherein the user interface is a touch screen
enabled
device.
37. The method of claim 34, wherein the observing of an uncorrected projected
image and the intermediary projected image is done by a user from a vantage
point
outside of the water feature.
38. The method of claim 14, wherein the underwater projection system includes
an
at least two underwater image projection devices and coupling the at least two

underwater image projection devices to share data regarding the stored
projection
area boundary line or edge and correction map or calibration table for each of
the at
least two underwater projection devices.
39. The method of claim 14, wherein the method is executed by a controller in
a
computer enabled device.
40. The method of claim 29, wherein the method is executed by a controller in
a
computer enabled device.
38

41. The method of claim 34, wherein the method is executed by a controller in
a
computer enabled device.
42. An underwater light projector with an underwater light projector
controller,
comprising: a computing device or a controller communicating with a computer
enabled device, wherein the computing device or controller communicating with
a
computer enable device executes the method of claims 14, 29, or 34 while
controlling said underwater light projector and an image issuing therefrom.
39

Description

Note: Descriptions are shown in the official language in which they were submitted.


CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
UNDERWATER PROJECTION WITH BOUNDARY SETTING AND IMAGE CORRECTION
METHOD OF USING SAME
COPYRIGHT NOTICE
[001] Portions of the disclosure of this patent document contain material that
is
subject to copyright protection. The copyright owner has no objection to the
facsimile reproduction by anyone of the patent document or the patent
disclosure, as
it appears in the U.S. Patent and Trademark Office patent file or records, but

otherwise reserves all copyright rights whatsoever.
CROSS-REFERENCE TO RELATED APPLICATIONS
[002] This application claims the benefit of priority of U.S. Patent
Application No.
13/626,867, filed September 25, 2012, which claims the benefit of priority of
U.S.
provisional patent 61/678,606, filed August 1, 2012, which are incorporated
herein
by reference.
BACKGROUND OF THE INVENTION
[001] Field of the Invention
[002] The invention relates to, at least in part, a computer enabled device to
control
a projection system, set a projection boundary, and provide image correction,
a
controller for controlling in part an underwater projection system, a method
of
controlling underwater projection systems, a method of boundary setting in an

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
underwater projection system or controller, a method of projection image
correction
in an underwater projection system or controller, and a method of using a
controller
in an underwater projection system.
[003] Background of the Invention
[004] In the field of image projection, a number of obstacles are provided to
accurate projection of images, especially images on a non-uniform surface.
This is
further compounded when the image is projected in one composition or fluid
environment and the observer is located outside of this composition or fluid
environment. Or similarly when the projection of the image and subsequent
observation of the image are made in one fluid environment and the projected
image
target is in another fluid environment, resulting in unacceptable distortion.
An
example of this distortion can be seen when one observes fish in a body of
water, the
position of the fish observed is distorted as is the size and sometimes shape
of the
fish from a vantage outside the water. Additionally, in projecting images in
the
interior of a body of water like those in water features or in targeted
natural water
settings, such as but certainly not limited to pools, fountains, spas, sand
bars, and the
like surface irregularities make correction in combination with the distortion
effects
difficult.
[005] Some of the technical difficulties and issues in projecting images in
these
locations include accommodating the variables of transmissivity within the
body of
water, the variations between the transmissivity of the water and the
interface with
air and the location of an observer outside of the body of water, undulating
contours
and angles within the body of water and in the surfaces being projected upon
within
2

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
the body of water, observational variations based on the position of the
observer,
and similar variables which must be accommodated to provide ideal image
viewing.
[006] As such, a need exists for a projection system, a projection system
controller,
an underwater projection system, an underwater projection system controller, a

computer enabled apparatus for controlling a projection system and/or an
underwater projection system, a method of controlling a projection system
and/or an
underwater projection system, and a method of controlling and adjusting a
projected
image that overcomes these challenges and provides a robust image quality in a

projected image with control of the projected light and, potentially,
additional
lighting. An in-situ projection system with user observed image control and
correction is needed to address the myriad of complexities in projecting such
quality
images.
[007] SUMMARY OF THE INVENTION
[008] An aspect of the invention is to provide a projection system with a
robust
method of correcting an image based on sensed or input variables to
accommodate
variables in projection surface, fluid changes, and other environmental
variables as
well as spatial variables.
[009] A further aspect of the invention is to provide a controller in an
underwater
projection system that can be programmed to sense or receive variable input
for the
dimension of a pool or water feature and define a target projection boundary
and
then receive data on variables of known images to provide a correction for the
area
defined by the target projection boundary and thereby correct images projected
from
the underwater projection system during operation of same.
3

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
[010] Another aspect of the invention is to provide a controller which
utilizes an at
least one variable regarding the conditions of the water in the water feature
to adjust
the projection variables for the projection of an image within the water
feature.
[011] A further aspect of the invention is to provide an automated data
collection
system coupled to a controller to provide for automated data collection for
one or
both of a boundary setting module and an image correction mapping module.
[012] Yet another aspect is to provide a controller for an underwater
projector
system which is in situ with the fluid environment of a projection boundary
area and
projects within a water feature, however the observation vantage is outside of
the
water feature.
[013] The apparatus of the invention includes a computer enabled device
controlling image projection calibration of an underwater projection system.
The
computer enabled device having a system controller communicating with said
underwater projection system with a water feature in which said underwater
projection system is located under water and projects an image therefrom under
the
water. Where the system controller has at least one of an at least one
boundary
mapping module and/or an at least one image correction module. The image is
provided to the controller and projected through said underwater projection
system
after the system controller executes the at least one boundary mapping module
to
establish a boundary for the display of the image within the water feature.
The
system controller executes the at least one image correction module to
establish
image correction data for correcting the image displayed by said underwater
projection system and controls projection of the image based on the correction
data
within the boundary for display of the image within the water feature by the
4

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
underwater projection system. Reference is made throughout to the terms pool
or
water feature in projecting images in a body of water, these terms
specifically
include natural water sources like those in water features or in targeted
natural water
settings, such as but certainly not limited to pools, water features,
fountains, spas,
sand bars, and the like.
[014] The computer enabled device can also include a user interface
communicating with the system controller, wherein data representing the
mapping
of the at least one boundary mapping module is input by a user through the
user
interface. The user interface can also communicate with the system controller,

wherein the correction data of the image is established through observations
from a
user and the observations are input through the user interface. The user
interface
can be a graphical user interface. The graphical user interface can include a
touch
screen. The user input can be drawn onto the touch screen.
[015] The data for the mapping of the at least one boundary mapping module
input
by a user can be provided through placement of projected boundary image within

the water feature through the user interface in communication with the system
controller controlling projection of the boundary image in the underwater
projection
system to establish a projection boundary line within an underwater surface
within
the water feature. The placement of projected boundary patterns with the water

feature through the user interface can include placement around an obstruction

within the water. The data for the correction data of the image can be
established
through observations from the user of a test image of known shape on the user
interface and correction of the same test image projected through the
underwater
image projection system to establish the correction data. The test image can
be

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
projected sequentially or simultaneously at multiple points throughout the
water
feature.
[016] The method of the invention includes a method of operating an underwater

image projection system including executing a boundary mapping module for the
underwater image projection system to establish a projection area boundary
line or
edge, executing an image correction module for the underwater image projection

system to establish a correction map or calibration table for the area within
the
projection area boundary line or edge, retrieving image data for an image as
input,
adjusting the image data and manipulating the image based on the correction
map or
calibration table through a dynamic content manipulation system and projecting
the
image underwater in the water feature through the underwater projection
system.
[017] The execution of the boundary system module can project through the
underwater image projection system an at least one boundary image. The at
least
one boundary image can displayed on a user interface and a user positions the
boundary image at a point of projection. The at least one boundary image can
be
further manipulated to change its shape to correspond to a contours of the
water
feature at the point of projection within the water feature. The placement of
the at
least one boundary image can further comprise the placement of multiple
boundary
images sequentially. The placement of the at least one boundary image can
further
comprise the placement of multiple boundary images simultaneously. The
placement of the at least one boundary image can include the placement of
multiple
boundary images around an obstruction.
[018] The method can further include applying additional corrective variables
for
the water feature analyzed as input by a user or sensed through sensors in
6

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
communication with the underwater projection system. The additional corrective

variables can include at least one of a variable relating to placement,
timing, display
of the at least one boundary image or at least one variable relating to data
collected,
stored, programmed or utilized for calculations relating to the water feature.
[019] The execution of the image correction module can project a test image of
a
known shape into the water feature to produce an uncorrected projected image.
The
uncorrected projected image can be analyzed and a correction determined by an
observation made from outside the water feature. The correction can be applied
to
the test image of a known shape and the underwater projection system projects
the
test image of a known shape with the correction to produce an intermediary
projected image. The intermediary projected image can be analyzed and the
corrections are repeated until the projected intermediary image is
substantially
similar to the known image as projected. The corrections can be observed by a
user
and a user interface prompts input of the observed uncorrected and
intermediary
images by the user. The user interface can be a graphical user interface with
a touch
screen and the user draws the observed shape and corrections are calculated
from the
variance of the drawn shape by a controller.
[020] The method of the instant invention also includes a method of
establishing a
projection area boundary in a water feature defining a projection area for an
underwater image projection system, including projecting an at least one
boundary
image in the water feature; adjusting a position of the at least one boundary
image;
adjusting or selecting the shape of the at least one boundary image to comport
with a
surface within the water feature at the position; and drawing and storing a
projection
area boundary based on the at least one boundary image placement and shape.
7

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
[021] The method can also include adjusting the position of the at least one
boundary image using a user interface. The user interface can be a touch
screen
enabled device. The boundary image can be configured through a user changing
the
shape of the boundary image with the touch screen user interface. The boundary

image shape can be adjusted through selection of shapes in a menu or palette
of
shapes.
[022] The method of the instant invention also includes a method of correcting
an
image in a water feature projected by an underwater image projection system
from
an observation point outside of the water feature, including observing an
uncorrected
projected image of a known or expected shape at a specified target area of
said water
feature alone or across a defined projection area as set by a defined
boundary;
inputting observations or sensed variables related to the variance of the
uncorrected
projected image from the known shape at the specified target area of said
water
feature alone or across the defined projection area as set by a defined
boundary;
projecting the uncorrected projected image with an at least one correction
data point
to produce an intermediary projected image and observing and inputting further

corrections until the intermediary projected image is the substantially
similar to the
known or expected shape; and storing the corrections applied to the image at
the
specified target area of said water feature alone or across the defined
projection area
as set by a defined boundary.
[023] The method can include input of observations using a user interface. The

user interface is a touch screen enabled device. The observing of an
uncorrected
projected image and the intermediary projected image can be done by a user
from a
vantage point outside of the water feature. The underwater projection system
can
include an at least two underwater image projection devices and the method can
8

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
include coupling the at least two underwater image projection devices to share
data
regarding the stored projection area boundary line or edge and correction map
or
calibration table for each of the at least two underwater projection devices.
The
methods can be executed by a controller in a computer enabled device.
[024] The methods and software modules noted herein may be programmed on a
computer and executed by same as executable code, thus the methods may be
executed by a machine in a machine language and execute the specific
operations
and methods indicated.
[025] Moreover, the above objects and advantages of the invention are
illustrative,
and not exhaustive, of those which can be achieved by the invention. Thus,
these
and other objects and advantages of the invention will be apparent from the
description herein, both as embodied herein and as modified in view of any
variations which will be apparent to those skilled in the art.
BRIEF DESCRIPTION OF THE DRAWINGS
[026] Embodiments of the invention are explained in greater detail by way of
the
drawings, where the same reference numerals refer to the same features.
[027] Figure 1 shows a plan view of a water feature with underwater projection

system.
[028] Figure 2A-2C shows the application of a boundary mapping system for the
underwater image projection system.
[029] Figure 2D shows a plan view of a further embodiment of the instant
invention incorporating an automatic data collection module and hardware.
9

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
[030] Figure 2E shows a plan view of a further embodiment of the instant
invention incorporating a further automatic data collection module and
hardware.
[031] Figure 3 shows an exemplary embodiment of the input selection process
for
the edge boundary operation into the boundary mapping system through a GUI by
a
user.
[032] Figure 4 shows the operation of the underwater projection system and
resulting deformation due to the water feature variables and correction taken
by the
underwater projection system.
[033] Figure 5 shows an exemplary embodiment of the GUI of the instant
invention operating on a projected test image.
[034] Figure 6A shows a system component or module view of the controller of
the instant invention.
[035] Figure 6B shows a further system component or module view of the
controller of the instant invention with an automated data collection module.
[036] Figure 7 shows a view of a first and second of an at least two
underwater
projection systems simultaneously operating.
DETAILED DESCRIPTION OF THE INVENTION
[037] The instant invention is directed to an underwater projection system and

controller for configuring and controlling image projection in situ. Water has
a
different refractive index from air which adds to the optical deformation of
the light
as it exits a lens surface if it is out of the water, thus in-situ projection
simplifies the

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
necessary corrections. In the exemplary embodiment, both the boundary setting
and
the image correction modules of the controller are executed underwater so that
the
perspective distortion correction from projector to target is avoided. Thus,
in-situ
projection reduces the complexity of the corrections applied by the
controller.
Additionally, the boundary measurements and corrections of the exemplary
embodiment of the instant invention being applied in-situ provides consistency
and
ease of operation. The user views the image at the target as projected in situ
and
applies the corrections as outlined below from outside the water. Thus, the
focus of
the projection system is set so that the water is in contact with the exterior
lens and
participates in the final optical manipulation. In the absence of the water in
contact
with the lens, the size, position and focus of the objects are different and
requires
further adjustment calculations, which can be added in additional software in
further
embodiments of the instant invention. However, the exemplary embodiment of the

invention described herein is directed to an underwater projection system. As
water
features often have complex geometric shapes that cannot be easily modeled
onto
software, this approach of in-situ projection with a user observing and
guiding
corrections from outside the water feature allows the user to simply move
markers to
specific points thus defining the boundary point by point and applying
corrections
across the defined boundaries as observed without the variation from the
projector
being out of the water. Thus, the exemplary embodiment provides a novel
control
function for an underwater projector.
[038] Figure 1 shows a plan view of a water feature with underwater projection

system. The system includes a user interface, in this case a graphical user
interface(GUI) 400 that communicates, either through a wired coupling or
wireless
coupling or through a network, with a system controller 120. The system
controller
11

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
120 controls an underwater image projection system 100. The underwater
projection system has a lens (not shown) that is in contact with the water 220
of the
water feature 210. This system projects an image 520 into the confines of the
water
feature 210 under the water 220. In this instance the water feature is a pool,

however, the system may be used in fountains, pools, Jacuzzis, ponds, from the
hull
of a boat into a natural lake or ocean with a sand bar, or similar underwater
uses.
The control elements and software, described herein below, may also be
utilized in
above water applications such as theatrical or live show presentations or
similar
displays. Similarly reference is made to a GUI 400, here shown as a tablet
type
computing/communications device. However, GUI 400 may simply be an analog
controller with buttons and an LCD interface for instance or LED lights and a
series
of input buttons. The specific design of the user interface may be varied to
provide
the necessary steps of boundary setting and graphical variable corrections
specified
herein without departing from the spirit of the invention.
[039] In addition to the image projection elements of the underwater image
projection system 100 the exemplary embodiment shown also includes optional
ambient lights similar to conventional pool floodlights, that operate in
conjunction
with the image projection elements of the underwater image projection system
100
to provide optional ambient lighting effects 510. Although shown as a single
device, the combination of the underwater projection system with an at least
one
ambient light (not shown) operating in a further installation is embraced in
the
instant invention or combined in similar manner. The underwater projection
system
100 may also be operated independently without ambient lighting. Similarly,
multiple underwater projection systems 100 working in the same water feature
210
are contemplated, as better seen in Figure 7.
12

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
[040] In addressing the unique nature of projecting a two dimensional image on
the
non-uniform surface of most water features 210 it is necessary for the system
controller 120 to operate a visual correction system in projecting images in
motion
in the water feature, along the water feature, and images appearing around the

surface of the pool or water feature 210. To this end, the software or modules

residing on the controller 120 include an at least one boundary mapping
routine or
module for the image projection surfaces within the pool in conjunction with
or
operating independently from an at least one correction routine or module for
correcting the projected images.
[041] Figures 2A-2C show the application of a boundary mapping system for the
underwater image projection system. The system projects through the underwater

image projection system 100 a series of boundary patterns. In the embodiment
show, these are represented by the projected boundary shapes 351-358, labeled
A-G
in Figure 2A. The system, in this exemplary embodiment, utilizes the
underwater
projection system in step by step fashion to project each boundary patterns or

projected boundary shapes 351-358 sequentially.
[042] This is exemplified in Figure 2B, where a single boundary image is being

displayed in conjunction with the operation of the GUI 400 in a manner as
shown
and described in Figure 3. In this instance boundary image "H" 352 is being
shown
in its placement within the water feature 210 as shown. The underwater
projection
system 100 is shown projecting the boundary image "H" 352 in the water
feature. A
frame 350 is shown representing the maximum extent or size of projection that
is
possible directly from the primary modulation or steering device (not shown);
in this
instance the total projection size is directly limited from the projection
source
aperture to a discrete portion of the water feature 210. The secondary
modulation or
13

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
steering device (not shown) allows for the frame 350 to be moved allowing for
subsequent projection of other shapes in other parts of the pool such as the
various
locations shown in Figure 2A of boundary shapes 351-358.
[043] Figure 2C shows another exemplary embodiment of the instant invention.
As seen in Figure 2C, in a further exemplary embodiment utilizing another form
of
projector with sufficient projector aperture to be able to project within a
larger
section of the boundary of the projection area at once, the controller 120 may
project
multiple selected boundary images at once, here shown as boundary images "A"
353, "B" 354, and "C" 355. In this instance the aperture size of the projector
in
projection system 100 is sufficient that there is no secondary modulation
necessary
to project into the water feature 210. In this case multiple boundary images
353-355
can be projected. Similarly, if the image aperture of the projector is
sufficiently
large up to all the boundary images may be projected at once.
[044] Though a shape is shown, here a right angle corner, the shape of the
test
images shown in the figures is not exhaustive of what may be used. The test
image
simply needs to be discernible by the user and/or the system and implemented
in the
drawing of the boundary. Additional intermediary steps may also be utilized
and
additional corrective variables may be analyzed as input by a user or sensed
through
a series of sensors. The placement, timing and display as well as the data
collected,
stored, programmed or utilized for calculations and/or compensations may be
varied
without departing from the spirit of the invention and the order shown is
provided as
a non-limiting example.
[045] The process of placing each projected boundary shape 351-358 in the
exemplary embodiment shown is similar in each instance. Again, reference is
made
14

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
to the stepwise exemplary embodiment displaying each projected boundary shape
351-358 in sequence and adjusting placement individually. This is further
illustrated
in an obstruction in the display area of the water feature is shown here as
230, a
Jacuzzi or similar obstruction for example, that is dealt with by implementing
an at
least one additional marker 360, here shown as multiple markers "E" 357 and
"F"
358, to accommodate the changes in the boundary necessary due to the
obstruction.
[046] Further additional markers can be added as necessary to outline the
display
boundary line or edge 370. In other non-limiting examples for instance this
type of
variation could also simply be caused by the shape of the pool or water
feature or
could encompass additional water features or other similar obstructions.
Additional
markers 360 could be used to compensate for these obstructions as well or
further
additional markers, which could be added to adjust the boundary. These types
of
obstructions can be typically found in a water feature installation such as
pool.
Similar obstructions may also occur in other types of water features and
bodies of
water, for instance water fountains from nozzles, waterfalls, level changes,
and
similar design aspects of most water features need to be accommodated.
Additional
obstructions may also be present in naturally occurring environments, such as,
but
certainly not limited to, those found on sand bars and jetties from rock
outcrops and
the like. The instant invention allows for adjustment of the underwater
projection
system 100 projection boundary 370 to compensate for these obstructions, like
230,
as shown through the boundary setting component with flexible input of any
number
of additional marker elements 360.
[047] Each boundary pattern or projected boundary shape 351-358 is selected by
a
user 450 through the GUI 400 as further described in relation to Figure 3. The

boundary shapes may be presented to the user as a drop down menu, a graphical

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
menu, or any other appropriate display for a variety of morphable or user
configurable shapes. These shapes are mutable and morphable or configurable by

the user into the desired boundary of the projection area.
[048] In this instance, a series of corners and straights are provided and
placed by
the user 450 with the aid of the GUI 400 by morphing or configuring a selected

boundary shape, for instance the corner shown, to define the projection field
or area
edge or boundary 370. When selected, the boundary pattern or projected
boundary
shape 351-358 is selected and moved to a selected edge or boundary or
perimeter
370 and fit to define various contour points. In this instance, the projected
test shape
"A" 353 is projected to a corner and the shape is configured as a right angle.

Additional methods of inputting shapes and outlines can be provided and will
function to provide the necessary guidance for boundary setting in the
underwater
projection system. For instance, using a pen or stylus to draw a boundary and
then
interpolating that boundary on a picture of the desired area is just one
further
example of boundary definition processes that might be utilized. The result
being,
as shown in Figure 2A, the image boundary 370 is defined by user input to
select
and place limitations defining the boundary.
[049] Figure 2D shows a system view of a further embodiment of the instant
invention incorporating an automatic data collection module and hardware. As
seen
in Figure 2D an automated data collection module 730 can, as best seen in
Figure
7B, also be incorporated in a controller, for instance system controller 120,
and
hardware including an at least one sensor 512. This can include an edge
detection
system which can, but is certainly not limited to, determine the shape of the
pool. A
non-limiting example of such a system would be one that utilizes a passive
sensor to
determine variations in the received amount of an energy, such as light or
sound, and
16

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
to produce a gradient which can then be utilized to define and edge or through
color
variations from a video image. One non-limiting example can be, but is
certainly
not limited to, an infra-red sensor or camera which can detect the absorption
gradient around the water feature to define the water features actual edges
for the
display for instance. Similar cameras or sensors in different spectrums could
be
utilized in conjunction with or alternatively to such a system to aid in
automatically
detecting the initial shape of the water feature for the user. These would
generally
be classified as passive type sensors, capturing reflected light or sound for
instance
to provide data.
[050] Figure 2E shows a further embodiment of the instant invention
incorporating
another embodiment of an automatic data collection module. Another type of
system that might be utilized is an active mapping system using infrared or
other
active emissions to determine depth and dimensions. In these instances, the
controller, here again the system controller 120 for example or the user
interface
400, utilizes further software modules and/or hardware to facilitate image
capture or
transmission of data pertaining to the water feature. An at least one active
emitter
517 with an at least one sensor 519, for instance but certainly not limited to
an
ultrasonic or LIDAR type system, using the at least one emitter 517 above or
in the
pool to detect distance and shape in conjunction with one or more sensors 519
acting
as receivers. In this type of system, the data can be interpreted based on the
location
of the at least one sensors 519 to the at least one emitter 517 to determine
both edge
dimensions and changes in the underwater surfaces of the water feature 210. In
both
Figures 2D and 2E, these systems can be coupled to the instant invention to
provide
pool shape and size data using an automated data collection module on a
controller.
17

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
[051] In this way the additional modules including the automated data
acquisition
module 370 can capture the information when instructed and create a boundary
line
370 setting for the underwater projection system 100. This purpose can be
served
with other sensors such as ultrasonic or LIDAR or other active or passive
sensors as
noted. Generally, if the data acquired is to be for the projection border
component
of the projection border and image correction routine module 610, a passive
system
to detect the edge of the pool or a passive system alone or in conjunction
with an
active system can be utilized. This data could then be coupled to the
projection
border and image correction routine module 610 to set the edge of the water
feature
and then define a projection area in the boundary as noted above in relation
to
figures 2A-2C, with adjustment by the user 450.
[052] An active sensor can also be used for more detailed analysis of the
surfaces
and an improved initial boundary and projection area determination for the
projection border and image correction routine module 610. The can be, for
example but is certainly not limited to, providing data in the image
correction
portion of the module 610 to provide the initial data points similar to the
user driven
setup or provide for a more automated system for initially mapping the water
feature
and the projection points of test images. In addition, in a still further
automated
embodiment a further camera or other image sensing device (not shown) can be
used
to visualize the boundary images and automatically calculate the correction
inputs in
a manner similar to the input from the user as explained above. By visual
comparison from an observation point which is then compared to a stored known
target image by image comparison software.
[053] The data from the automated data collection module 730 can be used in
conjunction with the user 450, wherein steps similar to those of the
previously
18

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
disclosed embodiments correct the initial setup from the automated detection
modules. Alternatively, the existing modules may function directly from with
automated data collection modules 730, providing a user 450 with a more
simplified
interface, allowing for simple deformation on screen to "adjust" a final image

projection determined by the map generated by the boundary setting module and
image correction module using data from the automated detection module for
instance.
[054] In any case, the result of the process used is delineation in the pool
or water
feature or target of a boundary370 within a body of water 220 of a projection
area
375 with a projection area boundary edge 370 for the underwater image
projection
system 100 to project within and for which a correction map 620 is then
defined for
the water feature or target 210.
[055] In the exemplary embodiment disclosed herein, the GUI 400 is used to
interactively shape the boundary such that the corner is opened to the dotted
boundary line edge or boundary perimeter 370, as shown. The angle of the
boundary line defined by this interaction that maps the corner at the location
of
projected test shape "A" 353 in the pool. A further edge indicator projected
shape is
selected by the user 450 and projected as a straight line "B" 354. This is
placed as
shown and the boundary 370 begins to be constructed by controller 120 joining,

without displaying, the edges defined by the projected test shapes A and B
353, 354.
The system proceeds with the user 450 selecting and placing components C-G in
place 355-351 to define the edge boundary or boundary line 370 of the
projection
system. The obstruction 230 is avoided by placing the projected test shapes E
and F
in such a way as to modulate morph or mutate the boundary around the
obstruction
230. Each projected test image 351-358 is selected and placed by a user
interacting
19

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
with the GUI 400. This can include the use of a guided or step by step wizard
program, as further discussed herein. This placement is stored along with the
modulation, morphing or configuration of the selected images and a projection
image display boundary 370 is set.
[056] Figure 3 shows an exemplary embodiment of the input selection process
for
the edge boundary operation into the boundary mapping system through a GUI by
a
user. The GUI 400 displays a series of input selectors 430. The GUI 400 is
shown
being utilized by a user 450. Touch selection and control is shown in this
exemplary
embodiment, however, the instant invention fully embraces both analog and
digital
input from other input devices including but not limited to sliders, switches,
buttons,
external interface devices, human interface devices and the like for both
input and
selection as well as execution of and interaction with the GUI 400. Again,
although
a touch screen GUI is shown as best mode exemplary embodiment herein, a non-
touch screen controller, for example but certainly not limited to a controller
having a
small LED representative screen and button inputs, is fully contemplated
herein.
Additionally, as noted above, parts of the input may be automated through the
use of
certain edge boundary schema, including thermal or standard spectrum video
edge
detection processes or similar machine enabled edge detections software.
[057] As noted above, a "wizard" or prompting program is used in the exemplary

embodiment of the instant invention. The wizard guides the user 450 through a
series of onscreen prompts and proceeds through the process of selecting
boundary
images and then adjusting these shapes to fit the water feature. As shown, an
outline
of the pool 470 is projected on the screen. This can be, for instance but is
certainly
not limited to, an image derived from direct input of the user, for instance
by
providing a subroutine to draw the pool shape and any obstacles or it can be
from a

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
stored library of pool shapes or a further embodiment may utilize an actual
picture
of the pool using an edge detection algorithm to define the shape or similar
mechanisms for inputting the image of the shape of the pool 900 and displaying
it on
the GUI 400. This may be input by the user or may be derived from operation of
the
GUI 400, such as operation of the GUI through a cellular phone with a camera
or
similar image capture device.
[058] The user selects projected boundary shapes 351-358, as noted above in
relation to Figures 2A-2C, which are also displayed on the GUI 400 as GUI
screen
test images 411-414 on the GUI 400 through touch sensitive screen 440. Again,
reference is made here to an exemplary embodiment utilizing a touch screen
interface, the invention embraces input from all types of human interface
devices
including keyboard, buttons, pointer devices, and similar human interface
devices.
The user selects, shown as an active selection element or cursor 460, the
projected
boundary shapes on the screen 440 and moves, as indicated by arrow 465, the
shape,
here shape 414, to appropriate locations to outline changes in the boundary
for the
projection area boundary 370 as shown in Figure 3. Similarly, as discussed in
reference to Figures 2D and 2E, the system may use an automated system to
generate a boundary or generate an initial boundary followed by user input to
further
correct the boundary.
[059] Once placed, the GUI onscreen test image 411 is greyed out or
unhighlighted, here shown as being in this state with a dotted line. In the
exemplary
embodiment shown three of four of the GUI onscreen test images 411, 412, and
413
have been placed and are unselected. User 450 is placing a final GUI onscreen
test
shape 414 as shown by dragging it into position on the screen 440 a progress
bar 420
is shown provides prompts and shows steps or percentage to completion. An
21

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
additional prompt (not shown) would prompt user 450 to the next software
module.
This may include the deformation correction module or a further module for
operating the pool or the underwater projection display system 100 or a
related
system. However, to operate properly in an underwater environment or in
projecting
on a non-uniform surface, the underwater projection system of the instant
invention
must be able to compensate for both variations in the projected image surface,

variations in fluid variables and variations in positional variables within
the water
feature or target projection area.
[060] Figure 4 shows the operation of the underwater projection system and
resulting deformation due to the water feature variables and correction taken
by the
underwater projection system. In this instance the underwater projection
system 100
is shown in a water feature, in this instance in the standard light niche of a
pool,
projecting a test pattern or image 310 into the pool and onto the bottom of
the pool.
A body of water 220 is provided and the refractive nature of the water in
combination with the irregular shape of the most pool bottoms leads to a
deformation of the projected image as viewed by the user. The projection onto
the
irregular shape of the bottom surface of the water feature contributes in
large part to
the distortion as well because the projection source is not directly above the
image
plane. As a result, one end of the image is projected onto a surface that is
closer to
the source than the other. The side that is closer has a smaller image because
the
divergence is smaller. Again, a test pattern or test image target shape 411 of
known
dimensions, shown here as a square , is used to determine the amount of
deformation in particular location in the pool based on user 450 input on the
screen
to correct the projected uncorrected image 320. The deformation correction is
22

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
applied until the pprojected uncorrected image 320 is transformed into an
image
substantially similar to the corrected test image 330.
[061] Reference is made below to an exemplary method of inputting these
deformations by the user 450 through the GUI controller 400. Additional
methods
of input, interaction, and correction are also contemplated. A comparison to
target
implementation is made here to compensate for variables in both the shape of
the
projection surface and other variables, some non-limiting examples being for
instance those involving fluid or transmission properties of the fluid or
specific fluid
flow characteristics, which are in the method described compensated for
together.
Other schema for determining variance in a projected image could include
specific
measurement and compensation for the variables in both the projection surface
and
the projected media as well as spatial variables. These could include
triangulation
and image distortion based on calculations compensating for density,
transmissivity,
time of day, water temperature, surface composition, surface color, background

color and the like. In either case, the basic operation of the module will
result in an
image correction or distortion adjustment/control map that provides a
corrected
image being projected from an underwater source to remain consistent as
observed
by a user of the projection system based on corrections for observed and/or
measured variables to the input caused principally by the projection within
the water
media, the non-uniform target surface, and the spacial dynamics of a body of
moving fluid that interfaces with a fluid of a different density that the
point of
observation is located within.
[062] In broadest terms, the user interface, here GUI 400, can be used to
input
observations or sensed variables of the variance of the uncorrected projected
image
320 of a known shape and these variances can be used as corrections in the
23

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
projection for that targeted area of the water feature 210 across a defined
projection
area as set by a boundary. The process of entering the observed or sensed
variances,
for example from a point outside the water feature, can be repeated through
several
intermediary projected results until the projected image 310 is modified to
compensate for projection variables and is thus transformed from the
uncorrected
image 320 to the corrected image 330. These corrections, once entered, can be
stored and used during the operation of the underwater projection system 100
to
correct these variances in the projected images as they are displayed in real
time.
The projected image may be a still image, a series of images shown as an
animation,
a video or the like in any format readable by the controller as desired by the
user.
More specifically, an animation or series of animations is specifically
embraced by
the instant invention, which may include the use of one or more projection
elements
or projection systems to provide for fully animated "shows" or features shown
on
the surfaces of the underwater feature as desired by a user. Some specific
examples
would include, but certainly not be limited to, wildlife scenes,
advertisements,
animated caricatures, visually appealing geometric shapes and displays and the
like.
[063] In the exemplary embodiment shown, as best seen in figure 5 below, on
the
GUI 400 touch screen 440 a display for each of the boundary locations
corresponding to the perimeter 370 of the underwater projection system 100
projection as set by the user 450 through the GUI as disclosed above, where
the
boundary area perimeter or line or edge 370 is generated and can be stored
when the
boundary module is executed. The stored locations each have a target image 411

associated with the location and the user observes the distortion of the
uncorrected
image 320, as best seen in Figure 4, with the target image 411 on the GUI. The
user
450 inputs the resulting projected uncorrected shape 320. Corrections are
calculated
24

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
based on the user input. Corrections are applied and an intermediary image is
shown to the user and the process of observation and correction repeats until
a final
corrected image 330 corresponding to the known shape is shown.
[064] Figure 5 shows an exemplary embodiment of the GUI of the instant
invention operating on a projected test image. In the exemplary embodiment
shown
the GUI 400 is shown operating in the image correction mode. The GUI has a
touch
screen 440. Again, reference is made to a touch screen and touch input,
however,
one of ordinary skill in the art will understand that these inputs may be
varied and
any acceptable human interface devices including but not limited to touch
enabled
devices, button inputs, keyboards, mice, track balls, joysticks, touch pads,
and other
human interface devices and the like. Each area of the projection area or
perimeter
510 as defined by the boundary settings operations like those described above
in
relation to Figures 1-4 is provided a test image selection marker, and the
user 450
selects a test image 411 on the screen 440 as input. Each area is selected or
woken
by the user 450 from a hibernation state, as evidenced by the dashed lines,
481, 483,
484 and as each is set it goes dormant again. The active location 482 is shown
on
the screen 440 with the selected test image or target shape. The known target
shape
411 can be projected on the screen and start as a regular shape, such as but
not
limited to squares, rectangles, octagons, and the like. The display on the
screen of
the user interface 400 differs from the image shown, as seen in the Figure 4.
The
user through the user interface takes the displayed image and distorts it to
match the
distorted or uncorrected image 320 shape seen in the targeted body of water
220, for
instance a pool or pond or water featured. Alternative methods of input such
as
drawing the image as seen or using a stylus or pen input or similar means of
conveying the observed image are fully embraced by the instant disclosure.

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
[065] The objective is to convey what is seen by the user 450 so that a degree
of
correction can be stored through the series of adjustments made to the output
image.
These adjustments are then analyzed, stored, and the projected image test
pattern in
the pool is updated as the adjusted figure on the screen. This is continued
until the
target shape 411 projected on the screen is satisfactorily met in the
projection within
the pool, as seen in Figure 4, as the corrected image 330.
[066] In this exemplary embodiment, the operating parameters are adjusted for
multiple locations represented by the displayed test figures or images 481-
484.
Again, these are used as the target for the user input uncorrected shape 490.
Once
all the corrected shapes are achieved and the data is stored, a calibration
table or
correction map is compiled by the controller. In cases where the input is
sequential,
the correction map 620, as shown in Figure 6, can also interpolate as between
the
points used for the input to establish the correction needed to maintain the
aspect
ratio of a projected image moving within the water feature. This information
is
shared and used by the Dynamic Content Manipulator (DCM) system.
[067] The calibration table or correction map 620 is stored by the controller
120 as
best seen in Figure 6. The storage of the data may be in volatile or non-
volatile
storage. In the exemplary embodiment, the storage of the correction map 620 is
in
non-volatile memory, indicated as storage 600, on the controller. This allows
for
recall of the map without the need to run the correction routine or module at
every
startup of the underwater projection system 100. However, if the correction
routine
or module is necessary, for instance if it is used in a portable device that
is moved
from water feature to water feature, the information may be stored in volatile

memory or alternatively an option to reset may be provided at startup. Thus a
complete correction map or calibration table 620 of the projection area within
the
26

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
boundary 370 is produced and stored for adjusting the projected image from the
underwater projection system 100 during operation of the underwater projector.
[068] Figure 6A shows a further system component or module view of the
controller of the instant invention. In the exemplary embodiment, the
invention
utilizes a touch screen display 440 in the GUI 400. Other forms of controller
may
be utilized including wired, wireless, computers, tablets, network enabled
devices,
smart phones, Wi-Fi enabled devices, and the like. The instant exemplary
embodiment utilizes a web addressable server with the control software being
downloaded onto a tablet device GUI 400. The controller 120 interfaces with
the
wireless server which in turn is coupled through a wireless connection to
through the
projection system interface 650 to the controller on the underwater image
projection
system 100. The underwater projection system 100 in the exemplary embodiment
shown also having an optional at least one addressable ambient light, for
instance a
high intensity/brightness LED (HBLED) for illuminating the pool with
background
or ambient lighting which is controlled through the controller 120. In this
way each
individual light and the underwater projection system would be addressable by
the
controller and singularly controlled as an element of the system. Currently,
to
accomplish this control, each light is on an analog switched system. That is
individual switches control power to and switch the state of the light. Using
this
system, a single addressable controller switches the light through a soft
switch on
the controller and this is interfaceable with the handheld controller through
a
wireless network or wireless connection.
[069] In Figure 6A, components or modules of the control software of the
exemplary embodiment are shown. Although this is representative of the modules

for control as outlined above, additional modules and functions may be added
or
27

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
removed without departing from the spirit of the invention. Likewise, although

reference is made herein above to a network addressable control system,
current
technologies using switched controls may also be used in conjunction with the
software to control one or more lights and the underwater projection system
100.
[070] As seen in the figure, the instant invention comprises a GUI controller
400 in
communication with a system controller 600. The software in the exemplary
embodiment depicted is located on the systems controller 120. However it may
also
be located on the GUI 400 and its controller. The exemplary embodiment show
utilizes a projection border and image correction routine module 610 to
calibrate and
establish the boundary 370 of the projection area and produce a calibration
table or
correction map 620 as noted above. The data from this execution of this module
in
conjunction with the GUI controller 400 is stored as a calibration table or
matrix or
correction map 620. The stored information can be contained in a non-volatile
memory and retained after powering down. In this way the adjustments remain
until
the user or the system prompts recalibration, as noted above. Alternatively,
recalibration may be forced at each startup, for instance if calibration
utilizes sensors
and is fully automated or may require an active election to save by a user and
then
saved into non-volatile memory. Content, such as image 520, is sent into the
system
through a content module 630. This content may be uploaded to or streamed or
otherwise provided through the content module 630. The data from the content
module together with the calibration map 620 are then provided to the Dynamic
Content Manipulator (DCM) module 640.
[071] The DCM 640 is responsible for manipulating the data received form the
content module 630 with the measurements made in the calibration table or map
620. The data is essentially corrected for the calculated variances in the
image as
28

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
projected across the entirety of the bounded projection area, as herein
described
above. The result of the data of the content module 630 being manipulated by
the
data of the calibration table or map 640 is a control output for the
projection system
100 and is communicated through the projection system controller interface 650
to
the projection system controller. Additionally, the DCM 640 may also
incorporate
additional variables representing environmental variances. The can include for

instance additional corrective variables including at least one of a variable
relating to
placement, timing, display of the at least one boundary image or at least one
variable
relating to data collected, stored, programmed or utilized for calculations
relating to
the water feature or environmental information on the water feature such as
certainly
not limited to water temperature, salinity, measure visibility, depth,
material data,
and the like.
[072] Figure 6B shows a further system component or module view of the
controller of the instant invention with an automated data collection module.
The
embodiment shown is substations similar to the embodiment of Figure 6A.
However, the embodiment of Figure 6B provides for a further automated data
collection module 730 as discussed above in relation to Figure 2D and 2E. The
added automated data collection module 730 communicates data collected from
the
sensors shown in Figure 2D and 2E to the boundary and image correction routine

610. As noted above, the data collected by the automated data collection
module
730 is thus communicated back to the system.
[073] Figure 7 shows a view of a first and second of an at least two
underwater
projection systems simultaneously operating. As shown, a first of an at least
two
underwater projections systems 120A is shown projecting images A and B. A
boundary marking function has been performed to establish, through a boundary
29

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
marking module of the control software and an image correction module 610 like

that above, for the projection of images A and B, collectively the first
underwater
projection systems images 380. The second of the at least two underwater
projection systems 102B is shown similarly having completed a boundary marking

function and image correction module 610 functions. The result is the display
of
images C and D, collectively referred to the corrected images of the second
underwater projection system 390. A single GUI Controller interface 400 is
provided, however, multiple such GUI Controller interfaces may be utilized as
well.
In addition, the GUI Controller 400 is in communication with an auxiliary, non-
light
amenity system, in this case a sound system. Additionally the controllers may
control an at least one ambient light source, as noted in other embodiments
described herein. In instances where more than one underwater image projector
is
provided, a single controller 120 may also be utilized to control both
underwater
image projectors 120A,120B.
[074] When used in conjunction with one another, the at least two underwater
image projectors have some unique control feature issues. Firstly depending on
the
overall size of the pool and the boundaries circumscribed and stored on each
controller, an additional step of handling overlapping projections must be
included
in any routines conducted by either of the at least two underwater image
projection
systems 100, 101. For example as A is moved it could pass to locations outside
of
the boundary of the first underwater image projection systems, it may pass
into the
bounder of the other system. Thus in addition to the DCM 640 an additional
component must be provided to handle "handoffs" between the projection areas.
This is provided in a separate module (not shown) that interrogates projection

position data from each of the at least two under water projection systems

CA 02880668 2015-01-30
WO 2014/022591
PCT/US2013/053084
102A,102B. The result is a seamless projection of a continuous image across
boundary lines or zones. It may also be that each of the at least two under
water
projection systems 102A,102B have defined projection boundaries that overlap.
In
this instance the at least two under water projection systems 102A,102B must
be
able to communicate relative position and share markers. This avoids over wash
or
sudden and erratic fluctuations from projecting the same or conflicting images
into a
single. Finally, in the configuration having at least two under water
projection
systems 102A,102B, each system may simply operate as an independent and
autonomous control. This would potentially result in distortion on interlacing

images if the projections project the same or different things in the same
spot.
[075] In each case, the at least two under water projection systems 102A,102B
may
communicate with the single GUI controller in order to perform the
manipulations
on their respective images 380,390. Additional information may be shared in
the
way of edge distortion values and the calibration table or map, such that
smooth
transitions can be made of the images between each of the at least two
underwater
projection systems. Additionally, both lights may also be controlled be
controlled
by a single controller 102, as shown above, controlling multiple underwater
image
projection systems.
[076] The embodiments and examples discussed herein are non-limiting examples.

The invention is described in detail with respect to exemplary embodiments,
and it
will now be apparent from the foregoing to those skilled in the art that
changes and
modifications may be made without departing from the invention in its broader
aspects, and the invention, therefore, as defined in the claims is intended to
cover all
such changes and modifications as fall within the true spirit of the
invention.
31

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date Unavailable
(86) PCT Filing Date 2013-07-31
(87) PCT Publication Date 2014-02-06
(85) National Entry 2015-01-30
Examination Requested 2018-05-09
Dead Application 2021-08-31

Abandonment History

Abandonment Date Reason Reinstatement Date
2020-08-31 R86(2) - Failure to Respond
2021-03-01 FAILURE TO PAY APPLICATION MAINTENANCE FEE

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Application Fee $400.00 2015-01-30
Maintenance Fee - Application - New Act 2 2015-07-31 $100.00 2015-06-30
Maintenance Fee - Application - New Act 3 2016-08-01 $100.00 2016-07-19
Maintenance Fee - Application - New Act 4 2017-07-31 $100.00 2017-07-05
Request for Examination $800.00 2018-05-09
Maintenance Fee - Application - New Act 5 2018-07-31 $200.00 2018-07-04
Maintenance Fee - Application - New Act 6 2019-07-31 $200.00 2019-07-18
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
PENTAIR WATER POOL AND SPA, INC.
Past Owners on Record
None
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Examiner Requisition 2020-01-23 4 245
Abstract 2015-01-30 2 84
Claims 2015-01-30 8 223
Drawings 2015-01-30 12 288
Description 2015-01-30 31 1,217
Representative Drawing 2015-01-30 1 34
Cover Page 2015-03-09 2 57
Request for Examination 2018-05-09 1 37
Examiner Requisition 2018-12-17 4 230
Amendment 2019-06-06 10 286
Description 2019-06-06 31 1,237
Claims 2019-06-06 6 163
PCT 2015-01-30 14 953
Assignment 2015-01-30 4 120