Sélection de la langue

Search

Sommaire du brevet 2834217 

Énoncé de désistement de responsabilité concernant l'information provenant de tiers

Une partie des informations de ce site Web a été fournie par des sources externes. Le gouvernement du Canada n'assume aucune responsabilité concernant la précision, l'actualité ou la fiabilité des informations fournies par les sources externes. Les utilisateurs qui désirent employer cette information devraient consulter directement la source des informations. Le contenu fourni par les sources externes n'est pas assujetti aux exigences sur les langues officielles, la protection des renseignements personnels et l'accessibilité.

Disponibilité de l'Abrégé et des Revendications

L'apparition de différences dans le texte et l'image des Revendications et de l'Abrégé dépend du moment auquel le document est publié. Les textes des Revendications et de l'Abrégé sont affichés :

  • lorsque la demande peut être examinée par le public;
  • lorsque le brevet est émis (délivrance).
(12) Brevet: (11) CA 2834217
(54) Titre français: DETECTION ET REGLAGE DES CARACTERISTIQUES D'UN ENVIRONNEMENT
(54) Titre anglais: SENSING AND ADJUSTING FEATURES OF AN ENVIRONMENT
Statut: Accordé et délivré
Données bibliographiques
(51) Classification internationale des brevets (CIB):
  • G5B 15/02 (2006.01)
  • G5D 23/19 (2006.01)
  • H4R 3/00 (2006.01)
(72) Inventeurs :
  • MCGUIRE, KENNETH STEPHEN (Etats-Unis d'Amérique)
  • HASENOEHRL, ERIK JOHN (Etats-Unis d'Amérique)
  • MAHONEY, WILLIAM PAUL, III (Etats-Unis d'Amérique)
  • BISCHOFF, COREY MICHAEL (Etats-Unis d'Amérique)
  • STANLEY, HUIQING Y. (Etats-Unis d'Amérique)
  • STEINHARDT, MARK JOHN (Etats-Unis d'Amérique)
  • GRUENBACHER, DANA PAUL (Etats-Unis d'Amérique)
(73) Titulaires :
  • THE PROCTER & GAMBLE COMPANY
(71) Demandeurs :
  • THE PROCTER & GAMBLE COMPANY (Etats-Unis d'Amérique)
(74) Agent: WILSON LUE LLP
(74) Co-agent:
(45) Délivré: 2018-06-19
(86) Date de dépôt PCT: 2011-04-26
(87) Mise à la disponibilité du public: 2012-11-01
Requête d'examen: 2013-10-24
Licence disponible: S.O.
Cédé au domaine public: S.O.
(25) Langue des documents déposés: Anglais

Traité de coopération en matière de brevets (PCT): Oui
(86) Numéro de la demande PCT: PCT/US2011/033924
(87) Numéro de publication internationale PCT: US2011033924
(85) Entrée nationale: 2013-10-24

(30) Données de priorité de la demande: S.O.

Abrégés

Abrégé français

Des modes de réalisation de l'invention ont trait à la détection et au réglage des caractéristiques d'un environnement. Certains modes de réalisation se rapportent à un système et/ou un procédé qui permettent la réception d'une caractéristique d'ambiance d'un environnement source grâce à la détermination, à partir de la caractéristique d'ambiance, d'une sortie source produite par un dispositif source dans l'environnement source, et grâce à la détermination d'une capacité d'ambiance pour un environnement cible. Certains modes de réalisation comprennent la détermination, basée sur la capacité d'ambiance, d'une sortie cible pour un dispositif cible dans l'environnement cible, et la communication avec le dispositif cible pour modéliser la caractéristique d'ambiance à partir de l'environnement source sur l'environnement cible par la modification de la sortie cible produite par le dispositif cible.


Abrégé anglais

Included are embodiments for sensing and adjusting features of an environment. Some embodiments include a system and/or method that for receiving an ambiance feature of a source environment, determining from the ambiance feature, a source output provided by a source device in the source environment, and determining an ambiance capability for a target environment. Some embodiments include determining, based on the ambiance capability, a target output for a target device in the target environment and communicating with the target device to model the ambiance feature from the source environment into the target environment by altering the target output provided by the target device.

Revendications

Note : Les revendications sont présentées dans la langue officielle dans laquelle elles ont été soumises.


16
Claims:
1. A method for sensing and adjusting features of an environment comprising:
receiving, by a sensor device that is coupled to a user computing device, an
ambiance
feature of a source environment;
determining, by the user computing device and from the ambiance feature, a
source
output provided by one or more source devices in the source environment,
wherein determining
the source output provided by the one or more source devices comprises
determining the
number of source devices in the source environment, the location of source
devices in the
source environment and the type of source devices in the source environment;
determining an ambiance capability for a target environment using the
determined
source output;
determining, based on the ambiance capability, a target output for one or more
target
devices in the target environment; and
communicating with the target device to model the ambiance feature from the
source
environment into the target environment by altering the target output provided
by the one or
more target devices, wherein modeling the ambiance feature from the source
environment into
the target environment comprises determining the number of target devices in
the target
environment, the location of the target devices in the target environment, and
type of target
devices in the target environment.
2. The method as in claim 1, wherein the ambiance feature comprises at least
one of
the following: an illumination signal, an audio signal, a scent signal, a
temperature signal, a
humidity signal, an air quality signal, and a wind signal.
3. The method as in claim 1 or 2, wherein the type of source device comprises
at least
one of the following: a light source, an audio source, a scent source, a
temperature source, a
humidity source, an air quality source, and a wind source.

17
4. The method as in any one of claims 1 to 3, in which communicating with the
target
device comprises sending a command to at least one of the following: a light
source in the
environment, an audio source in the environment, a scent source in the
environment, a climate
source in the environment, and a network device in the environment.
5. The method as in any one of claims 1 to 4, further comprising making a
recommendation to alter the target environment to more accurately model the
ambiance feature
from the source environment.
6. A system for sensing and adjusting features of an environment comprising:
an image capture device for receiving an illumination signal for a source
environment;
and
a memory component that stores logic that causes the system to perform at
least the
following:
receive the illumination signal from the image capture device;
determine, from the illumination signal, an illumination ambiance in the
source
environment;
determine, from the illumination ambiance, a source output provided by one or
more source devices in the source environment, wherein determining the source
output
provided by the one or more source devices comprises determining the number of
source devices in the source environment, the location of source devices in
the source
environment and the type of source devices in the source environment;
determine an illumination capability for a target environment using the
determined source output;
determine, based on the illumination capability, a target output for one or
more
light sources in the target environment; and
communicate with the light source to model the illumination ambiance from the
source environment into the target environment by altering the target output
provided by the
one or more light sources, wherein modeling the illumination ambiance from the
source
environment into the target environment comprises determining the number of
light sources in

18
the target environment, the location of the light sources in the target
environment, and type of
light sources in the target environment.
7. The system as in claim 6, wherein the logic further causes the system to
determine
whether the illumination capability in the target environment is substantially
accurate and, in
response to determining that the illumination ambiance in the target
environment is not
substantially accurate, dynamically adjusting the light source in the target
environment.
8. The system as in claim 6 or 7, in which determining the illumination
ambiance
comprises determining a type of light source, wherein the type of light source
comprises at
least one of the following: a lamp, an overhead light, a television, a
component light, sunlight,
a fire, an external light source, and a candle.
9. The system as in any one of claims 6 to 8, in which communicating with the
light
source comprises sending a command directly to at least one of the following:
the light source
and a network device that controls the light source.
10. The system as in any one of claims 6 to 9, in which determining data
related to the
illumination ambiance comprises sending data to a remote computing device and
receiving the
target output from the remote computing device.
11. The system as in any one of claims 6 to 10, in which the logic further
causes the
system to send the illumination ambiance to a remote computing device for
utilization by other
users.
12. A non-transitory computer-readable medium for sensing and adjusting
features of
an environment that stores a program that, when executed by a computing
device, causes the
computing device to perform at least the following:
receive an illumination signal;

19
determine, from the illumination signal, an illumination ambiance in a source
environment;
determine, from the illumination ambiance, a source output provided by a
source device
in the source environment;
determine an illumination capability for a target environment using the
determined
source output;
determine, based on the illumination capability, a target output for a light
source in the
target environment;
communicate with the light source to model the illumination ambiance from the
source
environment into the target environment by altering the target output provided
by the light
source;
receive an updated lighting characteristic of the target environment;
determine whether the updated lighting characteristic substantially models the
illumination ambiance from the source environment; and
in response to determining that the updated lighting characteristic does not
substantially
model the illumination ambiance from the source environment, altering the
target output
provided by the light source.
13. The non-transitory computer-readable medium as in claim 12, in which the
logic
further causes the computing device to store the updated lighting
characteristic, in response to
determining that the updated lighting characteristic substantially models the
illumination
ambiance from the source environment.
14. The non-transitory computer-readable medium as in claim 12 or 13, in which
determining the illumination ambiance comprises determining at least one of
the following: a
number of light sources in the source environment, a location of the light
source in the source
environment, and a size of the environment.
15. The non-transitory computer-readable medium as in any one of claims 12 to
14, in
which determining the illumination ambiance comprises determining a type of
illumination

20
device, wherein the type of illumination device comprises at least one of the
following: a lamp,
an overhead light, a television, a component light, sunlight, a fire, an
external light source, and
a candle.
16. The non-transitory computer-readable medium as in any one of claims 12 to
15, in
which communicating with the light source comprises sending a command directly
to at least
one of the following: the light source and a network device that controls the
light source.
17. The non-transitory computer-readable medium as in any one of claims 12 to
16, in
which determining data related to the illumination ambiance comprises sending
data to a
remote computing device and receiving the target output from the remote
computing device.
18. A method for dynamically adjusting a target environment, comprising:
receiving an ambiance characteristic of a source environment, the ambiance
characteristic comprising source output and environment characteristic data;
determining an ambiance capability of a target environment according to the
source
output;
determining, from the source capability of the target environment,
determining, based on the ambiance capability, a target output for a target
device in the
target environment;
communicating with the target device to model the ambiance feature from the
source
environment into the target environment by altering the target output provided
by the target
device;
performing an iterative process of receiving the target output to determine
whether the
target output in the target environment is substantially accurate and, in
response to determining
that the target output in the target environment is not substantially
accurate, dynamically
adjusting the light source in the target environment.
19. The method as in claim 18, wherein the ambiance characteristic is received
from
at least one of the following: a source environment via a wireless signal, a
source environment

21
via a wired signal, a source environment via a 1-dimensional bar code, from a
source
environment via a 2-dimensional bar code, from a theme store, from a website,
and from a
sensor device.

Description

Note : Les descriptions sont présentées dans la langue officielle dans laquelle elles ont été soumises.


CA 02834217 2013-10-24
WO 2012/148385 PCT/US2011/033924
1
SENSING AND ADJUSTING FEATURES OF AN ENVIRONMENT
FIELD OF THE INVENTION
The present application relates generally to sensing and adjusting features of
an
environment and specifically to utilizing a computing device to determine
features of a first
environment for utilization in a second environment.
BACKGROUND OF THE INVENTION
Often a user will enter a first environment, such as a house, room,
restaurant, hotel,
office, etc. and an ambiance of that environment is found to be desirable. The
features of the
ambiance may include the lighting, sound, temperature, humidity, air quality,
scent, etc. The
user may then enter a second environment and desire to replicate ambiance from
the first
environment in that second environment. However, in order to replicate the
ambiance of the first
environment, the user may be forced to manually adjust one or more different
settings in the
second environment. Additionally, when the user is adjusting the settings
he/she may be forced
to refer only to his or her memory to implement the setting from the first
environment. Further,
as the second environment may include different light sources, heating
systems, air conditioning
systems, audio systems, etc., a user's attempt to manually replicate the
ambiance from the first
environment is often difficult if not futile.
SUMMARY OF THE INVENTION
Included are embodiments of a method for sensing and adjusting features of an
environment. Some embodiments of the method are configured for receiving an
ambiance
feature of a source environment, determining from the ambiance feature, a
source output
provided by a source device in the source environment, and determining an
ambiance capability
for a target environment. Some embodiments include determining, based on the
ambiance
capability, a target output for a target device in the target environment and
communicating with
the target device to model the ambiance feature from the source environment
into the target
environment by altering the target output provided by the target device.
Also included are embodiments of a system. Some embodiments of the system
include
an image capture device for receiving an illumination signal for a source
environment and a
memory component that stores logic that causes the system to receive the
illumination signal
from the image capture device and determine, from the illumination signal, an
illumination

CA 02834217 2013-10-24
WO 2012/148385 PCT/US2011/033924
2
ambiance in the source environment. In some embodiments, the logic further
causes the system
to determine a characteristic of the source environment, and determine an
illumination capability
for a target environment. In still some embodiments, the logic causes the
system to determine,
based on the illumination capability, a target output for a light source in
the target environment
and communicate with the light source to model the illumination ambiance from
the source
environment into the target environment by altering the target output provided
by the light
source.
Also included are embodiments of a non-transitory computer-readable medium.
Some
embodiments of the non-transitory computer-readable medium include logic that
causes a
computing device to receive an illumination signal, determine, from the
illumination signal, an
illumination ambiance in a source environment, and determine a characteristic
of the source
environment. In some embodiments, the logic further causes the computing
device to determine
an illumination capability for a target environment, determine, based on the
illumination
capability, a target output for a light source in the target environment, and
communicate with the
light source to model the illumination ambiance from the source environment
into the target
environment by altering the target output provided by the light source. In
still some
embodiments, the logic causes the computing device to receive an updated
lighting characteristic
of the target environment, determine whether the updated lighting
characteristic substantially
models the illumination ambiance from the source environment, and in response
to determining
that the updated lighting characteristic does not substantially model the
illumination ambiance
from the source environment, altering the target output provided by the light
source.
BRIEF DESCRIPTION OF THE DRAWINGS
It is to be understood that both the foregoing general description and the
following
detailed description describe various embodiments and are intended to provide
an overview or
framework for understanding the nature and character of the claimed subject
matter. The
accompanying drawings are included to provide a further understanding of the
various
embodiments, and are incorporated into and constitute a part of this
specification. The drawings
illustrate various embodiments described herein, and together with the
description serve to
explain the principles and operations of the claimed subject matter.
FIG. 1 depicts a plurality of environments from which an ambiance may be
sensed and
adjusted, according to embodiments disclosed herein;

CA 02834217 2013-10-24
WO 2012/148385 PCT/US2011/033924
3
FIG. 2 depicts a user computing device that may be utilized for sensing and
adjusting
features in an environment, according to embodiments disclosed herein;
FIG. 3 depicts a user interface that provides options to model an environment
ambiance
and apply a stored model, according to embodiments disclosed herein;
FIG. 4 depicts a user interface for determining a type of ambiance feature to
capture in an
environment, according to embodiments disclosed herein;
FIG. 5 depicts a user interface for receiving data from a source environment,
according to
embodiments disclosed herein;
FIG. 6 depicts a user interface for modeling the source environment, according
to
embodiments disclosed herein;
FIG. 7 depicts a user interface for storing a received ambiance, according to
embodiments
disclosed herein;
FIG. 8 depicts a user interface for receiving a theme from an environment,
according to
embodiments disclosed herein;
FIG. 9 depicts a user interface for applying a stored ambiance to a target
environment,
according to embodiments disclosed herein;
FIG. 10 depicts a user interface for receiving an ambiance capability for a
target
environment, according to embodiments disclosed herein;
FIG. 11 depicts a user interface for providing a suggestion to more accurately
model the
target environment according to the source environment, according to
embodiments disclosed
herein;
FIG. 12 depicts a user interface for providing options to apply additional
ambiance
features to the target environment, according to embodiments disclosed herein;
FIG. 13 depicts a flowchart for modeling an ambiance feature in a target
environment,
according to embodiments disclosed herein;
FIG. 14 depicts a flowchart for determining whether an ambiance feature has
previously
been stored, according to embodiments disclosed herein; and
FIG. 15 depicts a flowchart for determining whether an applied ambiance
feature
substantially matches a theme, according to embodiments disclosed herein.
DETAILED DESCRIPTION OF THE INVENTION
Embodiments disclosed herein include systems and methods for sensing and
adjusting
features in an environment. More specifically, in some embodiments, a user may
enter a source

CA 02834217 2013-10-24
WO 2012/148385 PCT/US2011/033924
4
environment, such as a house, room, office, hotel, restaurant, etc. and
realize that the ambiance is
pleasing. The ambiance may include the lighting, the sound, the scent, the
climate, and/or other
features of the source environment. Accordingly, the user may utilize a user
computing device,
such as a mobile phone, personal digital assistant (PDA), laptop computer,
tablet computer, etc.
to capture an ambiance feature of the source environment. More specifically,
the user computing
device may include (or be coupled to a device that includes) an image capture
device, a
microphone, a gyroscope, an accelerometer, a positioning system, a
thermometer, a humidity
sensor, an air quality sensor, and/or other sensors for determining the
ambiance features of the
source environment. As an example, if the user determines that the lighting in
the source
environment is appealing, the user may select an option on the user computing
device that
activates the image capture device. The image capture device may capture
lighting
characteristics of the source environment. The lighting characteristics may
include a light
intensity, a light frequency, a light distribution, etc., as well as dynamic
changes over time
thereof. With this information, the user computing device can determine a
source output, which
(for lighting) may include a number of light sources, a light output of
sources; whether the light
is diffuse light, columnar light, direct light, reflected light, color
temperature of the light, overall
brightness, etc. The user computing device may also determine a characteristic
of the source
environment, such as size, coloring, acoustics, and/or other characteristics.
Once the user
computing device has determined the source output, this data may be stored
locally and/or sent to
a remote computing device for storage.
Once a source output is determined, the user device may implement the ambiance
from
the source environment into a target environment. In the lighting context, the
user may utilize
the image capture device (and/or other components, such as the positioning
system, gyroscope,
accelerometer, etc.) to determine an ambiance capability (such as an
illumination capability in
the lighting context or an audio capability, a scent capability, a climate
capability, etc. in other
contexts) of the target environment. Again, in the lighting context, the
ambiance capability may
be determined from a number and position of target devices (such as light
sources or other output
devices), windows, furniture, and/or other components. Other features of the
target environment
may also be determined, such as size, global position, coloring, etc.
Additionally, the user computing device can determine alterations to make to
the light
sources in the target environment to substantially model the ambiance feature
from the source
environment. This determination may be made by comparing the location and
position of the
output sources in the source environment, as well as the light actually
realized from those output

CA 02834217 2013-10-24
WO 2012/148385 PCT/US2011/033924
sources with the determined ambiance capability of the target environment. As
an example, if
the source environment is substantially similar to the target environment, the
user computing
device can determine that the output (such as lighting effects) provided by
the light sources
should be approximately the same. If there are differences between the source
environment and
5 the target environment, those differences may be factored into the
analysis. More specifically,
when the source environment and target environment are different, the
combination of light
output and room dynamics adds up to the visual feeling of the environment. For
example,
because the source environment and the target environment are different, the
light outputs could
be substantially different. However, due to room size, reflective
characteristics, wall color etc.,
of the source environment and the target environment, embodiments disclosed
herein may shape
the light output such that the ambiance "felt" by the image capture device
would be similar. As
such, some embodiments may utilize a feedback loop configuration to
dynamically assess the
source environment and/or target environment and dynamically adjust the
settings and ensure
accuracy.
Once the alterations are determined, the user computing device can communicate
with the
output sources directly and/or with a network component that controls the
output sources. The
user computing device may additionally reexamine the target environment to
determine whether
the adjustments made substantially model the ambiance feature from the source
environment. If
not, further alterations may be made. If the alterations are acceptable, the
settings for this
ambiance may be stored.
It should be understood that in some embodiments where the source output data
(which
includes data about the ambiance characteristics in the source environment) is
sent to a remote
computing device, the remote computing device may receive the source output
data and create an
application to send to the user computing device for implementing the ambiance
into a target
environment. This may be accomplished such that the ambiance may be
implemented in any
environment (with user input on parameters of the target environment).
Similarly, in some
embodiments, the user computing device may additionally send environmental
characteristics
data (such as size, shape, position, etc. of an environment), such that the
remote computing
device can create an application to implement the ambiance in the particular
target environment.
Additionally, some embodiments may be configured with a feedback loop for
continuous
and/or repeated monitoring and adjustment of settings in the target
environment. More
specifically, the user computing device may be configured to take a plurality
of measurements of
the source environment to determine a current ambiance. Similarly, when
modeling the current

CA 02834217 2015-04-09
6
ambiance into the target environment, the user computing device can send data
related to the
current ambiance to a target device. Additionally, once the adjustments to the
target environment
are implemented, the user computing device can monitor the ambiance, calculate
adjustments,
and send those adjustments to achieve a desired target ambiance. This may
continue a
predetermined number of iterations or until accuracy is achieved within a
predetermined
threshold.
It should also be understood that, as described herein, embodiments of a light
source may
include any component that provides ,a visible form of light, including a
lamp, an overhead light,
a television, a component light, sunlight, a fire, an external light source,
and a candle, etc. Thus, a
light source may take many shapes, sizes, and forms and, since the inception
of electric lighting,
have matured to include many types of emission sources. Incandescence,
electroluminescence,
and gas discharge have each been used in various lighting apparatus and, among
each the primary
emitting element (e.g., incandescent filaments, light-emitting diodes, gas,
plasma, etc.) may be
configured in any number of ways according to the intended application. Many
embodiments of
light sources described herein are susceptible to use with almost any type of
emission source, as
will be understood by a person of ordinary skill in the art upon reading the
following described
embodiments.
For example, certain embodiments may include light-emitting diodes (LEDs), LED
light
sources, lighted sheets, and the like. In these embodiments, a person of
ordinary skill in the art
will readily appreciate the nature of the limitation (e.g., that the
embodiment contemplates a
planar illuminating element) and the scope of the described embodiment (e.g.,
that any type of
planar illuminating element may be employed). LED lighting arrays come in many
forms
including, for instance, arrays of individually packaged LEDs arranged to form
generally planar
shapes (i.e., shapes having a thickness small relative to their width and
length). Arrays of LEDs
may also be formed on a single substrate or on multiple substrates, and may
include one or more
circuits (i.e., to illuminate different LEDs), various colors of LEDs, etc.
Additionally, LED
arrays may be formed by any suitab;e semiconductor technology including, by
way of example
and not limitation, metallic semiconductor material and organic semiconductor
material. In any
event, embodiments utilizing an LED material or the use of a planar
illuminated sheet, any
suitable technology known presently or later invented may be employed in
cooperation with
other elements without departing from the invention described herein.
Referring now to the drawings, FIG. 1 depicts a plurality of environments from
which an
ambiance may be sensed and adjusted, according to embodiments disclosed
herein. As

CA 02834217 2013-10-24
WO 2012/148385 PCT/US2011/033924
7
illustrated in FIG. 1, a network 100 may include a wide area network, such as
the Internet, a local
area network (LAN), a mobile communications network, a public service
telephone network
(PSTN) and/or other network and may be coupled to a user computing device 102,
remote
computing device 104, and a target environment 110b. Also included is a source
environment
110a. The source environment 110a may include one or more output devices 112a
¨ 112d, which
in FIG. 1 are depicted as light sources. As discussed above, a light source
may include any
component that provides a visible form of light, including a lamp, an overhead
light, a television,
a component light, sunlight, a fire, an external light source, a candle, etc.
Similarly, the target environment 110b may also include one or more output
devices 114a
¨ 114c. While the output devices 112 and 114 are illustrated as light sources
in FIG. 1 that
provide an illumination ambiance, other sources may also be considered within
the scope of this
disclosure, including an audio source, a scent source, climate source (such as
a temperature
source, a humidity source, an air quality source, wind source, etc.) and/or
other sources. As
illustrated, in some embodiments, the source environment 110a and target
environment 110b may
each be coupled to the network 100, such as via a network device. The network
device may
include any local area and/or wide area device for controlling an output
device in an
environment. Such network devices may be part of a "smart home" and/or other
intelligent
system. From the source environment 110a, the network connection may allow the
user
computing device 102 with a mechanism for receiving an ambiance theme and/or
other data
related to the source environment 110a. Similarly, by coupling to the network
100, the target
environment 110b may provide the user computing device 102 with a mechanism
for controlling
one or more of the output devices 114. Regardless, it should be understood
that these
connections are merely examples, as either or both may or may not be coupled
to the network
100.
Additionally, the user computing device 102 may include a memory component 140
that
stores source environment logic 144a for functionality related to determining
characteristics of
the source environment 110a. The memory component 140 also stores target
environment logic
144b for modeling the ambiance features from the source environment 110a and
applying those
ambiance features into the target environment 110b.
It should be understood that while the user computing device 102 and the
remote
computing device 104 are depicted as a mobile computing device and server
respectively, these
are merely examples. More specifically, in some embodiments any type of
computing device
(e.g. mobile computing device, personal computer, server, etc.) may be
utilized for either of these

CA 02834217 2013-10-24
WO 2012/148385 PCT/US2011/033924
8
components. Additionally, while each of these computing devices 102, 104 is
illustrated in FIG.
1 as a single piece of hardware, this is also an example. More specifically,
each of the computing
devices 102, 104 depicted in FIG. 1 may represent a plurality of computers,
servers, databases,
etc.
It should also be understood that while the source environment logic 144a and
the target
environment logic 144b are depicted in the user computing device 102, this is
also just an
example. In some embodiments, the user computing device 102 and/or the remote
computing
device 104 may include this and/or similar logical components.
Further, while FIG. 1 depicts embodiments in the lighting context, other
contexts are
included within the scope of this disclosure. As an example, while the user
computing device
102 may include a scent sensor, in some embodiments a scent sensor may be
included in an air
freshener (or other external device) that is located in the source environment
110a and is in
communication with the user computing device 102. The air freshener may
determine an aroma
in the source environment 110a and may communicate data related to that aroma
to the user
computing device 102. Similarly, in some embodiments, the air freshener may be
set to produce
an aroma and may send data related to the settings for producing that aroma.
In the target
environment 110b, another air freshener may be in communication with the user
computing
device 102 for providing the aroma data received from the source environment
110a. With this
information, the air freshener may implement the aroma to model the ambiance
from the source
environment 110a.
FIG. 2 depicts a user computing device 102 that may be utilized for sensing
and adjusting
features in an environment, according to embodiments disclosed herein. In the
illustrated
embodiment, the user computing device 102 includes at least one processor 230,
input/output
hardware 232, network interface hardware 234, a data storage component 236
(which includes
product data 238a, user data 238b, and/or other data), and the memory
component 140. The
memory component 140 may be configured as volatile and/or nonvolatile memory
and, as such,
may include random access memory (including SRAM, DRAM, and/or other types of
RAM),
flash memory, secure digital (SD) memory, registers, compact discs (CD),
digital video discs
(DVD), and/or other types of non-transitory computer-readable mediums.
Depending on the
particular embodiment, these non-transitory computer-readable mediums may
reside within the
user computing device 102 and/or external to the user computing device 102.
Additionally, the memory component 140 may be configured to store operating
logic 242,
the source environment logic 144a, and the target environment logic 144b. The
operating logic

CA 02834217 2013-10-24
WO 2012/148385 PCT/US2011/033924
9
242 may include an operating system, basic input output system (BIOS), and/or
other hardware,
software, and/or firmware for operating the user computing device 102. The
source environment
logic 144a and the target environment logic 144b may each include a plurality
of different pieces
of logic, each of which may be embodied as a computer program, firmware,
and/or hardware, as
an example. A local interface 246 is also included in FIG. 2 and may be
implemented as a bus or
other interface to facilitate communication among the components of the user
computing device
102.
The processor 230 may include any processing component operable to receive and
execute instructions (such as from the data storage component 236 and/or
memory component
140). The input/output hardware 232 may include and/or be configured to
interface with a
monitor, positioning system, keyboard, mouse, printer, image capture device,
microphone,
speaker, gyroscope, accelerometer, compass, thermometer, humidity sensor, air
quality sensor
and/or other device for receiving, sending, and/or presenting data. The
network interface
hardware 234 may include and/or be configured for communicating with any wired
or wireless
networking hardware, including an antenna, a modem, LAN port, wireless
fidelity (Wi-Fi) card,
WiMax card, mobile communications hardware, and/or other hardware for
communicating with
other networks and/or devices. From this connection, communication may be
facilitated between
the user computing device 102 and other computing devices. The processor 230
may also
include and/or be coupled to a graphical processing unit (GPU).
It should be understood that the components illustrated in FIG. 2 are merely
exemplary
and are not intended to limit the scope of this disclosure. As an example,
while the components
in FIG. 2 are illustrated as residing within the user computing device 102,
this is merely an
example. In some embodiments, one or more of the components may reside
external to the user
computing device 102. It should also be understood that, while the user
computing device 102 in
FIG. 2 is illustrated as a single device, this is also merely an example. In
some embodiments, the
source environment logic 144a and the target environment logic 144b may reside
on different
devices. Additionally, while the user computing device 102 is illustrated with
the source
environment logic 144a and the target environment logic 144b as separate
logical components,
this is also an example. In some embodiments, a single piece of logic may
perform the described
functionality.
FIG. 3 depicts a user interface 300 that provides options to model an
environment
ambiance and apply a stored model, according to embodiments disclosed herein.
As illustrated,
the user computing device 102 may include a sensor device 318 and an
application that provides

CA 02834217 2013-10-24
WO 2012/148385 PCT/US2011/033924
the user interface 300. The sensor device 318 depicted in FIG. 3 represents
any sensor device
that may be integral to and/or coupled with the user computing device 102.
More specifically,
the sensor device 318 may be configured as an image capture device, a
microphone, a scent
sensor, a humidity sensor, a temperature sensor, an air quality sensor, wind
sensor, etc.
5
Similarly, the user interface 300 may include a model environment option 320
and an
apply stored model option 322. As described in more detail below, the model
environment
option 320 may be selected to facilitate capture of ambiance data from a
source environment
110a. The apply stored model option 322 may be selected to apply ambiance data
from the
source environment 110a and apply that data to the target environment 110b.
10 FIG.
4 depicts a user interface 400 for determining a type of ambiance feature to
capture
in an environment, according to embodiments disclosed herein. As illustrated,
in response to
selection of the model environment option 320, the user interface 400 may be
provided with a
lighting option 420, a sound option 422, a scent option 424, and a climate
option 428. More
specifically, the user may select one or more of the options 420 ¨ 428 to
capture the
corresponding data from the source environment 110a. As an example, by
selecting the lighting
option 420, the user computing device 102 may acquire lighting data via the
sensor device 318,
which may be embodied as an image capture device. By selecting the sound
option 422, audio
signals may be captured by the sensor device 318, which may be embodied as a
microphone. By
selecting the scent option 424, the user computing device 102 may capture
scents via the sensor
device 318, which may be embodied as a scent sensor. By selecting the climate
option 426, the
user computing device 102 may capture a temperature signal, a humidity signal,
an air quality
signal, a wind signal, etc. via the sensor device 318, which may be embodied
as a thermometer,
humidity sensor, air quality sensor, etc.
FIG. 5 depicts a user interface 500 for receiving data from the source
environment 110a,
according to embodiments disclosed herein. As illustrated, in response to
selection of the
lighting option 420, the image capture device may be utilized to capture
lighting data from the
source environment 110a and display at least a portion of that data in the
user interface 500. By
selecting the capture option 520, the image capture device may capture an
image of the source
environment 110a. While FIG. 5 depicts that the image data is a photographic
image of the
environment and source devices, this is merely an example. In some
embodiments, the user
interface 500 may simply provide a graphical representation of light intensity
(such as a color
representation). Regardless of the display provided in the user interface 500,
the user computing
device 102 may utilize the received ambiance feature (which in this case is
lighting data) to

CA 02834217 2013-10-24
WO 2012/148385 PCT/US2011/033924
11
determine source output data, such as the location, number, and intensity of
light sources in the
source environment 110a. Other determinations may also be made, such as size
and color of the
environment, whether the light sources are internal light sources (such as
lamps, overhead lights,
televisions, electronic components, etc.) or external light sources (such as
the sun, moon, stars,
street lamps, automobiles, etc.).
It should be understood that while the user interface 500 of FIG. 5 depicts
the source
environment 110a in the context of determining the lighting ambiance, this is
merely an example.
More specifically, if the sound option 422 (from FIG. 4) is selected, a
microphone may be
utilized to capture audio data from the source environment 110a. The user may
direct the user
computing device 102 across the environment. From the received audio data, the
user computing
device 102 can deteimine the source, intensity, frequency, etc. of the audio
from the
environment.
In response to selection of the scent option 424 (FIG. 4), the user computing
device 102
may receive scent data from a scent sensor. As with the other sensors
disclosed herein, the scent
sensor may be integral with or coupled to the user computing device 102.
Similarly, in response
to selection of the climate option 426 (FIG. 4), the user computing device 102
may receive
climate related data from the source environment 110a, such as via a
temperature sensor, a
humidity sensor, an air quality sensor, etc. With this data, the user
computing device 102 can
determine a climate ambiance for the source environment 110a.
FIG. 6 depicts a user interface 600 for modeling the source environment 110a,
according
to embodiments disclosed herein. As illustrated, the user interface 600
includes an indication of
the number of output sources that were located in the source environment 110a,
as well as
features of the source environment 110a, itself. This determination may be
made based on an
intensity analysis of the output form the output source. Additionally, a
graphical representation
620 of the source environment 110a may also be provided. If the user computing
device is
incorrect regarding the environment and/or output sources, the user may alter
the graphical
representation 620 to add, move, delete, or otherwise change the graphical
representation 620.
Additionally, a correct option 622 is also included for indicating when the
ambiance features of
the source environment 110a are accurately deteimined.
FIG. 7 depicts a user interface 700 for storing a received ambiance, according
to
embodiments disclosed herein. As illustrated, the user interface 700 includes
keyboard for
entering a name for the output source data and source environment data from
FIG. 6.

CA 02834217 2013-10-24
WO 2012/148385 PCT/US2011/033924
12
FIG. 8 depicts a user interface 800 for receiving a theme from an environment,
according
to embodiments disclosed herein. As illustrated, the user interface 800 may be
provided in
response to a determination by the user computing device 102 that a source
environment 110a is
broadcasting a theme or other ambiance data. More specifically, the
embodiments discussed
with reference to FIGS. 3 ¨ 7 address the situation where the user computing
device 102 actively
determines the ambiance characteristics of the source environment 110a.
However, in FIG. 8, the
user computing device 102 need not make this determination because the source
environment
110a is broadcasting the ambiance characteristics (e.g., the source output
data, the environment
characteristics data and/or other data), such as via a wireless local area
network. Accordingly, in
response to receiving the ambiance characteristics, the user interface 800 may
be provided with
options for storing the received data.
It should also be understood that other mechanisms for receiving the ambiance
characteristics of the source environment 110a. In some embodiments, the user
may scan a 1-
dimensional or 2-dimensional bar code to receive information pertaining to the
source
environment 110a. In some embodiments, the information may be sent to the user
computing
device 102 via a text message, email message, and/or other messaging.
Similarly, in some
embodiments, a theme store may be accessible over a wide area network and/or
local area
network for receiving any number of different themes. In the theme store,
users may be provided
with options to purchase, upload, and/or download themes for use in a target
environment.
Additionally, some embodiments may be configured to upload and/or download
ambiance characteristics to and/or from a website, such as a social media
website, a mapping
website, etc. As an example in the social media context, restaurant or other
source environment
controller may provide the ambiance characteristics on a page dedicated to
that restaurant. Thus,
when users visit that page, they may download the ambiance. Additionally, when
a user
mentions the restaurant on a public or private posting, the social media
website may provide a
link to that restaurant that may also include a link to download the ambiance
characteristics.
Similarly, in the mapping website context, a user can upload ambiance
characteristics to the
mapping website, such that when a map, satellite image, or other image of that
environment is
provided, a link to download the ambiance may also be provided.
FIG. 9 depicts a user interface 900 for applying a stored ambiance to the
target
environment 110b, according to embodiments disclosed herein. As illustrated,
the user interface
900 may be provided in response to selection of the apply stored model option
324, from FIG. 3.
Accordingly, the user interface 900 may provide a "dad's house" option 920, a
"sis' kitchen"

CA 02834217 2013-10-24
WO 2012/148385 PCT/US2011/033924
13
option 922, a "fav eatery" option 924, and a "beach" option 926. As discussed
in more detail
below, by selecting one or more of the options 920 ¨ 926, the user computing
device 102 can
apply the stored ambiance to the target environment 110b.
FIG. 10 depicts a user interface 1000 for receiving an ambiance capability for
the target
environment 110b, according to embodiments disclosed herein. As illustrated,
the user interface
1000 may be configured to capture imagery and/or other data from the target
environment 110b
and utilize that data to determine an ambiance capability of the target
environment 110b. The
ambiance capability may be portrayed in a graphical representation 1002, which
may be provided
as a photographic image, video image, altered image, etc. Also included are an
apply option
1022 and an amend option 1024. More specifically, by selecting the amend
option 1024, the user
may add, edit, move, and/or otherwise change the output sources that are
provided in the user
interface 1000.
FIG. 11 depicts a user interface 1100 for providing a suggestion to more
accurately model
the target environment 110b according to the source environment 110a,
according to
embodiments disclosed herein. As illustrated, the user interface 1100 is
similar to the user
interface 1000 from FIG. 10, except that the user computing device 102 has
determined that
changes to the target environment 110b would allow a greater accuracy in
modeling the
ambiance from the source environment 110a. As such, the user interface 1100
may provide a
graphical representation 1120, which illustrates a change and a location of
that change. An
option 1122 may be provided to navigate away from the user interface 1100.
FIG. 12 depicts a user interface 1200 for providing options to apply
additional ambiance
features to the target environment 110b, according to embodiments disclosed
herein. As
illustrated, the user interface 1200 may be provided in response to selection
of the apply option
1022 from FIG. 10. Once the apply option 1022 is selected, the selected
ambiance may be
applied to the target environment 110b. More specifically, with regard to
FIGS. 9 ¨ 11,
determinations regarding the target environment 110b have been made for more
accurately
customizing the desired ambiance to that target environment 110b. Once the
determinations are
made, the user computing device 102 may communicate with one or more of the
output devices
to implement the desired changes. The communication may be directly with the
output devices,
if the output devices are so configured. Additionally, in some embodiments,
the user computing
device 102 may simply communicate with a networking device that controls the
output of the
output devices. Upon receiving the instructions from the user computing device
102, the
networking device may alter the output of the source devices.

CA 02834217 2013-10-24
WO 2012/148385 PCT/US2011/033924
14
FIG. 13 depicts a flowchart for modeling an ambiance feature in a target
environment,
according to embodiments disclosed herein. As illustrated in block 1330, an
ambiance feature of
a source environment may be received. As discussed above, the ambience feature
may include
those features of the source environment that may be detected by the sensor
device 318, such as
light (e.g., an illumination signal), an audio signal, a scent signal, and a
climate signal (such as
temperature, humidity, air quality, etc.) and/or other features. At block
1332, a deteimination
may be made from the ambiance feature regarding a source output provided by a
source device in
the source environment. More specifically, the determination may include
determining a type of
source device (such as a type of illumination device or other output device),
where the type of
illumination device includes a lamp, an overhead light, a television, a
component light, sunlight,
a fire, an external light source, a candle, etc. At block 1334, a
determination may be made
regarding an ambiance capability for a target environment. At block 1336, a
determination may
be made based on the ambiance capability of the target environment, regarding
a target output for
the target device in the target environment. The target device may include an
output device, such
as a light source, audio source, climate source, etc. that is located in the
target environment
and/or a networking device that controls the output devices. At block 1338, a
communication
may be facilitated with the target device to model the ambiance feature from
the source
environment into the target environment by altering the target output provided
by the target
device. In some embodiments, modeling the ambiance feature from the source
environment into
the target environment includes determining a number of target devices in the
target
environment, a location of the target device in the target environment, a type
of target device in
the target environment (such as a type of light source), etc. Similarly, in
some embodiments the
communication may include sending a command to the target device.
FIG. 14 depicts a flowchart for determining whether an ambiance feature has
previously
been stored, according to embodiments disclosed herein. As illustrated in
block 1430, the user
computing device 102 may enter a target environment. At block 1432, a
determination may be
made regarding whether an ambiance setting is currently stored. If an ambiance
setting is not
currently stored, the user computing device 102 may be taken to a source
environment and the
process may proceed to block 1330 in FIG. 13. If an ambiance setting is
currently stored, at
block 1436 the stored settings may be retrieved. At block 1438, the user
computing device 102
can communicate with the target environment to alter target devices to match
the stored settings.
FIG. 15 depicts a flowchart for determining whether an applied ambiance
feature
substantially matches a theme, according to embodiments disclosed herein. As
illustrated in

CA 02834217 2013-10-24
block 1530, a theme ambiance may be received. At block 1532, a request to
apply the theme to
the target environment may be received. At block 1534, the user computing
device 102 may
communicate with the target environment to alter the target devices to match
the theme. At
block 1536, an ambiance feature may be received from the target environment.
At block 1538, a
5 determination may be made regarding whether the ambiance feature
substantially matches the
theme. This determination may be based on a predetermined threshold for
accuracy. If the
ambiance feature does substantially match, at block 1542, the settings of the
target devices may
be stored. If the ambiance feature does not substantially match, the user
computing device 102
can alter the target devices to provide an updated ambiance feature (such as
an updated lighting
10 characteristic) to more accurately model the theme.
The dimensions and values disclosed herein are not to be understood as being
strictly
limited to the exact numerical values recited. Instead, unless otherwise
specified, each such
dimension is intended to mean both the recited value and a functionally
equivalent range
surrounding that value. For example, a dimension disclosed as "40 mm" is
intended to mean
15 "about 40 mm."
The citation of any document, including any cross referenced or related patent
or
application is not an admission that it is prior art with respect to any
invention disclosed or
claimed herein or that it alone, or in any combination with any other
reference or references,
teaches, suggests or discloses any such invention. Further, to the extent that
any meaning or
definition of a term in this document conflicts with any meaning or definition
of the same term in
a document cited herein, the meaning or definition assigned to that term in
this document shall
govern.
While particular embodiments of the present invention have been illustrated
and
described, it would be understood to those skilled in the art that various
other changes and
modifications can be made without departing from the invention described
herein.

Dessin représentatif
Une figure unique qui représente un dessin illustrant l'invention.
États administratifs

2024-08-01 : Dans le cadre de la transition vers les Brevets de nouvelle génération (BNG), la base de données sur les brevets canadiens (BDBC) contient désormais un Historique d'événement plus détaillé, qui reproduit le Journal des événements de notre nouvelle solution interne.

Veuillez noter que les événements débutant par « Inactive : » se réfèrent à des événements qui ne sont plus utilisés dans notre nouvelle solution interne.

Pour une meilleure compréhension de l'état de la demande ou brevet qui figure sur cette page, la rubrique Mise en garde , et les descriptions de Brevet , Historique d'événement , Taxes périodiques et Historique des paiements devraient être consultées.

Historique d'événement

Description Date
Inactive : COVID 19 - Délai prolongé 2020-03-29
Inactive : CIB expirée 2020-01-01
Représentant commun nommé 2019-10-30
Représentant commun nommé 2019-10-30
Accordé par délivrance 2018-06-19
Inactive : Page couverture publiée 2018-06-18
Préoctroi 2018-05-07
Inactive : Taxe finale reçue 2018-05-07
Inactive : CIB expirée 2018-01-01
Inactive : CIB expirée 2018-01-01
Un avis d'acceptation est envoyé 2017-11-07
Lettre envoyée 2017-11-07
month 2017-11-07
Un avis d'acceptation est envoyé 2017-11-07
Inactive : Q2 réussi 2017-10-31
Inactive : Approuvée aux fins d'acceptation (AFA) 2017-10-31
Modification reçue - modification volontaire 2017-06-14
Exigences relatives à la nomination d'un agent - jugée conforme 2017-01-04
Inactive : Lettre officielle 2017-01-04
Inactive : Lettre officielle 2017-01-04
Exigences relatives à la révocation de la nomination d'un agent - jugée conforme 2017-01-04
Inactive : Dem. de l'examinateur par.30(2) Règles 2016-12-14
Inactive : Rapport - Aucun CQ 2016-12-08
Demande visant la révocation de la nomination d'un agent 2016-12-01
Requête pour le changement d'adresse ou de mode de correspondance reçue 2016-12-01
Demande visant la nomination d'un agent 2016-12-01
Inactive : Demande ad hoc documentée 2016-11-28
Inactive : Lettre officielle 2016-11-28
Demande visant la révocation de la nomination d'un agent 2016-11-03
Demande visant la nomination d'un agent 2016-11-03
Modification reçue - modification volontaire 2016-06-07
Inactive : Dem. de l'examinateur par.30(2) Règles 2015-12-07
Inactive : Rapport - Aucun CQ 2015-08-07
Modification reçue - modification volontaire 2015-04-09
Inactive : Dem. de l'examinateur par.30(2) Règles 2014-10-09
Inactive : Rapport - CQ réussi 2014-10-01
Inactive : CIB attribuée 2014-04-25
Inactive : CIB attribuée 2014-02-12
Inactive : CIB en 1re position 2014-02-12
Inactive : CIB attribuée 2014-02-12
Inactive : CIB attribuée 2014-02-12
Inactive : CIB attribuée 2014-02-12
Inactive : Page couverture publiée 2013-12-10
Inactive : CIB en 1re position 2013-12-03
Inactive : CIB enlevée 2013-12-03
Inactive : CIB attribuée 2013-12-03
Inactive : CIB en 1re position 2013-12-02
Lettre envoyée 2013-12-02
Lettre envoyée 2013-12-02
Inactive : Acc. récept. de l'entrée phase nat. - RE 2013-12-02
Inactive : CIB attribuée 2013-12-02
Demande reçue - PCT 2013-12-02
Exigences pour l'entrée dans la phase nationale - jugée conforme 2013-10-24
Exigences pour une requête d'examen - jugée conforme 2013-10-24
Modification reçue - modification volontaire 2013-10-24
Toutes les exigences pour l'examen - jugée conforme 2013-10-24
Demande publiée (accessible au public) 2012-11-01

Historique d'abandonnement

Il n'y a pas d'historique d'abandonnement

Taxes périodiques

Le dernier paiement a été reçu le 2018-04-25

Avis : Si le paiement en totalité n'a pas été reçu au plus tard à la date indiquée, une taxe supplémentaire peut être imposée, soit une des taxes suivantes :

  • taxe de rétablissement ;
  • taxe pour paiement en souffrance ; ou
  • taxe additionnelle pour le renversement d'une péremption réputée.

Les taxes sur les brevets sont ajustées au 1er janvier de chaque année. Les montants ci-dessus sont les montants actuels s'ils sont reçus au plus tard le 31 décembre de l'année en cours.
Veuillez vous référer à la page web des taxes sur les brevets de l'OPIC pour voir tous les montants actuels des taxes.

Titulaires au dossier

Les titulaires actuels et antérieures au dossier sont affichés en ordre alphabétique.

Titulaires actuels au dossier
THE PROCTER & GAMBLE COMPANY
Titulaires antérieures au dossier
COREY MICHAEL BISCHOFF
DANA PAUL GRUENBACHER
ERIK JOHN HASENOEHRL
HUIQING Y. STANLEY
KENNETH STEPHEN MCGUIRE
MARK JOHN STEINHARDT
WILLIAM PAUL, III MAHONEY
Les propriétaires antérieurs qui ne figurent pas dans la liste des « Propriétaires au dossier » apparaîtront dans d'autres documents au dossier.
Documents

Pour visionner les fichiers sélectionnés, entrer le code reCAPTCHA :



Pour visualiser une image, cliquer sur un lien dans la colonne description du document (Temporairement non-disponible). Pour télécharger l'image (les images), cliquer l'une ou plusieurs cases à cocher dans la première colonne et ensuite cliquer sur le bouton "Télécharger sélection en format PDF (archive Zip)" ou le bouton "Télécharger sélection (en un fichier PDF fusionné)".

Liste des documents de brevet publiés et non publiés sur la BDBC .

Si vous avez des difficultés à accéder au contenu, veuillez communiquer avec le Centre de services à la clientèle au 1-866-997-1936, ou envoyer un courriel au Centre de service à la clientèle de l'OPIC.


Description du
Document 
Date
(yyyy-mm-dd) 
Nombre de pages   Taille de l'image (Ko) 
Revendications 2017-06-13 6 185
Description 2013-10-23 15 886
Dessin représentatif 2013-10-23 1 11
Abrégé 2013-10-23 1 70
Dessins 2013-10-23 10 123
Revendications 2013-10-23 5 191
Description 2013-10-24 15 880
Revendications 2013-10-24 5 190
Page couverture 2013-12-09 2 44
Description 2015-04-08 15 877
Revendications 2015-04-08 6 194
Revendications 2016-06-06 6 197
Page couverture 2018-05-22 1 43
Dessin représentatif 2018-05-22 1 9
Accusé de réception de la requête d'examen 2013-12-01 1 176
Avis d'entree dans la phase nationale 2013-12-01 1 202
Courtoisie - Certificat d'enregistrement (document(s) connexe(s)) 2013-12-01 1 102
Avis du commissaire - Demande jugée acceptable 2017-11-06 1 162
PCT 2013-10-23 11 602
Demande de l'examinateur 2015-12-06 8 475
Modification / réponse à un rapport 2016-06-06 9 325
Correspondance 2016-11-02 3 129
Demande de l'examinateur 2016-12-13 6 361
Correspondance 2016-11-30 3 131
Courtoisie - Lettre du bureau 2017-01-03 1 22
Courtoisie - Lettre du bureau 2017-01-03 1 29
Courtoisie - Lettre du bureau 2016-11-27 138 5 840
Modification / réponse à un rapport 2017-06-13 12 505
Taxe finale 2018-05-06 2 43