Language selection

Search

Patent 2917383 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent: (11) CA 2917383
(54) English Title: SYSTEM FOR GENERATING PROCEDURAL TEXTURES USING PARTICLES
(54) French Title: SYSTEME DE GENERATION DE TEXTURES PROCEDURALES A PARTIR DE PARTICULES
Status: Granted
Bibliographic Data
(51) International Patent Classification (IPC):
  • G06T 15/04 (2011.01)
(72) Inventors :
  • DEGUY, SEBASTIEN (France)
  • SOUM, CHRISTOPHE (France)
  • DAMEZ, CYRILLE (France)
  • BATUT, ERIC (France)
(73) Owners :
  • ALLEGORITHMIC (France)
(71) Applicants :
  • ALLEGORITHMIC (France)
(74) Agent: GOWLING WLG (CANADA) LLP
(74) Associate agent:
(45) Issued: 2023-08-01
(86) PCT Filing Date: 2014-07-15
(87) Open to Public Inspection: 2015-01-22
Examination requested: 2018-09-13
Availability of licence: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): Yes
(86) PCT Filing Number: PCT/IB2014/001327
(87) International Publication Number: WO2015/008135
(85) National Entry: 2016-01-05

(30) Application Priority Data:
Application No. Country/Territory Date
13/01709 France 2013-07-18

Abstracts

English Abstract


System for generating textures on an object based on the particles emitted by
a particle
engine, comprising: - an access to data (11) of a particles emitter, of
particles (12)
emitted, of target object (13), of traces (14), and of graphical effects (15);
- an
animation simulation module (4) provided so as to perform a simulation of
emission
and of displacement for each of the particles provided; - a tracer module (5)
provided
for generating a trace on the surface of a target object corresponding to the
displacement of a particle along said surface after an impact of the particle
against the
target object with the aid of the traces data and of the target object data; -
a physical
parameters integrator module (6) provided for generating a new set of textures
for said
object taking into account the data of the object, the data of each now or
modified
trace, and the data of the corresponding graphical effects. Corresponding
method for
generating textures.


French Abstract

Système de génération de textures sur un objet à partir des particules émises par un moteur de particules, comprenant: - un accès à des données (11 ) d'émetteur de particules, de particules (12) émises, d'objet cible (13), de traces (14), et d'effets graphiques (15); - un module (4) de simulation d'animation, prévu pour effectuer une simulation d'émission et de déplacement pour chacune des particules prévues; - un module (5) traceur, prévu pour générer une trace sur la surface d'un objet cible correspondant au déplacement d'une particule de long de ladite surface après un impact de la particule contre l'objet cible à l'aide des données de traces et des données d'objet cible; - un module (6) intégrateur de paramètres physiques, prévu pour générer un nouvel ensemble de textures pour ledit objet prenant en compte les données de l'objet, les données de chaque trace nouvelle ou modifiée, et les données des effets graphiques correspondants. Procédé de génération de textures correspondant.

Claims

Note: Claims are shown in the official language in which they were submitted.


- 17 -
CLAIMS
1. A system comprising:
at least one processor configured to cause the system to:
emit, by a virtual particle emitter, virtual particles onto a target object;
identify object data comprising one or more first procedural textures for the
target
obj ect;
generate, for each virtual particle impact with the target object, a
parameterized
trace along a surface of the target object;
perform graphical effects along the surface of the target object based on the
object
data and one or more parameterized traces, wherein performing graphical
effects
comprises one or more of:
combining superimposed crack traces comprising a point of a common
path such that a first crack trace stops upon merging into a second crack
trace that
continues along the surface of the target object;
combining superimposed liquid traces by merging together two or more
liquid traces when they are within a threshold distance of each other; or
avoiding a combination of impact traces;
generate one or more second procedural textures for the target object based on
the
object data and the graphical effects; and
integrate the one or more second procedural textures with the target object.
2. The system of claim 1, wherein the at least one processor is further
configured to
cause the system to determine:
emitter data from the virtual particle emitter; and
particle data relevant to the virtual particles emitted by the virtual
particle emitter.
3. The system of claim 2, wherein the at least one processor is further
configured to
cause the system to generate parameterized traces for each virtual particle
impact with the target
Date Recue/Date Received 2021-08-25

- 18 -
object by determining a trajectory for each of the virtual particles as a
function of the emitter data
and the particle data.
4. The system of claim 2, wherein the at least one processor is further
configured to
cause the system to determine the emitter data by determining spatial
coordinates and an
orientation of the virtual particle emitter as a function of time.
5. The system of claim 2, wherein the at least one processor is further
configured to
cause the system to determine the particle data by determining physical
characteristics of the
virtual particles, the physical characteristics comprising one or more of
shape, dimensions,
weight, adhesion, or elasticity.
6. The system of claim 1, wherein the at least one processor is further
configured to
cause the system to generate parameterized traces by:
selecting, based on a type of a virtual particle, one or more rules that
determine a physical
effect that the virtual particle will impart to the target object.
7. The system of claim 1, wherein the at least one processor is further
configured to
cause the system to generate, utilizing a procedural texture generation
engine, the one or more
first procedural textures for the target object and the one or more second
procedural textures for
the target object based on one or more texture generation algorithms.
8. The system of claim 1, wherein the at least one processor is further
configured to
cause the system to perform the graphical effects along the surface of the
target object according
to physical-defined or chemical-defined processes in relation to the one or
more first procedural
textures for the target object.


- 19 -
9. The system of claim 1, wherein the at least one processor is
further configured to
cause the system to emit the virtual particles to the target object comprising
a multi-material
body.
10. The system of claim 1, wherein the at least one processor is further
configured to
cause the system to perform the graphical effects along the surface of the
target object in
accordance with a temperature mapping associated with the one or more
parameterized traces of
the virtual particles emitted onto the target object.
11. A method comprising:
emitting, by a virtual particle emitter, virtual particles onto a target
object;
identifying object data comprising one or more first procedural textures for
the target
obj ect;
generating, for each virtual particle impact with the target object, a
parameterized trace
along a surface of the target object;
performing graphical effects along the surface of the target object based on
the object
data and one or more parameterized traces, wherein performing graphical
effects comprises one
or more of:
combining superimposed crack traces comprising a point of a common path such
that a first crack trace stops upon merging into a second crack trace that
continues along
the surface of the target object;
combining superimposed liquid traces by merging together two or more liquid
traces when they are within a threshold distance of each other; or
avoiding a combination of impact traces;
generating one or more second procedural textures for the target object based
on the
object data and the graphical effects; and
integrating the one or more second procedural textures with the target object.
Date Recue/Date Received 2021-08-25

- 20 -
12. The method of claim 11, further comprising determining:
emitter data from the virtual particle emitter; and
particle data relevant to the virtual particles emitted by the virtual
particle emitter.
13. The method of claim 12, wherein generating parameterized traces for
each virtual
particle impact with the target object comprises determining a trajectory for
each of the virtual
particles as a function of the emitter data and the particle data.
14. The method of claim 12, wherein determining the emitter data comprises
determining spatial coordinates and an orientation of the virtual particle
emitter as a function of
time.
15. The method of claim 12, wherein determining the particle data comprises

determining physical characteristics of the virtual particles, the physical
characteristics
comprising one or more of shape, dimensions, weight, adhesion, or elasticity.
16. The method of claim 11, wherein generating parameterized traces
comprises
selecting, based on a type of a virtual particle, one or more rules that
determine a physical effect
that the virtual particle will impart to the target object.
17. The method of claim 11, further comprising generating, utilizing a
procedural
texture generation engine, the one or more first procedural textures for the
target object and the
one or more second procedural textures for the target object based on one or
more texture
generation algorithms.
18. The method of claim 11, wherein performing the graphical effects along
the
surface of the target object is in accordance with physical-defined or
chemical-defined processes
in relation to the one or more first procedural textures for the target
object.
Date Recue/Date Received 2021-08-25

- 21 -
19. The method of claim 11, wherein emitting the virtual particles to the
target object
comprises emitting the virtual particles to the target object comprising a
multi-material body.
20. The method of claim 11, wherein the emitter data comprises thermal
energy data,
and the method further comprises:
determining a temperature mapping associated with the one or more
parameterized traces
of the virtual particles emitted onto the target obj ect; and
performing the graphical effects along the surface of the target object in
accordance with
the temperature mapping.


Description

Note: Descriptions are shown in the official language in which they were submitted.


- 1 -
SYSTEM FOR GENERATING PROCEDURAL TEXTURES USING PARTICLES
TECHNICAL FIELD OF THE INVENTION
[0001] The present invention relates to a system and a method for generating
textures
on an object using particles projected onto an object.
PRIOR ART
[0002] In the field of computer graphics a myriad of tools have been used for
many
years to apply colors to objects. Conventionally, a color is applied as a
layer, in the
manner of a layer of paint applied to an actual physical substrate.
[0003] The application of a layer of color typically produces a uniform
result. To
obtain variations in color, intensity or opacity, a user must manually adjust
the color
settings at each point, thereby creating an accurate and detailed color
mapping. Various
graphical tools such as virtual brushes and applicators are made available to
the user
who performs such a "mapping".
[0004] To change a previously established "mapping", the user employs the same
types
of tools in order to apply the changed parameters point by point, and thus to
generate a
modified colorimetric result. Even though the user can use a frame to select
multiple
points to be changed in a similar fashion, the process must be carried out
manually for
each image, and therefore requires a considerable amount of time.
[0005] Different filters are also known, which may be applied to one or more
colors of
an image. Conventionally, such filters act to change the colors based on
parameters that
are intrinsic to the colors themselves. These filters therefore allow effects
to be created
based on either the chosen environment or style imposed by a user or according
to the
original parameters of the colors to be processed.
Date Recue/Date Received 2020-11-10

-2-
1100061 Thus, the process of creating or modifying object colors does not
allow the
parameters or characteristics of the objects on which the color is applied,
nor the
environment in which the objects are arranged in the scene, to be taken into
account.
Thus, to create realistic effects, a user must proceed manually in order to
determine the
target points or areas, the parameters to be modified, and the level of
modification of
the selected parameters. If one or more objects from a scene or scenes are to
be
processed, considerable time may be needed to carry out the required
operations.
[0007] For example, for the coloring of an area of wooden material in order to
impart it
with a realistic wooden appearance, a user must perform the parametric
adjustments in
a meticulous and accurate manner. As the coloring tools do not take material
properties
or interactions between objects and their environment into account, a user
wishing to
create a visual effect based on a material's reaction or behavior must firstly
envision or
imagine the desired effect in a realistic manner, and then apply the color
changes in
accordance with the settings of the existing colors. Thus, if a color is
applied to an
object, its coloring impact will be the same on all areas of the object. For
example, if
the object has a metallic portion, a different wooden portion, and a plastic
area, the
applied color has the same effect on all of these areas, whereas on a real
object, the
effects produced on each of the materials will vary, or even be very
different,
depending on the circumstances.
[0008] FR2681967 discloses a method for modifying the colors of an image
displayed
on a display device based on the determination of colorimetric values. The
method
includes selecting at least one color indicative of at least one pixel in the
image
comprised of a plurality of pixels, determining the colorimetric values of
said at least
one color, selecting a second color and determining the colorimetric values of
the
second color, and modifying the colorimetric values of a plurality of pixels
in the
image so that for any given pixel of said plurality having colorimetric values
which
correspond to the colorimetric values of said at least one color, the
colorimetric values
of the given pixel are modified so that they correspond to the colorimetric
values of the
second color. The applied color is identical, whatever the nature of the
object (plastic,
Date Recue/Date Received 2020-11-10

- 3 -
wood, etc.) and does not take textures into account, but only color variations
in an area
selected by the user.
[0009] P0884694 discloses a method for adjusting colors in digital images,
including
correcting "red eyes" in photographs. The pixel color data are adjusted by
identifying
the pixels in a digital image comprising original color data corresponding to
the
predetermined color. However, the color is applied automatically, based on
colorimetric data only, in particular the colors of the iris.
[0010] W02008066880 discloses a method for obtaining an original set of two or
more
original colors associated with an item of artwork. For that purpose, an input
set of one
or more user-selected colors is received. For each original color, the
original color is
mapped onto the derived colors. The plurality of derived colors is obtained
based on
one or more user-selected colors.
[0011] W02012154258 discloses a three-dimensional colorimetric coloring tool.
Each
pixel in the image comprises a set of pixel values in a three-dimensional
color space.
Even though the applied color allows a wide range of colors to be used, it
does not vary
depending on the material on which it is applied.
[0012] US7557807 discloses a computer-implemented method which comprises
generating an object having certain characteristics and emitting a particle.
The path of
the particle is checked to determine whether it will interact with the object.
In the case
of a collision between the particle and the object, the characteristics of the
object are
modified, in particular to simulate the ageing and erosion behaviors of the
object. The
described method involves implementing pointwise mapping of the object. A y-
ton
map is then applied to each point.
[0013] There is therefore a need to overcome these various disadvantages.
DISCLOSURE OF THE INVENTION
Date Recue/Date Received 2020-11-10

-4-
1100141 An object of the invention is to provide a system and method for
improving the
efficiency and productivity of graphic design tools.
[0015] Another object is to provide a system and graphical method for
increasing the
flexibility and graphics capabilities when generating colors or renditions.
[0016] Another object of the invention is to provide a system and graphical
method for
increasing the realism of the represented items.
[0017] Yet another object of the invention is to provide a system and method
for
improving the interactivity between the rendition of a represented object and
its
environment.
[0018] Yet another object of the invention is to provide a system and method
for
creating a contextual editing mode which takes environmental parameters into
account.
[0019] To achieve this object, the invention provides various technical means.
For
example, the invention first provides a system for generating procedural
textures on an
object using the particles emitted by a particle engine, comprising:
- access to particle emitter data;
- access to emitted particle data;
- access to data relevant to target objects defined by architectural
parameters and
procedural textures;
- access to trace data;
- access to graphical effects data;
- a microprocessor and control instructions;
- an animation simulator module, adapted to perform emission and
displacement
simulation for each of the provided particles using the particle emitter data
and emitted
particle data;
- a tracer module for generating a parameterized trace which produces one or
more
physical and/or chemical changes in properties of at least the surface of said
object, so
as to modify at least one of its parameters, in particular a visible
characteristic;
Date Recue/Date Received 2020-11-10

- 5 -
- a physical parameter integrator module for:
i) performing graphical effects based on the obtained object data and trace
data;
ii) generating a new texture set for said object, taking into account the
object data and
the graphical effects previously obtained.
[0020] With such an arrangement, a system can take into account a parametric
architecture in order to determine the influence of particles projected onto
objects. This
parametric architecture makes use of the physical and/or chemical elements
inherent to
the components and properties of the particles and objects. In particular, due
to the fact
that parameterized objects and their textures can be modified according to
parameterized traces based on physical and/or chemical phenomena, a scene can
be
implemented and evolve in accordance with a much greater number of parameters
than
just the colorimetric parameters conventionally accounted for, thus
dramatically
increasing the realism of the visual effects produced.
[0021] According to an advantageous embodiment, the tracer module comprises a
rule
selection module and an implementation module for applying the rule in order
to
generate the resulting trace data.
[0022] According to another advantageous embodiment, the tracer module
comprises a
trace mixer submodule for modifying a trace on which a new active particle is
brought
into interaction.
[0023] Advantageously, the system further comprises a temporal storage module
for
keeping data allowing a texture set to be generated again for an object for
which one or
more parameters are modified, or to again obtain a texture set which had
previously
been generated.
[0024] Also advantageously, the system further comprises an input module for
user
data that may affect the data from the simulation module.
Date Recue/Date Received 2020-11-10

-6-
1100251 According to yet another embodiment, the system further comprises
access to
data relevant to global parameters that may influence a plurality of emitted
particles
and/or at least one object in the area of influence of these global
parameters.
[0026] The invention also provides a method for generating procedural textures
on an
object using particles emitted by a particle emitter, comprising the following
steps:
- an animation simulator module receives data from at least one particle
emitter, data
relevant to particles to be emitted by the emitter, data relevant to at least
one target
object, which is defined by architectural parameters and procedural textures,
liable to
be impacted by said emitted particles, and determines a trajectory for each of
the
particles to be emitted as a function of the emitter data and particle data;
- for each particle colliding with a target object, a tracer module
generates data relevant
to at least one trace on the surface of said object based on the object data
and particle
data;
- a physical parameter integrator module performs the graphical effects based
on the
object data and trace data;
- for each object having undergone at least one particle impact, the
physical parameter
integrator module generates a new set of textures, taking into account the
object data
and the previously obtained graphical effects.
[0027] In an alternative embodiment, the integrator module generates the
textures of
the new set by performing the graphical effects based on the object data and
trace data.
[0028] According to another embodiment, for each active particle, a rule
selection
module selects a rule to be applied, and a rule implementation module
evaluates said
rule according to the target object parameters in order to generate the
resulting trace
data.
[0029] According to yet another alternative embodiment, for each trace
modification
rule a particle selection module selects particles affected by the rule to be
applied, and
a rule implementation module evaluates said rule according to the particle
parameters
and the target object in order to generate the resulting trace data.
Date Recue/Date Received 2020-11-10

- 7 -
DESCRIPTION OF THE DRAWINGS
[0030] Fully detailed embodiments are given in the following description, in
conjunction with Figures 1 to 5, which are presented only for the purposes of
non-
limiting examples and in which:
- Figure 1 schematically shows an example of a texture generator system
according to
the invention;
- Figure 2 is a block diagram showing the main steps of the texture
generating method
according to the invention;
- Figure 3 is a block diagram showing in detail step 150 of Figure 2;
- Figure 4 is a block diagram showing in detail a first trace generation
mode;
- Figure 5 is a block diagram showing in detail a second trace generation
mode.
[0031] In the following description, substantially identical or similar items
will be
designated by the same reference numerals.
DETAILED DESCRIPTION OF THE INVENTION
DEFINITIONS
[0032] By "physical parameter" is meant any physical and/or chemical item,
property
or characteristic capable of being measured or observed or detected or
quantified,
which characterizes an object, a particle, an environment, an emitter, etc.
[0033] By "parametric architecture" is meant the set of parameters that define
the
physical, chemical (components, properties, visual appearance of an object,
texture,
etc.) and behavioral characteristics of an item (ink, texture, object, etc.).
[0034] By physical "particle" (or parameterized particle) is meant the
physical and/or
chemical elementary unit in its state when the projection is performed (solid,
liquid,
gaseous or a mixture of these phases) which, when projected onto an object,
generates
Date Recue/Date Received 2020-11-10

- 8 -
a parameterized trace that produces one or more physical and/or chemical
changes in
properties at least on the surface of this object, in particular textures of
that object, so
as to modify at least one of its physical parameters or characteristics, in
particular a
visible characteristic.
[0035] By "particle emitter" is meant an item, in particular a virtual item,
whether
visible or not in a scene, for projecting one or more physically parameterized
particles
onto an object, which has also been physically parameterized, such as a gun,
spray gun,
spray, nozzle, emitter, projector (for photons, or light or heating particles,
etc.) etc. A
scene may comprise one or more emitters. An emitter's parameters preferably
comprise
its position in the scene, and the orientation and angle of emission or
projection of the
particles.
[0036] By "trace" (or parameterized trace) is meant a point or path (a set of
points) on
the surface of a target object generated by the movement of one or more
parameterized
particles on the object due to an impact or collision therewith.
[0037] By "graphical effect" is meant a description of the physical and/or
chemical
process, which determines how one or more traces generated on a target object
affect
this object's texture. By way of illustration, some examples of graphical
effects are as
follows:
- a trace of liquid applied to bare wood is absorbed by the wood.
Alternatively, this
causes the wood's hue to darken;
- a liquid applied to a varnish or plastic is not absorbed at all and
produces a "bead of
liquid" effect on the surface of the material;
- heat applied to a painted material causes the paint to peel and then burn
depending on
the temperature set by the user, and optionally cause combustion of the
material on
which the paint is applied if the latter is combustible;
- application of an acid to, or sandblasting of a glossy plastic material
will gradually
roughen it, thus making it less glossy and increasingly rougher.
Date Recue/Date Received 2020-11-10

-9-
1100381 By "procedural texture" is meant a texture defined using algorithms
and/or
mathematically, and displayed by a rendering engine which transforms the
mathematical data into a conventional image format such as bitmap, for
example.
[0039] Through the method and system described in the following, the various
stages
of an evolutionary process can be determined and presented
[0040] Figure 1 illustrates an exemplary system for generating procedural
textures
according to the invention. The system comprises at least one microprocessor 2
suitable for the implementation of instructions in an instruction memory 3. A
plurality
of modules are advantageously provided through the implementation of the
instructions
by the microprocessor. An animation simulator module 4 allows the data related
to the
movement of various items in the scene to be obtained. This animation data
further
includes the spatial coordinates over time, events such as collisions,
deactivations, etc.,
for each of the items.
[0041] A tracer module 5 can determine the particle motion data on a target
object after
the particle has collided with the object. A physical parameter integrator 6
makes it
possible, using the physical parameters of interest, to generate a new set of
textures for
the object subjected to the various items and parameters.
[0042] A mixer module 7, which is optional, allows the data of several
superimposed
traces to be taken into account, when several traces include points or areas
on common
paths. The mixer can define, depending on the influence of each trace, a mixed
or
global trace portion. The latter will be used by the integrator in the
relevant area. To
illustrate the function of the trace mixer, the following non-limiting
examples are
provided:
- liquid traces which flow "down" and merge when they become too close;
- "crack" traces, which stop with only the widest remaining when they merge
(cracks
do not generally "intersect");
- impact traces that are completely independent of one another and do not
"mix
together".
Date Recue/Date Received 2020-11-10

- 10 -
[0043] A user input section 8 can receive data from an external source, such
as, in
particular from a user wishing to interact with the physical phenomena in
progress or to
come.
[0044] An optional backup module 9 allows temporal data to be kept, which are
related
to a time scale. For example, this module allows an animation simulation to be
rerun
after changing one or more parameters, by performing only those operations
required
by the changed data. It is thus possible to simply and quickly carry out
several
consecutive simulations based on a previous simulation, or to find a
previously
performed simulation.
[0045] A bus 10 is provided for data transfers among the various modules and
memory
elements described below.
[0046] An emitter memory element 11 contains data relevant to at least one
particle
emitter or engine. This data comprises, for example, the spatial coordinates
and
orientation of the emitter as a function of time, particle emission cone,
emission rate,
velocity and/or strength of emission, etc.
[0047] The emitted particle data is contained in a particle memory element 12.
This
data includes, for example, the physical characteristics of the particles such
as shape,
dimensions, weight, adhesion, elasticity, etc.
[0048] A target object data item 13 stores the data relevant to target objects
that may be
subject to impact during an animation simulation. This data includes, for
example, the
physical characteristics of the target objects such as shapes, dimensions,
weights, and
various characteristics related to the surface and textures of the objects.
[0049] A trace data element 14 stores the data traces generated by the
particles on a
given target object. The data may include a plurality of parameters such as
width, depth
and profile as a function of position along the trace, roughness, porosity,
etc. Generally,
Date Recue/Date Received 2020-11-10

- 11 -
any parameter that may influence the texture characteristics of the object in
question
can be taken into account. Indices can be assigned to each of the parameters
in order to
weight their relative significance levels.
[0050] A graphical effects data item 15 stores the data relevant to the
graphical effects
implemented in animation simulations. These graphical effects may include
parameters
for coloring, intensity, brightness, particle size, etc.
[0051] The above-described memory elements and/or the various modules can be
combined into one or more components and one or more modules without
significantly
affecting the system's operation.
[0052] An optional item of global parameters 16 includes parameters that may
affect
several items in the scene, such as data related to temperature, pressure,
humidity,
physical force (magnetic, gravitational or the like), etc.
[0053] A target object texture data item 17 stores the data relevant to the
new textures
of target objects that may be subject to impact during an animation
simulation. Any
available original texture data can also be contained in this memory element
17.
[0054] Figure 2 shows a flowchart of the main steps of the method for
generating
textures according to the invention. At step 110, the system and useful data
are
initialized. At step 120, which is optional, any available user data can be
received to
adjust or correct data or parameters to be processed by the animation
simulator module
4 depending on a particular user need or desire.
1100551 Step 130, in the animation simulator module 4, includes receiving data
related
to particles, or emitters, the object or objects, and any environmental data.
The
animation simulator module 4 therefore receives all of the parameters which
allow it to
perform an animation of the scene. This animation simulation comprises a step
of
calculating trajectories of the items liable to move in the scene, such as the
particles
and any objects and/or emitters. Steps 141 to 148 show the different steps in
this
Date Recue/Date Received 2020-11-10

- 12 -
trajectory calculation phase in greater detail. After phase 140, phase 150
provides for
the physical parameter integration and the generation and/or adaptation of a
new set of
textures for the object or objects affected by the events occurring in the
scene. This
phase is shown in Figure 3 in more detail.
[0056] Trajectory calculation preferably starts with a test performed in step
141, where
it is checked whether the relevant particle is active or deactivated. If it is
deactivated,
step 145 is performed in order to update the relevant particle data. In such
cases, the
data related to the particle comprises a parameter related to the deactivation
of said
particle. If the particle is active, a second test, at step 142, checks
whether the particle
collides with an object. If the test produces a negative result, step 145 is
performed in
order to update the relevant particle data. In case of collision, a trace
generation phase
143 is performed. A trace modification phase 144 may then possibly be carried
out in
the case where one or more traces are affected by a new collision or trace.
These phases
are shown in detail in Figures 4 and 5.
Step 145 next ensures that the data affected by the preceding steps or phases
are
updated. This, in particular, is the case of the particle and/or trace data.
The calculation
phase ends at step 146.
[0057] Alternatively, the test at step 141 is followed by a test 147 to check
whether the
particle being processed generates or not a possible new particle or a new
emitter. If
this test is positive, step 148 is then performed to update the emitter data
as a function
of the particle generation mode. Otherwise, step 145 is performed in order to
update
the relevant particle data. Test 147 is also performed in the case where the
collision test
with an object from step 142 yields a positive result.
[0058] Figure 3 shows in greater detail the most important steps of phase 150
which
consists in integrating the physical parameters and generating and/or adapting
textures
resulting from events taking place in the scene. At step 151, physical
parameter
integrator 6 receives applicable trace data, object data and graphical effects
data.
Date Recue/Date Received 2020-11-10

- 13 -
[0059] In step 152, the integrator implements the graphical effects which
correspond to
the received data, based on the object data and the relevant trace data, for
example,
flaking paint, corrosion (if metal), burn, combustion, trace interruption for
porous
materials, and non-absorbent sagging. Step 153 checks whether one or more
other
traces are taken into account. If this is the case, the trace mixer module 7
pools the
trace parameters for those areas that are shared by several traces.
[0060] Any user data is taken into account in step 155. Finally, once all of
the
iterations have been performed, the physical parameters are integrated by
integrator 6
in order to generate and/or modify the object's textures.
[0061] As previously mentioned, phase 143 is detailed in Figures 4 and 5, in
steps 200
to 250 for the case of Figure 4, and in steps 300 to 350 for the case of
Figure 5.
[0062] In Figure 4, for each active particle (step 200), tracer module 5
selects a rule to
be applied depending on the type of particle. The rules enable the
determination of the
type of influence or physical effect the relevant particle will have on the
generated
trace, and ultimately on the textures obtained for the object interacting with
said
particle. At step 230, the rule is evaluated according to the object
parameters. At step
240, a trace is generated or changed according to the appropriate rule. A test
at step
220 checks whether or not another rule applies.
[0063] In Figure 5, for each trace modification rule (step 300), tracer module
5 selects
particles affected by the rule. In step 330, the rule is evaluated based on
the particle
parameters and the object. In 340, a trace is generated or modified according
to the
appropriate rule. A test in step 320 checks whether or not another particle
applies.
ALTERNATIVES AND OTHER EMBODIMENTS
[0064] In the above, the system and method of the invention have been
disclosed in a
working environment suitable for an editing tool, for a user who intends to
create or
modify the rendition of one or more objects.
Date Recue/Date Received 2020-11-10

- 14 -
[0065] Alternatively, the system and method according to the invention are
used in a
standalone mode, for generating renditions of objects using physical
parameters that
are predetermined or may be calculated by the system itself, for example based
on
intermediate results. Such exemplary embodiments are advantageously used for
video
games or movies, in particular games or movies in which textures are rendered
or
generated by a procedural texture generation engine. Document W02012014057
describes an example of such a rendering system and process.
[0066] The system and method according to the invention allow renditions of
objects
to be generated and/or modified, taking into account the technical (physical,
chemical,
thermodynamic, etc.) factors inherent to the objects themselves as well as the
scene's
environment.
[0067] For example, to create a corrosion effect on an object, an emitter may
project
particles parameterized with parameters that are related to corrosion. Among
these
physical parameters (other than color data) the behavior of objects with
respect to the
projected particles, that is to say, the interactions between the different
physical items
can for example be such that materials such as plastic do not react to
corrosion effects,
steel develops corroded areas, copper oxidizes, etc.
[0068] According to the embodiments, certain parameters can be assigned either
to
parameterized particles, or to objects, or to the environment, or else, to the
graphical
effects. The parametric distribution or architecture can also vary in order to
produce
comparable renditions.
[0069] In another exemplary use of the method and system according to the
invention,
the particles projected against the objects only have non-colorimetric
parameters, such
as thermal energy or heat data, pressure data, etc.
Date Recue/Date Received 2020-11-10

- 15 -
[0070] In one example where an emitter projects water, the target object with
a
plurality of different materials may have different reaction modes according
to the
materials on which the traces evolve. The traces may be different, and the
graphical
effects may also be different, so that the final textures take the parameters
of the
various materials of the target into account.
[0071] In another example, hot particles are emitted onto a multi-material
body. The
traces allow a kind of temperature "mapping" to be created on the surface of
the object.
This "mapping", to which the graphical effects are applied, makes it possible
to
produce final textures which take different materials into account.
[0072] Table 1 below illustrates examples of parameters and rules allowing the

discussed examples to be implemented.
TABLE 1: Example of physical parameters
Particles Emitter Object Environ- Particle Calculation of
Graphical Adapted
ment trajectory trace on object
effect to be texture
calculation applied
Water Water gun Metal+ Neutral Trajectory Metal:
sagging Metal: Object
wood+ data in Wood: absorbs corrosion
texture
PVC space PVC: sagging Wood and with
body PVC: applied
wetting effect
effect
Burning Torch Metal+ Neutral Trajectory Mapping
of Metal: non Object
gas wood+ data in temperature effect
texture
PVC space onto object PVC: melts
with
body Wood: burnt applied
areas effect
Projec- Gun Space- Neutral Trajectory Points of
impact Crater form Object
tile ship data in on surface on surface +
texture
space ripples with
applied
effect
[0073] Temporal backup advantageously makes it possible to go back into a
process in
order to recover one of the multiple previous states. This also allows a
process to be
rerun by modifying only one or a few parameters, while advantageously keeping
the
other parameters unchanged, thus avoiding having to parameterize all of the
data again.
Date Recue/Date Received 2020-11-10

- 16 -
It is thus possible, for example, to easily and quickly compare results
obtained by only
modifying some of the parameters.
[0074] For example, it is possible to change a particle characteristic (e.g.
color, size, la
hardness, temperature, etc.) for one or more previously projected particles
during the
process.
[0075] The above-described figures are given by way of non-limiting example of
the
present invention.
[0076] The reference numerals in the claims have no limiting character. The
words
"comprise" and "include" do not exclude the presence of items other than those
listed
in the claims. The word "a" preceding an item does not exclude the presence of
a
plurality of such items. In addition, the above-described system and method
advantageously operate in a multi-channel configuration, that is to say, by
processing
several textures (diffuse, normal, etc.) at each step.
Date Recue/Date Received 2020-11-10

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date 2023-08-01
(86) PCT Filing Date 2014-07-15
(87) PCT Publication Date 2015-01-22
(85) National Entry 2016-01-05
Examination Requested 2018-09-13
(45) Issued 2023-08-01

Abandonment History

There is no abandonment history.

Maintenance Fee

Last Payment of $210.51 was received on 2023-07-07


 Upcoming maintenance fee amounts

Description Date Amount
Next Payment if small entity fee 2024-07-15 $125.00
Next Payment if standard fee 2024-07-15 $347.00

Note : If the full payment has not been received on or before the date indicated, a further fee may be required which may be one of the following

  • the reinstatement fee;
  • the late payment fee; or
  • additional fee to reverse deemed expiry.

Patent fees are adjusted on the 1st of January every year. The amounts above are the current amounts if received by December 31 of the current year.
Please refer to the CIPO Patent Fees web page to see all current fee amounts.

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Application Fee $400.00 2016-01-05
Maintenance Fee - Application - New Act 2 2016-07-15 $100.00 2016-04-25
Maintenance Fee - Application - New Act 3 2017-07-17 $100.00 2017-05-09
Maintenance Fee - Application - New Act 4 2018-07-16 $100.00 2018-04-24
Request for Examination $800.00 2018-09-13
Maintenance Fee - Application - New Act 5 2019-07-15 $200.00 2019-05-06
Maintenance Fee - Application - New Act 6 2020-07-15 $200.00 2020-04-27
Maintenance Fee - Application - New Act 7 2021-07-15 $204.00 2021-07-09
Maintenance Fee - Application - New Act 8 2022-07-15 $203.59 2022-07-11
Final Fee $306.00 2023-05-19
Maintenance Fee - Application - New Act 9 2023-07-17 $210.51 2023-07-07
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
ALLEGORITHMIC
Past Owners on Record
None
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Amendment 2020-01-31 8 329
Description 2020-01-31 18 782
Claims 2020-01-31 3 108
Examiner Requisition 2020-07-17 4 191
Amendment 2020-11-10 32 1,173
Abstract 2020-11-10 1 21
Description 2020-11-10 16 633
Claims 2020-11-10 4 130
Drawings 2020-11-10 4 142
Examiner Requisition 2021-05-07 4 189
Amendment 2021-08-25 18 615
Claims 2021-08-25 5 158
Examiner Requisition 2022-02-23 3 184
Amendment 2022-05-02 11 497
Final Fee 2023-05-19 3 83
Abstract 2016-01-05 2 98
Claims 2016-01-05 3 119
Drawings 2016-01-05 4 72
Description 2016-01-05 19 781
Representative Drawing 2016-01-05 1 8
Cover Page 2016-02-24 2 46
Request for Examination 2018-09-13 2 45
Examiner Requisition 2019-08-01 5 225
International Search Report 2016-01-05 3 71
National Entry Request 2016-01-05 3 81
Representative Drawing 2023-06-29 1 11
Cover Page 2023-06-29 1 49
Electronic Grant Certificate 2023-08-01 1 2,527