Language selection

Search

Patent 3069582 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent Application: (11) CA 3069582
(54) English Title: SYSTEM AND METHOD FOR ANONYMOUS LOCATION VERIFICATION
(54) French Title: SYSTEME ET METHODE POUR VERIFIER LES EMPLACEMENTS DE FACON ANONYME
Status: Examination Requested
Bibliographic Data
(51) International Patent Classification (IPC):
  • H04W 4/021 (2018.01)
  • H04W 12/06 (2021.01)
  • H04W 64/00 (2009.01)
(72) Inventors :
  • POURTABATABAIE, ARYA (Canada)
  • ORTIZ, EDISON U. (Canada)
  • SALTER, MARGARET INEZ (Canada)
(73) Owners :
  • ROYAL BANK OF CANADA (Canada)
(71) Applicants :
  • ROYAL BANK OF CANADA (Canada)
(74) Agent: NORTON ROSE FULBRIGHT CANADA LLP/S.E.N.C.R.L., S.R.L.
(74) Associate agent:
(45) Issued:
(22) Filed Date: 2020-01-23
(41) Open to Public Inspection: 2020-07-23
Examination requested: 2024-01-23
Availability of licence: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): No

(30) Application Priority Data:
Application No. Country/Territory Date
62/801,322 United States of America 2019-02-05
62/839,407 United States of America 2019-04-26
62/795,979 United States of America 2019-01-23
62/839,408 United States of America 2019-04-26
16/503,154 United States of America 2019-07-03

Abstracts

English Abstract


A computer implemented system for anonymous electronic verification of
location
credentials including at least one processor and data storage is described in
various
embodiments. The system includes cryptographic mechanisms and electronic
communication
between one or more computing systems that in concert, provide verification of
a prover's
location credentials in accordance to logical conditions of a verifier's
policy without providing
additional information to a verifier entity.


Claims

Note: Claims are shown in the official language in which they were submitted.


WHAT IS CLAIMED IS:
1. A computer implemented system for communicating data messages between a
verifier
computing device and a portable client computing device, the data messages
establishing
authentication of at least one client location characteristic of the portable
client computing
device, the system comprising
the portable client computing device including at least a client computing
device
processor and data storage, the data storage storing one or more token data
objects received
from or computed jointly in a multiparty protocol with an issuer computing
device, the one or
more token data objects generated using at least an issuer computing device
private issuance
key, the one or more token data objects each including one or more signed data
elements
representing at least one of the at least one client location characteristic
of the portable client
computing device; and
the client computing device processor configured to:
receive a verification request data message from the verifier computing
device,
the verification request data message comprising a request for confirmation
that the
portable client device is associated with a particular geographic location,
and
using a combination of the one or more token data objects and the verification

request data message, generate one or more proof data messages without sending
any
data messages or requests to the issuer computing device.
2. The computer implemented system as claimed in claim 1, wherein to
determine if the
portable client device is associated with the particular geographic location,
the client computer
device processor is configured to.
determine if at least one client location characteristic represents a
geographic
location that is within at least one of
a radius of the particular geographic location;
a polygon shape defining an area representing the particular geographic
location, or
a union of convex polygon shapes defining an area representing the
particular geographic location.
- 82 -

3. The computer implemented system as claimed in claim 1, wherein to
determine if the
portable client device is associated with the particular geographic location,
the client computer
device processor is configured to.
convert a geographic point associated with the portable client device into a
planar
approximation;
determine a commitment value of a geometric function of the planar
approximation, and
determine and generate a proof that a result of the geometric function applied
to the
planar approximation is within a parameter.
4. The computer implemented system as claimed in claim 3, wherein the
client computer
processor is further configured to:
generate a proof that a commitment is well-formed;
send the commitment values and proofs to a verifier device for verification of
the
location; and
receive confirmation from the verification device that the location has been
verified.
5. The computer implemented system as claimed in claim 1, wherein the
particular
geographic location is a circle having an origin point with a radius defining
an area, and wherein
to determine if the portable client device is associated with the particular
geographic location,
the client computer device processor is configured to:
convert a geographic point associated with the origin into a planar
approximation;
determine a commitment value of a distance squared between the planar
approximation
of the geometric location of portable client device and the planar
approximation of the origin;
and
generating a proof that the distance squared is less than the radius squared.
6. The computer implemented system as claimed in claim 1, wherein the
particular
geographic location is a polygon shape having vertices defining an area, and
wherein to
determine if the portable client device is associated with the particular
geographic location, the
client computer device processor is configured to:
- 83 -

for each endpoint of a vertex of the polygon, convert a geographic point
associated with
that endpoint into a planar approximation;
for each triangle defined by A i PA i+1, determine a commitment value of a
signed area of
that triangle, where A i and A i+1 are the planar approximations of endpoints
of a vertex of the
polygon shape, P is the planar approximation of the particular geographic
location associated
with the portable client device, and i is an index value representing a vertex
of the polygon; and
generate a proof that the signed area of each triangle defined by APA i+1 is a
positive
value.
7. The computer implemented system as claimed in claim 1, wherein the
particular
geographic location is a union of shapes defining an area, and wherein to
determine if the
portable client device is associated with the particular geographic location,
the client computer
device processor is configured to.
convert the particular geographic location into planar approximations,
generate an honest proof for at least one shape that contains the particular
geographic
location, and
generate a simulated proof for all other shapes in the union of shapes.
8. The computer implemented system of claim 7, wherein to generate a
simulated proof of
a shape comprising an origin point and a radius defining an area, the client
computer device
processor is configured to:
generate commitments to planar approximations dx and dy of the particular
geographic
location;
generate a first simulated proof portion that dx = rcos.phi.od.theta. and dy =
rd.phi.;
generate commitments to squared planar approximations dx2 and dy2 of the
particular
geographic location,
generate a second simulated proof portion that dx2 and dy2 were correctly
computed;
generate a commitment to .delta. = r2 ¨ dx2 ¨ dy2, and
generate a third simulated proof that a decomposition of .delta. is a positive
value.
- 84 -

9. The computer implemented system of claim 7, wherein to generate a
simulated proof of
a polygon shape having vertices defining an area, the client computer device
processor is
configured to
generate commitments to planar approximations dx and dy of the particular
geographic
location,
generate a first simulated proof portion that dx = rcos .PHI.0 d .theta. and
dy = rd.PSI.,
generate commitments to signed values S(A i PA ~+1);
generate a second simulated proof portion that the signed values were formed
correctly;
and
generate simulated proof portions that all the signed values S(A ~PA i+1) are
positive
values.
10. The computer implemented system of claim 1, wherein the one or more
token data
objects are pre-loaded into the data storage such that the generation of the
proof can be
conducted at a time temporally separated from when the one or more token data
objects were
generated or preloaded.
11. The computer implemented system of claim 1, wherein the one or more
proof data
messages are generated such that the one or more proof data messages can be
validated
using an issuer computing device public issuance key corresponding to the
issuer computing
device private issuance key.
12. A computer implemented method for communicating data messages between a
verifier
computing device and a portable client computing device, the data messages
establishing
authentication of at least one client location characteristic of the portable
client computing
device, the method comprising:
storing one or more token data objects received from an issuer computing
device, the
one or more token data objects generated using at least an issuer computing
device private
issuance key, the one or more token data objects each including one or more
signed data
elements representing at least one of the at least one client location
characteristic of the
portable client computing device.

- 85 -

receiving a verification request data message from the verifier computing
device, the
verification request data message comprising a request for confirmation that
the portable client
device is associated with a particular geographic location; and
using a combination of the one or more token data objects and the verification
request
data message, generate one or more proof data messages without sending any
data messages
or requests to the issuer computing device.
13. The computer implemented method as claimed in claim 12, wherein
determining if the
portable client device is associated with the particular geographic location
comprises:
determining if at least one client location characteristic represents a
geographic location
that is within at least one of:
a radius of the particular geographic location,
a polygon shape defining an area representing the particular geographic
location;
or
a union of convex polygon shapes defining an area representing the particular
geographic location.
14. The computer implemented method as claimed in claim 12, wherein
determining if the
portable client device is associated with the particular geographic location
comprises:
converting a geographic point associated with the portable client device into
a planar
approximation;
determining a commitment value of a geometric function of the planar
approximation;
and
determining a result of the geometric function applied to the planar
approximation is
within a parameter.
15. The computer implemented method as claimed in claim 12, wherein the
particular
geographic location is a circle having an origin point with a radius defining
an area, and wherein
to determine if the portable client device is associated with the particular
geographic location
comprises:
converting a geographic point associated with the origin into a planar
approximation;
- 86 -

determining a commitment value of a distance between squares of the planar
approximation of the geometric location of portable client device and the
planar approximation
of the origin; and
confirming that the distance is less than the radius.
16. The computer implemented method as claimed in claim 12, wherein the
particular
geographic location is a polygon shape having vertices defining an area, and
wherein
determining if the portable client device is associated with the particular
geographic location
comprises:
for each endpoint of a vertex of the polygon, converting a geographic point
associated
with that endpoint into a planar approximation;
for each triangle defined by A i PA i+1, determining a commitment value of a
signed area of
that triangle, where A i and A i+1 are the planar approximations of endpoints
of a vertex of the
polygon shape, P is the planar approximation of the particular geographic
location associated
with the portable client device, and i is an index value representing a vertex
of the polygon; and
confirming that the signed area of each triangle defined by A i PA i+1 is a
positive value.
17. The computer implemented method as claimed in claim 12, wherein the
particular
geographic location is a union of convex polygon shapes defining an area, and
wherein
determining if the portable client device is associated with the particular
geographic location
comprises:
converting the particular geographic location into planar approximations;
generating an honest proof for at least one shape that contains the particular
geographic
location; and
generating a simulated proof for all other shapes in the union of shapes.
18. The computer implemented method of claim 17, wherein generating a
simulated proof of
a shape comprising an origin point and a radius defining an area comprises:
generating commitments to planar approximations dx and dy of the particular
geographic location;
generating a first simulated proof portion that dx = rcos.PSI.0 d.theta. and
dy = rd.PSI.;

- 87 -

generating commitments to squared planar approximations dx2 and dy2 of the
particular
geographic location;
generating a second simulated proof portion that dx2 and dy2 were correctly
computed;
generating a commitment to .delta. = r2 ¨ dx2 ¨ dy2; and
generating a third simulated proof that a decomposition of S is a positive
value.
19.The computer implemented system of claim 17, wherein generating a simulated
proof of
a polygon shape having verticies defining an area comprises:
generating commitments to planar approximations dx and dy of the particular
geographic location;
generating a first simulated proof portion that dx = rcos.PSI.0 d .theta. and
dy = rd.PSI.;
generating commitments to signed values S(A ~ PA ~ +1);
generating a second simulated proof portion that the signed values were formed

correctly; and
generating simulated proof portions that all the signed values S(A i PA ~+1)
are positive
values.
20.The computer implemented method of claim 12, wherein the one or more token
data
objects are pre-loaded into the data storage such that the generation of the
proof can be
conducted at a time temporally separated from when the one or more token data
objects were
generated or preloaded.
21.The computer implemented method of claim 12, wherein the one or more proof
data
messages are generated such that the one or more proof data messages can be
validated
using an issuer computing device public issuance key corresponding to the
issuer computing
device private issuance key.

- 88 -

Description

Note: Descriptions are shown in the official language in which they were submitted.


SYSTEM AND METHOD FOR ANONYMOUS LOCATION VERIFICATION
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application is a continuation-in part of, and claims all benefit
including priority to, US
Application No. 16/503154 dated 03-Jul-2019 which is a non-provisional of, and
claims all
benefit including priority to, US Application No. 62/693680 dated 03-Jul-2018,
US Application
No. 62/702684 dated 24-Jul-2018, and US Application No. 62/839408 dated 26-Apr-
2019, all
entitled SYSTEM AND METHOD FOR AN ELECTRONIC IDENTITY BROKERAGE, all
incorporated herein in their entirety by reference.
[0002] This application is a non-provisional of, and claims all benefit
including priority to, US
Application No. 62/795979, dated 23-Jan-2019, US Application No. 62/801322,
dated 5-Feb-
2019, and US Application No. 62/839407, dated 26-Apr-2019, all entitled SYSTEM
AND
METHOD FOR ANONYMOUS LOCATION VERIFICATION; and to US Application No.
62/839408, dated 26-Apr-2019, entitled SYSTEM AND METHOD FOR AN ELECTRONIC
IDENTITY BROKERAGE, all incorporated herein in their entirety by reference.
FIELD
[0003] Embodiments of the present disclosure generally relate to the field of
electronic
verification, and more specifically, embodiments relate to devices, systems
and methods for
electronic verification of anonymous location credentials.
INTRODUCTION
[0004] The verification of characteristics of an entity is a useful tool in
the context of decision
making, for example, in relation to access provisioning, goods and service
provisioning, among
others. The anonymous verification of an entity may also be useful and
desirable.
[0005] However, an individual when providing credentials for verification may
wish to restrict the
amount of information being provided to the counterparty, particularly in
regards to the location
of the individual. The credentials being provided, to increase trust, may also
benefit from
verification through association with a third-party verifier (e.g., indicating
that the individual is
who they purport to be, and located where they purport to be).
- 1 -
CA 3069582 2020-01-23

[0006] Credential assertion with restricted information has been difficult to
implement in practice
as it is technically challenging to generate and provide sufficient trusted
credentials, especially
on a high-volume, scalable system adapted to serve a large number of users on
an on-demand
basis.
SUMMARY
[0007] Embodiments described herein are directed to computer systems and
devices directed
to provide a cryptographic platform for generating and transmitting messages
that are adapted
to assert attributes about various objects (e.g., user profiles, including
user locations) without
indicating any more than is actually required (e.g., anonymously), and
corresponding methods
and computer readable media storing machine-interpretable instruction sets for
performing the
methods.
[0008] The computer systems and devices, in accordance with some embodiments,
are
adapted to a high-volume, scalable system, which dynamically responds to data
credential
requests of one or more users or one or more computer systems requesting
identity / credential
proofs (including anonymous location of users).
[0009] In some embodiments, the assertions are conducted using mobile
endpoints (e.g., user
devices) which may have limited computational performance and resources, and
accordingly,
an improved cryptographic approach and system is proposed that enables the
assertion
functionality through the passing of cryptographically generated messages
between devices. An
improvement associated with the proposed cryptographic approach of some
embodiments is
that it is able to operate in a secure and scalable way, even on limited
computational resources
(e.g., those available on an unenhanced smartphone).
[0010] In some embodiments, there is provided a system for anonymous location
verification.
The system comprises at least one processor and a memory comprising
instructions which
when executed by the processor cause the processor to determine if an address
is within at
least one of a radius of a location, a polygon shape defining an area, or a
union of convex
polygon shapes.
[0011] In some embodiments, there is provided a computer implemented system
for
communicating data messages between a verifier computing device and a portable
client
computing device. The data messages establish authentication of at least one
client location
characteristic of the portable client computing device. The system comprises
the portable client
- 2 -
CA 3069582 2020-01-23

computing device including at least a client computing device processor and
data storage. The
data storage stores one or more token data objects received from or computed
jointly in a
multiparty protocol with an issuer computing device. The one or more token
data objects
generate, using at least an issuer computing device private issuance key, the
one or more token
data objects where each include one or more signed data elements representing
at least one of
the at least one client location characteristic of the portable client
computing device. The client
computing device processor is configured to receive a verification request
data message from
the verifier computing device, and using a combination of the one or more
token data objects
and the verification request data message generate one or more proof data
messages without
sending any data messages or requests to the issuer computing device. The
verification request
data message comprises a request for confirmation that the portable client
device is associated
with a particular geographic location.
[0012] In some embodiments, there is provided a computer implemented method
for
communicating data messages between a verifier computing device and a portable
client
computing device. The data messages establish authentication of at least one
client location
characteristic of the portable client computing device. The method comprises
storing one or
more token data objects received from an issuer computing device, receiving a
verification
request data message from the verifier computing device, and generate one or
more proof data
messages using a combination of the one or more token data objects and the
verification
request data message, without sending any data messages or requests to the
issuer computing
device. The one or more token data objects are generated using at least an
issuer computing
device private issuance key. The one or more token data objects each include
one or more
signed data elements representing at least one of the at least one client
location characteristic
of the portable client computing device. The verification request data message
comprises a
request for confirmation that the portable client device is associated with a
particular geographic
location.
[0013] Corresponding computer implemented methods and computer readable media
are
contemplated.
DESCRIPTION OF THE FIGURES
[0014] In the figures, embodiments are illustrated by way of example. It is to
be expressly
understood that the description and figures are only for the purpose of
illustration and as an aid
to understanding.
- 3 -
CA 3069582 2020-01-23

[0015] Embodiments will now be described, by way of example only, with
reference to the
attached figures, wherein in the figures:
[0016] FIG. 1 is a pictorial rendering of an example scenario, according to
some embodiments.
[0017] FIG. 2 is a graphical representation of parties to a verification
event, according to some
embodiments.
[0018] FIG. 3 is an example system for conducting credential verification,
according to some
embodiments. The system aspects may include logical components, physical
components, or a
combination of logical and physical components, in accordance with various
embodiments.
[0019] FIG. 4 is an example 0-Auth based method, according to some
embodiments.
[0020] FIG. 5A is an example method diagram where a secure enclave master
verifier is
utilized, according to some embodiments.
[0021] FIG. 5B is a state diagram of a verify oracle, according to some
embodiments.
[0022] FIG. 6A is a system diagram providing additional detail in the context
of a verifier hosted
enclave, according to some embodiments.
[0023] FIG. 6B is a system diagram providing a simplified variation of the
system shown in FIG.
6A, according to some embodiments.
[0024] FIG. 7 is a method diagram providing an example issuer sequence where
the prover
computing system has a corresponding key pair, according to some embodiments.
As described
in later figures, the prover key is optional, but in some cases, the prover
key pair helps prevent
sharing or can be utilized to reduce an amount of data required to be held
secret. The use of a
key pair for the prover may be instrumental in preventing credential
subletting, an abuse of the
system whereby the prover shares some of their credentials with another for
attribute
impersonation.
[0025] FIG. 8 is a method diagram providing an example verification sequence,
where the
prover computing system has a corresponding key pair, according to some
embodiments.
[0026] FIG. 9 is a method diagram providing an example issuer sequence where
the prover
computing system does not have a corresponding key pair, according to some
embodiments.
[0027] FIG. 10 is a method diagram providing an example verification sequence,
where the
prover computing system does not have a corresponding key pair, according to
some
embodiments.
- 4 -
CA 3069582 2020-01-23

[0028] FIG. 11 is a system diagram providing an example verification system
having a third
party hosted enclave including a transcript, according to some embodiments.
[0029] FIG. 12 is an example C-based proof request description language,
according to some
embodiments.
[0030] FIG. 13 is an example method diagram for an anonymous credentials
approach,
according to some embodiments.
[0031] FIG. 14 illustrates, in a component diagram, an example of a module
breakdown of an
identity brokerage, in accordance with some embodiments.
[0032] FIG. 15A illustrates, in a flowchart, an example of a method of
determining that a
geographic point is within a geographic area, in accordance with some
embodiments.
[0033] FIG. 15B illustrates, in a flowchart, an example of a method of proving
that a first
geographic point is within a radius of another geographic point, in accordance
with some
embodiments.
[0034] FIG. 15C illustrates, in a flowchart, an example of a method of proving
that a first
geographic point is within a polygon shape having vertices defining an area,
in accordance with
some embodiments.
[0035] FIG. 15D illustrates, in a flowchart, an example of a method of
determining that a
location point is within a union of convex polygon shapes defining an area, in
accordance with
some embodiments.
[0036] FIG. 15E illustrates, in a flowchart, a method of simulating a proof
that a particular
geographic location is within an area defined by an origin and a radius, in
accordance with some
embodiments.
[0037] FIG. 15F illustrates, in a flowchart, a method of simulating a proof
that a particular
geographic location is within an area defined by a polygon shape having
verticies, in
accordance with some embodiments.
[0038] FIG. 16 is a screenshot of an example credential manager, according to
some
embodiments. Depending on the computational power of the mobile device, an
application
residing on a mobile device, such as the credential manager, is configured to
perform one or
more actions directed to the prover role in the above embodiments.
[0039] FIG. 17 is a screenshot of an example credential manager having an
expanded portion
to view additional details of the credential, according to some embodiments.
- 5 -
CA 3069582 2020-01-23

[0040] FIG. 18 is a screenshot of an example credential manager showing an
expanded
credential view of a single credential, according to some embodiments.
[0041] FIG. 19 is a screenshot of an example page requiring verification,
according to some
embodiments.
[0042] FIG. 20 is a screenshot of an example proof request interface page,
according to some
embodiments.
[0043] FIG. 21 is a screenshot of an example proof input interface page,
according to some
embodiments.
[0044] FIG. 22 is a topology of an example computing device, according to some
embodiments.
[0045] FIG. 23 is a server for inclusion in a data center, according to some
embodiments.
[0046] FIG. 24 is a system diagram for an example verification system,
according to some
embodiments.
[0047] FIG. 25 is a system diagram depicting a method for registration,
according to some
embodiments.
[0048] FIG. 26 is a system diagram depicting a method for verification,
according to some
embodiments.
[0049] FIG. 27 is a system diagram depicting an example age verifier device,
according to
some embodiments.
DETAILED DESCRIPTION
[0050] Embodiments described herein are directed to computer systems and
devices directed
to provide a cryptographic platform for generating and transmitting messages
that are adapted
to assert attributes about various objects (e.g., user profiles, including
user addresses) without
indicating any more than is actually required (e.g., anonymously), and
corresponding methods
and computer readable media storing machine-interpretable instruction sets for
performing the
methods.
[0051] Credential verification, when conducted manually, is a tedious process
prone to
falsification and also over-provisioning of information. In an example, Alice
is a law-abiding 26
year old, and she would like to attend a community event that is free to
members of the public
who live in a certain area. Before allowing access to the community event to
Alice, Bob wants to
make sure that Alice lives in the prescribed area.
- 6 -
CA 3069582 2020-01-23

[0052] Alice thinks the conditions are fair, and they both know presenting her
ID card would
prove that she does satisfy them. She could provide her driver's license,
which shows her
name, address and date of birth. She would like to not disclose anything to
him other than the
fact that she satisfies the conditions. However, by providing her driver's
license, Bob ends up
knowing more than he needs to know (e.g., age, address and specific date of
birth as opposed
to the fact that she lives in the community). Further, aside from visual
inspect of the license, Bob
has practical difficulties in verifying that the driver's license is not a
fake driver's license.
[0053] Accordingly, a challenge involves providing a robust credential
verification whereby Alice
is able to prove to Bob that she does live in the community, while revealing
nothing other than
the fact to him. As an example, consider a policy of living within a radius of
a location, such as a
community centre. That is all Bob needs to know. He does not and should not
know that Alice is
in fact 26, nor her exact address.
[0054] The system is configured to adduce stripped down credentials to meet
Bob's customer
policy without exposing additional information. In particular, cryptographic
techniques are
utilized that undertake specific steps and computational approaches to provide
a secure, yet
computationally efficient mechanism for proof generation.
[0055] Accordingly, an issuer device issues one or more signed token data
objects, which are
stored on a client's device for later usage. Upon encountering a situation
where verification is
required, the client's device is configured to dynamically generate proof data
messages which
are then provided to the verifier's computing device (e.g., the verifier's
smart phone, a point of
sale device, an access control system, a mantrap gate). The verifier is able
to conduct a
verification check using the proof data message to see only that the
conditions required in the
original verification check message without providing the actual underlying
characteristics. As
the proof data messages are generated using the token data objects, the
verifier is able to
validate that such proof data message is associated with a trusted verifier.
[0056] There are two different types of proofs that are proposed in some
embodiments, these
being exact match proofs (non-zeroness protocol; e.g., this person either
matches someone on
a whitelist or doesn't match anyone on a blacklist), and conditional proofs
(e.g., based on an
inequality condition being matched, such as "over 18 years old?" or "live
within a geographic
area?").
[0057] As described in various embodiments herein, improved cryptographic
protocols are
proposed that, relative to prior approaches, reduce an overall cryptographic
complexity without
- 7 -
CA 3069582 2020-01-23

a significant reduction in security. Accordingly, the proofs can be generated
more quickly, which
improves convenience, especially where a system is being established for mass
adoption and
client device characteristics are highly variable across the users (e.g., some
users may be using
devices with extremely limited capabilities).
[0058] An enhanced solution is described herein that is adapted for protecting
a client's
personal information and only providing what is needed by leveraging a
client's special space
using a secure enclave and a blockchain solution, in accordance with some
embodiments.
[0059] A blockchain infrastructure and the secure enclave each store data sets
representing
aspects of signed attributes and, in some embodiments, a proof response logic.
The block chain
infrastructure can include distributed logic technologies and combination with
cascading
encryption to provide an immutable ledger. In some embodiments, the proof
requests and
responses can be conducted using intelligent connected devices such as a
mobile device, or
wearable devices (e.g., a smartwatch that is connected to a mobile device
across Bluetooth low
energy).
[0060] In an example embodiment, there are multiple authoritative issuers who
are able to
provide signed attributes (e.g., for storage in secure enclaves or on a
distributed ledger
blockchain data structure). Secure enclaves can be utilized, or other types of
hardware
protected spaces are usable.
[0061] A registration mechanism and method is utilized to initialize and
populate the attributes
using public and secret (private) encryption keys. Issuer devices create
attribute data records
that are generated using a combination of a client's public key and an
issuer's secret key (e.g.,
using digital signatures or encryption / decryption). The attributes can be
made publicly
available, for example, on a blockchain, whereby the attributes can be signed
by an issuer's
secret key but encrypted using the client's public key.
[0062] A verification mechanism and method is provided whereby a
communications channel
can be established with an authenticated verifier device, which initiates a
proof request, which
triggers a process to establish a proof response that is transmitted to the
verifier.
[0063] An example use case includes a specially configured residency verifier
terminal, which
for example, can include a graphical user interface rendering visual and coded
objects such as
a quick response code that can be scanned by a mobile device. Upon scanning
the quick
response code, the verification mechanism is invoked, and the mobile device
may share data
sets on a backend communications network such as the Internet. The proof
response can be
- 8 -
CA 3069582 2020-01-23

transferred to the verifier device based off of identifiers or information
stored other on the
residency verifier terminal, or encoded within the quick response code the
residency verifier
terminal returning true or false such that both a verifier such as a cashier,
and the customer are
able to visually confirm. The proof response rendering, for example, may be
restricted to a
true/false determination (e.g., additional private information is not
disclosed or rendered).
[0064] There are computing devices that interoperate with one another in
concert with the
cryptographic platform, including devices associated with issuers, verifiers,
and clients. The
issuers are trusted entities which provide cryptographically validated
credential messages that
are issued to the client devices for storage thereon.
[0065] The cryptographically validated credential messages are then
presentable to a verifier
(e.g., a third party organization) that seeks to validate that identity or
aspects of the identity
(e.g., location such as address or residency, etc.) of the user associated
with the client device.
The cryptographically validated credential messages are configured such that
the user is able to
validate such identity or aspects without providing additional information
associated with the
user that is not requested (e.g., as opposed to presenting all the information
on a driver's
license).
[0066] The credential assertion platform is a high-volume, scalable system
which dynamically
responds to data credential requests of one or more users or one or more
computer systems
requesting identity / credential proofs.
[0067] In some embodiments, the assertions are conducted using mobile
endpoints (e.g., user
devices) which may have limited computational performance and resources, and
accordingly,
an improved cryptographic approach and system is proposed that enables the
assertion
functionality through the passing of cryptographically generated messages
between devices.
[0068] An improvement associated with the proposed cryptographic approach of
some
embodiments is that it is able to operate in a secure and scalable way, even
on limited
computational resources (e.g., those available on an unenhanced smartphone).
[0069] For example, a device with limited computational resources can include
basic
smartphones, which may be one or more generations out of date, and also have
limited
amounts of on-board memory (e.g., 1-4 GB of memory) and storage (e.g., 8-64 GB
of solid state
memory). The transfer protocols as between the client devices and the verifier
devices may also
have limited bandwidth (e.g., through near-field communications (NEC),
Bluetooth, limiting
communications to only several Mbit/s).
- 9 -
CA 3069582 2020-01-23

[0070] Prior approaches required large numbers of large messages being sent,
which made the
approaches impractical where resources were limited. The approach proposed
herein requires
less messages and streamlines the amount of cryptographic computations
required to make
these assertions.
[0071] As described herein, an improved cryptographic mechanism and protocol
is proposed
that reduces an overall number of data messages and/or cryptographic steps
required to be
taken to generate the proof data messages. For example, the method of Belenkiy
requires 4
randomizations, 3 group multiplications and 7 group exponentiations, which
includes elliptic
curve exponentiations that are computationally expensive (e.g., involves more
than 256
operations on 512 long integers). In a proposed non-zeroness approach of some
embodiments,
a field inversion is provided, which itself is an expensive operation, but
reduces a consideration
number of group exponentiations.
[0072] The proof data messages are designed to have a "soundness" attribute
whereby a
malicious verifier is unable to find out from the proof data message more
information that what
is being provided in the proof data message (e.g., cannot find out the
underlying characteristic
values).
[0073] A computer implemented identity brokerage solution is described in
accordance with
various embodiments. The identity brokerage solution is adapted to address
problems with
identity and attribute verification, using computer implemented cryptographic
approaches to
provide a robust mechanism for conducting verifications while reducing the
provisioning of
extraneous information (e.g., information not required for the verification).
[0074] FIG. 1 is a pictorial rendering of an example scenario, according to
some embodiments.
Alice is a law-abiding 26 year old, and she is thirsting for beer. She decides
to go to Bob's
German-style beer hall to quench her thirst for beer. Bob is also currently
offering a special
discount for local resident within a geographic area close to Bob's
establishment.
[0075] Bob is not looking for a headache so before selling beer to Alice,
wants to make sure of
two things: She is legally allowed to drink, meaning 21 years of age or more,
and that she is not
Mallory McFelon, a problem customer. Bob will also want proof that Alice lives
within the
geographic area if she is to receive the special discount offer. Alice thinks
the conditions are
fair, and they both know presenting her ID card would prove that she does
satisfy them. She
could provide her driver's license, which shows her name, address and date of
birth.
- 10 -
CA 3069582 2020-01-23

[0076] Alice also knows that Bob tends to be nosy, so she would like to not
disclose anything to
him other than the fact that she satisfies the conditions. However, by
providing her driver's
license, Bob ends up knowing more than he needs to know (e.g., age, specific
address and
specific date of birth as opposed to the fact that she is above 21 years of
age, is a resident of
the geographic area, and is not Mallory). Further, aside from visual inspect
of the license, Bob
has practical difficulties in verifying that the driver's license is not a
fake driver's license.
[0077] Accordingly, a challenge involves providing a robust credential
verification whereby Alice
is able to prove to Bob that she does satisfy Bob's customer policy, while
revealing nothing
other than the fact to him. As an example, consider a policy of "being older
than 21" or "being a
resident in a community or geographic area". That is all Bob needs to know. He
does not and
should not know that Alice is in fact 26, or her exact address.
[0078] It should be understood that there are other uses for anonymous
location verification.
For example, when at a voting booth, a voter must prove their residency. The
teachings herein
allow voters to prove they live within a geographic area without needing to
reveal their exact
address.
[0079] FIG. 2 is a graphical representation of parties to a verification
event, according to some
embodiments. The parties to a verification event include a prover (e.g., the
entity seeking to
prove the entity's characteristics and/or identity, for example, through
programmatic mobile
application client 202 having a token 204 stored thereon), a verifier 206
(e.g., the entity seeking
to verify the prover's characteristics and/or identity in accordance with a
policy), and an issuer
208 (e.g., the entity, such as a financial institution, which has a
relationship with the prover and
can attest to the prover's characteristics and/or identity, whose
representations are trusted by
the verifier 206).
[0080] In accordance with various embodiments, the prover should be able to
hide as many
attributes as the prover seeks to prove that follows from their attributes
having zero knowledge
of the underlying attributes: "I've lived in the same city over the last 5
years", or "I live at an
address within a geographic area."
[0081] The prover's client 202 holds credentials that are digitally signed by
the issuer ("tokens")
208. An example token are those provided by U-Prove specifications. A U-Prove
token can
include a credential similar to a PKI certificate with cryptographic wrapping
of attributes to aid in
reducing unwanted tracking of users.
- 11 -
CA 3069582 2020-01-23

[0082] For example, a token may have various artifacts wrapped therein and may
include
information, such as issuer parameters, including issuer public key
information (e.g., coupled an
issuer's private key) that can be used for signing or encrypting elements of
information stored
thereon to prove the veracity of such signature or to protect sensitive
information. The issuer
signature can be used by the prover or verifier to verify issuer parameters
being relied upon,
and the token itself, in some embodiments, may have one or more data fields
storing
information such as token usage restrictions, validity period, token metadata.
[0083] In some embodiments, the token is jointly created using a combination
of issuer
information and prover information. For example, there may be information
stored thereon that
is established in conjunction and hidden from the issuer, such as contact
information, encryption
key, or verifier supplied nonces, etc.
[0084] During issuance of a token, an issuer may authenticate the existence
and access /
control that the prover has over the prover's device.
[0085] Tokens include attributes that can be converted from a natural form to
a sequence of
large numbers (field elements) suitable for public key operations. These
public key operations
include anonymous credentials protocols.
[0086] Attributes are organized in a tree. An attribute can either come with a
value, in which
case it is called a leaf attribute, or bundle a number of sub-attribute, in
which case it is called a
parent attribute.
[0087] For example, consider a geographic location attribute. That would be
most naturally
divided up into a latitude sub-attribute and a longitude sub-attribute. Thus,
a credential token
can be considered consisting of a single root attribute containing all others
as descendants.
[0088] Regardless of whether an attribute is disclosed, committed to, or
hidden, the prover may
wish to communicate metadata about it to the verifier. The most important such
property is an
attribute's name. The number "170" in an attribute would mean nothing without
the name
"height" attached. Additionally, such numeric attributes require units as
context. The number
"170" is absurd if considered in inches but plausible when in centimeters.
[0089] It is important to disclose this metadata even when attributes are
being committed to.
Consider the non-trivial example of heights and units. Consider an attraction
park that refuses to
admit people taller than 180 cm on a rollercoaster. Without the proper context
provided, a 188
cm tall person can abuse an attribute a height attribute of 74 inches and
successfully prove 74 <
180, thereby put him and others in danger.
- 12 -
CA 3069582 2020-01-23

[0090] In some embodiments, the token can include fields that additionally
give the users an
ability to decide if they want to hide an attribute's metadata. For example,
even if hidden, an
attribute attesting to a negative syphilis test can carry social stigma.
[0091] An attribute will be serialized into one "raw attribute" (a number or
string) if the user
chooses its metadata to depend on its parent's. If not, it will be serialized
into two, the first
representing its metadata and the second representing the value.
[0092] Every attribute's metadata contain an array called "subAttributes". If
the array is empty,
the attribute is considered to be a leaf attribute. Each sub attribute has a
corresponding entry in
the array. If the sub attribute is encoded independently, the entry will be an
integer, denoting
how many raw attributes the sub attribute and all of its descendants (subtree)
together will take.
If it is encoded dependently, the subAttributes entry will be all of its
metadata.
[0093] In this example, it is describing a token for an individual residing in
35.796682 N,
51.416549 E, and 188 cm tall. In radians, the coordinates are 0.624769962188 N
and
0.897388070061 E.
[0094] The token from the slide will serialize into the following, each bullet
point representing
one raw attribute:
{subAttributes: [
{name: "homeAddress", type: "polarCoordinates", subAttributes: [
{name: "longitude", type: "polarCoordinate", unit: "mRad", subAttributes: 0},
2
]),
{name: "height", type: "length", unit: "cm", subAttributes: []}
897
{name: "latitude", type: "polarCoordinate", unit: "pRad", subAttributes: []}
624770
188
[0095] A proof request is issued from the verifier 206 to the prover's client
202, asking the
prover to give the verifier 206 cryptographic assurance that according to some
issuer trusted by
- 13 -
CA 3069582 2020-01-23

the verifier, the prover's attributes satisfy a certain (arbitrary) policy
(e.g., older than 21, as far
as provisioning alcohol is concerned), and these proof requests typically
contain one or more
challenge messages. A proof request can include a nonce, types of conditions,
etc., and these
conditions may be encapsulated as inequalities (e.g., intUserAge > 18), or
logical statements
(e.g., intUserlD not equal to 22412). One or more lookup reference data
structures may also be
passed, which can include blacklists, whitelists, values for various constants
(e.g.,
MINIMUMDRINKINGAGE).
[0096] A proof is provided by the prover through client 202 as a response to
the verifier 206's
request, which includes cryptographic assurance that the prover's credentials
satisfy the verifier
106's proof request, the cryptographic assurance being held being as good as
the issuer 108's
word. The proof is a data message that encapsulates various information (e.g.,
proof responses
directed to a sigma protocol). The data message includes sufficient
information such that the
verifier is able to receive the data message, and conduct steps to validate
and verify that such
proof responses are indeed acceptable. In processing proof responses, the
proof data message
can include aspects indicative of the identity of an issuer, and a potential
step is the validation
by the verifier that such issuer is indeed trustworthy as a source of
credential authentication.
[0097] The proof responses can be processed to generate gatekeeping control
signals, which,
for example, in an example embodiment, may be as simple as a device that
operates a lightbulb
whenever someone is validated as being of age (e.g., so that a bouncer at a
bar is able to know
that this person should be allowed in), or as complex as control mechanisms
that automatically
open doors, enable access to online accounts (e.g., on a web portal), etc.
Accordingly, the
verifier systems can include physical and electronic mechanisms which can
generate alerts,
notifications, actuation / control signals, digital or electronic signals,
among others.
[0098] Factors for assessing identity brokerage solutions include how light
the required
infrastructure is (e.g., it may be important to reduce the need for
specialized hardware,
centralized devices, or complex distributed systems that make deployment and
entry difficult), a
level of computational efficiency, a simplicity of cryptography, a level of un-
linkability between
various parties (e.g., the issuer should not be able to aggregate additional
data about the client,
even in collusion with verifiers), and a flexibility and level of minimalism
of disclosed information.
[0099] Any solution requiring the issuer to be online at verification time
risks exposing additional
information about the client to the issuer. This is especially concerning in
cases where the
issuer and the verifier collude to track client activities.
- 14 -
CA 3069582 2020-01-23

[0100] Reduced complexity is desirable as a solution may be less likely to
suffer
implementation flaws, be more easily understood, and less likely to
theoretically break due to
reliance on unusual hardness assumptions. If computational operations that
have optimized /
low-level implementations, the solution may be able to operate using less
computing resources
and/or time.
[0101] The identity protocols, ideally, should require little time, take
little power, have few
rounds of message transmission, and pass messages having small sizes and/or
overhead. This
is especially important where the parties implement portions of the identity
brokerage solution
on mobile devices to handle one or more verification events. The mobile
devices have limited
computational, storage, and interface capabilities.
[0102] The parties hold corresponding public / secret (e.g., private) key
pairs. The public keys
can be utilized to determine the veracity of information signed using the
private keys, and to
encrypt information that can be decrypted using the corresponding private key.
[0103] The private keys can be utilized to sign information and to decrypt
information that has
been encrypted using the corresponding public key, and in some cases, produce
Zero-
Knowledge Proofs of Knowledge. Each secret key is maintained by the
corresponding
computing device associated with the corresponding entity.
[0104] The parties each have corresponding computing systems, which are used
to
electronically communicate amongst one another (e.g., through a network) and
to perform
various cryptographic activities, including signing, verifying signatures,
encrypting information,
decrypting information and various anonymous credential issuance, proof and
verification
protocol implementations. Each verification event is associated with
validating whether all
logical conditions of the proof request are satisfied. A positive
determination may lead to access
/ service / goods being provisioned to the prover. A negative determination
may lead to access /
service / goods not being provisioned to the prover.
[0105] A specific technological implementation of providing identity
assertions with minimal
disclosure is described in various embodiments. Three separate approaches are
described,
along with variations thereof. These approaches include (1) an 0-Auth token
based design, (2)
a secure enclave based design, and (3) an anonymous credentials based design.
[0106] In some embodiments, a proposed approach is provided in an anonymous
credentials
based design whereby a client receives token data structure(s) that are stored
on data storage,
and asynchronously, the client gets a verifier request from a verifier. The
verifier may, for
- 15 -
CA 3069582 2020-01-23

example, have a list of trusted issuers that the issuer verifier trusts.
Certain organizations may
be trusted for certain information, such as a bank for employment or financial
status, a university
for educational attainment characteristics, among others. The client generates
a proof (e.g.,
encapsulated as a proof data message) based on the token and the verifier
request, and the
proof can be established as either a non-zeroness proof or a conditional
proof. Token objects
can be received from or computed jointly in a multiparty protocol with an
issuer computing
device.
[0107] For a non-zeroness proof, the proof approach generation can include a
first modular
inverse, two randomization steps, two group exponentiations, and a group
multiplication. In
particular, the steps in an example non-limiting embodiment can be established
as:
(1) Receive a verification request data message from the verifier computing
device, the
verification request data message including at least a nonce co.
(2) Compute t = x-1 mod p, where x is the attribute value from the token, and
p is the
order (e.g., size, number of elements) of the discrete log group (e.g.,
elliptic curve, Diffie-
Hellman group) according to the cryptographic standards the parties choose to
use (e.g.,
elliptic curve, Diffie-Hellman group); t is the modular inverse of x mod p.
(3) Sample a first random numberri and a second random number,r2, such
thatr1,r2 E
ZP=
(4) Compute R = Cxr1 hr2, where R is effectively a commitment to random values
r1 and
r2,Cx is a commitment to attribute x,h is a group generator taken from
cryptographic
specifications (e.g., elliptic curve, Diffie-Hellman group). A commitment is a

representation of a value that is both hiding and binding, hiding in the sense
that the
recipient of the commitment cannot find out anything about what the value of
the
commitment is, and binding in the sense that the sender later cannot pretend
that it was
a commitment to another value than it originally was.
(5) Compute c = H(Cx, R, co), where c is the proof challenge, following the
Fiat-Shamir
Heuristic.
(6) Compute z1 = ct + r1 andz2 = ¨cty + r2, where z, and z2 are proof
responses in a
sigma protocol.
(7) Encapsulate and transmit one or more proof data messages including R, z1
andz2 as
data objects to the verifier computing device, such that the verifier
computing device is
- 16 -
CA 3069582 2020-01-23

able to compute c = H(Cx, R, c0) and confirm that eR = Cxzlhza, the verifier
computing
device controlling provisioning of access to a secured resource responsive to
the
confirmation that eR = Cxzlhza.
[0108] The verifier independently validates the received proof and the
verifier makes a
determination of access grant or not grant.
[0109] In some embodiments, the verifier is a verifier computing system that
automatically
grants access to one or more secured resources, such as a physical access
entry (e.g.,
mantrap, revolving doors, locked gateway, locked cabinet), and in other
embodiments, the
system grants access to one or more virtual resources (e.g., administrator
access on a
computer system, logging into accounts, access to secured sections of
webpages), among
others.
[0110] In another example, a comparison protocol may be established (e.g., to
prove some
condition whereby a<= b). This can be utilized to establish proof messages
whereby it is
necessary to indicate that a person is of a particular age, that a person has
a particular
minimum creditworthiness, a person has a minimum educational attainment, among
others.
[0111] Consider G to be a discrete log group of prime order p and g and h be
generators with
unknown discrete logs.
[0112] Let numbers q and 1 be such that q ¨1 = 2" < and two whole numbers a
and b such
that 1 5_ a < b <q. Consider commitments A = gahma and B = gbhmb to a and b,
respectively. To
prove that a < b, the following steps can be taken:
(1) Prover computes C = BA-1 = gb-ahmb-ma gchmo.
(2) Prover produces bit commitments Ai = gaihmai, Bi = gbihmbi, Ci = eihmci
for i E
ft, N ¨ 11 where ai, bi and ci are the i'th bits of a-1, b ¨1 and
c, respectively. mai, mbi
and mci are sampled randomly.
(3) Prover computes Ao = gaohmao = A H A-, 2.
= and likewise Bo = gbohmbo =
B Bi-2i and Co = ethnic, = c fl ci-21.
(4) For each i E ...,N ¨ 1), the prover does the following:
(4.1) Randomly sample rat, crai and Zai.
(4.2) Compute Rata. = hrai and Rai,(1--ai) = hzai 1
- 17 -
CA 3069582 2020-01-23

(4.3) Compute dai = H(Ai, Rao, Rao).
(4.4) Compute zai = (dai ¨ clai i)mai + raj.
(4.5) Assign zai,ai = zai, = Z1, da"Lai = dai ¨ d'ai and
dl,(1_,i) = crai.
(4.6) Repeat steps 4.1 through 4.5 for B and C.
(5) Prover sends all Ai, Rai,o, Rao, c11,0, zai,o, zai,i, 131, Rbi,o, Rbi,i,
41,0, Zbi,o, Zbi,i, Ci, Rio,
Rc1,1, d1,0,zo, zc1,1.
(6) Verifier checks that A = B = Br BA-1 =
(7) For each i E (0,1, N ¨ 1} the verifier checks that:
(7.1) Ada"Lo Rai3O = hzal.
H(AbRai,o,Rai,i)¨daifi,o
(7.2) (Ag-1) Rau = hzal.1
(7.3) Check the same conditions for B and C.
[0113] Note: It may be that either a or b are known to the verifier. In such a
case there is no
need to decompose the known number and commitment C will have the same mask
exponent
as that of the unknown parameter.
[0114] In some embodiments, that the client computing device (e.g., the
prover) does not send
Ao, Bo and Co to reduce the size of its messages. In that case, in step 6,
instead of verifying a
relation between the bit commitments the verifier derives Ao, Bo and Co
independently. This
aspect may be particularly useful in low data throughput situations or where
storage space is
very limited.
[0115] The comparison method of some embodiments reduces the problem of
comparison to
three bit decompositions. As such, the computational burden on the prover
consists of about
12N-3 group exponentiations.
[0116] In contrast, the method of Belenkiy involves two bit decompositions and
N-1 equality
maps each consisting of four 2-variable equations and a total of six distinct
variables.
[0117] As such, it is estimated that each equality map requires at least 8
group exponentiations.
[0118] Using the efficient Bit Decomposition implementations of some proposed
embodiments,
the two decompositions will require a total of 8N-2 group exponentiations.
Accordingly, it is
estimated that Belenkiy's method requires 16N-10 group exponentiations. This
demonstrates
- 18 -
CA 3069582 2020-01-23

that for NW, the proposed method for the comparison protocol is more
efficient, and this
superiority becomes increasing important as the numbers to be compared scale
up.
[0119] In particular, the scale up may occur if the credential verification
system is utilized across
a large number of users.
[0120] FIG. 3 is an examples system for conducting credential verification,
according to some
embodiments. The components are shown as an example of one embodiment, and
alternatives
are also contemplated. The components of the system do not necessarily reside
on the same
computing device and may be distributed across one or more computing devices.
[0121] A computer implemented system 300 for electronic verification of
credentials is
illustrated. The system includes at least one processor and data storage, and
includes a proof
request parsing engine 302 configured to receive one or more proof request
data structures
304, which in combination represent one or more logical conditions.
[0122] A credential parsing engine 306 is provided to receive one or more
credentials 308
which in combination, validate one or more characteristics of an identity
profile 310 of a prover
entity.
[0123] A proof generation engine 312 is provided that receives, from a
verifier computing
system 314, the one or more proof request data structures 304 and the one or
more credentials
308; and for each logical condition provided in the one or more proof request
data structures,
parse the one or more characteristics of the identity profile 310 to determine
whether the logical
condition has been satisfied.
[0124] One or more proof output data structures 316 storing signatures or zero
knowledge
proofs of satisfaction of a subset or all of the one or more logical
conditions is returned by the
system (e.g., in the form of data fields). A secure encryption engine 318 and
a secure
processing enclave 320 may be included, in accordance with various
embodiments.
[0125] A proof generation engine 312, in some embodiments, resides at or is
coupled to a data
center of a financial institution, and wherein parsing the one or more
characteristics of the
identity profile includes invoking an electronic comparison against a stored
user profile of the
financial institution corresponding to the prover entity. The example
implementations are not
restricted to such a topology, and other topologies are contemplated,
including a cloud /
distributed resources based proof generation engine 312.
- 19 -
CA 3069582 2020-01-23

[0126] In other embodiments, the proof generation engine 312 is coupled to the
secure
processing enclave 320, which may also be coupled to a verifier computing
device 314.
[0127] In another embodiment, the proof generation engine 312 lies within the
prover's user
device, thus user data will never be provided to the verifier and the issuer
will not be informed of
the transaction taking place.
[0128] In another aspect, the electronic comparison against the stored user
profile of the
financial institution corresponding to the prover entity includes querying one
or more attributes
of the stored user profile and comparing the queried one or more attributes
against the one or
more logical conditions to determine whether individual logical conditions of
the one or more
logical conditions have been satisfied. The characteristics and attributes of
the user profile can
be used established and stored thereon the portable client computing device as
one or more
token data objects that can be received from or computed jointly in a
multiparty protocol with an
issuer computing device.
[0129] The one or more token data objects are generated (e.g., as signed
objects or encrypted
objects) using at least an issuer computing device private issuance key. The
one or more token
data objects each including one or more signed data elements representing at
least one of the
one or more characteristics of the client associated with the portable client
computing device.
[0130] In another aspect, the verifier computing system is configured to
encapsulate the one or
more credentials along with the one or more proof request data structures in a
single data
container transmitted to the proof generation engine.
0-Auth Based Design
[0131] FIG. 4 is an example 0-Auth token based method, according to some
embodiments.
The O-Auth based method portrayed does not address the issue of unlinkability,
and in this
example, the prover computing device electronically communicates to the
verifier 0-Auth tokens
containing the verifier's proof request, which the verifier computing system
can use to formulate
a query message to be transmitted to the issuer computing system, and receive
a yes/no
answer in response. A benefit to this approach is that there is relatively
light infrastructure
required, it is very efficient, and the cryptography is simple and flexible.
[0132] However, the issuer computing system needs to be available (e.g.,
online) to be able to
process the request. In response to a proof request, the prover confers an
0Auth token (not to
- 20 -
CA 3069582 2020-01-23

be confused with credentials) that the verifier can use to query the issuer
and be assured that
the prover does indeed satisfy their policy.
[0133] The verifier is provided tokens containing the verifier's proof request
which can be used
to query a computing system associated with an issuer, receiving an answer,
such as a yes or
no response (e.g., or a Boolean variable, such as TRUE / FALSE, 0, 1).
Secure Enclave
[0134] A challenging technical problem occurs in implementing a system where
the verifier is
able to ensure the prover has the correct credentials, while preserving their
privacy. In some
embodiments, a secure enclave based approach is described. In order to
implement a secure
enclave, Intel Software Guard Extensions TM (SGX) can be utilized, among
others.
[0135] There are different mechanisms for public key cryptography. An
approach, for example,
supported by the Intel SGX SDK natively supports ECDH for key encapsulation
and ECDSA for
digital signatures over the PRIME256V1 (also known as SECP256R1) curve. Other
approaches
are possible, such as Schnorr's, which would serve just as well for a digital
signature scheme.
256-bit base fields may potentially provide sufficient security.
[0136] For symmetric cryptography, Intel SGX SDKTM natively supports 128-bit
AESGCM. This
primitive can be used for authenticated encryption. It remains to be seen if
larger AES key sizes
are necessary. In that case, Galois-Counter Mode cannot be used.
[0137] Hashing can be performed using SHA-2, as this is natively supported by
the Intel SGXTM
SDK. As it supports 256-bit blocks, it would also be useful in case of a
migration to larger AES
blocks; both as a means of key derivation as well of a MAC building block.
[0138] The secure enclave approach improves computational efficiency and
minimizes a
trusted computing base, rendering it more amenable to formal analysis. The
verifier may include
a verify oracle, which is a trusted software/hardware component hosted by an
untrusted third
party. It is allowed to view a prover's attributes in the clear and attest
that they satisfy a certain
predicate queried by the verifier.
[0139] An example registration protocol is provided as follows. First, a
prover generates their
public key. The issuer hands the prover a random string r, the prover
generates ski', and
generates pkp = f(skp) for kP (sk r) and common knowledge collision resistant
function f. In
order for the registration to be accepted, the prover should prove to the
issuer in zero
knowledge that it knows a corresponding "./p. The (semi-honest) issuer's
contribution to key
- 21 -
CA 3069582 2020-01-23

generation is to keep a malicious prover from stealing credentials belonging
to a previously
revealed private key.
[0140] In regard to credential subletting, it may be beneficial that the
issuer should demand the
prover to incorporate some important secret about their account (not even
known by the issuer)
into the private key, such that the secret can be inferred from the private
key. This will
discourage provers from sharing credentials with one another. Alice may be
willing to let Bob
use some credential issued to her by her bank, but not be comfortable giving
him complete
control over her bank account.
[0141] Another possible technique to approach this is to issue credentials to
specific devices,
using private keys that the device can create for an application and sign
using them on the
application's behalf, without ever sharing the key with the application.
[0142] An example issuer protocol is described:
[0143] The issuer generates a signature on the prover's attributes using an
arbitrary signature
scheme that is secure against existential forgery. For the construction to be
secure, the
signature should also involve a description of the token's data structure.
[0144] More formally, the prover and the issuer agree on a string ap
representing the prover's
attributes. The issuer knows the prover as the owner of pkp, satisfying ap.
The issuer sends the
prover back a signature a, = sig(sk,;pkpl lap) to the prover.
[0145] It is not strictly speaking necessary for the prover to have a public
key at all. However, if
.. the issuer enforces limits on how often it would register a new public key
for a client, provers will
be discouraged from subletting their credentials to one another. This stands
in opposition to
keyless credentials, where disclosing the secrets to a credential doesn't cost
the original owner
anything.
[0146] An example protocol for showing verification by the oracle is provided.
[0147] Let the prover and the verifier both trust a verification oracle known
by the key pair
(sko,Pko).
[0148] The verifier chooses a random challenge c and sends it to the prover. A
proof request
predicate P is agreed upon. The prover composes the string d =
'skid lad lad Icl IP) and sends
enc(pko;d) to the oracle.
[0149] The oracle decrypts d and checks that the following propositions are
satisfied:
- 22 -
CA 3069582 2020-01-23

sigver (pk i: a f (skijilav)
P(pki, ap)
[0150] In case of successful verification, the oracle outputs ao =
sig(sko;cIIP) or it outputs i
otherwise.
[0151] The verifier only needs to check that sigver(pk0;a0;c1IP) holds.
[0152] Note that as regards propositions P that do not depend on anything
outside ap (e.g.,
time) there is no freshness requirement; therefore the challenge c can simply
be regarded to be
the empty string in such cases.
[0153] For examining the approach security, the following collusion scenarios
are considered:
[0154] Malicious issuer and prover to break soundness: This attack can be
trivially mounted
and in some embodiments, there is not attempt to prevent it. The issuer can
always issue bogus
adaptively chosen untruthful credentials for an accomplice prover. In
practice, such a problem is
best thwarted by strong and transparent authentication and KYC practices by
issuers, as well as
careful choice of trusted issuers by verifier consortiums based on thorough
vetting.
[0155] Malicious issuer and verifier to break zero-knowledge: Zero-knowledge
in this context
means that an adversary controlling all issuers and verifiers cannot pinpoint
which of the trusted
issuers implied by the query and which of the credentials obtained from the
issuer the credential
in use is. For this, the analysis makes use of the CCA2 property of the
encryption scheme used
in Acquire Proof queries.
[0156] More formally, consider the following game, where the adversary is
allowed to make
polynomially many of the following queries, interleaved with polynomial
computations:
= Create Honest Oracle: Generate (sko,pko) and add pk0 to the set honest
known to the
adversary.
= Confer Credential: Send (a, = sig(sk,,pkpliap),pk,) for arbitrary ap and
arbitrary key pairs
(sk,,pk,) and (skp,pkp).
= Request Challenge: Send arbitrary pko E honest, P and c to the challenger.
The
challenger picks a random element d from the set D
{(Pkiikskpliapitailicii1))1P(Pki=a7)}
and sends enc(pko;d) back to the adversary.
[0157] The adversary wins if D is non-empty and he can guess the value of d
with non-
negligible advantage over a random choice.
- 23 -
CA 3069582 2020-01-23

[0158] A simulation argument naturally follows from this intuition and is
therefore avoided.
[0159] Malicious prover to break soundness: The adversary is allowed
polynomially many
queries from the following list; arbitrarily interleaved with one another and
polynomial-time
computations.
= Create Honest Issuer: Create a new key pair (sk,,pk,) and add pk, to the set
!honest
available to the adversary.
= Create Honest Oracle: Create a new key pair (sko,pko) and add pko to the
set ()honest
available to the adversary.
= Initiate Registration: Receive a random string r from an honest issuer.
= Finish Registration: Send (r,pkp,Tr) to an honest issuer that has sent r in
a past Initiate
Registration query. If IT non-interactively proves knowledge of 'skip such
that [)k> =. r
), the issuer will later accept Acquire Credentials queries from the
adversary.
= Finish Honest Registration: Create an honest prover to respond to an
already initiated
registration process.8k;, will be hidden from the adversary, but pkp will be
known and
added to the set P
= honest.
= Acquire Credentials: Acquire a, = sig(sk,:pkp,ap) for arbitrary ap and
the additional
requirement that pkp has been already registered with the owner of sk,. Add
(pk,,ap) to the
set A.
=
Acquire Proof: Submit enc(pko;d) an arbitrary but well-formed d = to
an honest oracle with key pko and acquire the output ao.
= Acquire Honest Proof Request: Send arbitrary (P,c,pko) to an honest
prover and receive
enc(pko;d) if the prover has a credential attesting to P and I otherwise. Add
c to the set
Coutsourced=
= Forge: The adversary outputs some ao, and the game ends. She wins if:
1. sigver(pko;ao;cIIP) for some c and P.
2. C E Coutsourced
3. pko E Ohonest
4. ii(pkõap) E A: pk,E lhonest,P(Pkhap)
5. vpki lhonest,ap : -1P(Pk1,aP)
- 24 -
CA 3069582 2020-01-23

[0160] There are no queries regarding corrupted or corruptible Issuers and
Oracles since such
parties can be simulated by the adversary herself.
[0161] FIG. 5A is an example method diagram where a secure enclave master
verifier is
utilized, according to some embodiments.
[0162] In FIG. 5A, the issuer computing system signs attributes with a
cryptographic technique,
the verifier computing system sends issuer computing system a challenge and
proof request.
[0163] In response, the prover computing device sends encrypted credential,
challenge and
proof request to a master verifier computing device. The master verifier signs
challenge and
proof request computing device.
[0164] This approach, while requiring additional infrastructure relative to
the approach of FIG. 4,
satisfies many of the conditions for an ideal verification. The issuer
computing system does not
obtain further information (e.g., the identity of the verifier) from the
verification event.
[0165] FIG. 5B is a state diagram of a verify oracle, according to some
embodiments. The verify
oracle can be implemented using software, hardware, or a combination thereof.
For example,
the states may be transitioned through the use of a processor configured to
transition between
one or more states, and to perform steps described below in conjunction with
machine-
interpretable instruction sets.
[0166] A Verify Oracle supports three states:
1. Blank: At this state, only the initRemoteAttestation call would be
accepted. Then, the first
remote attestation message will be generated the enclave goes to the Remote
Attestation state.
2. Remote Attestation: At this state, the enclave accepts either a reset call
or a
finishRemoteAttestation call. Upon a reset call, the enclave clears all of its
state data, as
if it were killed and reloaded. Upon a finishRemoteAttestation call, the
enclave
consumes a Remote Attestation challenge message. The enclave produces a Remote
Attestation message 3, generates the necessary key pairs and outputs the
Remote
Attestation message and the public keys. If any of this fails, it performs a
reset operation.
3. Ready: This is the state wherein the enclave can actually evaluate policies
over
attributes. It can receive a checkProofRequest call, or a reset call.
[0167] Trust by Provers and Verifiers is assumed in all the previously
described models as a
common reference. Also, for obvious performance concerns, it is vital to be
able to perform
- 25 -
CA 3069582 2020-01-23

Remote Attestation on an enclave non-interactively. As such, the enclave's
host can perform a
publicly verifiable remote attestation with its enclave and publish the
transcript to it. In order to
do so, she may employ the Fiat-Shamir heuristic using the collision-resistant
function H(.)
modeled as a Random Oracle. If the Remote Attestation Verifier would normally
use a
probabilistic polynomial-time algorithm m2 4- A(m1;r) to generate the second
message, in this
scenario the second message would be derived through m2E¨ A(ini; H(70)
[0168] A proof request can be defined in accordance with variations of the
following examples.
[0169] The language describing policies should be simple to interpret so as
not to expose the
system to security risks.
[0170] In order to prevent the execution from leaking information about the
attributes, the
language should preclude programs with data-dependent access patterns, runtime
and
execution paths. Here, a C-like language called StraightTalk is described as
an example, and it
is only capable of describing straight-line programs:
(policy) ::= (token-definition) (statement)* (expirssion)
(token-definition) ::=
(token) V (variable-definition)* T
(variable-definition) ::= ( type) (identificr-list)';
(identifier-list) ::= (identifier)
1 (identifier-list) ,' (identifier)
(tYPc) ::= (basic-type)
1 (base-type) [' (intiger) '1'
(basic-type) ::= 'unsigned' 'int' '[' (integer) .1'
I 'int' ' (integer) '1'
1 'float'
(sta(f-maid)
(t-Jp1-sion).;
::=
( n onempty-argunicrit-list)
- 26 -
CA 3069582 2020-01-23

(nottempt !I-argument-1W) ::= ( (expression) = , ')* ((xpression)
(ixpression) ::= (expression) =7' (expression) = (expression)
(expression) (binary-opun(or) (expression)
(unary-opeintor) xpression.)
(function-like-operator) = ( (argument-list) = ).
= (' (expivssion) )
(string)
(ba.sc64)
(identifier)= ['(integer) ]'
(identifier) ['(iritegel). [ '(integer)'['
(ident ifier)
(number)
(unary-opeintor)
i = !'
(binary-operator) ¨
= = ='
I `+'
I '-=
[0171] FIG. 6A is a system diagram providing additional detail in the context
of a verifier hosted
enclave, according to some embodiments. As shown in FIG. 6A, the verifier
enclave stores a
secret key which is utilized in a flow of signed messages. The key
encapsulation process, in
various embodiments, includes 2-way or 1-way authenticated public key
encryption.
[0172] FIG. 6B is a simplified diagram providing additional detail in the
context of a verifier
hosted enclave, according to some embodiments. In FIG. 6B, the verifier
receives the proof
request, the proof request, and the proofs directly from the prover or prover
device, and
transmits a proof verification message to the verifier.
[0173] In this example, the secure enclave is adapted for processing encrypted
credentials,
challenges, and proof requests. The secure enclave can be a processor or a
secured memory
location that is configured for maintaining a verifier secret key utilized to
generate a first signed
message.
[0174] The verifier computing device receives, from a prover computing device,
a second
signed message including at least an enclosed issuer signed message
representing one or
more encrypted containers storing at least one or more characteristics of an
identity profile of a
- 27 -
CA 3069582 2020-01-23

prover entity along with a message authentication code based at least on the
proof request data
structure.
[0175] The verifier computing device then transmits the second signed message,
the proof
request data structure, and the one or more encrypted containers to the secure
enclave.
[0176] The verifier computing device then receives a response data message
from the secure
enclave indicative of whether all of the one or more logical conditions were
satisfied by the at
least one or more characteristics of the identity profile of the prover
entity. In some
embodiments, the secure enclave is configured to provide publicly verifiable
remote attestation
with a verifiable threat model and a verifiable proof of security.
[0177] A remote attestation protocol involves a zero knowledge proof with a
prover and a
verifier, the enclave being the prover. A direct run of this protocol by both
Identity Brokerage
parties (prover and verifier) may compromise efficiency. Therefore, a
mechanism is
implemented using the Fiat-Shamir heuristic, and the enclave's maintainer is
configured to run
an instance of remote attestation in a publicly verifiable manner.
[0178] Instead of using actual random inputs, the remote attestation verifier
(the enclave's
maintainer) replaces every randomness with the output of a pseudorandom
function applied to
the entire protocol transcript up until that moment, and an arbitrary initial
nonce. Thus, by
presenting the transcript of this protocol instance, the enclave's maintainer
can efficiently
convince the identity brokerage parties that the enclave is a trustworthy one.
[0179] In some embodiments, the verifier enclave or a third party hosted
system tracks records
transcripts of an exchange, which are exposed to the public. For example, it
may be the
responsibility of a verifier computing system to run a remote attestation
protocol with its enclave
once whereby the enclave communicates its public key, which is then stored in
on a transcript
exposure module, which may be hosted by any one of the computing systems
associated with
any one of the parties or by a third party hosted system. In order to
establish the honesty of the
transcript, all the randomness used on the verifier's cryptography are to be
created using a
pseudorandom function (hash, block cipher, etc.) involving all or some of the
information
available to the verifier's computing device at a time of a credential
validation transaction.
[0180] The secure enclave processor maintains a verification transcript in
relation to its own
credentials, as the enclave is wholly trusted by both the prover and the
verifier, it should be
strongly vetted itself.
- 28 -
CA 3069582 2020-01-23

[0181] Chip manufacturers provide mechanisms to verify an enclave involving
multi-round
interactive protocols. Remote attestation is a protocol based on bilinear
group signatures,
whereby an enclave proves to a third party that it is running on a legitimate
Intel SGX platform,
and that is running a particular program.
[0182] FIG. 7 is a method diagram providing an example issuer sequence where
the prover
computing system has a corresponding key pair, according to some embodiments.
[0183] FIG. 8 is a method diagram providing an example verification sequence,
where the
prover computing system has a corresponding key pair, according to some
embodiments.
[0184] FIG. 9 is a method diagram providing an example issuer sequence where
the prover
computing system does not have a corresponding key pair, according to some
embodiments.
[0185] FIG. 10 is a method diagram providing an example verification sequence,
where the
prover computing system does not have a corresponding key pair, according to
some
embodiments.
[0186] FIG. 11 is a system diagram providing an example verification system
having a third
party hosted enclave, according to some embodiments.
[0187] FIG. 12 is an example C-based proof request description language,
according to some
embodiments. An example proof request is shown in FIG. 12, and other policies
are possible. In
some embodiments, the policies can be compiled from a simple C-like language
only capable of
writing straight-line non-branching programs.
ANONYMOUS CREDENTIALS
[0188] In some circumstances, it is desirable to be able to provide
credentials in an anonymous
manner. For example, one may wish to be able to prove that they reside in a
particular area
without disclosing their address (e.g., that they reside within a radius of a
particular place or
location). For example, a local coffee shop may wish to provide loyalty
pricing to local
.. customers. Another example is only allowing people to vote on a local issue
if they reside in a
particular neighborhood. I.e., only provide access to a service or benefit to
people that reside in
a particular area without those residents having to disclose their address.
[0189] In some embodiments, the location may be determined within a radius
from a location. In
other embodiments, other shapes, such as triangles or rectangles may be used.
- 29 -
CA 3069582 2020-01-23

[0190] In some embodiments, polar coordinates (longitude and latitude) are
used as a point for
a particular location or a user's address. The polar coordinates of the user's
address may be
certified by an authority (e.g., department of motor vehicles) and inputted
into a user-centric
protocol (e.g., U-Prove) to determine a Commitment to the polar coordinate.
The Commitment
may then be inputted into a system to anonymously confirm that the Commitment
is within a
distance from a particular location (another e.g., polar coordinate). In some
embodiments, the
address coordinates and/or Commitments may be presented in the form of a
token.
[0191] In some embodiments, a business may present a QR code to a customer to
provide an
extra service or benefit (e.g., cost reduction) based on the address of the
customer. The
customer's device that scanned the QR code could use a U-Prove presentation
protocol to
provide a commitment to their address provided by an authority. A verifier
would then prove that
the customer is within the geographic area for the business by confirming that
the commitment
is within a distance from the business.
[0192] FIG. 13 is an example method diagram for an anonymous credentials
approach,
.. according to some embodiments. The approach utilizes advanced cryptography
for issuance
and presentation protocols, and has relatively fewer infrastructure
requirements and is
unlinkable and flexible. Anonymous Credentials is aimed at solving the problem
of identity
brokerage through designs depending on cryptographic hardness assumptions.
[0193] In such an approach, a custom-built signature mechanism is used for the
issuance
protocol that is amenable with the type of zero-knowledge proofs of knowledge
approaches
used in the proof protocol. Notable examples are U-Prove (relying on Diffie-
Hellman type
hardness assumptions), Idemix (relying on RSA-type hardness assumptions) and
later
approaches based on bilinear maps.
[0194] FIG. 14 illustrates, in a component diagram, an example of a module
breakdown of an
identity brokerage 1400, in accordance with some embodiments. The identity
brokerage 1400
comprises a dlog-utils unit 1410, a zk-dlog unit 1420, a sigma-protocols unit
1430, a user-centric
identity unit 1440 (e.g., uprove) and an identity-brokerage unit 1450. The
dlog-utils unit 1410
may comprise prime-order groups, their associated 4 fields, standard
generators, hash
functions, etc. In some embodiments, the dlog-utils unit 1410 may include the
following classes:
Group, Group Element, Field, Field Element, Vectors, Hash Functions,
Randomizers and Test
Utilities. The zk-dlog unit 1420 may comprise basic data structures used in
zero-knowledge
protocols using discrete log (e.g., Pedersen Commitments), etc. In some
embodiments, the zk-
dlog unit 1420 may include the Pedersen Commitment class. The sigma-protocols
unit 1430
- 30 -
CA 3069582 2020-01-23

may include the following classes: Provers, Verifiers, Problems and Proofs.
The sigma-protocols
unit 1430 will be described in further detail below. An example of a user-
centric identity module
1440 may comprise a bare bone implementation of the U-Prove Issuance and
Presentation
protocols. The identity-brokerage unit 1450 may put all the units together,
hiding the
.. cryptography from the user.
[0195] In some embodiments, a token is a list of integers with structure.
Table 1 illustrates an
example of a token:
Name Value #1 Name Value Name #3 Value #3 Name Value
#1 #2 #2 #4 #4
First Frederick Last Krueger Occupation Nightmare Work 1428
Name Name
Monster Address Elm
Street
Table 1: Example 1 of a Token
[0196] Table 2 illustrates another example of a token where the first
attribute is dedicated to
.. describing the token's structure:
Token Structure Value #1 Value #2 Value #3
Value #4
["First Name", "Last Name", Frederick Krueger Nightmare
1428 Elm
"Occupation", "Work Address] Monster Street
Table 2: Example 2 of a Token
[0197] With respect to security, the first example of Table 1 only reveals the
number of
attributes in a token, while the second example of Table 2 also reveals the
entire structure of
the token.
[0198] With respect to performance, the first example of Table 1 requires
double the number of
actual attributes, while the second example of Table 2 only demands one extra
attribute.
I-Protocols
[0199] The sigma-protocol unit 1430 will now be described in more detail.
- 31 -
CA 3069582 2020-01-23

[0200] A sigma-protocol (I-protocol) is a schema for proving knowledge of a
witness W to an
NP-Statement x E L. It comprises the following three moves:
1. Commitment: The prover sends commitments to random values
(referred to as
"masks" herein) to the verifier.
2. The verifier sends a random challenge back to the prover.
3. The prover provides a response incorporating the mask values, the
witness and
the challenge.
[0201] The verifier accepts if and only if the components of their view
satisfy a certain
polynomially-verifiable relation. The view includes are the problem statement
P(x) and the three
messages exchanged.
[0202] A I-protocol should also satisfy the following properties:
1. Special Soundness: An extractor can extract a witness from two accepting
transcripts of the protocol with equal commitments and different challenges.
2. Special Honest Verifier Zero Knowledge: An honest verifier can generate
transcripts identically distributed as an actual execution of the protocol,
given a
challenge.
[0203] It should be noted that the concept of a &protocol is much broader than
the algebraic
cases used in this solution. Namely there is very little limitation on the
mathematical nature of
the NP-Statement to be proven, its witness, and the commitment scheme in use.
.. [0204] It should also be noted that it is not sufficient for the challenge
to be unique. It should be
unpredictable. From the Special Zero Knowledge property it follows that a
malicious prover can
fool a verifier if able to guess the challenge prior to commitment generation.
[0205] It should also be noted that even though a proof of knowledge, a I-
protocol is not
necessarily zero-knowledge, nor composeable in parallel.
Composing E-Protocols
Disjunction
[0206] In some embodiments, in order to prove that at least one of the two
statements A and B
is true, the prover proves one (one that is actually true) and simulates the
other (which the zero
knowledge property guarantees to be possible, regardless of its truth). In
order to allow the
- 32 -
CA 3069582 2020-01-23

prover to simulate one and only one of the two, for each verifier challenge c
an efficient and
efficiently invertible bijective function fc:C
c, where C is the set of possible challenges is
devised. The prover may use arbitrary challenges c1 and c2 for its subproofs,
as long as c2 =
fc(ci). Intuitively, the prover can fix one of the challenges and generate
bogus commitments.
After receiving the uniformly random verifier challenge, the prover will be
bound to a uniformly
distributed challenge for the other statement, which forces it to try and
prove it honestly.
Special Soundness:
[0207] Imagine two accepting executions of the proof with the same mask
commitments and
different verifier challenges. Represent the two transcripts as ((R1, c1, z1),
(R2, c2, z2)) and
((lq, (K, 6, 4)). We have the following:
R1 =
R2 =
+ c2 # C + 6 c1 # c or c2 # c.
[0208] Without loss of generality, let us assume that c1 # c. Because (R1, c1,
zi) and (R1,
are both accepting transcripts, a witness w1 can be extracted for the first
statement,
demonstrating that the prover knew it was true, as well as a witness to its
truth.
Special Zero-Knowledge:
[0209] The simulator can sample c and C1 in advance and derive c2 from the
two. Because the
protocols for the sub-statements are zero-knowledge, the simulator can output
(R1, c1, z1) and
(R2, c2, z2) that are indistinguishable from honest transcripts, given the
challenges. But c1 and c2
are uniformly distributed, therefore the simulated transcripts are
indistinguishable from honest
transcripts.
Cost:
[0210] The computational overhead is negligible compared to that of providing
the honest and
simulated proofs, both in providing honest or simulated proofs and verifying
them.
[0211] Communication has an overhead of the prover sending one of the sub-
challenges to the
verifier.
- 33 -
CA 3069582 2020-01-23

[0212] It should be noted that the necessity of actually implementing proof
simulators to build
disjunction provers makes their efficiency relevant. Therefore, most
complexity analyses in this
document will include an estimation of the work required to simulate a proof.
[0213] It should also be noted that such differences do not enable timing
attacks when
disjunctions of two statements of the same form are proven. However, whenever
P1 v P2 is to be
proven, care should be taken to ensure that the timing difference between
honest P1 and
simulated P1 is the same as the timing difference between honest P2 and
simulated P2. In other
words, in order to maintain zero knowledge one needs to guarantee that
LD -F == p, 1- t 4=> tpi == ttn tp2.
Conjunction
[0214] One may need to prove knowledge of all statements P1, , PN, each with
their own I-
protocol. That is simply achieved through synchronizing the execution of their
proof protocols.
Meaning that all commitments R1, , RN are sent first and the same challenge c
is used to
provide all responses 21, , ZN.
Special Soundness:
[0215] Take two accepting transcripts (R1, ,RN, C, Z1, , ZN) and (R1, , RN, ,
Z1, , 4). From
the soundness of each sub-protocol we deduce that all wn's can be extracted.
Special Zero-Knowledge:
[0216] Given the challenge c beforehand, the sub-simulators can each create
honest-looking
transcripts (11n, c,zn). Clearly a sequence of such transcripts will also be
honest-looking.
From I to ZKPoK
[0217] A I-protocol is not guaranteed to be Zero Knowledge. However, under
certain black-box
transformations, a I-protocol can be converted into a Zero Knowledge Proof of
Knowledge
(ZKPoK).
[0218] One such transformation uses the Fiat-Shamir heuristic. Prior to the
execution of the
protocol, the verifier hands the prover a challenge string, later used in
generating the I-protocol
challenge. The main body of the protocol is made non-interactive, as the
prover self-generates
- 34 -
CA 3069582 2020-01-23

the challenge by hashing together the verifier-provided challenge string and
its own
commitments.
[0219] Roughly speaking, by having both parties contribute freshness to the
hash input, the
challenge is guaranteed to be unpredictable, in the sense that neither can
guarantee any
meaningful relation between the prover commitments and the challenge with non-
negligible
probability, essentially incapacitating them both from cooking up any
mischief. With both parties
limited to honest behavior, the special soundness and special zero knowledge
properties suffice
to guarantee the zero knowledge and proof of knowledge conditions.
[0220] It should be noted that the challenge string provided by the verifier
need not be random;
uniqueness suffices.
Zero-Knowledge:
[0221] The transcript of an honest prover's interaction with any verifier is
identically distributed
as with an honest verifier. Since good randomness from the prover alone
suffices to guarantee
randomness of the challenge, regardless of what the verifier contributes into
the hash input, the
challenge will be honestly produced.
[0222] An honest prover's view under the E-protocol is identically distributed
as that under the
transformed protocol.
Proof of Knowledge:
[0223] In a Random Oracle model, we can rewind the challenge creation step and
have two
distinct challenges that the prover can successfully respond to.
[0224] By making the protocol non-interactive, the transformation also
addresses the issue of
composability. Reducing the number of messages from 3 to 2 somewhat trivially
guarantees
parallel composability. Moreover, all zero-knowledge protocols are
sequentially composeable.
[0225] This makes Z-protocols very valuable tools, mainly because proving
Special Soundness
and Special Honest Verifier Zero Knowledge is much easier than proving
Arbitrary-Verifier Zero
Knowledge and Proof of Knowledge.
- 35 -
CA 3069582 2020-01-23

Algebraic Implementations
[0226] Operations are carried out in a prime order group G and the field Zio
of integers modulo
its size. This group has at least two publicly known elements g and h, along
with a hash function
H. The security of our constructions depends on the following properties.
.. Security Assumptions
[0227] In some embodiments, the following assumption are made:
I.
Generator Independence: No efficient algorithm can output nonzero vector
[wi. 14/2] such that el hw2 = e.
2. Discrete Log Hardness: No efficient algorithm can output r given hr for
exponentially many uniformly sampled r E
3.
There is a publicly known hash function H: (VerifierChallenge, G*) ZIG',
that can
be used to transform a E-proocol on G into a ZKPoK under the Fiat-Shamir
heuristic. A
E-protocol operates on G if R E G, fc, z} C ZIG' for all its executions.
Schnorr Signature
[0228] This protocol allows an entity to prove knowledge of the discrete log x
(private key) of a
group element X = gx (public key) by providing a valid response (signature) to
a challenge
(message). The non-interactive version of this protocol can serve as a
signature scheme with
the verifier challenge serving as the message to be signed.
[0229] As a E-protocol, the following steps are to be taken:
1. Prover randomizes mask r and provides R = gr.
2. Prover receives challenge c E FIG1.
3. Prover sends z = cx + r.
4. Verifier accepts if gz = RXc.
[0230] Completeness is evident. The following proves security:
- 36 -
CA 3069582 2020-01-23

Special Soundness:
[0231] Imagine two distinct transcripts with identical commitment and
different challenges.
= X-c2gz2 = R Xci-c, = gz1-z2 x = 9(z1-z2)(c1-c2)-
1.
[0232] Thus, a witness can be extracted.
Special Zero-Knowledge:
[0233] To counterfeit an accepting transcript, the simulator can first
randomize the challenge
and the response c,z E FIG1. It then generates fake mask commitment:
R = gzX-c.
[0234] It should be noted that, in some cases the simulator may be given the
challenge c.
Therefore, it's best to prove the slightly stronger result that a simulator
can generate transcripts
that are indistinguishable from honest ones given the challenge.
[0235] The following proves that real and fake transcripts are identically
distributed:
[0236] Random variables are represented with boldface capital letters. C,Z and
R represent the
challenge, response and the mask respectively.
[0237] In both cases, the challenge is uniformly distributed. In other words,
C¨U(7ZIG1).
[0238] In the counterfeit case, the response is generated independently of the
challenge,
therefore ZIC¨U(ZIG1).
[0239] In the honest case, the response is obtained by adding a uniformly
randomized r to a
fixed multiple of c. Therefore, in that case also we have ZIC¨U(ZiGi).
.. [0240] In both cases, R is uniquely determined from c, z and X. Therefore,
the random variable
(Z,C,R) is identically distributed across the two cases.
[0241] Thus the protocol is Honest-Verifier Zero Knowledge in the strongest
sense, namely,
information theoretically.
- 37 -
CA 3069582 2020-01-23

Computational Complexity:
Group exponentiation Group composition Field Field
Multiplication
randomization
Prover 1 0 1 1
Simulator 2 1 0 1
Verifier 2 1 0 0
Table 3
Note on Complexity
[0242] All complexity estimates ignore the process of creating and
communicating the
challenge, because it can be amortized. Moreover, sometimes it is handed down
from another
protocol.
Note on Computational Complexity
[0243] Every group exponentiation involves O(logIGI) group doublings and
compositions. This
number is considerable, at least about 256.
[0244] Every group doubling or composition in turn, involves a constant number
of Field
multiplications and inversions.
[0245] And finally, every field multiplication involves 0(log IG I) field
additions, which is again at
least about 256. This means that compared to a group exponentiation, a field
addition,
subtraction or negation takes a negligible 2-16 = 0.000015 time and effort.
Therefore, to save
ourselves and the readers a fair bit of misery, we shall hereon dismiss basic
field operations as
inexpensive, reasonably certain that our analysis will not suffer much.
Communication Complexity
[0246] 1 field and 1 group element.
- 38 -
CA 3069582 2020-01-23

Note on Communication Complexity
[0247] In many contexts, we may assume that transmitting a group element
requires more or
less the same number of bits as a field element does. Even so, more work may
be required on
the receiver's side to reconstruct a representation amenable to manipulation.
Generalized Schnorr's
[0248] Schnorr's signature can be naturally extended to efficiently prove
knowledge of a group
element X's discrete logarithm based multi-generator set (gn). In addition,
the prover may want
to prove that some of these exponents are equal.
[0249] Prover knows X = finN=igxj(,), where i(n): NN NI surjectively maps an
exponent's
index to that of its equality class. That is to say, there are I < N equality
classes, indexed from 1
to I.
[0250] The prover does the following:
1. For each equality class i sample ri ZIG'
2. Send the commitment R = finN=ignr")
3. Receive challenge C
4. For each equality class i send the response zi = cxi +
[0251] The verifier accepts if and only if RV'
Special Soundness:
[0252] Consider two transcripts (R,c,z1, Z1), (R, c', .....z1'). We have:
_
R = x-c ngzi0.0=x_c,nfizi(n),(c_c,= ngzony-z;(n) x = n
n = 1 n = 1 n = 1 n = 1
[0253] This gives us witnesses xn = (zi(n) ¨ z[(70)(c ¨ c')-1, such that for
every n,n' with
i(n) = i(n') we have xn = xn' .
[0254] Notice the one-sidedness of the above implication. It is not at all
asserted that variables
belonging to distinct equality classes hold distinct values.
- 39 -
CA 3069582 2020-01-23

Special Zero-Knowledge:
[0255] For a given challenge c c ZIG', the simulator uniformly samples
Z1,...,z1 and computes
R = rinN=1gzi(t).
[0256] This is identically distributed as an honest transcript. Given a
challenge c, the zis are
.. completely masked by the ris and therefore uniformly distributed. Finally,
as in the simulated
case, the commitment R is uniquely determined by the challenge and the
responses.
Computational Complexity:
Group exponentiation Group composition Field Field
Multiplication
randomization
Prover N N-1
Simulator N+1 N 0
Verifier N+1 N 0 0
Table 4
Communication Complexity:
[0257] I field and 1 group elements.
Proving Knowledge of a Pedersen Commitment
[0258] Proving, verifying and simulating knowledge of commitment x = gx itr is
reduced to
Generalized Schnorr's with the following parameters:
Parameter N I gi g2 Xi X2
Value 2 2
Table 5
-40 -
CA 3069582 2020-01-23

Computational Complexity:
Group exponentiation Group composition Field Field
Multiplication
randomization
Prover 2 1 2 2
Simulator 3 2 0 2
Verifier 3 2 0 0
Table 6
Communication Complexity:
[0259] 2 field and 1 group elements.
Soundness of the Pedersen Commitment Scheme:
[0260] If a prover can open a commitment in two ways gxhr = C = gx'hri , we
have gx-x' =
hr' _r. From the assumption that the prover does not know the relative
discrete log of g and h,
we infer that x ¨ x' = r' ¨ r = 0.
Very Generalized Schnorr's
[0261] This is called an "Equality Map" in the Microsoft Specifications. The
problem is to prove
knowledge of the discrete logarithm of a set of group elements ()Cm) as
xfl
m = n
n=1
[0262] Moreover, there is an "equality map" as common input, taking the index
of an exponent
to that of a witness:
/: Nm x NN u {0}
Xmn = WI(m,n)
WO = 0
- 41 -
CA 3069582 2020-01-23

[0263] The symbol 1 stands for the equality map as well as the number of non-
trivial witnesses.
It should be clear from context which is meant.
[0264] The prover performs the following:
1. For i E Nh sampler, ZIG'. ro = 0 by convention.
2. Form c NM, send Rm = innN=1 g77-0.
3. Receive challenge c c ZIG' .
4. For i E Nh send back zi = cwi + ri. zo = 0 by convention.
The verifier accepts if and only if XRm = 11N=1gnzi(m'n) for all m E NM.
Special Soundness:
[0265] This follows naturally from the special soundness of Generalized
Schnorr's. Intuitively,
the relevant witnesses for each discrete log relation can be extracted
independently of the
others. Consistency is still guaranteed.
Special Zero-Knowledge:
[0266] Given group elements (Km), challenge c and equality map 1, the
simulator samples zi
(¨ ZiGi , sets zo = 0 and for mE NM computes Rm. = X rhiN, gnz"'n).
[0267] Similarly to previous arguments, in both simulated and genuine cases,
the set tzi} is
uniform and independent of the challenge c, and the set (Rm} is uniquely
determined given the
two. Therefore, those cases are identically distributed.
Computational Complexity:
[0268] Consider the set E = t(i,n)13m: 1(m, n) = 0. One may say IEI MN.
[0269] Here we make an additional assumption that was not essential to the
security analysis
but makes complexity analysis easier.
Vrn E Rim: 3n: /(m, n) *0
That is, none of the discrete log witness vectors are trivially zero.
-42 -
CA 3069582 2020-01-23

Group exponentiation Group composition Field Field
Multiplication
randomization
Prover 1E1 1E1-M
Simulator lEl+M 1E1 0
Verifier lEl+M 1E1 0 0
Table 7
Communication Complexity:
[0270] I field and M group elements.
Commitment Equality Proof
[0271] In order to prove equality of two commitments X1 = gxihri and X2 =
gx2hr2, it is more
efficient to devise a specially tailored proof scheme rather than using the
generic equality map.
[0272] Both parties simply compute X,6, =X1X1 = 9x1-x2hri-r2 = gx8 Era.
n The prover proves
knowledge of logh X.
Special Soundness:
[0273] From the special soundness of Schnorr's it follows that the extractor
can compute 7-,5
such that Xp = hrt;. Assuming that the prover has knowledge of X1 and X2 as
valid
commitments, it should be able to compute gx8hr5 =
gxahra-r:s = 1. Because the prover
doesn't know logg h, it must be that x1 ¨ x2 = x5 = 0.
[0274] Note that this soundness guarantee is only valid if it is already
ascertained that the input
commitments are valid, which is not the case with the generic equality map.
Special Zero-Knowledge:
[0275] This follows directly from the zero-knowledge property of Schnorr's.
-43 -
CA 3069582 2020-01-23

Complexity:
[0276] The simulator computes X,6, = X1X2-1 and simulates a Schnorr's. This
results in 2
randomizations, 2 exponentiations and one extra composition. Therefore we have
an extra
exponentiation.
Group exponentiation Group composition Field Field
Multiplication
randomization
Prover 1 1 1 1
Simulator 2 2 0 1
Verifier 2 2 0 0
Table 8
Commitment Non-zeroness Proof
[0277] In order to prove for a commitment X = gxhr that x # 0, we only need to
prove the
existence of a multiplicative inverse for x, namely g = Xx-lh-rx-1. In other
words, it suffices to
prove the discrete log of g base X and h.
Special Soundness:
g = xahb = (gxhr)ahb = gxahra+b g xa ¨1 = h¨ra¨b
[0278] Since the prover doesn't know a nontrivial discrete log relation
between the generators,
this guarantees that xa ¨ 1 = 0. Therefore, a is the multiplicative inverse of
x, guaranteeing that
x is nonzero.
Special Zero Knowledge:
[0279] This follows directly from the zero knowledge property of the knowledge
of commitment
protocol.
-44 -
CA 3069582 2020-01-23

Complexity:
[0280] To set up the discrete log proof, the prover needs to compute x-1 and -
rx-1, requiring
one field inversion and one multiplication only. The verifier and simulator
need no additional
setup.
Group Group Field Field Field
exponentiation composition Inversion Multiplication
randomization
Prover 2 1 1 3 2
Simulator 3 2 0 0 2
Verifier 3 2 0 0 0
Table 9
Inequality Proof- Known Value
[0281] Given X = gxhr and Y = g', with y common to both parties and x and r
private to the
prover, one can prove x # y by carrying out a non-zeroness proof on XY-1 = g1
h.
[0282] As a side effect, this also proves knowledge of x and r.
Special Soundness:
[0283] An extractor can obtain X -y and r, with the guarantee that x - y # 0.
From that x can
also be determined.
Special Zero-Knowledge:
[0284] XY-1 is uniquely determined from common inputs X and Y-1 and the
subprotocol it is
handed to is special zero-knowledge; thus so is this protocol.
-45 -
CA 3069582 2020-01-23

Complexity:
Group Group Field Field Field
exponentiation composition Inversion Multiplication
randomization
Prover 2(3) 2 1 3 2
Simulator 3(4) 3 0 0 2
Verifier 3(4) 3 0 0 0
Table 10
[0285] There may be an additional exponentiation required if a party doesn't
have Y = gY
cached.
Inequality Proof¨ Unknown Values
[0286] This works very similarly as with known values. Group elements Xi. and
X2 are common
input, and private to the prover are ZiGi elements xi, x2, ri, r2 such that X1
= gxi hri and X2 =
gx2 hr2 .
[0287] To prove that X1 # X2, the parties both compute XA = XiX2-1. The prover
also computes
x6, = x1 ¨ X2 and rA = ri ¨ r2 and proves that x6, is not zero using the
commitment non-zeroness
proof.
[0288] Note that this protocol doesn't prove knowledge of the witnesses xl,
x2, 7-1 and r2.
Complexity:
Group Group Field Field Field
exponentiation composition Inversion Multiplication randomization
Prover 2 2 1 3 2
Simulator 3 3 0 0 2
Verifier 3 3 0 0 0
Table 11
-46 -
CA 3069582 2020-01-23

Bit Commitment Proof
[0289] Pedersen commitments to bits will play a role in future sections, when
comparison of two
numbers is of concern. The commitment is of the form C = gbhr, where b E {0,1}
and r E ZIG'.
In order to prove that such a commitment is well-formed (that b is indeed a
bit) we may have the
following:
0 or IC = hr or C = hr or
b E <=> fb
b=1 tC=gie" = hr
Therefore a disjunction of knowing the discrete log of either C or Cg-1 base h
does the trick.
Complexity:
[0290] The prover and the verifier both do the additional work of computing Cg-
1; namely, an
additional group composition.
[0291] This results in 3 field randomizations, 3 group exponentiations and 1
composition for the
prover, as well as 4 group exponentiations and 3 compositions for the
verifier.
[0292] There is no additional communication cost.
Group exponentiation Group composition Field Field
Multiplication
randomization
Prover 3 2 1 3
Simulator 4 3 0 3
Verifier 4 3 0 0
Table 12
[0293] It should be noted that the computational burden on the verifier is
less important,
because it is less likely to be implemented on a mobile device.
Bit Decomposition Proof
[0294] Certain properties of committed attributes cannot be naturally
expressed using algebraic
techniques so far covered in this document. Comparing two numbers is one such
example as
L1, doesn't come with a notion of order. It can be done however by a circuit
operating on
-47 -
CA 3069582 2020-01-23

individual bits making up the operands. This means that we need to produce
commitments to
bits and to simulate binary circuits. However, both of these can be done
algebraically. This
section describes a method to do the former.
[0295] The common input is a commitment C = gxhr, with the prover having x and
r as private
input.
[0296] To decompose x into an N-bit number, the prover generates the
following:
(bN-1, bo)2 = x
ZIG'
N-1
ro = r ¨ 21tr,
n=1
Cn = enhrn
[0297] This yields:
N-1
2nCn = C
71=0
[0298] The prover sends C1,
CN_i to the verifier. Apart from that, the prover will need to
provide proofs of commitment for CO3 C
-N-1.
Special Soundness:
[0299] Take two accepting transcripts
(C1, , CN_i, R0, , RN_i, C, z0, ... ,ZN_i) and
(C1, ..., CN_1, R0, , RN-1, ZP,
z'N_1). From the soundness of the conjunction proof we deduce
all witnesses wn can be extracted. It should be noted that the fact that the
C_O's are created by
the prover does not affect the subprotocols' soundness.
Special Zero-Knowledge:
[0300] A simulator can randomize group elements (C1, CN_i) and have sub-
simulators
generate honest-looking sub-proofs for their validity as bit commitments.
-48 -
CA 3069582 2020-01-23

Complexity:
[0301] The prover randomizes N ¨ 1 masks and infers ro from them. The
computation is as
follows:
N-1
7-0 = r ¨ Z 2n rn = r ¨ 2( ri + 2(r2 + === )).
n=1
[0302] This takes N ¨ 1 field additions and N ¨ 1 field doublings. In other
words, it takes 2N ¨ 2
field additions, about as expensive as a couple of multiplications, at most.
In most cases though,
2" is much smaller than the group size and therefore the cost of these
additions can be
disregarded.
[0303] Similarly, computing C1 to CN_i takes N ¨ 1group exponentiations and N
¨1 group
compositions.
[0304] In the same way as computing ro, computing Co takes N ¨ 1 group
doublings and N ¨ 1
group compositions. Alternatively, using the availability of ro, Co can be
computed directly as
Co = gbohro. This takes one group composition and one exponentiation. An
exponentiation to
the power of a large number takes at least about logIGI ¨ 1 exponentiations
and logIGI ¨ 1
doublings alone. Therefore direct computation of Co would always be less
efficient.
[0305] Followed by that is N independent bit commitment proofs.
[0306] The verifier, takes C1 to CN_i and computes Co, knowing C. That takes N
¨ 1 group
doublings and N ¨1 group compositions. The rest of its work is to verify
individual bit
commitment proofs.
Group Group composition Field Field
exponentiation Multiplication
randomization
Prover 4N-1 5N-3 N 4N-1
Simulator 5N-1 5N-2 0 4N-1
Verifier 4N 5N-2 0 0
Table 13
- 49 -
CA 3069582 2020-01-23

Bit Comparison Proof
[0307] Imagine an N-bit comparator design cascaded starting from the most
significant bit.
[0308] The output of any 1-bit comparator unit comparing the nth digits is a
commitment of form:
1 if (aN_i ...an)2 > (b/v_1 bn)2
Cn-1 = gcn-ihrn-1, where cn = 0 if (aN_i an)2 = (bN-1 === bn)2 =
¨1 if (aN_i an)2 < (bN_i bn)2
[0309] The carry in cN is considered 0 and the carry out co is the result.
[0310] CN_i can be simply computed as AN_,BN-_,,
=
if
[0311] All other values are computed as Cn = Ian ¨ b7.1 cn+1 = O. This is
equivalent to saying
cn+1 1 cn+1 0
(cn+1 0 A cn = cn+1) V (cn+1 = 0 A c = an ¨ bn).
Proving and simulating cõ1 # 0 A c, = cõ.0:
[0312] An honest proof would involve a proof of non-zeroness on Cn+i and a
Schnorr's as
Cn+1.C,71 = I.
[0313] A simulation would involve a simulation of both subcomponents. Each
contributes an
extra exponentiation.
Group Group Field Field Field
exponentiation composition Inversion Multiplication
randomization
Prover 3 2 1 4 3
Simulator 5 4 0 0 3
Verifier 5 4 0 0 0
Table 14
Proving and simulating cõ+1= 0 A cõ= aõ ¨
[0314] An honest proof involves a proof of zeroness on Cn+1 and a proof of
equality between
An8,-1 and C.
[0315] A simulation likewise involves simulating the two.
- 50 -
CA 3069582 2020-01-23

Group Group composition Field Field
exponentiation Multiplication
randomization
Prover 2 2 2 2
Simulator 4 4 0 2
Verifier 4 4 0 0
Table 15
[0316] The most significant work difference between the cn+i # 0 and cn+i = 0
cases is an
extra field inversion and two field multiplications when cn+1 # 0. A number of
dummy operations
may be necessary to close in the timing gap.
Overall complexity:
[0317] When performing an honest proof, one of the predicates is proven and
the other is
simulated. When doing a simulation, they both are simulated. Disjunction
provers and
simulators both sample an additional challenge from the field.
Group Group composition Field Field
exponentiation Multiplication
randomization
Prover 7 6 4 6
Simulator 9 8 0 6
Verifier 9 8 0 0
Table 16
Whole Number Comparison ¨ Using Comparator Circuits
[0318] Assuming decompositions are already available to both parties, the
prover generates the
bit comparison commitments CN_i, , Co and prove their correctness. This takes
N field
randomizations, N exponentiations and N group compositions.
- 51 -
CA 3069582 2020-01-23

[0319] What follows is N-1 comparators with carry ins and a single group
composition
(providing CN_1).
Group Group composition Field Field
exponentiation Multiplication
randomization
Prover 8N-7 7N-6 4(N-1) 7N-6
Simulator 10N-9 9N-8 0 7N-6
Verifier 9(N-1) 8(N-1) 0 0
Table 17
Whole Number Comparison ¨ Using Signed Arithmetic (p's Complement)
[0320] Imagine a number q < 122. For any two whole numbers 0 a, b < q , we
have the
following:
b<a0<a¨b<a<q
b > a ¨q < a ¨ b <0= (a ¨ b) modp> p ¨ q.
[0321] q can be as small as the prover feels comfortable, as it puts an upper
limit on the values
a and b.
[0322] This construction essentially reduces comparison between two arbitrary
numbers to a
comparison between an arbitrary one and one the prover gets to choose. Powers
of 2 are very
easy to compare against, when equipped with decomposition proofs, which
essentially enable
us to make statements such as "a is at most N bits long."
Complexity:
[0323] Assuming decompositions of both operands as common input, in order to
imply they are
at most N bits long, the prover only needs to decompose C = A13-1, again to
prove it is at most
N bits long. Thus, all that is needed is a group composition and a bit
decomposition.
- 52 -
CA 3069582 2020-01-23

Group Group composition Field Field
exponentiation Multiplication
randomization
Prover 4N-1 5N-2 N 4N-1
Simulator 5N-1 5N-1 0 4N-1
Verifier 4N 5N-1 0 0
Table 18
[0324] As compared with the circuit based method, this is considerably more
efficient, as it
requires about half as many exponentiations. However, it is more limited in
that it cannot provide
a commitment to the comparison result. It only proves the result directly.
Application to Identity Brokerage
[0325] Most of these problems start with Pedersen Commitments to finitely many
attributes, in
the form Ai = gaihri. These may be provided by U-Prove or another Anonymous
Credentials
protocol. The prover and the verifier have [Ad as common input, and the prover
additionally has
{at, TA as private input.
[0326] FIG. 15A illustrates, in a flowchart, an example of a method of
determining that a
geographic point is within a geographic area 1500, in accordance with some
embodiments. The
method 1500 comprises converting 1502 the geographic into a planar coordinate
(dx, dy),
determining 1504 a commitment of a geometric function of the planar
approximation,
determining and generating 1506 a proof that a result of the geometric
function of the planar
approximation is within parameters, sending 1508 commitment values and proofs
to a verifier,
and dptionally receiving 1510 a confirmation that the result of the formula is
correct. The
geographic point may comprise a latitudinal coordinate x, and a longitudinal
coordinate y (Xiat,
Ylong). In some embodiments, after the geographic points are converted 1502
into respective
planar approximations, a proof portion is generated to prove that that dx =
rcoscpodO and dy =
rthp. The geometric function may comprise determining that the geographic
point is within at
least one of: a radius of another geographic point, a polygonal shape, a union
of convex
polygonal shapes, a union of shapes (including a combination of circles and
polygons). The
commitment values may comprise encrypted points and auxiliary values (e.g.,
areas, radii
squared, and other information used by sigma protocols). The commitment values
allow the
- 53 -
CA 3069582 2020-01-23

verifier to confirm that the geometric function was applied honestly to the
planar coordinate (as
further described below). In some embodiments, the proofs may include a proof
of well-
formedness of a commitment, and a proof of a condition on the result of the
geometric function
(e.g., distance squared less that radius squared, signed area a positive
value, etc.). The result
of the geometric function may be compared with desired parameters or
conditions such as, a
point is within a radius of an origin. Other desired parameters and/or
conditions may be used.
Other steps may be added to the method 1500.
[0327] FIG. 15B illustrates, in a flowchart, an example of a method of proving
that a first
geographic point is within a radius of another geographic point 1520, in
accordance with some
embodiments. In some embodiments, the first geographic point may be associated
with a
credential location of a user device, and the other geometric point may be
associated with the
origin of an area of interest defined by the origin and the radius. The method
1520 comprises
converting 1522 the geographic points into respective planar approximations
(dx, dy),
determining 1524 a commitment value of a distance squared between the planar
approximation
of the first geometric location and the planar approximation of the other
geometric location (e.g.,
origin point of region of interest), and generating 1526 a proof that the
distance squared
between the two points is less than the radius squared (i.e., the distance
squared between the
planar approximations of the two points is less than the radius squared). In
some embodiments,
after the geographic points are converted 1522 into respective planar
approximations, a proof
portion is generated to prove that dx = rcos(p0d0 and dy = rd(p. Determining a
commitment
value of a squared value (e.g., squaring a commitment) is described further
below. Other steps
may be added to the method 1520, including sending 1528 the commitment values
and proof to
a verifier.
Short Approximate Distance from a Known Point
[0328] Let the known point Po to be cPo radians North and 00 radians East. For
a close-by point
P(9,e) = P0+(dcp,d8), we may use the following planar approximation:
dy = rthp
dx =rcoscpode
[0329] The earth is not a perfect sphere, so the distance from the center at
Po should be looked
up from a database trusted by the verifier and agreed upon by both parties.
With r and cpo
known, computing a commitment to (dx)2 + (dy)2 can be carried out as follows:
- 54 -
CA 3069582 2020-01-23

[0330] Notation: For entity a, Ca denotes a commitment to it.
[0331] Co = gOliro,C0 = gehre is our starting point.
[0332] We can compute Cdo = Cog-Cb = gd(lIhr0 and Cde = Cog-ea = gdehre.
[0333] Then we compute Cc04,0de = (cdocosoo ,nr for an arbitrary r. For non-
integral values, as
the cosine is expected to be, this operation is non-trivial and will be
covered later in this report.
[0334] Then we compute the commitments C() C()2 d, 2 and
. A commitment to the distance
77-Y-ci
r
would be computed as ( C() , 2 + C(c Y-) 2 . This can be done easily since r
is expected to be an
integer.
Representing Fractions
[0335] A natural idea for representing the fraction m/n in the field of
exponents would be to
simply use mn-1 in the field. Unfortunately this enables provers to pull off
all manners of
cheating, as for every m, n and m', there is an n' such that m/n = m'/n' in
the field. m/n and m'/n'
need not be equal or even close in Q.
[0336] As an example of how dramatically this can go wrong, consider the
integer 1000. It can
be represented as 1000F in the field. On the other hand, consider the field
inverse of 1000,
(1000F)-1 = nF. 1000F would also be a representation of 1/n, even though 1000
and 1/n are
orders of magnitude away.
[0337] Another natural idea is to choose a fixed point representation.
Commitments can be
added as long as their point position matches (even if it does not, one may be
extended or
truncated to match the other).
[0338] Any fixed point can be multiplied by a commitment to another. The
result will have its
point at the sum of the others'. When this happens we may need to truncate the
result to
normalize it.
Normalizing a fixed-point number
[0339] Let x be an N-digit number with its point at position n, and that we
want to normalize it
into x' with an n'<n bit fractional part. x can be decomposed into N bit-
commitments, and the
rightmost n-n' of them discarded. x' can be built back up from the remaining
bit commitments.
- 55 -
CA 3069582 2020-01-23

Squaring a commitment
[0340] In order to prove the validity of a commitment to x2 while having
agreed upon a
commitment to x Cx = gxe, we first rewrite Cx2 = gx2hm" = Cxhni' knowing m" =
m' + xm and
then run the following Z-protocol:
1. Randomize field elements r1, r2 and r3 and ship commitments R1 = gr1hr2
and
R2 = Cxri hr3
2. Receive challenge c.
3. Send z1 = cx + 7-1, Z2 = r2, z3 = cm' + r3.
4. Verifier checks that gzihz2 = CR1, C1hz3 = C2 R2.
. Multiplying commitments
[0341] The case of proving the validity of a commitment to xy from those to x
and y is only
slightly more complex.
Cx = gxhnix, Cy = gY hinY Cxy = gXY hinfXY
[0342] The goal here is to prove Cxy = CxYhmxY for mxy = mx'y ¨ ymy.
1. Randomize field elements ri, r2 and r3. Ship commitments Ry = grihr3 and
Rxy =
Cxr1 hr2
2. Receive challenge c. In some embodiments, the challenge c can be
received
from the verifier or determined via a hash computed by the prover using the
randomized field element values and randomized input from the verifier.
3. Send zi=cy+ri, z2=cmxy+r2 and z3=cmy+r3.
4. Verifier checks that gz1hz3 = Cyzl hz2 = qyRxy.
The proofs of security are trivial and are therefore omitted.
- 56 -
CA 3069582 2020-01-23

Back to Coordinates
From Requirements to Parameters
[0343] Relations will now be derived for estimating the number of binary
points required to
represent each quantity. To this end, one must derive the maximum amount that
is to be
represented as well as the largest admissible rounding error.
[0344] We assume the maximum earth radius to be given. In fact, it's less than
6400km. We
denote it as rmax. Representation rounding error will be denoted as Er.
[0345] The maximum distance in the application and its acceptable estimation
error are climax
and El, respectively.
[0346] For convenience, we mandate Ex = Ey and dxmax = dymax.
[0347] d12 = dx2 + dy2 . Thus dl is increasing with respect to both dx and dy.
Therefore, to
obtain boundaries that guarantee dl is within bound, we must plug in dxmax and
dymax. From that
we clearly obtain dxma, = dymax = d1m2ax .
[0348] On the other hand, approximating the errors as first order
differentials we have:
2d1d21 = 2dxd2x + 2dyd2y d21 = d2 x cos a + d2y sin a for some a. By the same
argument as
above, with d2I being increasing with respect to d2x and d2y, we can plug in
all the maximum
values in the equation.
1
E1 = max (E cos a + Ey sin al = Exlif Ex = E = ¨E1
Y .N/
[0349] We also get dOmax = "max and Omax = dxmax
rmax rmax
Ey ErdOmax
d2y = drd0 + rd20 Ey = Er CIO max + rmaxE4, E0 =
rmax
d2 x = dr cos 00 do + rd cos 00 do + r cos 00 d20 Ex = ErdOmax + rmaõ _ cos
(podOmax + rmax 0
Ex¨ExdOmax¨rmaxEe Ex¨Erdemax¨rmax Ecos cpademax
Ecos (i)o = and 9 =
rmax demax rmax
[0350] Having been given either Ecas 00 or E9 we can compute the other.
- 57 -
CA 3069582 2020-01-23

An Instantiation
[0351] Plugging rmõ = 6400km, Er = lkm, dlmõ = 32km, El= 125m and Ecos 00 =
2'13 into
the formulas above yields the following:
Maximum Point position Bit size
dx 22.63 km 4 9
dcp 0.00353554 rad 17 8
de 0.00353554 rad 17 8
cow() 1 10 11
Table 19
[0352] The system described above may be used to enable a prover to convince a
verifier that
it has a commitment to a point inside a verifier-determined convex polygon.
Its applications
include proofs of location in cities and gerrymandered areas. In the proof of
location applications
one may approximate many shapes, even concave ones, as a union of a number of
convex
shapes.
[0353] A point P is inside or on a convex shape Ao...An_l, ordered counter-
clockwise if and only
if all the angles A,PA,+1 mod n are also counter-clockwise. This is evidenced
by that the areas of
triangles made by these line-sections always add up to the original shape's
area. However, the
absolute value of these areas would add up to more than the shape's area if P
is outside it. In
that case therefore, some of the areas must have conflicting signs.
[0354] The signed area of the triangle A,PA,,, is the following:
1 S(A,PAi ,+
= xi+iYi + xp(Yi Yin.) + 31P(X1+1¨ xi))
[0355] The prover only needs to demonstrate that all such areas are positive.
[0356] FIG. 15C illustrates, in a flowchart, an example of a method of proving
that a first
geographic point is within a polygon shape having vertices defining an area
1530, in accordance
with some embodiments. In some embodiments, the first geographic point may be
associated
with a credential location of a user device, and the other geometric point may
be associated with
the origin of an area of interest defined by the origin and the radius. The
method 1530
- 58 -
CA 3069582 2020-01-23

comprises converting 1532 the first geographic point and geographic points
associated with
each endpoint of each vertex of the polygon into respective planar
approximations, determining
1534 a commitment value of a signed area of each triangle defined by AP/VI
(where A and A+,
are the planar approximations of endpoints of a vertex of the polygon shape, P
is the planar
approximation of the particular geographic location associated with the
portable client device,
and i is an index value representing a vertex of the polygon), and generating
1536 a proof that
the signed area of each triangle defined by A,PA1.1 is a positive value. In
some embodiments,
after the geographic points are converted 1532 into respective planar
approximations, a proof
portion is generated to prove that dx = rcoscpode and dy = rthp. Other steps
may be added to
the method 1530, including sending 1538 the commitment values and proof to a
verifier. The
commitment values and proof allow the verifier to verify that all areas were
constructed correctly
and proofs determined correctly.
Using Cryptographic Commitments
[0357] Before the protocol is executed, the prover and the verifier have
agreed on a clockwise
ordering of the polygon vertices. The point P, to be proven to be inside the
polygon, is given by
commitments to its coordinates Cpx = gPxhmPx ) and Cpy = gPYhrnPY . Because
there are no
products of two unknown entities, given Cpx and Cpy the verifier can compute
commitments to all
2f+1S(A1PAi+1) where f is the size of the fractional component used, with no
input from the
prover.
[0358] The prover then can show that S(A,PA,,,) can be expressed using f+1+ns
bits, where ns
is the number of bits used to express the area of the entire shape. The size
of the group is to be
no less than f+3+ns. It should be understood that any other way of proving
positivity may be
used.
Precision and Parameter Tuning
[0359] Error can be defined as the error in estimating the area of a triangle
when it is close to
zero. This definition is justified by the fact that the only predicate of
interest on an area is
whether it's positive. This translates to both a completeness and soundness
error.
[0360] In some embodiments, the error of representing each known coordinate
may be denoted
as EA, and that of each of P's coordinates denoted as Ep. The maximum absolute
value of any
of A's coordinates may be called iximax and IYImax. The prover initially shows
that IXpI < IXImax
- 59 -
CA 3069582 2020-01-23

and II < lYlmax, since there is no guarantee that the following bounds will
continue to hold as
P's coordinates grow.
[0361] Without loss of generality, it may be assumed that Ixlmax lYlmax=
Applying these into
Es
the triangle area equation provides 2EA + Ep =
. This is a nice result because it shows
2Ix'max
that more is to be gained from making the shape coordinates more precise than
there is from
the unknown point. Relaxing precision requirements on the unknown points will
provide
efficiency gains.
[0362] FIG. 15D illustrates, in a flowchart, an example of a method of proving
that a first
geographic point is within a union of shapes defining an area 1540, in
accordance with some
embodiments. In some embodiments, the first geographic point may be associated
with a
credential location of a user device. Each shape comprising a circle having a
radius, or a
polygon having vertices, defining an area. The method 1540 comprises
converting 1542 the first
geographic point into planar approximations, generating 1544 an honest proof
for at least one
shape that contains the geographic point, and generating 1546 a simulated
proof for all other
shapes in the union of shapes. Other steps may be added to the method 1540,
including
sending 1548 the commitment values, honest proof(s) and simulated proof(s) to
a verifier. The
commitment values and proofs allow the verifier to verify that all areas were
constructed
correctly and proofs determined correctly.
Choosing the Origin
Unknown point P
[0363] An advantage of this choice is that the terms xp(Yi ¨ Yi+i) and yp(xi+i
¨ xi) are always
zero and do not need to be computed. However, xiyi+i and xi+iyi are to be
computed as
products of two unknown values. This is computationally intensive and is
therefore not
recommended.
Known point on or inside the shape
[0364] Possible choices include one of the vertices, min and max averages of
the coordinates
and the mean of the vertices. In one embodiment the min and max averages are
selected, since
in that case, the term IxImax would be minimized, allowing for larger errors
EA and E.
[0365] It is understood that the above concepts of proving that a commitment
is within a
distance from a point (i.e., within a circle), or within a union of convex
polygons (i.e., within a
-60 -
CA 3069582 2020-01-23

gerrymandered shape) may be applied to other shapes, including a union of one
or more circles
and one or more polygons.
Simulation of Location-Based Zero-Knowledge Proofs
[0366] A simulation of a cryptographic protocol is the process of generating a
transcript that is
indistinguishable from an honestly generated one, but is generated by the same
party.
Theoretically, the concept is useful in proofs of security. In practice, it is
used to prove
disjunctions of statements by allowing the prover to "cheat" on all statements
but one of their
choosing.
[0367] All steps of a simulated statement are simulated with the same
challenge. Location-
related protocols described herein comprise the following steps which would be
simulated as
follows:
Multiplication by a known factor
[0368] To simulate a "multiplication by a known" proof, recall that an honest
proof of b = ca
involves proving that Cb = gbhrb encodes the same entity as C. Then the proof
that establishes
that equality must be simulated. Simulating equality proofs is described
infra.
Squaring a commitment
[0369] The following simulator can make it look like a2 = 0 regardless of a's
value.
[0370] Given the challenge c, and a commitment Ca = gahm, the simulator
generates a
commitment to a2 for an arbitrary value for a2 by performing the following:
1. Randomize zl, z2, z3, m'
2. Compute the commitment Ca2 = ga2 el'
3. Compute R1 = eihz2C;c and R2 = C,Z1Ca-2ChZ3.
4. Ship Z1, Z2, z3, Ca2, R1, R2 to the verifier.
[0371] Step 3 can be optimized by taking a2=0 so that R2 = Cxz1ircm'+73.
Decomposing a commitment
[0372] The following simulation process can generate false bit decompositions:
- 61 -
CA 3069582 2020-01-23

[0373] When decomposing Ca = gahm into n digits, all the commitments for Ci =
gathmt for i #
0 are to the value a, and the least significant bit commitment is Co = gaohmo
where mo = m ¨
(2t) (9
Ei=1 mi , ao = a ¨Ei=ia2
i , ai are arbitrary for i *0 and int are random for i *0. The proof
of well-formedness for all bits are simulated. Recall that a proof of well-
formedness for a bit
commitment C1 = gathmt was a disjunction of the statements ai = 0 and ai = 1.
In this case,
given the challenge c, the simulator randomizes the challenge c' and feeds the
sub-simulators
the sub-challenges c' and c ¨ c'.
[0374] The process can be optimized by taking at = 0 for i # 0 and ai = a.
Application to location-based proofs
.. [0375] In order to simulate position in a circle of radius r, the simulator
can
1. Generate commitments to dx,dy and simulate proofs that dx = rcoscpode and
dy =
rdcp.
2. Generate commitments to dx2, dy2 and simulate proofs that they were
correctly
computed.
3. Generate commitment to 5 = r2 ¨ dx2 ¨ dy2 (simply through group arithmetic)
4. Simulate decomposition of 5 to show it's a positive signed number.
[0376] FIG. 15E illustrates, in a flowchart, a method of simulating a proof
that a particular
geographic location is within an area defined by an origin and a radius 1550,
in accordance with
some embodiments. The method 1550 comprises generating 1552 commitments to
planar
approximations dx and dy of the particular geographic location, generating
1554 a first
simulated proof portion that dx = rcos(pode and dy rdv, generating 1556
commitments to
squared planar approximations dx2 and dy2 of the particular geographic
location, generating
1558 a second simulated proof portion that dx2 and dy2 were correctly
computed, and
generating 1660 a commitment to 5 = r2 ¨ dx2 ¨ dy2; and generate 1662 a third
simulated
proof that a decomposition of 5 is a positive value. Other steps may be added
to the method
1550.
[0377] The following can simulate a proof of being contained in a polygon:
1. Generate commitments to dx, dy and simulate proofs that dx = rcosvode and
dy =
rdw.
- 62 -
CA 3069582 2020-01-23

2. Generate commitments to values S(AiPili+i) through multiplication proof
simulations
and group arithmetic and simulate proofs that they were formed correctly.
3. Generate simulated proofs that all the S(AiPAi+i) are positive.
[0378] FIG. 15F illustrates, in a flowchart, a method of simulating a proof
that a particular
geographic location is within an area defined by a polygon shape having
verticies 1570, in
accordance with some embodiments. The method 1570 comprises generating 1572
commitments to planar approximations dx and dy of the particular geographic
location,
generating 1574 a first simulated proof portion that dx = rcosvocle and dy =
rely), generating
1576 commitments to signed values S(Ail'Ai+i), generating 1578 a second
simulated proof
portion that the signed values were formed correctly, and generating 1580
simulated proof
portions that all the signed values S(AiPAi+i) are positive values. Other
steps may be added to
the method 1570.
Sharing of values
[0379] In some embodiments, where the shapes in question are geographically
close, the same
planar approximation may be valid for all shapes defining areas that are
geographically close. In
this case, step 1 can be performed once for all of the shapes with an honest
proof and the same
committed values for dx and dy can be used for each shape.
SafeS hare Application
[0380] FIGS. 16-21 are screenshots of an example web application illustrating
a sample prover
computing device user experience, according to some embodiments. In this
example, a token is
a signed token from a semi-honest issuer attesting to its carrier's having a
certain attribute, and
an attribute is any logically simple claim one can make about oneself.
[0381] FIG. 16 is a screenshot of an example credential manager, according to
some
embodiments. FIG. 17 is a screenshot of an example credential manager having
an expanded
portion to view additional details of the credential, according to some
embodiments.
[0382] FIG. 18 is a screenshot of an example credential manager showing an
expanded
credential view of a single credential, according to some embodiments.
[0383] FIG. 19 is a screenshot of an example page requiring verification,
according to some
embodiments.
-63 -
CA 3069582 2020-01-23

[0384] In this example, the prover has navigated to a page for a skydiving
program whereby the
user must prove that they are at least 18 years of age, and that they have
been recently
declared healthy. There is also a student deal, where if the prover wishes to
access the student
deal, they must also prove that they are currently enrolled at a university.
It is understood that
other criteria may be included, such as location or residency within a certain
geographic area.
[0385] FIG. 20 is a screenshot of an example proof request interface page,
according to some
embodiments. The verifier (e.g., a vendor) sends a proof request, and the user
is redirected to
the user interface. The proof request is written in a formal language;
therefore, the natural
language description also comes from the vendor. The user has the option to
view the proof
request code and vote on whether it matched the natural language description.
It is understood
that other criteria may be included, such as location or residency within a
certain geographic
area.
[0386] FIG. 21 is a screenshot of an example proof input interface page,
according to some
embodiments. After agreeing to the policy, the interface includes visual
elements that the prover
may select to choose which of their credentials to use as input to the proof
generation algorithm.
These credentials will not be shared with the verifier, but it will be proven
in zero knowledge that
they satisfy the policy.
Glossary
[0387] Issuer / Identity Provider is a party trusted by the prover and the
verifier, that attests to
the prover's attributes.
[0388] Prover / Client the party in contact with the issuer as well as the
verifier, attempting to
prove properties in zero knowledge.
[0389] Verifier the party demanding knowledge of certain properties about the
prover.
[0390] Attribute a proposition about the prover to which the issuer attests.
[0391] Property a proposition about the prover whose truth the verifier wants
to ascertain. It
may be identical to one of the prover's attributes or logically implied by one
or many of them.
[0392] Proof Request a formalized message from the verifier specifying the
property of which it
demands knowledge, containing additional information about the protocols
supported, etc.
[0393] Proof a message from the client to the verifier, providing
cryptographic assurance as to
the issuer's attestation to the client's satisfaction of the property
specified in the proof request.
- 64 -
CA 3069582 2020-01-23

Templates for Messages
[0394] The protocols message can be communicated as JSON-encoded.
Proof Request
[0395] What follows is a sample proof request.
{
"proof_request":{
"lang":"straight-talk-1.0",
"urI":"https://bank.com/straight-talk/sample-scripts/alcohol-consumption.stk",

"verifier_args":{
"expiration_threshold":{
"genesis": "01/01/2018",
"apocalypse":"01/01/2100",
"granularity": "day",
"since_genesis":1000,
},
"birth_threshold":{
"genesis":"01/1900",
"apocalypse":"01/1998",
"granularity":"day",
"till_apocalypse":195
},
"permitted_issuers":["Bank","Florida DMV"],
"supported_schemes":[
{"scheme":"u-prove","dlog_params":{"group":"prime256v1","generators":"base64
hash of
generators, or list of generators"}},
"scheme": "delegated-verification",
"remote_attestation":{"offline_accepted":true, "later_than":"01/01/2017"}
],
"ca_certs":[{"name":"bank-identity-brokerage-ca","digest":"base64 sha2"}],
"challenge_string":"base64 byte string"
- 65 -
CA 3069582 2020-01-23

The script "alcohol-consumption.stk" can contain the following:
verifier_params
{
date_t expiration_threshold;
date_t birth_threshold;
credential{
date_t expiration_date;
date_t birth_date;
return expiration_date > expiration_threshold && birth_date < birth_threshold;
[0396] Instead of providing a URL, the verifier can provide the source code or
bytecode to a
straight talk script.
[0397] A standard bytecode language for straight talk shall also be provided.
Proof
[0398] The following can be a response to the request above:
"proof" :{
"scheme": "u-prove-1.0,
"dlog_params":{"group":"prime256v1","generators":"base64 hash"},
"credential_metadata":{
"issuer":{"name":"BANK","cert":"bash64 cert"},
"attribute_indexes":{
"expiration_date" :8,
"birth_date" :1
1,
"attribute_count":5
},
"presentation_proof":{
"configuration":"cchhh",
"uprove stuff":"things"
- 66 -
CA 3069582 2020-01-23

"standalone_comparison_proof":{
"lhs":"expiration_date",
"relation": "greater",
"rhs": "06/20/1997",
"relation_commitment_mask":"base64 bignum",
"decomposition":
"commitment_in" : "lhs",
"bits_little_endian": [
{"commitment":"base64 group
element",
"well_formedness": "
{"commitment":"base64 group
element",
"well_formedness": "... "}
"comparison":
"in": "lhs",
"bit_comparisons": [
{}
1,
"request_digest":"base64 hash"
}
[0399] Another example proof, based on secure enclaves, is provided below:
"proof" :{
"scheme": "delegated-verification",
"remote_attestation_transcript":{
"base_randomness":"base64 byte array",
- 67 -
CA 3069582 2020-01-23

"msg1":{},
"msg2":{},
"msg3":"",
"ias_signature":""
1,
"request_digest":"base64 hash",
"verifier_public_key":"pem encoding",
"validation_signature":"base64 signature"
1
1
[0400] FIG. 22 is a schematic diagram of a computing device 2000 such as a
server. As
depicted, the computing device includes at least one processor 2002, memory
20020, at least
one I/O interface 2006, and at least one network interface 2008.
[0401] Processor 2002 may be an Intel or AMD x86 or x64, PowerPC, ARM
processor, or the
like. Memory 2004 may include a combination of computer memory that is located
either
internally or externally such as, for example, random-access memory (RAM),
read-only memory
(ROM), compact disc read-only memory (CDROM).
[0402] Each I/O interface 2006 enables computing device 2000 to interconnect
with one or
more input devices, such as a keyboard, mouse, camera, touch screen and a
microphone, or
with one or more output devices such as a display screen and a speaker.
[0403] Each network interface 2008 enables computing device 2000 to
communicate with other
components, to exchange data with other components, to access and connect to
network
resources, to serve applications, and perform other computing applications by
connecting to a
network (or multiple networks) capable of carrying data including the
Internet, Ethernet, plain old
telephone service (POTS) line, public switch telephone network (PSTN),
integrated services
digital network (ISDN), digital subscriber line (DSL), coaxial cable, fiber
optics, satellite, mobile,
wireless (e.g. Wi-Fi, WiMAX), SS7 signaling network, fixed line, local area
network, wide area
network, and others.
[0404] Computing device 2000 is operable to register and authenticate users
(using a login,
unique identifier, and password for example) prior to providing access to
applications, a local
- 68 -
CA 3069582 2020-01-23

network, network resources, other networks and network security devices.
Computing devices
2000 may serve one user or multiple users.
[0405] FIG. 23 is a server for inclusion in a data center, according to some
embodiments. The
server is an illustration of a special purpose machine 2102, according to some
embodiments
that may reside at data center. The special purpose machine 2102, for example,
incorporates
the features of the verification system and is provided in a portable
computing mechanism that,
for example, may be placed into a data center as a rack server or rack server
component that
interoperates and interconnects with other devices, for example, across a
network or a message
bus. For example, the server may be configured to provide one or more secure
data storage
spaces, or may include one or more secure enclave processors, among others.
[0406] FIG. 24 is a system diagram for an example verification system,
according to some
embodiments.
[0407] There may be multiple issuer entities, each having their own set of
associated computing
devices (e.g., computers in data centers). The issuer entities and their
computing devices may
have heterogeneous data storage mechanisms, which can include local storage
and/or specially
allocated memory storages, including secured enclave memory and/or processors.
[0408] As illustrated in the example of FIG. 24, issuer devices 2202 are
coupled to a client
secured data storage. The issuer device 2204 may be coupled to a blockchain
data structure
backend or a distributed ledger mechanism that can be accessed and/or
otherwise interacted
with (e.g., pushing new information to be stored on the blockchain, querying
the blockchain
using a blockchain explorer), among others.
[0409] The blockchain data structure, in some embodiments, is a public
blockchain (e.g.,
publicly accessible, such as an Ethereum blockchain), or a private,
permissioned blockchain
that operates through a propagated distributed ledger shared across a set of
trusted nodes. The
blockchain data structure may store immutable attributes (and/or encrypted or
signed versions
of the attributes).
[0410] Multiple authoritative issuer devices 2202, 2204 are configured to
provide signed
attributes which may either be delivered to the client special space 2206
(e.g., a secured
enclave having secured memory and/or secured processors) or onto the
blockchain data
structure 2208. These attributes represent aspects of client personal
information. The client
special space 2206 can store some signed attributes and the proof response
logic, and can be
stored in a data storage remote from the on-board memory of devices associated
with the client.
- 69 -
CA 3069582 2020-01-23

A benefit of having a client special space 2206 is that, for example, multiple
client devices 2210
are able to connect to the client special space 2206 (e.g., a tablet, a mobile
device, a desktop
computer), and if a client loses a device, the signed attributes remain
accessible.
[0411] The blockchain data structure 2208 is adapted to store on one or more
distributed
ledgers, data blocks representing signed attributes and the proof response
logic, according to
some embodiments. Similarly, multiple client devices 2210 are able to connect
to the blockchain
data structure 2208 (e.g., a tablet, a mobile device, a desktop computer), and
if a client loses a
device 2210, the signed attributes remain accessible.
[0412] Proof requests and responses can be conducted using, for example,
connected
electronic client devices 2210 (e.g., a mobile device, such as a smartphone or
a tablet) or other
devices that are connected to the mobile device using Bluetooth low energy
(e.g., a wearable
device). The client devices 2210 may store locally a public key which can be
utilized to encrypt
data messages for decryption using the corresponding secret key or validate
signed attributes
that are signed based on the corresponding secret key.
[0413] FIG. 25 is a system diagram depicting a method for registration,
according to some
embodiments.
[0414] At step 1 (shown by the encircled numbers in FIG. 25), a data
communication occurs
between the client device to the client special space data storage, whereby
the client device
generates control messages that authenticates and establishes the special
space data storage,
which, in an example, is a SGX Secure Enclave or other hardware protected
space.
[0415] A public/private key pair is generated, and the client device retains
the public key (Pk) in
data storage, while the private/secret key (Sk) is stored on the client
special space data storage.
[0416] At step 2, a data communication occurs between the client device to the
issuer device.
The client authenticates (e.g., 2 way authentication) in relation to the
issuer device, selecting
which attributes he/she wants to provide (e.g., Age, zip, over 21, etc.) from
a set of offered
selections. In some embodiments, the issuer device may provide one or more
interface options
indicative of potential token destinations (e.g., storage destinations),
including the client special
space data storage or a blockchain-based data storage, such as a permissioned
blockchain or a
public blockchain.
[0417] The issuer is configured to then deliver attributes, for example, in a
client secret space
data structure, the client's device can provide a pointer (e.g., a uniform
resource locator address
(URL)) to a special space data structure, and can transmit the public key (Pk)
to the issuer
- 70 -
CA 3069582 2020-01-23

device. Attributes are then created by the issuer device using a combination
of the client's public
key and the issuer's secret key (Sk), which can be delivered for storage on
the client secret
space data structure (3A) or on a data block of a blockchain. Where the
attribute is stored on
the blockchain, the attribute may be made public (if the issuer is blockchain
enabled), and thus
publicly viewable. The public blockchain may be configured to store a pre-
defined set of
attribute types that are thus publically viewable (e.g., using a blockchain
explorer).
[0418] In another embodiment, where the issuer device is capable of
interacting with the
blockchain and the attribute can be made public (e.g., over 21), the issuer
device delivers
attributes signed by the issuer using the client's public key (Pk) to the
blockchain (3B).
[0419] In another embodiment, where the issuer device is capable of
interacting with the
blockchain and the attribute requires the client's permission (e.g., date of
birth), the issuer can
store on the blockchain signed attributes that are encrypted with the client's
public key (3C).
[0420] In an alternate embodiment, a sidechain is utilized to keep attestation
private between
the involved parties.
[0421] FIG. 26 is a system diagram depicting a method for verification,
according to some
embodiments.
[0422] Sample steps for verification are described herein, whereby the client
device forms a
communication channel with the authenticated verifier, and the verifier makes
a "Proof
Request".
[0423] The proof request can be provided, for example, to the client secret
space data storage.
At step 3A, a "Proof Request" is sent to the Client's special. space data
storage, where a
bounded device has both the URL and public key to access the special space. At
step 4A, a
"Proof Response" is sent back to the client device.
[0424] In an alternate embodiment where the issuer device is blockchain-
enabled and the
attribute can be made public (e.g., that the user is over 21), at step 3B,
data messages
representing the "Proof Request" are transmitted to the blockchain (or a
blockchain explorer tool
configured to interact with the blockchain), and at step 4B, a data message
representing the
"Proof Response" is sent back to the device as the attribute is public. In an
alternate
embodiment, the client device can direct the verifier to the blockchain, for
example by providing
a link or a block number indicative of where the attribute is stored.
- 71 -
CA 3069582 2020-01-23

[0425] In another embodiment, the issuer is blockchain-enabled and but the
attribute needs
client's permission (e.g., date of birth, address). In this example, at step
30, a "Proof Request"
is sent to the blockchain (e.g., or to a blockchain explorer tool configured
to interact with the
blockchain), and at step 40, an "Encrypted Proof Response" is sent back to the
device.
[0426] The client device may then be required to grant permission to share the
attribute at step
5C, and responsive to permission being granted, at step 6C, the "Encrypted
Proof Request" is
then transmitted to the client special space data storage for decryption. At
step 7C, a Decrypted
Proof Response is sent back to the client device that could then be provided
to the verifier
device.
[0427] FIG. 27 is a system diagram depicting an example age verifier device
2502, according to
some embodiments.
[0428] A device is positioned at a point of sale or other location where
verification is required,
such as in front of a bar serving alcohol. The verification can be used as a
gatekeeper
mechanism in some cases, or in other cases, as a tool to determine whether a
particular
individual is entitled to various statuses or discounts (e.g., student
discount, or located within a
geographic location, etc.). The device 2502 is adapted, in a specific,
illustrative example, in
relation to a requirement to check the age of all customers.
[0429] Without such a system, a cashier would have to request an
identification card (which
could be falsified), and perform mental math to determine if the individual is
over 19 years old.
This is time consuming; and requires mental work effort. Furthermore, the
customer may find
the check to be invasive, as the relationship between cashier and customer is
unknown.
[0430] The device 2502 can be a terminal set up at the point of sale, for
example, which could
be designated a verifier computing device. The device 2502, in an example
embodiment, may
be configured to render a visual representation 2504 of a resource locator,
such as a quick
response code. The quick response code can be related to an underlying data
element, such as
a URL, which the client device can interact with, for example, by scanning the
code to access
the URL. On the backend, verification processes as described in various
embodiments herein
are utilized to transmit or otherwise make available signed attribute
information of the client
device, which are then provided to the verification device 2502 such that
verification device
2502 is able to verify specific attributes of the client (e.g., age > 25, or
address is within a
geographic location, etc.). The verification device 2502 can be configured to
modify a rendering
- 72 -
CA 3069582 2020-01-23

2506 to visually or audibly indicate that the client has successfully passed
or failed the attribute
test.
[0431] In some embodiments, one or more aspects of the blockchain,
tokenization and/or
verification/validation/proof processes described herein can involve one or
more secure
execution environments and/or secure storage elements. For example, in some
embodiments,
the storage of private keys and tokens, in addition to computations required
for issuance and
proofs, could be performed on Trusted Execution Environments, Smart Cards,
Secure Elements
or Trusted Platform Modules on devices such as mobile and personal computers
using
corresponding APIs.
[0432] In some embodiments, a computing system includes or is configured to
provide a
plurality of distinct execution environments. The isolation of these
environments can be
enforced using software or hardware. In some embodiments, a distinct execution
environment
can include one or more secure storage elements (for example, a Secure Element
or one or
more aspects of a Smart Card).
[0433] The distinct execution environments are, in some embodiments,
configured to provide
access to different storage and processing resources.
[0434] In some embodiments, one of the environments may be referred to as a
trusted
execution environment (TEE) and may have access to isolated and secure storage
and
processing resources.
[0435] In some embodiments, a secure environment may support a distinct
operating system,
or it may be a set of secure resources accessible to applications that are
assigned to use it by
the underlying operating system of the overall system. In some embodiments, a
computing
system includes a dedicated secure storage resource, such as a separate secure
storage or a
secure storage area within a general storage resource. In some embodiments,
the computing
system includes a dedicated secure memory device such as a separate secure
memory, or a
secure area within a general memory resource (e.g., secure memory may be
accessible in a
different address space or in a different address range).
[0436] These resources may be physically and/or logically distinct from the
general resources
of the same type. In a computing system that includes or is configured to
provide two distinct
execution environments, the first execution environment is a secure execution
environment and
the second execution environment is a potentially unsecure environment.
- 73 -
CA 3069582 2020-01-23

[0437] The secure execution environment is sometimes referred to as a trusted
execution
environment (TEE) and the potentially unsecure environment is sometimes
referred to as a rich
execution environment (REE).
[0438] The second execution environment (e.g., the potentially unsecure
execution
environment) is configured to communicate with the secure execution
environment (e.g., the
first execution environment) to request one or more aspects of the
tokenization and/or
verification/validation process to be performed.
[0439] The second execution environment includes an unsecure portion of a
processor,
memory, and storage. Software code of the second execution environment can
include an
unsecure OS which is stored in storage, loaded into memory at run time, and
executed by
processor to perform OS operations. In some embodiments, software executable
by the second
execution environment can include one or more APIs or other software
components for
providing function calls or otherwise interfacing with one or more components
of the first
execution environment.
[0440] For example, in some embodiments, the first (e.g., secure) execution
environment can
include (e.g., store) one or more keys such as root keys, private keys, and
the like for
generating signs tokens, validating one or more signed data elements, and/or
the like. Some
environment, first execution environment can include (e.g., store) one or more
tokens against
which one or more credentials or other data elements can be validated.
[0441] In some embodiments, first execution environment can include one or
more software
components including computer executable code for generating/issuing and/or
validating one or
more tokens, credentials and/or other data elements.
[0442] For example, in one example embodiment, a digitally signed token
representing a
verified identity or account can be stored in a secure storage element in a
secure execution
environment. A secure execution environment can include computer executable
instructions
which receive from an unsecure execution environment one or more data sets
representing one
or more biometric verification credentials.
[0443] The computer executable instructions and the secure execution
environment can be
configured to perform one or more calculations or data transformations to
validate that the data
sets representing the biometric verification credentials match or otherwise
correspond to the
digitally signed token as described herein or otherwise. In some embodiments,
the data sets
representing the one or more biometric verification credentials can be
received at the device on
- 74 -
CA 3069582 2020-01-23

which the secure execution environment resides and/or an external device in
communication
with the device in which the secure execution environment resides.
[0444] In some embodiments, secure execution environment can return one or
more signals
indicating whether the biometric verification credentials are valid or
otherwise match the digitally
signed token. Some environments, the signals can include one or more signed
data elements to
confirm the veracity of the signals.
[0445] In some embodiments, the secure execution environment can be configured
to respond
to proof requests from unsecure execution environment(s).
[0446] In some embodiments, a secure execution environment can be used to
generate a
signed token. In some embodiments, a secure execution environment can receive
from an
unsecure execution environment one or more tokens and/or credentials. One or
more software
elements within the secure execution environment can generate a signed token
and/or
credential using one or more private keys stored within the secure execution
environment. The
signed token and/or credential can then be returned to the unsecure execution
environment.
[0447] In some embodiments, one or more aspects of the blockchain
verification, transaction
and/or other modification processes can be performed within a secure execution
environment to
ensure that private keys, addresses, credentials and/or the like are only
accessible by
authorized users and/or processes within the secured environment.
[0448] Any other aspect of the tokenization and/or her validation process can
be similarly
.. applied to using these secure an unsecure execution environment to ensure
that sensitive
information such as keys, credentials, tokens, tokenization algorithms,
biometric data, biometric
processing algorithms, blockchain transactions/activities, neural networks,
and/or the like are
only accessible by authorized users and/or processes.
[0449] In some embodiments, sensitive operations using a private key may be
performed only
in a secure area. In some embodiments, all or additional operations maybe
performed in a java
card space of a smart card.
[0450] Applicant notes that the described embodiments and examples are
illustrative and non-
limiting. Practical implementation of the features may incorporate a
combination of some or all
of the aspects, and features described herein should not be taken as
indications of future or
existing product plans. Applicant partakes in both foundational and applied
research, and in
some cases, the features described are developed on an exploratory basis.
- 75 -
CA 3069582 2020-01-23

Example Commitment Non-zeroness protocol
[0451] Consider a dummy example where the group is of order 23. The prover has
a
commitment Cx = g13h12 (x = 13, y=12) and wants to prove that it is nonzero.
[0452] The prover first privately computes t1 = X-1 16 and t2 = ¨yx-1 LI 15.
Note that g =
Cxtiht2.
[0453] The prover first samples two random numbers, say r1 = 6 and r2 = 21. It
computes R =
qh21 = g9h and hashes Cx and R together to obtain c, say its value is c = 7.
The prover
computes z1 = cti + r1 = 3 and z2 = ct2 + r2 = 11 and sends over Cx, R, zi and
z2.
[0454] The verifier independently computes c and checks that gcR = Cõz1hz2 =
gl6h which is
indeed true.
[0455] It can be seen that the prover was required to perform one modular
inverse, two
randomizations, two group exponentiations and one group multiplication. The
method of
Belenkiy requires 4 randomizations, 3 group multiplications and 7 group
exponentiations.
Data Example, Real World Setting:
[0456] Here is an example run of the protocol in a real-world setting. The
NIST-recommended
SECP256r1 curve has been used as the discrete log group with g being the
default generator.
[0457] Integers:
[0458] x=
9892867038935251039652315169735590435479200495797377441259563330327373249774
3
[0459] y=
9500805360036109343192304657542475143360937713535799694111058199393816523636
7
[0460] x-1=
4626984032656735185409945105113382753552534018336730092162185203700084696651
5
[0461] -yx-1=
8118836176174455952026757876607568991506901761079418302569681690872777324170
5
- 76 -
CA 3069582 2020-01-23

[0462] ri =
6417418876480091382426725404945425038388032383757421657948044967727828784505
3
[0463] r2=
7807204769397269756225727662375015987718299338002534488489061971659656926216
1
[0464] c=
9622499010981012121056562044098423336748606820016916640259219129151163180105
3
[0465] zi=77558756204012200080475684884310387291370943910107311585466358661683

524082068
[0466] z2=10967924588729828517744024474218426311078559706343058313692421336800

4095632307
[0467] Note: The specific value of c depends on how hashing is carried out.
Many possible
ways of doing so can be valid. The process involves deterministically
serializing the objects to
be hashed (however seen fit) and pass the resulting byte array through a
secure hash (in this
example, SHA256).
[0468] Note: These integers are displayed in base 10.
[0469] Group elements:
[0470]
Cx=(e6ab3db4c6691dda4a8b05d79a15559c18181cda6c6dfc7fc77f41dff392e41,f0d8a7a
52e882ff9da9f64cf082db98bbe1db6fa6f965dc96a4150b95270e073,1da350a2e431d51de9217
a
218313fb2cc39f8f1dda48ea33ad7b1e561ef00e89)
[0471]
h=(e6ab3db4c6691dda4a8b05d79a15559c18181cda6c6dfc7fc77f41dff392e41,f0d8a7a5
2e882ff9da9f64cf082db98bbe1db6fa6f965dc96a4150b95270e073,1da350a2e431d51de9217a
2
18313fb2cc39f8f1dda48ea33ad7b1e561ef00e89)
[0472]
R=(b5e8e60e25842deb89cdb80047e49681b566b8bcf6b6fd6298fdc7dab5857300,54cac
d179ab2e3fbc892b1001c47408dc1d8559c8a2dce519094ab874b640e87,11b5ec881901a0901d

73d0892402c3f70d96f6d23ca851cd7fe9402f886f6bb4)
[0473] Note: These group elements are displayed in projective coordinates base
16.
- 77 -
CA 3069582 2020-01-23

Example Comparison Protocol
[0474] Consider G to be a discrete log group of prime order p and g and h be
generators with
unknown discrete logs. (Setting identical to that of all the rest of the
protocols).
[0475] Let numbers q and 1 be such that q ¨1 = 2N 5_ I-22 and two whole
numbers a and b such
that 1 < a < b < q.
[0476] Consider commitments A = gahma and B = gbhmb to a and b, respectively.
[0477] To prove that a < b, the following steps can be taken:
1. Prover computes C = BA-1 = gb-ahmb-ma = gchmc.
2. Prover produces bit commitments Ai = gaihmai, gi = gbihmbi, ci = gcihmci
for i E
[1, N ¨ 1)
where ai, bi and ci are the i'th bits of a ¨1, b ¨1 and c, respectively. mai,
mbi
and md are sampled randomly.
b
3. Prover computes Ao = gaohmao = AlrilAT21 and likewise Bo = gohmbo = B fl
B2'
and Co = gcohmco = [lc cF2.
4. For each i E N ¨ 1}, the prover does the following:
a. Randomly sample raj, cvai and z.
b. Compute Rai,ai = lira' and Rai,(i_ai) =
c. Compute dai = H(Ai, Rao, R
--ai,i)
d. Compute zai = (dai ¨ d'ai)mai raj
e. Assign zai,ai = zai, = za'i, da"Lai = dai
¨ and da"w_ai) = da'
f. Repeat steps a through e for B and C
5. Prover sends all Ai, Rao, Rao, daffo, Zaj,0, zao, B R
_ 41,0,
Ci, Rio,
Rc1,1, zci3O, z,11.
6. Verifier checks that A = nroi B = Firoi
Br', BA-1 = nroi c?'.
7. For each i E N ¨ 1) the verifier checks that:
a. A4.0Rai3O =
b. (Ag- =
- 78 -
CA 3069582 2020-01-23

c. Check the same conditions for B and C
[0478] Note: It may be that either a or b are known to the verifier. In such a
case there is no
need to decompose the known number and commitment C will have the same mask
exponent
as that of the unknown parameter.
[0479] Note: In an embodiment, the prover avoids sending Ao, Bo and Co to
reduce the size of
its messages. In that case, in step 6, instead of verifying a relation between
the bit commitments
the verifier derives Ao, Bo and Co independently.
Dummy Example
[0480] Let p-23, -- 1-0, q-8, a-3, b-5,A = g3h6, B = g5h1.
1. Prover computes C = g2h1.8
2. Prover generates commitments A1 = gh17, A2 = h22, B1 = h3, B2 = gh4,
= gh2 and
C2 _h17
3. Prover computes commitments Ao = gh22, Bo = gh2 and Co = h15.
4. Prover does the following:
a. Randomly sample rao = 15, d'ao = 10, Z0= 18, rai = 3, d1 = 5, z = 7, ra2 =
3,
d'a2 = 18, z'a2 = 4, rho = 17, 40 = 19, 40 = 7, rbi = 8, dim = 12, z,1 = 6,
rh2 = 7,
42 = 5, z = 3, rco = 11, d0 = 21, 40 = 19, rc1 = 5, 41 = 16, z'cl = 12, rc2 =
9, dic2 = 5, Zia = 10.
b. Compute Rau = h18A-0-10 _ g13,n 5,
Rao,i = h15, Rai,o = WAT5 = g181114, Rau =
h3, Ra2,0 = 113, Ra2,1 = 14(A -1\-18 = g 18h22, D
"b0,0 = h7B(719 = g41.,15, D
=
r=130,1 -
= h6(B1g1y12 = g121116
h17 Rb1,0 = 118 Rb1,1
Rb2,0 = h3(B2g-1)-5 = g5h6,
Rb2,1 =117, Rc0,0 = lin, Rao. = hl9(c 0 g -1)-21 = g 21113 , RC1,0 = h12 Cr16
= g7I13,
= h5, Rc2,0 = h9, Raj_ =1110(C2g-1)-5 = g51117.
c. Compute hashes, say dao = 17, dai = 4, da2 = 17, dbo = 12, dbi = 20, db2 =
3,
dco = 7, do. = 1, dc2 = 0.
d. Compute responses zao = (17 - 10)22 + 15 = 8, zai = (4 - 5)17 + 3 =9, za2 =

(17 - 18)22 + 3 = 4, zho = (12 - 19)2 + 17 = 3, zhi = (20 - 12)3 + 8 = 1, zh2
=
(3 - 5)4 + 7 = 22, zco = (7 - 21)15 + 11 = 8, zcl = (1 - 16)2 + 5 = 21, z2=
-5 x 17 + 9 = 16.
- 79 -
CA 3069582 2020-01-23

e. Assign values Z
-a0,0 = 18, za0,1

= 8, - z
a1,0 = 7, za1,1 9, za2,0
= 4, = 4, zb0,0 za2,1
7, zbo,i = 3, Zb1,0 = 1, Zb1,1 = 6, Zb2,0 = 3, Zb2,1 = 22, z,0,0 = 8, z0,1 =
19, z,1,0 =
12, z,1,1 = 21, zc2,0 = 16, zc2,1 = 10.
5. Prover sends Ao = gh22, IQ
¶a0,0 = gl3h5, Ra0,1 = h15, daff0,0 = 10, za0,0 = 18, zap = 8, A1 =
gh17, Ba1,0 = g181Ø4, p
'µa1,1 = 113, 41,0 = 5, Za1,0 = Za1,1 = 9, A2 = h22,
Ra2,0 =
1.3
= gl8h22,
Ra2,1 dafF2,0 = 22, Za2,0 = 4, Z
-a2,1 = 4, Bo = gh2, Rb0,0 = g4h15, Rb0,1 =
dgco = 19, 40,0 = 7, 40,1 = 3, 81 = h3, Rbi3O = h8, =
g12111.6 II
1,0 = 8, zbi3O = 1,
= 6, B2 = gh4, Rb2,0 = g5h6, Rb2,1 = h7, 42,0 = 5, Zb2,0 = 3, zb2,1 = 22, Co =
h15õ
= hll, = g 21113 ri = 19, Cl = gh2, Rao =
g7h3, =
Rc0,0 -co,i 40,0 = 9, zc0,0 = 8, zco,i
h5, d'ho = 16, zao = 12, zc1,1 = 21, C2 = h17, Rc2,0 = h9, Rc2,1 = g5h17,
dcfr2,0 = 18,
Zc2,0 = 16, zc2,1 = 10 to the verifier.
6. The verification steps are omitted in this description of this example.
[0481] The discussion provides example embodiments of the inventive subject
matter. Although
each embodiment represents a single combination of inventive elements, the
inventive subject
matter is considered to include all possible combinations of the disclosed
elements. Thus, if one
embodiment comprises elements A, B, and C, and a second embodiment comprises
elements B
and D, then the inventive subject matter is also considered to include other
remaining
combinations of A, B, C, or D, even if not explicitly disclosed.
[0482] Throughout the foregoing discussion, references were made regarding
servers, services,
interfaces, portals, platforms, or other systems formed from computing
devices. It should be
appreciated that the use of such terms is deemed to represent one or more
computing devices
having at least one processor configured to execute software instructions
stored on a computer
readable tangible, non-transitory medium. For example, a server can include
one or more
computers operating as a web server, database server, or other type of
computer server in a
manner to fulfill described roles, responsibilities, or functions.
[0483] The terms "connected" or "coupled to" may include both direct coupling
(in which two
elements that are coupled to each other contact each other) and indirect
coupling (in which at
least one additional element is located between the two elements).
[0484] The technical solution of embodiments may be in the form of a software
product. The
software product may be stored in a non-volatile or non-transitory storage
medium, which can
be a compact disk read-only memory (CD-ROM), a USB flash disk, or a removable
hard disk.
- 80 -
CA 3069582 2020-01-23

The software product includes a number of instructions that enable a computer
device (personal
computer, server, or network device) to execute the methods provided by the
embodiments.
[0485] The embodiments described herein may be implemented by physical
computer
hardware, including computing devices, servers, receivers, transmitters,
processors, memory,
displays, and networks. Such embodiments provide useful physical machines and
particularly
configured computer hardware arrangements.
[0486] Although the embodiments have been described in detail, it should be
understood that
various changes, substitutions and alterations can be made herein without
departing from the
scope. Moreover, the scope of the present application is not intended to be
limited to the
particular embodiments of the process, machine, manufacture, composition of
matter, means,
methods and steps described in the specification.
[0487] As one of ordinary skill in the art will readily appreciate from the
disclosure, processes,
machines, manufacture, compositions of matter, means, methods, or steps,
presently existing or
later to be developed, that perform substantially the same function or achieve
substantially the
same result as the corresponding embodiments described herein may be utilized.
Accordingly,
the appended claims are intended to include within their scope such processes,
machines,
manufacture, compositions of matter, means, methods, or steps.
[0488] As can be understood, the examples described above and illustrated are
intended to be
exemplary only.
- 81 -
CA 3069582 2020-01-23

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date Unavailable
(22) Filed 2020-01-23
(41) Open to Public Inspection 2020-07-23
Examination Requested 2024-01-23

Abandonment History

There is no abandonment history.

Maintenance Fee

Last Payment of $100.00 was received on 2023-12-21


 Upcoming maintenance fee amounts

Description Date Amount
Next Payment if small entity fee 2025-01-23 $100.00
Next Payment if standard fee 2025-01-23 $277.00

Note : If the full payment has not been received on or before the date indicated, a further fee may be required which may be one of the following

  • the reinstatement fee;
  • the late payment fee; or
  • additional fee to reverse deemed expiry.

Patent fees are adjusted on the 1st of January every year. The amounts above are the current amounts if received by December 31 of the current year.
Please refer to the CIPO Patent Fees web page to see all current fee amounts.

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Application Fee 2020-01-23 $400.00 2020-01-23
Maintenance Fee - Application - New Act 2 2022-01-24 $100.00 2021-12-21
Registration of a document - section 124 $100.00 2022-04-25
Maintenance Fee - Application - New Act 3 2023-01-23 $100.00 2022-11-29
Maintenance Fee - Application - New Act 4 2024-01-23 $100.00 2023-12-21
Request for Examination 2024-01-23 $1,110.00 2024-01-23
Excess Claims Fee at RE 2024-01-23 $110.00 2024-01-23
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
ROYAL BANK OF CANADA
Past Owners on Record
None
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
New Application 2020-01-23 11 243
Abstract 2020-01-23 1 12
Description 2020-01-23 81 3,412
Claims 2020-01-23 7 278
Drawings 2020-01-23 34 764
Representative Drawing 2020-06-23 1 10
Cover Page 2020-06-23 2 43
Request for Examination 2024-01-23 5 185