Language selection

Search

Patent 2316743 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent Application: (11) CA 2316743
(54) English Title: MANAGED DATABASE PRIVACY SYSTEM AND METHOD
(54) French Title: SYSTEME ET METHODE DE PROTECTION DES RENSEIGNEMENTS PRIVES D'UNE BASE DE DONNEES GEREE
Status: Dead
Bibliographic Data
(51) International Patent Classification (IPC):
  • G06F 21/60 (2013.01)
  • G06Q 30/02 (2012.01)
(72) Inventors :
  • HILL, AUSTIN (Canada)
  • KATIRAI, HOOMAN (Canada)
  • FAVVAS, GEORGE (Canada)
(73) Owners :
  • ZERO-KNOWLEDGE SYSTEMS INC. (Canada)
(71) Applicants :
  • ZERO-KNOWLEDGE SYSTEMS INC. (Canada)
(74) Agent: FASKEN MARTINEAU DUMOULIN LLP
(74) Associate agent:
(45) Issued:
(22) Filed Date: 2000-08-28
(41) Open to Public Inspection: 2002-02-28
Availability of licence: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): No

(30) Application Priority Data: None

Abstracts

Sorry, the abstracts for patent document number 2316743 were not found.

Claims

Note: Claims are shown in the official language in which they were submitted.

Sorry, the claims for patent document number 2316743 were not found.
Text is not available for all patent documents. The current dates of coverage are on the Currency of Information  page

Description

Note: Descriptions are shown in the official language in which they were submitted.



CA 02316743 2000-08-28
i
MANAGED DATABASE PRIVACY SYSTEM AND
METHOD
New Canadian Patent Application
Applicant: Zero Knowledge Systems Inc.
Filed: August 28, 2000
Our File: 2600726/0003
FASKEN MARTINEAU DUMOULIN LLP
PATENT AND TRADEMARK AGENTS



Managed Database Privacy System and
Method


CA 02316743 2000-08-28
4
r
Contents
INTRODUCTION...................................................................
.......................................... 3
OVERVIEW OF INVENTION
...............................................................................
........ 6
DE-IDENTIFICATION LAYER
...............................................................................
.....
8


DB ANALYSIS TOOL
...............................................................................
..........................
S


PRIVACY RESOLUTION TOOL
...............................................................................
............
9


DE-LINKING
ENGINE.........................................................................
.............................11


FULL TEXT ANALYSIS
TOOL...........................................................................
.................11


PRIVACY ENHANCING LAYER
...............................................................................
13


DATA MINIMIZATION ENGINE
...............................................................................
..........I
3


ENCRYPTION ENGINE
...............................................................................
......................15


DATA ACCESS LAYER
...............................................................................
................16


IDENTITY VERIFICATION ENGINE
...............................................................................
.....16


BLINDED TWO-WAY COMMUNICATION ENGINE
..............................................................17


SECURE PROFILE ACCESS ENGINE
...............................................................................
....19


THIRD PARTY DATA ACCESS ENGINE
..............................................................................2
0


ATTACHMENTS
...............................................................................
............................ 21
2


CA 02316743 2000-08-28
Introduction
As business processes and an increasing number of our customer and partner
interactions
have been computerized there has been a large increase in the amount of data
about our
customers; partners and our potential customers available to us as businesses.
With the
decreasing cost of capturing, storing and archiving this data, businesses are
now finding
themselves awash in data about their every aspects of their markets; customers
and
business transaction history.
In conjuction with the growing availability of large amounts of data that is
easy to
capture; analyze and process, there have been entire new technologies and
industries that
have been created to help businesses derive value and intelligence from this
data.
(OLAP, ROLAP, eCRM; personalization engines, targeting engines, collaborative
filtering engines, data mining) Due to the fact that businesses have an
economic and
business requirement to.
The fair information practices that define and lay the groundwork for building
privacy
into such systems have existed since the mid 70's, but due to the lack of any
legal
requirement and the added cost of building and implementing these systems the
investment in such privacy systems were largely ignored. Privacy concerns were
certainly
not a primary consideration when these systems were being developed and the
data
mining business practices that were predominant were not only completely
legal, but they
were also considered good business.
This situation has changed dramatically over the last five years. Business
processes and
data acquisition and mining techniques that were completely legitimate and
legal, are
now becoming despised by consumers and in some cases made illegal.
Companies are now finding that investments made in (personalization engines;
eCRM;
data mining; business intelligence) are suddenly at risk because of the
growing and
increasingly serious privacy issues. Companies are facing huge potential costs
in trying
retrofit their business processes to meet new and constantly changing
legislation coming
from many different state, federal and globally diverse countries. Previous
systems and
investments are now facing the potential of running starved for that
previously copious
data or potentially abandoned altogether.
While some companies have adopted some parts of the fair information practices
in an
attempt to address their customers privacy concerns, the growing focus on the
privacy
issue is causing a changing landscape that is forcing companies to constantly
redo and re-
address their privacy stances.
We believe that days of privacy policies and lobbyists answering a companies
privacy
issues are coming to an end. Privacy is now becoming a critical strategic
issue facing
3


CA 02316743 2000-08-28
every company's computerized business automation initiatives. This strategic
issue can
either be a huge strategic advantage, or potentially a company's Achilles
heel.
By investing in a managed privacy service, companies are building a business
foundation
that allows them to react, change and evolve with their customers needs with
regards to
privacy.
We employ new privacy engines that automate and accelerate the implementation
of the
core fair information practices. These engines can be customized to meet
specific
business and customer interactions; self regulatory industry privacy
practices; regulatory
privacy requirements; international privacy laws and standards as well as new
and
privacy friendly ways of delivering increased customer focused services.
Companies that are focused on their customers demands and needs will find
themselves
able to enhance those relationships; adjust and react to new data handling
practices and
turn privacy into a powerful strategic advantage.
Step 1. Data De-idenh'fication stage
The first step in our managed privacy layer is to separate and segment
personally
identifiable information out of the incoming data streams. Locating,
separating and
cleansing a data stream of any PII can be a very challenging process.
Traditional search
and replace techniques; including field and record level searches have been
found to only
locate 30% to 40% of PII in most data streams. The proprietary algorithms
employed by
us in our De-Identification Engine have been found to locate 98%-99% of all
PII. At the
de-identification stage the following steps are taken to segment PII from the
rest of the
data, and to specifically exclude the collection and storage of any data that
policy kits
define as required exclusionary data. The specific steps that are taken at the
de-
identification stage are;
~ Locating and segmenting PII from the data stream
~ Apply exclusionary data filters for any data required not to be stored
~ Organizing/Labeling and verifying privacy conditions associated with PII
~ Opt-in vs. Opt Out
~ Geographic and special policy based handling requirements
~ Reviewing and improving data accuracy and data quality
~ Assigning a cryptographically private and secured identifier controlling the
linkage of
PII and de-identified data.
Step 2. PII Encryption and Security Engine
The PII separated in Step 1 of our privacy layer, is passed to the ENCRYPTION
& SECURITY ENGINE. As a key element of all privacy systems is the protection
and
security of customers private information, as this stage we apply encryption
and security
4


CA 02316743 2000-08-28
expertise and toolkits to help secure the storage of PII inside of a companies
database.
These security mechanism secure PII from the following threats;
~ Accidental disclosure and leaking of data to external sources (e.g.
Websites)
~ Hackers and attacks against websites and theft of data
~ Internal misuse/theft of data by errant employees/contractors
'The PII Encryption and Security engine is designed to automate some of the
following
technologies and techniques to ensure that only the most secure and trusted
mechanisms
are used to protect customers data;
~ Hash, Secure one way Hash, Seeded Passphrase Security
~ 3DES; DESX; Blowfish; Diffie Helman and RSA based encryption algorithms
~ Key splitting; shared secret and computationally controlled disclosure
~ Blinded credentials; minimal disclosure
~ Secure, or anonymous transport layers (SSL; Anonymous IP)
We support and interoperate with the security features in Oracle8; Sysbase;
NCR; blahs
blab as well as most standard database security mechanisms. (Maybe mention
integration
and support for CA software and Directory Servers/Role Based security
systems).
Step 3. Data Sieve Engine
Processing de-identified data in a privacy enhanced manner, allows businesses
to
leverage incoming and legacy data while still respecting privacy standards. We
link with
eCRM; perzonalization engines; stat. analysis; business intelligence; fraud
detection;
OLAP; and data mining applications. We are able to provide those applications
with the
data they require to operate. We apply a series of data processing techniques
to ensure
that relevant data is obtained while still providing customers with strong
privacy
protections. These techniques include;
~ Data minimization and data reduction
~ Interest vector mapping
~ Preference matching
~ Data labeling


CA 02316743 2000-08-28
Overview of Invention
The key features of our invention are:
~ Privacy infrastructure solutions to enterprise customers.
~ Professional services-based sales and delivery model.
~ Striking a balance between individual customer needs, and the need to create
reusable components
~ Industry-specific toolkits which customize those components
~ The need to build in service components wherever possible such that ongoing
revenue can be generated from a given customer
~ Auditability is built into each component
More specifically our invention includes:
~ Privacy risk analysis methodologies and tools:
~ A methodology for assigning a value to the privacy risk associated with a
given set of data fields. This value is based on the probability that a given
set
of data elements can be correlated with other existing databases in order to
uniquely identify an individual.
~ Tools for automating the process of analyzing a given database, computing
the
above risk value, and performing "what iP' scenarios to determine the effect
of various data manipulation actions on the risk value.
~ Technological methods for reducing the privacy risks in business practices
by:
~ Aggregating, minimizing and vectoring data so that less granular information
is stored.
~ Encrypting private data such that one or more parties must agree in order to
decrypt it.
~ Hashing data so that it can still be used
~ Controlled data access technologies:
~ The concept of a data sanitization facility, where data in chained through
multiple parties, which each play a role in decrypting and otherwise
manipulating data without seeing the whole picture.
~ Methods by which a party can communicate with another, by physical or
electronic means, without knowing the person's identity or contact
information.
~ Methods by which a user's identifying information is separated from profile
or other information, and a digital credential is issued which allows them to
re-link the two together.
6


CA 02316743 2000-08-28
' . , ,
~ Enabling a user to control view and manage their personal information
profile
across multiple sites, without those sites being able to link their data
records to
an identifiable individual.
~ Other
~ An automated means of creating, based on industry-, legislative- and
customer-specific needs, business rules which feed the above software
components.
The software components of the solution fall into three broad layers:
~ De-identification layer
~ Privacy enhancing layer
~ Data access layer
Below, we describe the following for each software component where applicable:
~ Business need
~ Technical solution
~ Business model
~ Further links
7


CA 02316743 2000-08-28
a
De-identification layer
'the de-identification layer allows for means by which data or groupings of
data which
can be used to identify an individual is exposed and assigned a risk factor.
If the risk
factor exceeds the threshold for a given situation, various scenarios can be
modeled with
the goal of obtaining a satisfactory resolution.
DB analysis tool
Need
While the presence of some types of fields can definitively allow linkage to
an
individual's identity, the ability to link a given data set to a unique
individual is not
necessarily binary. For example, a 9-digit zip code and date of birth together
have a high-
probability of yielding someone's identity, whereas a 9-digit zip code and
only a year of
birth have a yield a lower probability.
A tool is needed which, for a given database structure, will assign a risk
factor to fields or
field combinations.
Solution
The DB analysis tool is the first step in our managed privacy layer. Using
proprietary
mathematical algorithms, and industry specific knowledge, it examines an
existing legacy
database and identifies fields or groups of fields that constitute PII.
Based on toolkits developed in cooperation with industry-specific experts, the
privacy
engineer can map fields in the customer's database to known datatypes.
Then, the DB analysis tool associates a quantitative number, called a privacy
risk factor
(PRF) to each individual field, or group of fields. This PRF is a number
between zero and
one that indicates the level of probability that a given field or combination
of fields can
uniquely identify an individual. A PRF of zero indicates no privacy risk while
a PRF of
one defines a high privacy risk. Depending on the particular customer or
industry,
different risk thresholds may be set.
8


CA 02316743 2000-08-28
Example
Suppose we have a pizza delivery database with the following three fields:
name, postal
code and telephone number. The output from the DB analysis tool might look
like this:
Fields Privacy Risk
Factor (PRF)


Name 0.91


Postal Code o.os


Tele hone Number o.50


Name, Postal Code o.98


Name, Tele hone Number


Postal Code, Tele hone Number l.oo


{Name, Postal Code, Telephone l.oo
Number}


refers to the average number of people that can be found using these fields as
search criteria.
Further Links
~ Link to people/organizations with domain-specific expertise to help build
the rule
sets we need for various industries.
~ Link to database vendors to facilitate better integration so that we can
automatically import database structures into the tool.
~ Link to third party application vendors/systems by partnering with companies
that
have created standardized databases for specific verticals so that we can
automatically map their fields to the ones in our DB analysis tool.
Privacy Resolution Tool
Need
Used to create a privacy policy to mitigate the PII identified by the DB
analysis tool.
Solution
This tool is used by a privacy engineer to create a data de-identification
policy to address
problems found in the previous step. Creating this policy is a complex process
where
each decision affects subsequent decisions. Using proprietary algorithms, this
tool helps
the privacy engineers leverage the maximum amount of business information from
the
database while also satisfying privacy concerns.
9


CA 02316743 2000-08-28
The database is said to be "free" of PII when the Correlation Risk Factor
(CRF) for every
field or combination of fields is below some given threshold. In our example
we have
defined the CRF as the inverse of the Expected Bin Size (EBS) a factor which
is defined
in the glossary. Supposing our minimum satisfactory threshold is 0.2 we would
continue
for several iterations until we create a privacy policy that would certify our
database to be
"free" from PII. We illustrate each iteration for the Pizza Delivery example,
introduced in
the DB analysis tool section.
Iteration 1:
FIELDS PRESENTED TO USER PRF User -- PRF
Before Action After


Name p.g1 1-Way Hash-


Postal Code o.03 0.03


Telephone Number o.5o 0.50


Name, Postal Code o.g8 0.03


Name, Tele hone Number 0.97 0.50


Postal Code, Tele hone Number0.99 0.99


{Name, Postal Code, Telephone l.oo
Number l.oo


Note: H[X] denotes the hash of field X.
Iteration 2:
FIELDS PRESENTED TO USER PRF user ActionPRF
Befo After
re


H Name _ -
-


Postal Code 0.03 0.03


ele hone Number 0.50 1-Way Hash -


H[Name], Postal Code o.gg o.03


H Name , Tele hone Number o.g7 -


Postal Code, Tele hone Number0.99 0.03


{H[Name], Postal Code, Telephone o.03
Number l.oo


Note: H[X] denotes the hash of field X.
Since all fields have a PRF below 0.2 we do not proceed with further
iterations.
The final privacy policy of the database is:
Name ~ H[Name]
Postal Code ~ (leave intact)
Telephone Number -~ H[Telephone Number].


CA 02316743 2000-08-28
De-Linking Engine
Need
Implements the privacy policy created by the privacy resolution tool
Solution
This tool implements the privacy policy defined by the Privacy Engineer using
the
privacy resolution tool. Unlike the privacy resolution tool which only creates
a policy,
this tool actually makes changes to the database. It calls upon the
Encryption,
Minimization, Aggregation, and Interest Vectoring engines as required by the
privacy
policy. For example, it may triply encrypt an email address for use with the
blinded
communication system described later in this document.
Example
The following illustrates the effect of the de-linking engine on a record in
the Pizza
delivery database before and after the de-linking engine. For demonstrative
purposes we
have added a "Date of Birth" field to the database.
Field Name Contents of FieldOperation Contents of Field
Before After


Name "John Smith" _ 12sh1#d'ASD;
1-Wa Hash


Telephone "505-555-1244" 1-Way Hash 72dsfi32233
Number


Postal Code "L4B 3H7" Do Nothin "L4B 3H7"


Date of Birth"12/21/1971" Minimize 1971


Full text analysis tool
Need
To protect company's from privacy concerns when sharing unrestricted text that
is not
stored as a record in a database.
11


CA 02316743 2000-08-28
Solution
This system removes PII by locating and replacing personally-identifying
information in
unrestricted text documents using techniques that extend beyond simple search-
and-
replace procedures. This minimizes risk and maintain confidentiality when
files such as
doctor's notes need to be shared with 3'd parties who do not require the
subject's identity.
The system employs pattern recognition techniques including detection
algorithms that
employ templates and specialized knowledge of what constitutes a name,
address, phone
number and so forth to automatically detect and remove PII. It must also be
noted that the
success of this system is domain dependant and that preliminary investigation
must be
conducted before its successful delivery can be promised to a client.
12


CA 02316743 2000-08-28
a
Privacy enhancing layer
The various engines which make up this layer transform data into a form which
represents a lower privacy risk. They can be run either in batch mode, or on
the fly as
new datasets are being created.
Data minimization engine
Need
To maintain important information in a DB field while keeping the user
anonymous.
Solution
This engine is used to remove unneeded information from the fields of a record
by
converting the fields to a more general or less specific form. For example, a
market
researcher may employ minimization to convert the date of birth into a year of
birth.
Industry specific minimization routines could allow a full blood analysis to
be reduced to
a simple blood type.
_ Sam le data fore minimization
- be
~~


Date of birth Zi code Income Car


315/1973 90210 $90,000 Lexus


7/2/1968 84070 $40,000 none


11 / 12/ 1975 1 O 115 $65,000 Pathfinder


Sam le data -
after minimization


Year of birth State Income Car cate o


1973 CA $75,000 - Luxury
$100,0000


1968 UT $35,000 - $60,000none


1975 NY $60,000 - $75,000SUV


13


CA 02316743 2000-08-28
Interest vectoring engine
Need
Allows transactional records to be mined for one specific individual without
knowledge
of the actual transactions.
Solution
This engine uses industry specific modules to convert a series of items into a
set of
perceived user interests. For example a clickstream could be converted into a
vector
representing a user's perceived interest in sports, entertainment, and news
based on the
frequency by which the user visits the web sites in the latter categories.
Da to aggrega tion engine
Need
Allows aggregate data to be gleaned from records when the raw data from the
records
isn't needed.
Solution
This engine converts a set of records to aggregate statistics and measures
based on those
records. Industry specific modules allow specialized aggregation functions to
be
computed relevant to a given industry.
Sample data -
before data
aggre ation


Patient name A a Sex Disease


John Doe 65 M Parkinson's


Peter Smith 41 _ M Cancer
~


Erica Peterson 19 F D cession


Jane Doe 27 F Cancer


Mark Ro ers 32 M De cession


Sam le data -
after data a
re ation


_
A a 18-30 = 2 31-SO = 2 (51 + = 1


Sex M = 3 F = 2 Lexus


Disease Parkinson's Cancer = 2 D cession =
= 1 2


14


CA 02316743 2000-08-28
Further Links
~ Link to application vendors to make sure that our interest vectors are
compatible
with those in use by market leading applications. (i.e. if most online
marketers use
the same application to categorize/target individuals, we want to make sure
that
our output conforms to that format.)
Encryption engine
Need
Used when an identifier is suitable in place of an identifier linked to
personal identity, or
when access to information needs to be restricted and only released with the
consent of
several parties.
Solution
The encryption engine can perform a number of actions:
i) One way hash functions
Allow information to be converted in an irreversible transformation from a
human
readable form to a unique identifier. For example the names of users can be
encoded using a 1-way hash function in a marketing database, thereby
transforming each name into a unique code. This would allow marketers to
profile
databases without knowing the names of the people in the database.
ii) Two way encryption
This process is used whenever sensitive information needs to be converted to a
different form. Encryption is a reversible process therefore it is only used
if the
actual information may be needed at a later time. For example, a marketer
could
encrypt the email address of people in their database before sharing their
user
profiles with third parties to ensure that the 3'~ parties do not email their
customers without their consent.
Further Links
~ Link to the systems of PKI solution providers


CA 02316743 2000-08-28
Data access layer
This is the layer where previously encrypted data is decrypted in a controlled
fashion, in
order to unlock its value. This is where we believe the greatest opportunity
exists for our
invention.
Identity verification engine
Need
A means of verifying the identity of an individual so that they can
subsequently
authenticate their identity to the holder of their data. Such verification is
needed in cases
where the data collection occurred either offline or without user consent.
Solution
The identity verification engine uses known contact or personal data to verify
the identity
of the user.
Depending on the particular customer requirements and level of verification
certainty
required, various scenarios are possible:
~ E-mail: A validation token in sent to a known e-mail address belonging to
the
user.
~ Telephone: The user is called at a given phone number on file and is given a
validation token.
~ Snail mail: A validation token is physically mailed to a known address on
file.
~ Third party database checks: The user is challenged by being asked to supply
personal information which is contained in offline databases such as credit
reports. The queries should be such that it would be difficult for a person
other
than the person associated with the data to possess the information. This
process
can occur either online or offline.
Regardless of the verification method used, this one-time process results in
an unique
credential being issued to the user. This credential is what they subsequently
use to
authenticate in order to view their own data or .
16


CA 02316743 2000-08-28
'This credential may take one or more of the following forms:
~ A PIN number
~ A username/password combination
~ An x.509 digital certificate downloaded to the user's browser
~ A Brands or other type of credential stored locally within a Freedom client
Business model
We could charge:
~ a one-time verification fee commensurate with the verification method used
and
degree of security desired;
~ ongoing service fees based upon the number of validated users.
Blinded two-way communication engine
Need
The company needs a means of communicating with a consumer whose data record
they
hold, without knowing that consumer's identity.
Solution
Any PII that could be used to contact an individual (name, e-mail, address,
phone) is
multiply encrypted using keys belonging to different entities.
When a company wishes to contact a specific individual, they forward the text
of the
message to be sent, along with the encrypted contact info, to a sanitization
facility. There,
the blinded data passes through a chain of servers belonging to the customer,
us and the
audit partner. The chain of servers serves to distribute trust such that no
one entity can
link the user's identity with other information in the database.
A marketing company wishes to send targeted ads to individuals. The
individual's e-mail
address is encrypted first with a key belonging to an audit partner, then with
our key, then
with the customer's key.
17


CA 02316743 2000-08-28
When the customer wishes to send e-mail to that individual, they send the
message and
the encrypted e-mail address to the sanitization facility, where:
1. The company:
a. encrypts the message with the audit partner's public key
b. decrypts one layer of encryption on the individual's e-mail address
c. forwards both the above to our server
2. Our company:
a. does not touch the message contents
b. decrypts one layer of encryption on the individual's e-mail address
c. forwards both to the audit partner
3. The audit partner:
a. Decrypts the final layer of encryption to reveal the individual's e-mail
address
b. Decrypts the contents of the message
c. Forwards the message to the individual in question, on behalf of the
customer
The following table describes who amongst the three parties has access to
which data:
Sees personal data Sees profile Sees message
data contents


Customer No Yes Yes


Us No No No


Audit Yes No Yes
artner


Variations on the above scenario include:
~ using the Freedom network to route e-mail where a greater degree of privacy
and
where a lesser degree of auditability is perhaps required;
~ using a similar process for the blinded addressing of physical mail.
By reversing the process, the user can privately reply to the company. This is
accomplished by making the from address in the email, {the encrypted email
address(a~private-mail-gateway.pwc.com. If each party reverses the process the
user can
securely send messages back. If we don't do this, the company could easily
find their real
email addresses by sending them a note and asking them to reply back. Further,
this
solves our problem of how to opt out. We no longer need to know the real
user's address
to opt in or out.
18


CA 02316743 2000-08-28
Further Links
~ Link to SMTP servers to allow for outbound delivery of e-mail.
~ Link to snail mail fulfillment organizations to allow for the sending of
mass
amounts of snail mail.
Business model
We could charge:
~ hosting and bandwidth fees for the servers that are part of this process;
~ transaction fees based on the volume of e-mails sent through the system;
~ licensing fees based on the number of users in the database.
Secure profile access engine
Need
A means is required by which individuals can access their own profile data.
Solution
A "personal portal," powered by our secure profile access engine, lives in the
sanitization
facility. When a user authenticates, his or her personal data, is decrypted by
a chain of
several servers and it is presented to the user. The personal portal could be
cobranded
(customer + us).
Business model
This perhaps the most interesting business opportunity over the long term.
With a critical
mass of users updating their profile data at facility that is essentially
controlled and
operated by us, we are:
~ reinforcing our brand as the protector of personal information;
~ placing ourselves in a position where we are the intermediary between the
user
and all their privacy-sensitive interactions online, and can therefore try to
hook up
these users with other corporate customers.
19


CA 02316743 2000-08-28
We could make money:
~ by charging an ongoing per-user fee to manage this process;
~ via customer acquisition fees and/or revenue sharing arrangements when we
refer
a user to a new partner within our network.
Third party data access engine
Similar to the above, but we provide a means whereby authorized entities can
decrypt
predetermined sets of data under specific circumstances.
The data access engine, which also resides within the sanitization facility,
brings together
the multiple keys required in order to decrypt data.
Business model
Since we will hold at least one of the encryption keys, we are in a position
to force
companies to come to us any time they want to unlock data.
The business potential here is limited only by the value that the company
places on the
particular data they need to access or manipulate.


CA 02316743 2000-08-28
Attachments
1. Case Study
2. Technical Solution Overview
21


Image


CA 02316743 2000-08-28



~


. _



O



O


G~



U


3


c~3


t~ ~ '~c.~


. 4~



~ o


.



. U
t~



O


~


..~i N 4~ ,


. .
.-


O



ct3 4 .'~ .~ O
~~


.



I I


0 ~





CA 02316743 2000-08-28
h
.
4,~
~_
ct~l 4~ O v~ v~ .O
O
W ~ ~ N
tr . ~' v ,_,
i~ ~ ~ -r-~ LIB _CC3 '""
'cj .~'
. '-, ~ ~ j ~ ~ cc3
3
w ~ ~ w
i
.~
C~


CA 02316743 2000-08-28



V ~ U



0 0


V U



N '""'



U


.
r..a


U



4~



+-


'



O


U



O



0



0



W '



'



o W ~



''"'"


C~ ~



,..,


,~


0


CA 02316743 2000-08-28
~. ~ ~ ~ ~ ~ ~ y_
CD O
c
O ~ Cø!~' O 'O\.
<C
O ~. ~ O. O
. ~ '"~ ~n O
O
a' O
n
0
0
0
0
--,s
"' '~C ''
p'
('1~G ~ ~C
O
tD ,r .
O
V


CA 02316743 2000-08-28
~1
CD ~ CD
O O
~- CD ~ ~D
O r-r
O ~ "-~-' n ~ ~ ~h
,.~. '' ~7 ~,, . ~ f.-. ,,..r
c'p ~ . ~ O ~-.r O N
O
cu ~ ''~ p ~ ~ D
O ~ O
Cp
O O ~-,- O C.
,..r .
r'~ ~ O "'~~ ~ ~ Cp
V
.
CD
.. '""i r-.~ CD
O
CD ~- n
'~C
CD
U' O
fD
'~ ~' CD
O
~,-h


CA 02316743 2000-08-28
O
O ~ ~ ~ .~ ~,.~ ~ ~ 0
Cls ~ C'~D r~' BCD r-~ ' ~!
O
n ~ n CD ~ ~ ~ ~ ~ '~"
D
cn ~ ~ ~ ~ ~, ~ ~ ~ ~1
° ~ ~ r.~r, W ~ ~ Q ~1
a4
CD . CD ,~ ~ ~ ~
r-r ~°-h rig C'~D
r~ ~''' cP rr~ ~ D
CD ~ r-r. ~ . ~!
' ~ ~ O
O cP
'~ ~'
r+ ~ ~ ~. "'fir '
CD ~ ~ '~ O
CD r~ O '"~~" ~ ~
O ~h C . ~.' O
CO
C_D n ~ r~~.,
. ~-. ~C "'~ C~
..
C
CI4
0
0


Image


Image

Image


CA 02316743 2000-08-28
~ ~ ~
o ~ C~ r''
~_ ~ H
CD C~ls
C_D n
-v
O
;_~ ~ cr~ v, '
0
~~~v~ o
~, ° ~-
n ~ . ~~ ~ ~ f...
0
O
c~
b~~ ~:. ~ ~ ~ ~.
o ~ o
~4 cry
CD
CD
,..r .


CA 02316743 2000-08-28



tv ~ ~ lD
. ~ ~ x


~ O c' ~ tD
. D


fP ~ '


~


.' r
~


'< i ~ i



z ~-



~


~
' r o
:


~- ~


_ ~_ ~ ~_
_._ ,M$~~


d
~~


a= .~ ~~


~, ~~ _ .,~



~,
-


n



1



o


,~,





CA 02316743 2000-08-28



t.~t'
r c;; ; u; ~ 'a


. t~ J tl..p, ..~ a c
~ ~ < o c~ O
~
~


co ~ ~ o ~~. ~ c II
oc o
o


.. ~ c~ a.
~c II ~ ~ ~
..
.


z ~
o


ac ~ ~7 .~
a~ . ,.< 0 0


c ~ ~ ~
If


~,-t C'~~ ~ ~ ~ o CD
~


- ~ ~ I ~crj
- ~ UaO
.~s.
~


_ I f ~ I
"*:
p
~



,.~ ~' ;~a:
II '
7


N ' n.~.'-~
z


j n ~ O x


t V~ ~r
U~ 'C
~
o


y O


a'


O


cn
O



~


i Sv a4
~~-s C O p cD


~D
I
I


II a n d


n
~h


0



ca


If


o


a4
Q a


0
0





CA 02316743 2000-08-28
1 1
C~
p
n ~ ~~ O
~~ ~ O
,...~ . ~ . ~.' ~ ,~,
CD CD v~ ~ c"D
~. ~ ~.; c~
-' cc ~ ,... . 'C N~
c
o ,,.~~ d
n
o ~ ~i
.wSS, ~ o'
~ ~ ~c
c~
o ~ ~''
~, o cv
o ''h
c~
o
0
c~
V


Image


Image


CA 02316743 2000-08-28
1
O 4~ r~ ~; "~ ~.
CD ~ ~-r ~ ~ ~ r+
r+ ~ C
'"t~ ~ . ~ ~ ~..' c"D ~ c7 ~ ~ c D D
,...r c'D ~ . r-,
C ~-cr~~~~.~v~0
v, ~ cn ~.. ~ , '"'~ '
~C ~ ~ ~ ~ ~.
~ ~ ~ CD
Q
''D~'c~
O ~ ~ . C"~ ~ CD v~'~ N
CD r-r. "~ O ~. ~
rr~ ~ ~ ~.. r~ O ~. CD
CAD v~ C . CD C~'D C~'D
~L ~~-r ~., O
CD ~ ~ ~ ~ CAD
~ cP
o ~- ~ ~ o
0 o cv ~ . r-+
c~ o ~ o
r,. ~ ~ o
"~ o o °
o ~ ~. c~ c~ o c
ca
C
t~'Q ~p ~ CD
,.-, . ~ ~--f.- _
(r~q +i C"~D ~ ' r-~- CD
~.
0

Image


CA 02316743 2000-08-28
_,
0
0
0
0
c
l~-I
Ongoing pri~racy and technology audits


CA 02316743 2000-08-28
___
n
~D C' . ~-r
~_ ~' . O ~
CD N ~ (~. r..r . ~' f..r ' ~ Q
0 0 - _.' ~ ~ f~ ~~ rn '~
a~ 00 0 ~ ~ ~ . ~ ~ CD
'"'r (~q ~ ~. ~ . y
c
o ~ ~ ~ o t"
0~ 0 ~ CD
0
a o '~ ~h Cr' CD fp
o: m ~' ~ Q
v'.
~ . O CD ,,.0
r"f' ~ ~ ~ ~~
o c .~ ~ ~1. O cn
°, ran c~ O cn
V
c ~,~ ~ ~ ~ (D
rt
''r '
3 r ~p ~ ~ ('
o v~. ~ _~
I~ v V
e--t- _~
~ ~ Q
,~ . ~.,
C. 'r. O
' ~ o
""''
V


CA 02316743 2000-08-28



~a
.



a


~.


CD CD ~p ~ ~ ~ ~ ~ 1
o


a


~:


o ~ ,~ ~ ~. , w
~


~ r~


~...,.~~



~ ~ ~' ~ i ~


o o ,


a
c


~ ~ ~ ~ ~ ~ ~s cv o
.



o cry


~


~ ~' o ~ a


h


v


C_D Cp
~r


~ ~


~


( CD ~t
'~


,....
-' . O ~
~'


~


n
d'
4


r~ ,~.



N CD



o ~-.,' c
.



c



0



0





CA 02316743 2000-08-28
~
r~ rte,
_~ o 0

n n ~ ~ c~ ~'
~ ~ ~ CD
m rn
o c~ '"~
c,~..~'D p . . ~ ' o c'D O
~. ~D' cro ~--~s ""d
,~. ~'; ~ r+
~,r v_o
CD
cr, ~-~ ~ l5, ~ CAD ~ ~ ~ 0
.-, .
"~ ~-~. V~ C~ O
'p_ ~ CD-.
N
r-+
~a
70 ~ 5 , c'D
~ ~ ..
~r-
cn. ~ "-~'~h
Cr O
o ~-~ o t"~ N
o~ o iv.~
n~ o w -rt ~ ~-.~ CD
CD
~ !Cp C1q
CD


CA 02316743 2000-08-28



1



~1


Value
to
Business



C


z


~ o
s


~ C



. ........ , ........ ..


. ' '-~
'


C ~ ~
~


m
y


, r-~r
~ ~ r
,Z
~


~
~ cC! spa,
pip ~
~ ~


t_:
~


~= N, ~
v 3,
~,


'. A c~D rn~ ~c~~
~ ~


z~,.~
'~ o ~~ ~o
n


~ ~,


z-N
:~ ~ ~~' ;


~ m




Image


CA 02316743 2000-08-28



~1



1



I~I ~ ~ ~ ~ ~


-~1 1
n 1- ~
~


D -h, ~ ~ ~1


~ " ...,


o ~ Q ,~ ~ ~ ~


n ~ o~ 3 c~ ~ o


D -~- ~ ~' v~


...


w ~ ~


C~ ~, r-r-
~


o C'
D



o ~ CD



t7



v,


O



p


tfo p D xs


~ D ~


o ~ ~n


ou


m
o ~


n cn
~


a. m _,


n o


0


o


o o ~~ o


~D


o ~ ~



0


0


o


0





CA 02316743 2000-08-28



n
~ ~ rn c~ o ~



'"


~t tp ~ ~ ~ ~ o ~D D


~' ~ ~ ~ c . o


~ ~


a4 n
,. ..


a



~ a


~ c



o N '~ ~ r'+


OU ~ ~ tG -4~(~ ~. C~ CD



d



~ _


~J
II .



v


~. a


_ "'~ ~.-
'


-n ." 3 ~ tn h



N ~



~''


II N
N


n


a



0
~ ~


...
.a ~ ~ ~ ~. cro


+ ~
II


~ ~ ~ ~ ~


o.



I.



c





CA 02316743 2000-08-28
1
-P W _N
v v '~.N., ~ ~ ('~ ~
I~ ~ l~D O~0 0~0 ~ ~"~ ~ E"'~
d ~-fit- ~ v~, D O
cp ~ .. ~ "'~ D
o ~ e~-r- ~ _~ W
'""T' ~',~
,C ~ o ~ ~ ~ ~-+ CD
~p v~ ~ n n ~ ~ ~ C7
~ ~- ~ o ~fl..
n O
"~. '~r'~' Sz° o r-~- h-~'D't
-'' 4 ~
UG -c3 ~ ~
.. ~ c~ o
~ " .. G1
O
o °ccn° ~ ~ v H7 H'h
y cr
n
o c~, y ~: ~D
V w ono v
ui tn o
0 0 0 0 _ ~.~
CD


CA 02316743 2000-08-28



t


,.... ~
. .



r-~ e-+



~ '


~. ~


a


C7 r-r
"t



V~ ,.i



fv
~


e-
f


s-c
C ~ ~
. '



.



0


0
~ 7


C-~ -I- r-~
e-



~


CD ~r


f..
n .



0



0


0



n



iv ' ~ ''""S
'


~ ~ ~
.



N


V ~ V


V


~i
.





CA 02316743 2000-08-28
1 1 1 1
i I
1-~~-r
O ~ e-r ~ ~ ~ " ~ ~ h-r '
N ~ ~- ~ ~ ~ r"'r'
O c~ ~ ~ ~ ~ , cv
O ~ ~ ~ ~ ~ n n
c~ ('D ~ ~ ~ CD
c~
o ~ ~ ~ Q c~
_~
O
c,~r~~ ~~'p~,~n
rn
CD O n n
fD
O ~ CD ""~ ~ . CD
C~"~D r"~
. ~ ~ ~.
,..,r .
p
o ~ O
,,~~ ~ ~ o
~ ~o
CAD ~ . c~r~
c~
V


Image


CA 02316743 2000-08-28



C


0 0 o


~a



~


_.


r


0 0 ~ o'



0



~ ~ c


.



cr,
a



c ~ ~ c~


...,.



0


0



c



~.



0


.


~..



,..r
.


0



V




Image


CA 02316743 2000-08-28
r-~
0 ~ O
C~ o ~: ~ ~ ~ p
o ~ ~ y
p
V
~v
. C% ''~ CD .~''.,, CI"~
,"d ~ ,!~ .
W
o ~ ~'
c
~_
N
0
O ,..,r .
~, cn
c~
,r . CD
O
'~ c~
n
0
V ~
'.~i .
~.


Image


CA 02316743 2000-08-28
~ ~
~ G I I I ~ O ~ ~''
'~ C~ id ~"~ ~ O ~ '"'~ c n ~ ~ o
~ ,~ . ,.~.
"t~ ~ CD ~'D ~' 1~:
~ ~ r.~ n ~ n ~ ~ ~ O
',~ ~. ~ ~ ~ ~ ~ O
n c~ ~ "'r_~ ~ '~. O
O
Ci~ ~ O ~ '~ O
a,~ ~ ~ ~ ~ rr~ O O
~ .""~ ~ h~~~ n ~ ~ O W
t~D ~ ~q
O cv N ~ ~ ~...,.
b~ ~' n ~ ~ v~ ~' ~- O
O O ~? t~ C'~D,
v; ~ ~ ~ CD ''r .
r~ ~ r~ O ~'~',, ,"~ ~"~-'p (rQ
w
S~ r+ N :..r
i~ r-~'~ ~4 CD
C7 ~ CD
cn i~ ~",
ca'
~'u "'~ ~-.
O
"~ N
W..i ~ h.r .
~-r-
O
CD


CA 02316743 2000-08-28
1
x " ~ Jr
~ d 1~ f .,' 3 ~ f
a,~ n s ~8'- A
7~ ~~,~Y
1 ~a~ 5~&~. °~ ,
i .. , A. b 3 '
k ~ p ~ d S ~,,~ ~ ~,''
~.i
$e ~"~. I ~ e~
~ws: ~ ~Y
~ '~; ~s'~,
x ,., r '~ '.~ ,t':
~ ~k" ,
fir. ~ .~;.. t'~
a ~~C~, a
r
:a f ~ c ~ ' ~ ~..,~~"t,,r ': '.
v .
~& 1 ".a~
>;:
i ~ ~~ ~
'_ x ~s ,~r~& s3-"_.
~ ~ ~ _/
n N ~ ""~ ~ ''~
0 ~ ~;, ~ ~ ~ p
D~ ~'~~CD ~nj
CD iD ~, O ''"~' CAD v' CD
y ~ ~ ~D v~ ~_.
r~~
c~ ~
w ~ ~~
cu
~ ~s
v
Q",
~ c~ ~ cr~ x
c~ ~. ~ ro c~ v
O '"
'-r' ~q ~-~- v~
~,. ~-. c'D


CA 02316743 2000-08-28
d "~ N o
i ~ ~ ~ i ~. ~ ,
d ~ ~ . . ~.~ o ~ ~ o
~ ~ ~ ~. ~ ~ o ~ o
p o
,.-~., o ~-. o ~ ~ ~ v~ ~ O C'D
O ~ "~ . ~ n ri ~ ~ . ~ O ""~'~h
C ' CD ~ c~ ~ ' ~ ~ ' U~? ~ , Cr4 .~~.
vs x ~ ~ "'r . N ~ CD
a O ~ "-t
c~ ~ ~C ~ CD ,r .
V
CD CD ,,~~ ~ ~ C/~
a ~ _C~
n ~ "~ CD ~ ~
c O
w
0
° ~ ~ ~
cP


CA 02316743 2000-08-28
d 1 I I I ~
v~ ~ ~ ~' o
C~ c~ ~ ~' ~ "'~ o
0
. CD CI4 d'Q ~,~"~ C~'D ~ r.~.~ ('C'p ""~~ . "'~ ~..~ y
~. ,~.0 ~.. ~C CD ~ ~ CD
~'4 ~ d'4
"'°~ C~' ø_~
O
'"° cv
O ~ ~ ~ n ~ ~ '~ ~ '""C~
n ~ O ~ ~. ~ ~ .m
~ ' CD
0
p ~ ,..r . ~ fT4 p, ,...,
°""'~ v~ O ~ ~ cry O
c
~ N
O ~"D ~ cnN O
,~...~.~
,r . 'I
O O
O
c~°
c~--w- ~ ' ~.. c~
r~r~ O c'~~ Q+ ~ ~..
~ CI~~~.
~ o
,r . ~~., "


CA 02316743 2000-08-28
o I I ~'
~d ~ ~: a
o ~ o '-~ ~ to c~ o'
---r ~ . ~ ~ a Cr4 , "'~ . C~ ~..,, .
cry ~ . ~ ~: ca c~ro o ~ o
p,
n ~ ~ O c ~ y
cw _~
., ~~.. ~ ~ ""
O ~ C""';'D ('D rr~ ~
,~.~ rn
.. ~. ~. o
~. ~ ~ o ~a
cry ~ ~
' ~, o ~ o
c~
.
0

Representative Drawing

Sorry, the representative drawing for patent document number 2316743 was not found.

Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date Unavailable
(22) Filed 2000-08-28
(41) Open to Public Inspection 2002-02-28
Dead Application 2003-06-12

Abandonment History

Abandonment Date Reason Reinstatement Date
2002-06-12 FAILURE TO COMPLETE
2002-08-28 FAILURE TO PAY APPLICATION MAINTENANCE FEE

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Application Fee $300.00 2000-08-28
Registration of a document - section 124 $100.00 2000-10-18
Registration of a document - section 124 $100.00 2002-08-08
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
ZERO-KNOWLEDGE SYSTEMS INC.
Past Owners on Record
FAVVAS, GEORGE
HILL, AUSTIN
KATIRAI, HOOMAN
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Cover Page 2002-02-01 1 19
Description 2000-08-28 61 3,226
Correspondence 2000-09-15 1 2
Assignment 2000-08-28 3 102
Assignment 2000-10-18 3 106
Correspondence 2002-03-08 1 20
Assignment 2002-08-08 31 1,425