Language selection

Search

Patent 3104100 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent Application: (11) CA 3104100
(54) English Title: PATIENT-FACING DIGITAL PLATFORM FOR HEALTH LITERACY AND NUMERACY
(54) French Title: PLATE-FORME NUMERIQUE ORIENTEE PATIENT DE LETTRISME ET DE NUMERATIE DE SANTE
Status: Dead
Bibliographic Data
(51) International Patent Classification (IPC):
  • G06F 16/20 (2019.01)
  • G06Q 50/22 (2018.01)
  • G16H 10/60 (2018.01)
  • G16H 20/10 (2018.01)
(72) Inventors :
  • POWELL, ROBERTA D. (United States of America)
(73) Owners :
  • POWELL, ROBERTA D. (United States of America)
(71) Applicants :
  • POWELL, ROBERTA D. (United States of America)
(74) Agent: NORTON ROSE FULBRIGHT CANADA LLP/S.E.N.C.R.L., S.R.L.
(74) Associate agent:
(45) Issued:
(86) PCT Filing Date: 2019-01-27
(87) Open to Public Inspection: 2020-01-02
Availability of licence: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): Yes
(86) PCT Filing Number: PCT/US2019/015318
(87) International Publication Number: WO2020/005321
(85) National Entry: 2020-12-16

(30) Application Priority Data:
Application No. Country/Territory Date
62/691,310 United States of America 2018-06-28
16/142,911 United States of America 2018-09-26
16/173,282 United States of America 2018-10-29

Abstracts

English Abstract

A patient-facing digital platform designed to promote health literacy and numeracy that helps patients access, interpret, process and contextualize personal health data so that they may manage their health conditions. The platform interprets and simplifies personal health data into simple illustrations; provides substance-interaction and food-interaction information; offers health-education videos and dietary recommendations; and translates medication information into various languages.


French Abstract

L'invention concerne une plate-forme numérique orientée patient conçue pour favoriser le lettrisme et la numératie de santé qui aide des patients à accéder, à interpréter, à traiter et à contextualiser des données de santé personnelles pour qu'ils puissent gérer leurs états de santé. La plateforme interprète et simplifie des données de santé personnelles en simples illustrations ; fournit des informations d'interaction de substance et d'interaction alimentaire ; présente des vidéos d'éducation de santé et des recommandations alimentaires ; et traduit des informations de médication en diverses langues.

Claims

Note: Claims are shown in the official language in which they were submitted.


CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
Patient-Facing Digital Platform for Health Literacy and Numeracy
[0001] This application is a continuation-in-part application of U.S.
Patent Application No.
16/142,911, filed 09-26-2018. This application claims priority to provisional
application No.
#62/691,310, filed 06-28-2018.
CLAIMS
1. A system for interpreting and managing drug interactions comprising:
a user interface for choosing an operative language; and
subsequent operations conducted in said operative language; and
a user interface for manual data entry of medications used by a patient; and
a user interface for manual data entry of substances used by a patient; and
information derived from manual data entry is converted to non-transitory
computer-
readable medium storing instructions; and
said instructions look up and display medications in comparison to all
medications and
substances entered as used by said patient; and
said instructions look up and display medication-to-medication interactions:
and
18
SUBSTITUTE SHEET (RULE 26)

CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
said instructions look up and display medication-to-substance interactions;
and
said instructions convert medication-to-medication and medication-to-substance

interactions information into a color-coded graphic representation; and
said instructions look up and display text-format explanation of said
medication-to-
medication interactions and medication to substance interactions as depicted
in said graphic
representation; wherein
the medications entered by manual data entry are interpreted and compared for
medical
interactions, substance interactions, the results displayed as a graphic image
in combination with
explanation of the graphic in text format in the operative language.
2. The system for interpreting and managing health information of claim one
further
comprising:
a user interface for data entry by way of scanned image.
3. The system for interpreting and managing health information of claim one
further
comprising:
a user interface for data entry by way of barcode scanning.
4. A system for interpreting and managing diet and medication interactions
comprising:
a user interface for choosing an operative language; and
subsequent operations conducted in said operative language; and
a user interface for manual data entry of food items; and
19
SUBSTITUTE SHEET (RULE 26)

CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
information derived from manual data entry is converted to non-transitory
computer-
readable medium storing instructions; and
said instructions look up and display food items compared with all medications
and
substances used by said patient; and
said instructions look up and display interactions between food items and
medications
and substances entered as used by said patient; and
said instructions look up and display food and medication and substance
interactions in a
color-coded graphic representation; and
said instructions look up and display text format explanation of said food and
medication
and substance interactions as depicted in said graphic representation; wherein
the food items entered by manual data entry are interpreted and compared for
medical
interactions with medications and substances, the results displayed as a
graphic image in
combination with explanation of the graphic in text format in the operative
language.
5. The system for interpreting and managing diet and medication interactions
of claim four
further comprising:
a user interface for scanned image entry of food items and medications and
substances.
6. The system for interpreting and managing diet and medication interactions
of claim four
further comprising:
a user interface for data entry by way of barcode scanning of food items and
medications
and substances.
SUBSTITUTE SHEET (RULE 26)

CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
7. A non-transitory computer-readable medium storing instructions that when
executed by a
computer cause the computer to perform operations of a method comprising:
providing the selection of medication assistance; and
displaying medication to be evaluated; and
displaying substance to be evaluated; and
displaying manually entered data; and
displaying camera-scanned data entry; and
displaying barcode-scanned data entry; and
presenting a medication-information page relating to the entered data; and
displaying explanation of the indications and contraindications in text format
in the
operative language; wherein
a medication is described and a mode of data entry is chosen, and medication
information
and substance information is displayed, and medication and substance
indications and
contraindications are displayed
8. The non-transitory computer-readable medium storing instructions of claim 7
that when
executed by a computer cause the computer to perform operations of a method
further
comprising:
providing the selection of health values; and
21
SUBSTITUTE SHEET (RULE 26)

CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
displaying vital sign to be evaluated; and
displaying choice of voice-data entry; and
generating a color-coded graphic representation from the entered data; and
displaying action steps in text format in the operative language; wherein
a health value is denoted by a chosen vital sign and mode of data entry that
is
converted to a graphic analog representation with textual explanations of the
vital
sign data and action steps recommended.
9. The non-transitory computer-readable medium storing instructions of claim 7
that when
executed by a computer cause the computer to perform operations of a method
further
comprising:
displaying substance interactions in text format in the operative language.
10. The non-transitory computer-readable medium storing instructions of claim
7 that when
executed by a computer cause the computer to perform operations of a method
further
comprising:
displaying dosage and administration information in text format in the
operative
language.
11. The non-transitory computer-readable medium storing instructions of claim
7 that when
executed by a computer cause the computer to perform operations of a method
further
comprising:
22
SUBSTITUTE SHEET (RULE 26)

CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
displaying warnings and precautions in text format in the operative language.
12. The non-transitory computer-readable medium storing instructions of claim
7 that when
executed by a computer cause the computer to perform operations of a method
further
comprising:
providing the selection of diet assistant; and
displaying a food item to be evaluated; and
displaying manually entered data; and
displaying barcode-scanned data entry; and
presenting a dietary information page relating to the entered data; and
displaying explanation of the potential allergens, substance interactions and
dietary
information relating to said food item, in text format in the operative
language;
wherein
a food item is described in a mode of data entry is chosen and potential
allergens,
substance interactions and dietary information is displayed.
23
SUBSTITUTE SHEET (RULE 26)

Description

Note: Descriptions are shown in the official language in which they were submitted.


CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
Patient-Facing Digital Platform for Health Literacy and Numeracy
[0001] This application is a continuation-in-part application of U.S.
Patent Application No.
16/142,911, filed 09-26-2018. This application claims priority to provisional
application No.
#62/691,310, filed 06-28-2018.
TECHNICAL FIELD
[0002] The invention relates to systems and methods of clarifying health
information and
more particularly to clarifying, translating or simplifying medical
information for laypersons.
CPC schemes may include: Patient record management; Office automation, e.g.
computer aided
management of electronic mail or groupware; Time management, e.g. calendars,
reminders,
meetings or time accounting; Social work; ICT specially adapted for the
handling or processing
of patient-related medical or healthcare data for patient-specific data, e.g.
for electronic patient
records; and Computer-assisted prescription or delivery of medication, e.g.
prescription filling or
compliance checking.
BACKGROUND
[0003] Most patients in the United States lack sufficient understanding of
the health
information around their diagnoses and conditions. Health information includes
prescription
dosages and instructions; lab reports; patient literature; prescribed
medications; over-the-counter
medications; street drugs; possible interactions between drugs and with
certain foods, alcohol;
and warnings and side effects.
1

CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
[0004] According to Communicate Health (communicatehealth.com), "Only 10%
of adults
have the skills needed to use health information." The remaining 90% lack the
knowledge to
understand and con textualize health information, The Healthcare Information
and Management
Systems Society states that "The ability to contextualize health information
is a learned behavior;
acquired through formal instruction or in medical/nursing school. Poor health
numeracy and
literacy skills are exacerbated by the lack of patient health education,
customarily provided by
registered nurses, However, due to financial constraints and a general nursing
shortage, often
nurses have no time to provide health education to patients. As a result, the
interpretation and
contextualization of health data is predominately performed by the clinicians
(MD, NP, PA),
who spend less than ten minutes face-to-face with their patients, leaving
little time for education
and dialog." As a result, patients find themselves unable to make informed
decisions about
dosage, or to adhere to a prescription regimen.
[0005] Americans of various educational levels face difficulty
understanding written
instructions or warning labels. For example, a patient may not know that the
written prescription
instructions "Take 3 times per day" actually means "Take every 8 hours."
Patients may also be
unable to fully understand the implications of their diagnoses or health
conditions, and may
consequently fail to make appropriate lifestyle and behavioral decisions. The
result is worsening
conditions and, in the language of hospital administrators, poor patient
outcomes.
[0006] Healthcare consumers depend on clinicians or pharmacists to identify
medication
interactions and/or contraindications. When that fails, there is no easily
accessible and reliable
tool that interprets and/or clarifies medication instructions and
interactions.
2

CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
[0007] According to a recent study published in the journal Clinical
Toxicology, "There is
room for improvement in product packaging and labeling. Dosing instructions
could be made
clearer, especially for patients and caregivers with limited literacy or
numeracy. One-third of
medication errors resulted in hospital admission." Studies have shown that
patients with poor
literacy have difficulty understanding medication labels.
[0008] The problem is more acute among low-literacy patients and patients
for whom
English is a second language. This sector struggles to interpret health data
much more than those
versed in healthcare or those fluent in English,
[0009] According to Univision, the Hispanic population alone accounts for
over $23 billion
in prescription drug sales in the United States annually, yet few, if any,
pharmacy chains
translate the medication labels or instructions to Spanish. The U.S. Federal
government does not
require pharmacies to translate prescription medication labels for non-English
speakers. There is
no easily accessible and reliable tool that translates, interprets and/or
clarifies medication
instructions and interactions for Limited-English-Speaking Patients (LEP) or
those who do not
speak English.
[0010] Non-prescription or "street" drugs and/or alcohol are sometimes
taken simultaneously
with prescription drugs. Most patients are unaware that any two of these drugs
may interact,
sometimes dangerously. Nor are patients aware that street drugs and alcohol
may interact with
each other, or be contraindicated with an existing health condition. Increased
cannabis use in
states where it has been legalized warrants assessment of contraindications
and interactions with
other drugs. The use of opioids presents an additional example of the use of
non-prescription
drugs.
3

CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
[0011] Polypharmacy is the concurrent use of multiple medications by a
patient. In a 2014
report the National Institutes of Health (NIII) stated that "polypharmacy,
defined as the use of
multiple drugs or more than are medically necessary, is a growing concern for
older adults."
Older adults with cognitive decline are particularly vulnerable to incorrect
medication self-
administration. According to the NEH, "Specifically, the burden of taking
multiple medications
has been associated with greater health care costs and an increased risk of
adverse drug events
(ADEs), drug-interactions, medication non-adherence, reduced functional
capacity and multiple
geriatric syndromes."
[0012] Health literacy is the ability to grasp and interpret health
information and data to
make health decisions. Health literacy includes the elements of aural
literacy, print literacy,
numeracy and eHealth literacy. Aural literacy is the ability to understand
what is heard. Print
literacy is the ability to understand the written word or to write Numeracy is
the ability to
understand numerals, calculations, logic and interpretation of numerical
content. E-Health
literacy refers to the ability to navigate web-based and computer-based
content.
[0013] Numeracy, in general, refers to the ability to use mathematical
concepts and
methods. Innumeracy, in general, refers to the inability to use mathematical
concepts and
methods.
[0014] Health numeracy is the capacity to access, understand, process and
interpret data in
order to manage one's health or to make health-related decisions.
[0015] The self-management of chronic disease requires adequate health-
numeracy skills.
Health innumeracy may result in a patient's inability to interpret and
contextualize data about
4

CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
their health; a difficulty making informed decisions, which can lead to a
worsening of symptoms
or health conditions.
[0016] In the context of this disclosure, "medication" refers to vitamin
supplements, over-
the-counter (OTC) medications and prescription medications. "Substance" refers
to any non-
prescription medication; alcohol; street drug; legal or illegal drug.
[0017] A "machine-readable medium storing a program for execution by
processor unit of a
device" is commonly referred to as an application or app. Hundreds of apps
offer health
information and maintenance, but each app is specialized and limited by health
condition. For
example, blood-pressure monitoring, glucose-level monitoring, calorie counting
or exercise
regimentation apps are abundant in the field, but none provide qualitative or
quantitative
interpretation of health values or medications nor do they warn against
potential interactions.

CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
SUMMARY
[0018] Q2Q is a patient-facing digital platform accessible via a
smartphone, tablet and
computer that helps people access, interpret, process and contextualize
personal health data so
that they may manage their health conditions. The platform interprets and
simplifies personal
health data such as vital signs and lab results; converts health data into
simple, color-coded
illustrations; and explains particular health information through animated
videos. It checks for
medication interactions between prescribed drugs and "street" drugs such as
cannabis and
opioids, as well as food, alcohol and other substance interactions; interprets
nutrition labels as
they relate to chronic conditions; suggests comparable medication alternatives
to contraindicated
medicines or substances; and translates medication information into various
languages.
[0019] The platform returns information about drugs, interactions, side
effects and
prescription dosages, as well as information about chronic or acute health
conditions. It offers
relevant health education "explainer" videos about lab results and vital signs
as well as evidence-
based, relevant health education with behavioral, lifestyle and dietary
suggestions. In instances
of dangerous interactions or dosages the app emits audio warnings.
[0020] Q2Q integrates with electronic medical records (EMRs) through their
APIs and via
secure login.
[0021] Q2Q's "dashboard" window includes health numbers such as past lab
values as well
as current medications that may be downloaded from the patient's electronic
health record.
[0022] In some embodiments the platform includes a program for receiving
input in various
ways, including:
6

CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
[0023] Manual entry, via keypad, keyboard or similar text-entry means;
[0024] Scanned barcode entry, via camera;
[0025] Voice entry, via microphone;
[0026] Automatic download, via electronic medical record or patient portal.
[0027] The Q2Q platform is multilingual; it accepts and delivers
information in multiple
languages via text or voice input. It also translates information into various
languages using the
Google Translate API. In some embodiments, the language used for information
entry is
specified by the user; in other embodiments the language is recognized by the
program in the
app. One skilled in the art understands that information typed, scanned,
spoken, or downloaded
may be interpreted by a program to determine the language of the information.
Once the
language of the information is determined, data is output in the same
language. Alternatively a
user may choose to have that data translated to another language.
[0028] Q2Q uses artificial intelligence (Al) to analyze entered data, such
as patient history,
to interpret and extract values. Entered data is captured in a database. It
uses character and voice
recognition to extract relevant values from photographs of patient lab reports
and verbal
inquiries; analyzes extracted data; and presents information in user-friendly
graphical elements.
[0029] One skilled in the art understands the ability of Alto recognize
spoken words,
scanned images, or text and convert the information to a machine-readable
medium.
7

CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
[0030] In another iteration of the embodiment, data entered into the app by
the
aforementioned methods is analyzed and converted into a simplified visual
display of a patient's
medical history, represented by text, graphic elements and audio prompts.
[0031] Through a text-messaging component, the app communicates alerts to a
user's
specified responsible parties (such as family members or friends). An example
of data
communicated would be warnings of high blood pressure or low blood-glucose
levels or of a
dangerous drug interaction. This feature may be activated at the user's
discretion.
[0032] Other objects and features will become apparent from the following
detailed
description considered in conjunction with the accompanying drawings. Drawings
are intended
to illustrate rather than define the limits of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0033] To assist those of skill in the art in making and using the
disclosed invention and
associated methods, FIGS. 1 ¨ 5 show the user interfaces of an example
embodiment of the
present disclosure, as shown displayed on a provided smartphone.
[0034] FIG. 1 is a plan view of a user interface screen as shown displayed
on a provided
smartphone.
[0035] FIG. 2 is a plan view of three related user interface screens.
[0036] FIG. 3 is a plan view of three related user interface screens.
8

CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
[0037] FIG. 4 is a plan view of three related user interface screens with a
graphic display
interpreting results of entered data.
[0038] FIG. 5 is a plan view of three related user interface screens
showing interpreted
results of entered data.
[0039] FIGSs. 6 ¨ 13 show the user interfaces of a second example
embodiment of the
present disclosure, as shown displayed on a provided smartphone.
[0040] FIG. 6 is a plan view of a user interface screen of a second
embodiment of the
disclosure.
[0041] FIG. 7 is a plan view of three related user interface screens of the
embodiment of
FIG. 6.
[0042] FIG. 8 is a plan view showing results of user-entered information of
FIG. 6.
[0043] FIG. 9 is a plan view of a user interface screen of the embodiment
of FIG. 6 in which
an example of a search result appears.
[0044] FIG. 10 is a plan view showing translation options (at the top of
the screen) of a user-
interface screen of the embodiment of FIG. 6.
[0045] FIG. 11 is a plan view of a text-magnify option (at the top of the
screen) of the
user-interface screen of the embodiment of FIG. 6.
[0046] FIG. 12 is a plan view of an interpretation feature of the user-
interface screens of the
embodiment of FIG. 6.
9

CA 03104100 2020-12-16
WO 2020/005321
PCT/US2019/015318
[0047] FIG. 13 is a plan view of an interaction checker of the user-
interface screen of the
embodiment of FIG. 6.
[0048] FIG.s 14-16 are flowchart views of user interaction with an
iteration of the
embodiment.
[0049] FIG. 14 is a flowchart of user interaction with the embodiment.
[0050] FIG. 15 is a flowchart of user interaction with the embodiment.
[0051] FIG. 16 is a flowchart of user interaction with the embodiment.
[0052] FIG.s 17-20 are flowchart views of user interaction with an
iteration of the
embodiment.
[0053] FIG. 17 is a flowchart of user interaction with an iteration of the
embodiment.
[0054] FIG. 18 is a flowchart of user interaction with an iteration of the
embodiment.
[0055] FIG. 19 is a flowchart of user interaction with an iteration of the
embodiment.
[0056] FIG. 20 is a flowchart of user interaction with an iteration of the
embodiment.
DESCRIPTION
[0057] In an embodiment 100, FIG. 1 shows the app's initial screen 110 for
choosing a
primary language, in this case English 136.
[0058] FIG. 2 shows the start 112 of a program. A program-feature choice
138 has been
selected. A specific health value 140 is selected from the health-value
selector 114, giving the

CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
further option of selecting a mode selector for entering data 116. Options for
entering data are
manual entry 142; scanned entry 144; and spoken entry 146.
[0059] FIG. 3 shows the app's manual-entry option 118 and a specific manual-
entry example
148, 150. A scan-entry option 120 is shown on another app screen. In this
case, for example, the
user has scanned their lab results 152. A voice-entry option 122 is shown in a
third app screen. In
this case the user has spoken an entry 154.
[0060] FIG. 4 shows a graphic display 124 interpreting results of entered
data. A graphic
design shows a high blood pressure 160 and a button for more information 156
and another
button 158 with suggested action steps. Selecting a button for more
information 156 brings up
information about the chosen topic 126. Selecting the "Action Steps" button
158 returns
suggested action steps 128.
[0061] FIG. 5 shows a graphic display 130 interpreting results of entered
data, in which a
graphic design interprets results of entered data for, example, LDL
cholesterol 160 and
triglycerides 132. The third illustration shows a navigation screen 134 for
viewing historical data
164.
[0062] In a second iteration 200, FIG. 6, a user may select from various
medication-entry
methods 210 including manual medication entry 234, photograph entry 236 or
barcode-scan
entry 238.
[0063] FIG. 7 200 shows screens that are the result of each choice. In the
manual medication
entry screen 212 the user types a medication or drug using an on-screen
keyboard. In the camera-
11

CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
entry screen 214 the user has taken a picture of a medication label. In the
scanned-barcode entry
screen 216 the user scans a medication via their (provided) smartphone's
camera.
[0064] FIG. 8 200 shows a manual-entry result screen 218.
[0065] FIG. 9 200 shows an example of a search result 220.
[0066] FIG. 10 200 shows the translation screen 222, where one may choose
to translate
indications and usage 240; dosage and administration 242; dosage forms and
strengths 244; or
warnings and precautions 246.
[0067] FIG. 11 200 shows the text-magnify option 224 in which options 248,
250, 252 are
shown magnified.
[0068] FIG. 12 200 shows the option to request entered information to be
explained in
simple terms 226. The information that was entered 256 may be simplified by
tapping an
explanation button 254. Once that button is tapped, the entered information is
re-interpreted in
simplified terms 228.
[0069] FIG. 13 200 shows an interaction checker 230 with example
medications 260, 262
entered and interactions 258 determined.
[0070] In FIG. 14 200, a flowchart illustrates the progression of steps
from a user's
perspective. Upon opening the app 264 on their device, a user selects a
language 266. From there
they choose from three branches 268 to obtain medical information. Branch 1,
"Health Values"
270, leads to an input screen (FIG. 15) for entering values such as blood
pressure, cholesterol,
etc.; Branch 2 (FIG 14) "Medication Assistant" 274, leads to an input screen
(FIG. 16) for
12

CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
entering medications; and a third branch (FIG 14) 278 leads to a patient-
portal connection which
connects to a patient's EHR portal, where users enter their credentials to
access their medical
record 280. (Branch 3 is not illustrated further).
[0071] The flowchart in FIG. 15 200 illustrates results of choosing the
first branch, "Health
Values" numeral 1, 211. The dashboard 213 loads, showing the user's previous
entries. In some
embodiments, the app checks for connection to a patient portal and if found,
allows the patient to
log in to retrieve electronic health record information such as recent visits.
The user enters a
value to be interpreted 215, for example blood pressure, cholesterol, or other
data 217. Options
for input include manual input 219, in which the user types a value 225;
camera-scan 221, in
which the user employs the (provided) camera app on their smartphone to
photograph or import
227 a photograph of, for example, lab values; and voice entry 223, wherein the
user speaks
information 229 into their smartphone using the smartphone's provided voice
app. Once the
health data is entered, the app generates information about each entry 231,
interpreting results via
a graphic design such as a dial 231, or as text; or in the form of an
educational video 235 or a
video of action steps 237. In instances of potentially dangerous interactions,
the app emits a
warning sound 236. Subsequent options include re-entering a corrected value
233 to start the
process again.
[0072] The flowchart in FIG. 16 200 shows events after the user chooses the
"Medication
Assistant" branch 2, 241. The dashboard 243 loads, showing the user's current
medications and
other drugs. In some embodiments, the app checks for connection to a patient
portal and if found,
allows the patient to log in to retrieve electronic health record information
such as recent visits.
The user enters a medication or substance to be interpreted 245. Options for
input include
manual input 247, in which the user types 253 the name of the medication, drug
or substance;
13

CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
camera-scan 249, in which the user employs the (provided) camera app on their
smartphone to
photograph or import 255 a photograph of a medication label; and barcode-scan
251, in which a
user scans the barcode 257 on their over-the-counter medication using the
app's barcode-
scanning feature into their smartphone using the smartphone's provided voice
app. Once a
medication or substance is entered, the app seeks confirmation 259. If
incorrect, the app re-routes
261 to the medication-entry step 245. If the entered medication or substance
is confirmed by the
user as correct, the app generates information about that medication or
substance 263 including
indications; values and types of dosage; administration; contraindications;
precautions and
warnings; and comparable medication alternatives. If the entered medication or
substance has
contraindications or possible interactions with their current medications, a
pop-up box 265 will
appear with this information. If an interaction or contraindication is
dangerous, the app will emit
a warning sound. Users may adjust the text size 267 of the generated results
by using a graphical
slider. They may add 269 this medication to a list of current medications.
[0073] FIG. 17 illustrates a third example iteration, 300. Upon opening the
app on their
device a user selects a language 312 and a text size 314, bringing them to a
"Get Started"
window 316. From there the user may choose from three branches to obtain
medical information.
Branch 1, "Health Values" 318, leads to an input screen (FIG. 18) for entering
values such as
blood pressure, cholesterol, etc. Branch 2, "Medication Assistant," FIG. 17,
320 leads to an
input screen that starts a process (FIG. 19) for obtaining medication
information. Branch 3, "Diet
Assistant" FIG 17, 322 leads to an input screen that starts a process by which
the user can check
food and drug interactions.
[0074] In FIG. 18, 300 a "Health Values" branch 1, 330 illustrates the
interpretation of user-
entered health data. The dashboard 332 loads, showing the user's previous
entries. In some
14

CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
embodiments, the app checks for connection to a patient portal and if found,
allows the patient to
log in to retrieve electronic health record information such as recent visits.
The user enters a
value to be interpreted 334, for example blood pressure, cholesterol, or other
data 336. Options
for input include manual input 338, in which the user types a value 344;
camera-scan 340, in
which the user employs the (provided) camera app on their smartphone to
photograph or import
346 a photograph of, for example, lab values; and voice entry 342, wherein the
user speaks
information 348 into their smartphone using the smartphone's provided voice
app. Once the
health data is entered, the app generates information about each entry,
interpreting results via a
graphic design such as a dial 350, or as text; or in the form of an
educational video 354 or a
video of action steps 356. If a health-data level is dangerous, the app will
emit a warning sound
355. Subsequent options include re-entering a corrected value 352 to restart
the process.
[0075] FIG. 19, 300 shows events after the user chooses the "Medication
Assistant" branch
2, 360. The dashboard 364 loads, showing the user's current medications. In
some embodiments,
the app checks for connection to a patient portal and if found, allows the
patient to log in to
retrieve electronic health record information such as recent visits. The user
enters a medication
or substance to be interpreted 366. Options for input include manual input
368, in which the user
types 374 the name of the medication; camera-scan 370, in which the user
employs the
(provided) camera app on their smartphone to photograph or import 376 a
photograph of a
medication label; and barcode-scan 372, in which a user scans the barcode 378
on their over-the-
counter medication using the app's barcode-scanning feature. Once a medication
is entered, the
app seeks confirmation 380. If incorrect, the app re-routes 382 to the
medication-entry step 366.
If the entered medication/substance is confirmed by the user as correct 380,
the app generates
information about that medication/substance 384 including indications; values
and types of

CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
dosage; administration; contraindications; precautions and warnings; and
comparable medication
alternatives (for example, if there is an interaction with acetaminophen, the
app suggests
ibuprofen). If the entered medication/substance has contraindications or
possible interactions
with their current medications, a pop-up box 386 will appear with this
information. Users may
adjust the text size 388 of the generated results by using a graphical slider.
They may add 390
this medication to list of current medications.
[0076] FIG 20, 300 illustrates events after the user chooses the "Diet
Assistant" branch 3,
311. The dashboard 313 loads, showing the user's caloric intake for a defined
duration, as well
as relevant data on fat, cholesterol, sodium and other intake. The user enters
a food 315 in one of
two ways: manual input 317, in which the user types 321 the name of the food;
or barcode-scan
319, in which the user scans the barcode 323 of their food product using the
app's barcode-
scanning feature. Once a food is entered, the app presents an image of the
entered food 325 for
confirmation. If incorrect, the app re-routes 327 to the food-entry step 315.
If the entered food is
confirmed by the user as correct 325, the user chooses the generated image and
a dietary
information page 329 opens, which verifies serving size and other dietary
information. An option
appears 335 to add dietary information to a daily sum for values relevant to
medical history. If
the entered food contains allergens or is commonly processed with known
allergens, the app
generates an allergy warning 331. If the entered food contains ingredients
that may interact with
the user's current medications a drug interaction warning 333 appears.
Concurrent with these
options is an option to add a new food 337 to begin the Diet Assistant process
on that entry.
[0077] In all the above iterations (100 ¨ 300), a menu icon appears at all
times at a corner of
each screen, allowing any of these options at any point: "Home" to return to
the Get Started
16

CA 03104100 2020-12-16
WO 2020/005321 PCT/US2019/015318
page; "Profile" to return to the dashboard; "Add Medication" to return to the
Medication
Assistant; or "Back" to return to the previously visited page.
[0078] These embodiments are understood to be exemplary and not limiting.
Additions and
modifications to what is expressly described here are understood to be
included within the scope
of the invention. The features of the various embodiments described here are
not mutually
exclusive and can exist in various combinations and permutations, even if such
combinations or
permutations are not made express here, without departing from the spirit and
scope of the
invention.
17

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date Unavailable
(86) PCT Filing Date 2019-01-27
(87) PCT Publication Date 2020-01-02
(85) National Entry 2020-12-16
Dead Application 2023-07-27

Abandonment History

Abandonment Date Reason Reinstatement Date
2022-07-27 FAILURE TO PAY APPLICATION MAINTENANCE FEE

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Application Fee 2020-12-16 $400.00 2020-12-16
Maintenance Fee - Application - New Act 2 2021-01-27 $100.00 2020-12-16
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
POWELL, ROBERTA D.
Past Owners on Record
None
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Abstract 2020-12-16 2 64
Claims 2020-12-16 6 161
Drawings 2020-12-16 20 623
Description 2020-12-16 17 650
Representative Drawing 2020-12-16 1 16
Patent Cooperation Treaty (PCT) 2020-12-16 2 69
International Search Report 2020-12-16 3 151
Declaration 2020-12-16 4 167
National Entry Request 2020-12-16 7 291
Cover Page 2021-01-27 2 41