Note: Descriptions are shown in the official language in which they were submitted.
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
SYSTEM AND METHOD FOR ANALYSIS OF DATA
RELATED INVENTIONS
s PCT Application PCT/ILO1/01074, for METHOD AND SYSTEM FOR
CREATING MEANINGFUL SLI~,~IARIES FROM INTERRELATED SETS
OF INFORMATION UNITS filed on 8th of January 2002 is related to the present
invention and is incorporated herein by reference. The present application is
also
related to US Patent Application serial no. 10/152,367 filed 21 May 2002
titled
1 o SYSTEM AND METHOD FOR ANALYSIS OF IMAGERY DATA and US
Patent Application serial no. 10/143,508 filed 10 May 2003 titled A SYSTEM
AND METHOD FOR AMALYZ1NG AMD EVALUATING OF ELECTRIC
SIGNAL RECORD and US Patent Application serial no. 10/145,574 filed 13
May 2002 titled SYSTEM AND METHOD FOR ANALYZING AND
Is EVALUATION OF AUDIO SIGNALS and US Patent Application serial
10/145,575 filed 13 May 2003 and titled A SYSTEM AND METHOD FOR
ANALYZING AND EVALUATING HUMAN BEHAVIOUR STIGMATA, the
contents of al of which is incorporated herein and the priority from which is
claimed.
FIELD OF THE INVENTION
The present invention generally relates to a system and method for summarizing
information units and evaluating the summarized information. More
specifically,
the present invention relates to the summarizing of an image data record by
2s selecting a particular aspect of the record and manipulating the record
such that a
meaningful evaluation thereof is accomplished. The present invention also
relates
to electric potential data selection, analysis and summery for the
facilitation of
relevant data extraction, more specifically the analysis of medically related
electric data for the facilitation of medical purposes, such as, for
evaluation,
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
diagnosis, treatment and prognosis. The present invention also relates to
audio
data selection, analysis and summary for the facilitation of relevant data
extraction, more specifically the analysis of medically related Audio data for
the
facilitation of medical purposes. The present invention relates to analysis
and
s summery for the facilitation of relevant data extraction, more specifically
the
analysis of system and method for analysis and evaluation of human behavior
stigmata obtained from various instruments measuring audio and visual output
relating to a patient.
to
DISCUSSION OF THE RELATED ART
Medical diagnosis or the identification of a disease or a specific medical
condition from its symptoms has long exceeded its dictionary definition.
Currently, medical diagnosis relies substantially on a multitude of laboratory
tests
1s and high-technology diagnostic tools. These procedures include a plurality
of
laboratory examinations such as biochemical tests, blood tests, microscopy,
diverse imaging procedures, and the like.
Today, the average patient entering any given Emergency Room (ER) ward will
2o most likely be given, after a brief physical examination, a wealth of
diagnostic
investigations using a multitude of investigative tools. The diagnostic tools
will
typically include traditional laboratory tests, such as Electrocardiogram
(ECG)
and X-ray photography. If needed, further diagnostic tools will be applied
such as
Ultrasound (US), Computed Tomography (CT) and the like. The diagnostic tools
2s provide the physician with a wealth of information regarding the anatomy,
physiology and pathology of the patient. Diagnostic tools have been evolving
rapidly during the last few decades due mainly to advances in the
understanding
of the human biology as well as to accelerated progress in computing and
imaging
techniques. The rapid development of diagnostic evaluation tools has brought
2
SUBSTITUTE SHEET (RULE 26)
FIELD OF THE INVENTION
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
about a plethora of information to the attending physician. In order to apply
the
appropriate therapy or to initiate the performance of further tests,
physicians must
read and interpret the data provided by the diagnostic tools within minutes.
Proper
and precise assessment, evaluation and interpretation of the data provided by
s most diagnostic tools require ample time and proficiency, exactly two of the
things that most physicians do not possess in a realistic medical environment.
Thus, the problem of assessing, evaluating and interpreting the information
provided by the diagnostic tools, particularly those operative in the field of
imaging, has led to the creation of a special subspecialty of physicians
capable of
1o assessing the data produced by diagnostic tools during a relatively short
period of
time with relatively high accuracy. Still, even radiologists are incapable of
observing the entire range of details associated with the data provided since
the
capacity of the ordinary human senses is naturally limited. The human brain
can
handle only a limited amount of information in a given period of time
concerning
1s the decision which of the information is relevant and which is unimportant.
In
addition to these natural limitations, a human observer is never free of
external
interruptions and internal interferences that may obscure and distort the
perceived
pattern of the data received. Furthermore, the human capability of observing
minute, subtle changes in high-volume data and evaluating the importance of
the
2o perceived changes is typically inadequate. The above-mentioned
interruptions,
interferences and natural limitations apply not only to radiologists but also
to all
humans operating in the field of medical diagnostic evaluation such as
pathologists, laboratory technicians, physicians of all specialties, and the
like.
2s There are relatively small number of working solutions to overcome the
limitations in the evaluation, assessment and the interpretation of diagnostic
tools
in the medical field. Most of these solutions involve the application of
mathematical models, such as, for example, the Fourier analysis and other
similar
techniques.
3
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
Due to the considerable amount of erroneous interpretations (false positive,
false
negative) most health practitioners avoid using such interpretations of data
provided by the diagnostic tools. Although some diagnostic tools such as
special
Ultrasound (US) machines summarise parts of the information and display the
s summary to the user in a form that is easier to understand, such summary
does not
include any meaningful evaluation and still requires expertise in the
evaluation
thereof. The above-mentioned diagnostic tools are typically very expensive and
difficult to operate such that a specifically trained medical operator is
required for
the correct interpretation thereof. There is therefore a need in the art for a
rapid,
Zo accurate, integrated and cost-effective system and method for summarising
and
evaluating image data records obtained from medical diagnostic investigation
tools.
Presently available computer facilitated methods for evaluating medically
is related data is insufficient. An important aspect of the present invention
is the
selection of complex, important and significant sections of data, which allows
the
viewing of meaningful extracts. The cost of evaluating medically related data
is
substantially high because of a number of reasons. Most techniques for
assessment, evaluation as well as summarizing medical data require training of
2o personnel in the technique procedure and application. In addition,
personnel
performing such tasks, require substantial knowledge and training in the
actual
subject matter. Furthermore, there is an enormous amount of data being
produced
from examination devices such as imaging examination devices as well as from
other devices such as communication and systems devices. It is virtually
2s impossible, if only from a time availability point of view, for medical
personnel
needing to access information, to perform a required task without having
access
to summary information presentations, unless additional time is spent. Another
advantage of the present invention involves human observation limitation of
data
4
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
amount and detail change. The present invention enhances and stress relevant
parts of the data thus allows a human observer to concentrate on relevant data
and
details otherwise blended within vast amounts of data, as well as present it
to the
user in a simplified manner. The present invention also provides for a
meaningful
summary and evaluation of three-dimensional images typically containing a
large
quantity of information.
All aspects mentioned above are significantly costly and demonstrate
the need in the art for an improved method for producing meaningful, accurate
to and effective summary data. The present invention represents a significant
advance in summarizing techniques.
In addition, the field of medical diagnosis is as ancient as medicine
itself. During the last decade great discoveries led to significant advances
in the
is medical diagnosis, treatment and prognosis. In ancient times clinical
examination
and physician intuition were the tools disposed to the physician for the
achievement of diagnosis. The twentieth century has brought about great
scientific discoveries and with them an explosion of new diagnostic tools.
2o The discovery of X-rays in the early 1900's has led to the development of
the X-
ray machine and later on Computerized Tomography (CT) machines. Ultra-Sound
(US) imaging has emerged as a diagnostic tool following the second world war.
The field of US imaging continued to develop and reached new heights with the
use of the Doppler effect to demonstrate flow and motion of body parts.
Imaging
25 diagnosis continue to expand to include other forms of imaging such as
nuclear
scans, contrast based imaging and Magnetic Resonance Imaging (MRI).
The field of electric based diagnosis has produced diagnostic tools such as
the
Electrocardiogram (ECG) and Electroencephalogram (EEG). These tools measure
5
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
the electric potential of different organs in the human body such as the heart
and
the brain, among others, and are represented as visual data called electric
graphs.
The electric potential differences, measured between electrodes placed on the
s human body is used to produce an electric signal summed into Electric Data
Record (EDR), generally referred to by names correlating with the organs or
organ parts being examined. This data can also be saved or transferred for the
purpose of further evaluation and consultation by other physicians.
to Such tools present to the physician a great wealth of information regarding
the
human body anatomy, physiology, biochemistry, pathology etc. Interpretation of
this data requires a great deal of expertise. Still, many physicians continue
to
interpret diagnostic information on a day to day basis.
is The human observation capability, though elaborate and complex, is
insufficient
to analyze fully the enormous wealth of information contained within
diagnostic
images obtained today. Most physicians observe changes occurring from what is
believed to be a normal examination.
2o When analyzing an EDR also referred to as an electric graph, such as an ECG
strip, the trained physician will loolc for gross pattern differences from a
predetermined, so called average normal ECG as well as compare the ECG strip
of a particular patient to a previously done ECG. The changes sought for can
include deviation of isoelectric lines from baseline, alteration in the length
of
2s interval between wave complexes, changes in the size and shape of wave
complexes and the like. A thorough examination of an ECG strip may require
special rulers and may take valuable time even for a trained physician. Even
the
most skilled physician may miss small alterations not readily visible to the
human
observer. Such alterations may be of importance to the medical diagnosis.
6
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
There is therefore a need in the art for a fast and more accurate diagnosis of
the
information contained within electrical graphs.
Moreover, the twentieth century has brought about great scientific discoveries
and
s with them an explosion of new diagnostic tools such as the X ray machine,
Ultrasound and Doppler machines, The nuclear imaging machines, Magnetic
Resonance Imaging machines and the like. The Auscultatory Medical Diagnostic
Tools (AMDT) such as the stethoscope is one of the most archaic yet basic
tools
of diagnosis. The Auscultatory Medical Diagnostic Tools (AMDT) such as the
to stethoscope have not changed significantly since their invention, their use
by
physicians is declining due to their subj ective nature. Typically a physician
auscultates the body surface above or in substantial proximity to the organ
under
examination, listening to any abnormal sounds such as abnormal lung sounds,
abnormal heart sounds, abnormal blood flow sounds, abnormal fetal heart
sounds,
is abnormal bowel sounds and the like. When used by an experienced physician
an
AMDT can be a valuable, simple to use and simple to operate diagnostic tool.
The AMDT's flaws are due to the highly subjective nature of the human sound
interpretation capabilities, thus, no standardization, recording, and accepted
forms
of analysis are practiced to date. The AMDT are basically audio amplifiers of
2o sound emanating from the human body. Said sounds are in fact Audio Data
Stream (ADS). The audio data stream contains a wealth of information regarding
the observed organ and body elements. The human ear is a relative insensitive
audio receiving tool; it perceives sounds in a very limited range, and thus
the
human brain receives a substantially small part of the audio data emitted from
the
2s human body. In addition, the human brain's interpretation is not impartial;
it is
affected by many factors such as internal-environment such as psychic state,
physical well being and the like, as well as many other environmental stimuli,
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
such as visual, smell, vibratory, as well as secondary audio stimuli, and the
like.
As a consequence the human interpretation of ADS is substantially incomplete.
An audio data stream can also be acquired by Ultrasound (US) machines using
s the Doppler mode. Said machines are typically used to asses the flow of
bodily
fluids, typically within veins, arteries and other bodily fluid conduits and
reservoirs. Typically the examiner will place the instrument in substantially
close
proximity to the area examined and listen to the audio interpretation of the
ultrasound waves returning from the tissue. The US-Doppler machine described
to hereinabove uses high frequency sound to measure flow and convert it into a
low
frequency sound adapted to the human ear's range. The human hearing and
interpretation capability though elaborate and complex is insufficient to
analyze
fully the enormous wealth of information contained within Audio Data Streams
obtained by various medical diagnostic tools. There is therefore a need in the
art
is for an accurate diagnosis of the information contained within electrical
graphs.
In addition, Human behavior is a complex form of data output composed of a
very large number of audio and visual signs referred here to as Human Behavior
Stigmata (HB S). The HB S form part of the human communication tools. A great
2o deal of information regarding the human condition is expressed by verbal as
well
as non-verbal (visual) form. Much of this information is subconscious and very
subtle, such that much information is unused in day to day interactions.
Siclcness
can affect both verbal and visual information emanating from a sick subject.
In
the field of psychiatry the verbal and visual information extracted from a
patient
25 are the only clues for the elucidation of the underlying cause. In
psychiatry today,
the psychiatric interview is the only tool disposed to the physician for the
elucidation of diagnosis.
8
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
IJp until the twentieth century psychiatric disease was considered outside the
medical field, hence no organic brain pathology was found. During the
twentieth
century advances in cell and molecular biology have led to greater
understanding
of the microstructure and workings of the brain. The understanding that the
core
s problem of many psychiatric diseases lies with the abnormal function of the
brain
had "certified" the field. The Diagnostic and Statistical Manual of
psychiatric
disease (DSM) was developed in order to allow physicians to standardize the
psychiatric patients and to define their individual illnesses. Still, even
today, the
diagnosis of a psychiatric condition and the separation of such disease from
other
to entities is not simple. The reasons for this can include the statistical
nature of the
DSM, the great disparity in interpretation of symptoms by psychiatrists, the
complexity of the human language and behavior which is the core of diagnostic
signs and symptoms of the psychiatric illnesses.
1s A typical psychiatric evaluation is done in an office setting where the
patient and
the psychiatrist are facing each other seated on chairs or in other similar
setting.
The psychiatrist observes the patient's behavior stigmata while asking
specially
directed questions designed to elucidate the psychiatric disturbance. The
psychiatrist must note the visuals as well as the audio output emanating from
the
2o patient. The visual output can include face mimics and gestures, body
movements, habitus and the like. Audio input of importance can include
content,
fluency, order, vocabulary, and pitch to mention a few. Once the interview is
over
the physician summarizes the findings and matches them to the minimum
requirements suggested by the DSM. In some cases additional exams are required
2s in order to define a psychiatric illness. In some institutes, the
psychiatric
interview is video taped for the purpose of further analysis.
The human observation capability, though elaborate and complex, is
insufficient
to fully analyze the enormous wealth of information emanating from the patient
9
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
and delivered to the physician both as non-verbal and verbal outputs in a
relatively short time span. In order to diagnose a psychiatric illness, a
great deal
of experience is required. In many cases a psychiatric diagnosis will be
missed for
a relatively long period of time due to the complexity of the task.
s
Once a psychiatric disease is diagnosed, therapy is initiated. Therapy may
include
therapeutic chemicals, psychoanalysis, group therapy as well as a myriad of
other
forms of therapy. The efficacy of therapy is evaluated in repeated psychiatric
interviews. Even the most skilled physician may miss small alterations in
to behavior and appearance that are not readily visible to the human observer,
thus
misinterpreting the reaction to therapy. Such alterations may be of importance
to
the medical diagnosis, treatment and prognosis.
There is therefore a need in the art for a fast and more accurate diagnostic
and
Is therapeutic evaluation tool of the information contained within the human
behavior and appearance.
SUMMARY OF THE PRESENT INVENTION
One aspect of the present invention regards a medical diagnostics
2o system enhanced by the application of an imagery data analysis instrument
operative in the processing of imagery data for the purpose of generating
meaningful imagery data summaries. The system comprises an imagery input
device to acquire imagery data of subjects and to transmit a stream of the
acquired imagery data to associated data processing apparatus. The system
further
2s comprises data processing device associated with the imagery input device.
The
data processing device comprises data storage device for storing the stream of
acquired imagery data, to store imagery data processing control parameters and
to
store programs constituting the imagery data analysis application. It further
includes a processor device to process the stream of imagery data through the
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
application of the imagery data analysis application and an imagery data
analysis
application for the generation of meaningful summaries of the imagery data
received from the at least one imagery input device and a user interface
device to
enable the users of the system to control suitably the operation of the
system.
s Another aspect of the present invention regards a medical diagnostics
method utilizing an imagery analysis mechanism for the purpose of creating
meaningful summaries of imagery data, the method comprising the steps of
acquiring a stream of imagery data of a subject via an 1 imagery input device.
Processing the imagery data for the purpose of generating meaningful imagery
to data summaries. Presenting the meaningful imagery summaries to a user.
Yet, another aspect of the present invention regards a method for
analysis and evaluation of electric data, the method comprising: receiving
information from an input device; calculating complexity of the information
received; calculating indicative parameter of the complexities; analyzing and
is converting indicative parameter for final results.
Yet, another aspect of the present invention regards a system for
analysis and evaluation of electric data, the system comprises an input device
for
capturing information; a computing device for calculating complexities of the
captured information; analyzing and converting complexities into indicative
2o parameters; interacting with storage device, user interface and input
devices; a
storage device for providing the computing device, user interface devices and
input devices with storage space; storage of captured, analyzed and converted
information; a user interface device for displaying information to the user;
interaction of user and system.
2s Yet, another aspect of the present invention regards a method for
analysis evaluation of audio signals, the method comprising receiving
information
from an input device; calculating complexity of the received information;
calculating indicative parameter of the complexities; converting the
indicative
11
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
parameter into results for play and display, whereby better usage of audio
information is achieved.
Yet, another aspect of the present invention regards a system for
analysis and evaluation of audio signals, the system comprising input device
for
capturing audio signals; processing device for manipulation of captured audio
signals; computing device for calculating audio signals parameters; storage
device
for storing audio signal related information; user interface device for user
interface with input device, processing device, computing device and storage
device, whereby a user can receive immediate and accurate analysis and
to evaluation of captured audio signals.
Yet, another aspect of the present invention regards a method for
analysis and evaluation of audio and video data, the method comprising
receiving
information from an input device; calculating complexity of the information
received; calculating indicative parameter of the complexities; analyzing and
is converting indicative parameter for final results.
Yet, another aspect of the present invention regards a system for
analysis and evaluation of audio and video data, the system comprises an input
device for capturing information; a computing device for calculating
complexities
of the captured information; analyzing and converting complexities into
20 indicative parameters; interacting with storage device, user interface and
input
devices; a storage device for providing computing device, user interface
devices
and input devices with storage device; storage of captured, analyzed and
converted information; a user interface device for displaying information to
the
user; interaction of user and system.
Other objects and advantages of the present invention will be evident
to the person skilled in the art from a reading of the following brief
description of
the drawings, the detailed description of the present invention, and the
appended
claims.
12
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will be understood and appreciated more fully
from the following detailed description taken in conjunction with the drawings
in
which:
s Fig. 1 is a schematic block diagram of an exemplary system
infrastructure of the present invention;
Fig. 2 is a simplified flow chart illustrating the operation of the system
and method of the present invention;
Fig. 3 is a simplified block diagram of an exemplary complexity
to calculation for a three dimensional image;
Fig. 4 is a simplified block diagram of an exemplary production of
complexity file for a three dimensional image;
Fig. 5 is a simplified flow chart illustrating the operation of the present
invention according to the first preferred embodiment; and
is Fig. 6 is a simplified flow chart illustrating the operation of the present
invention according to the second preferred embodiment.
Fig. 7 shows the parts of an alternative system of the present invention
in accordance with a preferred embodiment of the present invention.
Fig. 8 shows the operation of the alternative system of the present
20 invention.
Fig. 9 depicts a second alternative system of the present invention and
the exemplary operation thereof in accordance with another alternative
embodiment of the present invention.
Fig. 10 shows the operation of the second alternative system of the
2s present invention, in accordance with another alternative embodiment of the
present invention.
Fig. 11 is a third alternative system of the present invention and the
exemplary operation thereof in accordance with another alternative embodiment
of the present invention.
13
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
Fig. 12 shows the operation of the third alternative system of the
present invention, in accordance with another alternative embodiment of the
present invention.
s
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
PCT Application PCT/ILO1/01074, for METHOD AND SYSTEM
io FOR CREATING MEANINGFUL SLfMMARIES FROM INTERRELATED
SETS OF INFORMATION UNITS filed on 8~' of January 2002 is related to the
present invention and is incorporated herein by reference. The present
application
is also related to US Patent Application serial no. 10/152,367 filed 21 May
2002
titled SYSTEM AND METHOD FOR ANALYSIS OF IMAGERY DATA and
Is US Patent Application serial no. 10/143,508 filed 10 May 2003 titled A
SYSTEM
AND METHOD FOR AMALYZING AMD EVALUATING OF ELECTRIC
SIGNAL RECORD and US Patent Application serial no. 10/145,574 filed 13
May 2002 titled SYSTEM AND METHOD FOR ANALYZING AND
EVALUATION OF AUDIO SIGNALS and US Patent Application serial
20 10/145,575 filed 13 May 2003 and titled A SYSTEM AND METHOD FOR
ANALYZING AND EVALUATING HUMAN BEHAVIOUR STIGMATA, the
contents of al of which is incorporated herein and the priority from which is
claimed.
2s A novel method and system for the meaningful summarizing of
information units is disclosed. The system and method is operative in
introducing
an aggregate of information units into a computing environment in order to
derive
the most interesting and significant aspects of the information. The data
aggregate
consists of a plurality of logically inter-related information units having
diverse
3o formats, such as images, video, audio, graphics, text, database records and
the
14
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
like. In the preferred embodiments of the present invention the information
units
represent medical diagnostic images. The information is typically obtained via
a
medical diagnostic imaging tool such as a CT machine observing a human
subject. The information units, such as images and video data records, are
processed by a set of specifically developed computer programs, which effect
the
division of the data records into fragments or blocks having substantially
identical
dimensionality. The division of the data records by the programs is performed
in
accordance with predetermined parameters associated with the format and the
content of the data record collection. Each of the dimensionally substantially
to identical record fragments assigned an arbitrarily pre-determined
complexity
value by a set of specifically developed computer programs that calculate the
complexity value of the fragments in association with predetermined processing
parameters. The division of the related data records into multiple fragments
having identical dimensionality, the assignment of the complexity value to the
fragments, and the subsequent organization of the data fragments, provides the
option of creating a specific summary view and a specific perspective of the
original information. Complexity summary is transformed into an indicative
parameter, which is a calculated value with a practical meaning. The
transformation is achieved by using predefined data including baseline imagery
2o data, medical data and other relevant data. The indicative parameter is
then
transformed into a human readable final result that could be an indication for
medical action, a diagnosis, a prognostic element, a response to therapy and
the
like.
2s The complexity value calculation requires no a-priori lcnowledge of the
diverse media and the associated application. The extrapolation of the
indicative
parameter, done with specially tailored functions, and the final result are
dependent on many aspects of the field and questions at hand thus they are
dependent on a priori knowledge. For example, when examining the response to
3o radiotherapy in a patient with brain cancer, a priori knowledge of prior
brain
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
images of the patient is required. The prediction accuracy of the system
increases
as more information is fed to the system before hand. For example, if the
pattern
of response to radiotherapy of brain cancer from many patients is calculated
with
the indicative parameter formula, a greater accuracy as to the "normal" or
s "acceptable" response can be easily calculated when compared to essentially
the
same data involving only few patients.
The preferred embodiments of the present invention relate to a method for
facilitating the selection of the most interesting or significant aspects,
from a large
group, set or collection of information, more specifically medical imagery
data,
to video data files and other related data, and the extraction of meaningful
information. The following description of the method and system proposed by
the present invention includes several preferred embodiments of the present
invention. Through the description of the embodiments specific useful
applications are disclosed wherein the elements constituting the method
process
is suitably the collection of data records associated with the application. It
can be
easily appreciated by the person skilled in the art that the preferred
embodiments
described hereunder in the following description and the associated drawings
are
exemplary only.
2o Fig. 1 illustrates an exemplary system 100 within which the method
suggested by the present invention could be implemented. The system 100 is
operated by a user 114 and includes a user interface device 112, a computing
device 106, a storage device 108, an input device 104, and a communications
network 110. The various elements of the system 100 constitute a suitable
2s infrastructure operative in the application of the suggested method. Input
device
104 is an image-acquiring device capable of acquiring and handling a set of
static
and a sequence of dynamic images from a specified target. Device 104 could be
a
CT, a US, a 3D US, a MRI, an endoscope, an X-ray machine, a Mammography
device, a PET scanner device, and the like. Exemplary input devices could
3o include the Philips Elite-1200 CT picker, the Hewlett Packard Sonos 2500 US-
16
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
ECHO, the GE Medical Signa Horizon LX 1.ST, the Pentax Sci-Lab endoscope,
the GE Medical Revolution XQ/I X-ray machine, the Philips Mammography
device, the Siemens PET scan device and the like. Input device 104 acquires
image information from subject 102 in a fashion related to input device 104
mode
s of operation. For example, an X-ray input device and a CT input device
obtain
electromagnetic wave information in the X spectrum and translate such
information so as to produce a digital imagery representative of data
obtained.
The images obtained could include one or more separate static images such as
in
the case of X-ray photography, a set of multiple inter-related possibly
overlapping
static images such as in the case of a CT device and dynamic images such as in
the case of an US device. Static and dynamic images can be obtained by any one
input device such as dynamic video images obtained by an endoscope device as
well as a static image obtained by freezing and capturing one frame of the
dynamic video images. After acquiring relevant information, input device 104
Is transmits the captured and transformed information as a data record to
computing
device 106. Transmission of the data information record from input device 104
to
computing device 106 can be either in a wireless or in a wired fashion in the
standard manner known in the art. For purposes of clarity only one input
device
104 is shown in Fig. 1. It will be evident to the person skilled in the art
that a
2o plurality of input devices 104 and different types of input devices 104 can
be
connected to the computing device 106. User 114 can manipulate input device
104, such that the input device 104 views special regions of interest. Certain
operational modes are executed during the operation of input device 104. Input
device 104 can also be manipulated in a pre-programmed manner by computing
2s device 106 as to all aspects of operation such as subject 102 scanning
locations,
operational mode and the like. Subject 102 is typically a human subject but
not
necessarily undergoing a medical imaging examination. However, subject 102,
can be any organism or a part of an organism under investigation, such as
pathological specimens under a video microscope and the like. Computing device
m
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
106 operative in the reception and manipulation of the image data information
records. Device 106 could be a computing platform, such as a dedicated
microprocessor, a multi-functional Personal Computer (PC), or a central
mainframe computer. Device 106 incorporates suitable program routines
s including a set of specifically designed and developed program instructions
responsible for the manipulation of the image information data record.
Computing
device 106 can obtain the image information data record from the local input
device 104, the local storage device 108 and from diverse remote data sources
implemented across the communications network 110. Device 106 is capable of
1o converting analog imagery data obtained from input device 104 to a digital
format. The converted data then undergoes within device 106 complexity,
indicative parameter as well as final result calculation is performed thereof.
Computing device 106 is also operative in transferring information to storage
device 108, to user interface device 112, and to remote locations via network
110.
is Computing device 106 can also be functional in manipulating input device
104 as
previously described above. Network 110 is a computer network such as the
Internet, Intranet, local area network (LAN), wireless local area network
(WLAN)
and the like. Network 110 is operational in receiving and delivering input and
output information to and from computing device 106. The received and
2o delivered information could be image data information records, complexity
calculations, indicative parameters, final results, subject related
information, and
other relevant material. Storage device 108 can be a hard disk, a magnetic
disk, an
optical disk, a mass storage device, or an SDRAM device operative in storing
data such as image data information records, complexity calculation results,
2s indicative parameters, final results, subject related information, disease
related
information, operational related information and other relevant information
needed for the operation of system 100. User interface device 112 is a
computer
monitor, a television screen, a personal digital assistant (PDA), and the
like. User
interface device 112 is operative in providing suitable interaction between
user
Ig
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
114 and the other components of the system 100r User interface device 112 can
be functional in displaying input device 104 image data in real time,
displaying
complexity summaries, indicative parameters, final results and other relevant
information such as identification data, personal data, health data, medical
history
s data and the lilce. User interface device 112 is also functional in
providing user
114 with the option of manipulating computing device 106 parameters, allowing
user 114 to interact with storage device 108, input device 104 and providing
the
reception and transmission of information through network 110. User 114 is
typically a physician desiring to obtain medical information from subject 102
via
1o the examination performed by input device 104. User 114 could also be a a
medical technical expert operating system 100 concerned with evaluating
certain
aspects of information obtained by input device 104.
Turning now to Fig. 2 where a simplified flow chart illustrating the
is operation of the system and method of the present invention is demonstrated
and
where the system is referenced 200. Data stream 202 is acquired by input
device
104 of Fig 1 by appropriately sensing an examined target is transmitted to
computing device 106 of Fig 1. Data stream 202 can be a static image data
record,
a static three-dimensional image data record, a dynamic image data record, a
2o dynamic three-dimensional image record, ancillary data records typically
accompanying imagery data records, such as audio data record, text data record
and the like. Data stream 202 is transmitted to computing device 106 in wired
or
wireless fashion in a manner known in the art. Data stream 202 received by
computing device 106 is optionally manipulated in step 203. The manipulation
2s may include analog to digital (ATD) conversion, color correction,
brightness
correction, contrast correction and any types of processing typically
performed on
imagery data records in order to enhance input purity. The data stream then
undergoes a complexity calculation 204 in order to create a complexity file
(not
shown) containing complexity parameters of at least a substantially small part
of
3o the data stream defined by user 114 of Fig 1 or predefined within computing
19
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
device 106. The complexity file created through the complexity calculation 204
can also include a cumulated complexity parameter of substantially all parts
of
the data stream that underwent complexity calculation. Complexity file can
incorporate there within complexity calculations of a substantially small part
of
s image data record, substantially all image data record, ancillary data
records as
well as any other data record capable of undergoing complexity calculation.
Different image types and auxiliary data records can be processed by
complexity
calculation 204. Complexity calculation 204 is a calculation internal to
computing
device 106, and is independent of any external information. Several internal
manipulations of the parameters used for complexity calculation 204 can be
executed in order to optimize computing device 106 operation but are
independent of the data stream 202 as well as the examined object (not shown).
The parameters are optionally stored and fed to the computing device 106 from
database 210 stored at the storage device 108 of Fig. 1. Next, an indicative
1s parameter calculation 206 is performed. Indicative parameter calculation
206 is
an evaluation of complexity calculations stored within the complexity files.
For
example, complexity calculations within a complexity file, created by
complexity
calculation 204, can be compared to previous or predefined complexities fed to
computing device 106 from database 210 or from other sources such as network
20 210 and the lilce. Indicative parameter calculation 206 is then performed
by
feeding complexity calculations into predefined mathematical formulas having
predefined parameters as stated above. Next, final result 208 calculation is
performed using indicative parameter calculation results. Final result is user-
friendly representation of indicative parameter result such as a probable
2s diagnosis, a suggested course of action, a suggested therapeutic
indication, an
assessment of the response to a prior therapy and any other format pertinent
to the
specific examination. Such format may include an image, a stream of images.
Such image or stream of images may provide guided assistance including through
the projection of graphic means to assist in the treatment or course of action
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
suggested. The final result is then transmitted to database 210 implemented in
storage device 108 of Fig. 1. Complexity calculation results and indicative
parameter calculation results obtained at steps 204 and 206 respectively are
also
stored in database 210. Database 210 also comprises calculating parameters for
the above-mentioned calculation, previously obtained medical, personal, and
other data relating the operation of system 200. Database 210 could also store
all
data streams obtained by input devices and from other data sources such as
network 110 and the like. Information stored on database 210 can be
retransmitted to computing device 106 and to the user interface device 112 of
Fig.
l0 1 by demands introduced by user 114 and by predefined parameters within
computing device 106. In the typical operation of system 200, user 114
interacts
with the system via user interface device 112 (UID). User 114 can perform the
interaction by utilizing suitable devices, such as a pointing device (mouse),
a
touch screen device (stylus), an audio device (microphone) or any other method
for user system interaction. UID 112 is operative in allowing user 114 to
select
practically any region of interest 212 of data stream 202 fed to computing
device
202. The selection could be performed optionally at any step of the operation,
such as after manipulation step 203, complexity calculation 204 and the like.
UID
112 can display essentially any part of data stream 214, display indicative
2o parameter results 216 as a numeric value, as a graphical illustration, a
pictorial
representation and the like. UID 112 can also display final result 218 in
diverse
user-friendly formats Using UID 112, user 114 can also manipulate database 220
such as for manipulation of parameters definitions, manipulation parameters,
for
saving, displaying, transmission of information and the like.
Turning now to Fig. 3 where an overview of a complexity calculation of a
three dimensional image is discussed. Currently images obtained via input
devices such as a CT, an MRI, a US device and the like are reconstructed to a
three dimensional image. In addition some newer devices are capable of
21
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
producing three-dimensional images. The interpretation of three-dimensional
(3D) data is a complex process due to the vast amount of data contained there
within. Using complexity calculation of three-dimensional images can
facilitate
the analysis and summary of 3D images. In a three-dimensional (3D) image the
s image data stream comprises three axes. Complexity calculation of such data
involves calculation in three axes x, y and z. The x and y axes comprise a
longitudinal and horizontal location of the image while the z axis contains
the
depth location. A three dimensional image (3D-image) is divided into multiple
blocks having multiple reading frame possibilities. The reading frames are
1o typically in the x, y and z axes and each contain preferably more then one
pixel
value but typically less pixel values comprising the whole block. Complexity
calculation is performed for each block as described here and below. This
resulting complexity parameter is stored in a file containing a complexity
block
matrix.
Still referring to Figure 3 where an exemplary 3D-image is referenced as
302, block 1 and block 2 are referenced 304 and 306 respectively, where
reading
frames x, y and z of block 1 are numbered 301, 303 and 305 respectively and
where complexity calculation is referenced 308 and complexity file is
referenced
310. In the exemplary block diagram illustrated in Fig 3 an exemplary 3D-image
302 such as a 3D US video image, a 3D CT image reconstruction of multiple
images obtained via a CT scan is illustrated. Image 302 is acquired by an
input
device 104 of Fig 1 then transferred to computing device 103 of Fig 1, there
within image 302 is divided into multiple blocks such as block 1 (304) and
block
2s 2 (306). Although only two blocks are illustrated in Fig 3 for simplicity,
it should
be clear to one with ordinary skills in the art that any 3D-image can be
divided
into any number of blocks. Block 1 (301) comprises reading frames, RFx 301,
RFy 303 and reading frame RFz 305. A multitude of different reading frames
301, 303 and 305 is possible for block 1 (304). Block 1 (304) and block 2
(306)
undergo complexity calculation 308. A complexity calculation result for each
22
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
block is then transferred to a complexity file 310 such that complexity file
310
contains a complexity metric of all reading frames of all blocks obtained from
3D-image 302.
s Turning now to Fig. 4 where a simplified block diagram of the complexity
calculation of each block is described and where each reading frame comprises
a
multitude of possible sequences referred to herein as Wordsize. The use of the
terms "Word" and "Letter" and such similar terms in the context of the present
invention is associated with the use of pixels or a sequence of values of
pixels
to within an image or a series of images. It will be appreciated such use is
made for
the sole purpose of better describing the worlcings of the present invention,
and
that such use is in no way limiting to the term Word. Each Wordsize contains a
sequence of values of pixels referred to herein as letters. Any number of
Wordsize combinations is possible and each Wordsize combination is referred to
Is herein as a Word. In order to calculate complexity the maximum number of
Words and the number of different Words must be calculated. The calculation of
the maximum number of different Words or Wordsize combinations is derived by
either calculating the number of different Wordsize for each letter or by the
maximal different words in a given block. Calculating the number of different
2o Words is performed by counting the different number of Words using the
Words
and a shift parameter. A ratio of maximum number of different words and the
count number of different words is then used for the final calculation of
complexity of that block. Fig 4 illustrates block 1 referenced 302 of Figure
3,
three reading frames RFx, RFy and RFz referred to as 301, 303 and 305
2s respectively of Figure 3. Also illustrated are Wordsize lists 312, 314 and
316
respectively and Wordsize 318 through 334, words 1, 2, and N referenced 336,
338 and 340 respectively. In addition Shift parameter is referenced 342,
letter
referenced 344, WordNumber and MaxWord referenced 346 and 348
respectively. Also depicted is U ratio referenced 350 and complexity
calculation
30 352. Complexity calculation of block 1 302 is performed such that for each
23
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
reading frame RFx 301, RFy 303 and RFz 305 a word size list is created such
that
WSx list 312 is the word size list of RFx 301, WSy list 314 is the Wordsize
list of
RFy 303 and WSz 316 list is the Wordsize list of RFz 305. For each reading
frame 301, 303 and 305 Wordsize are created such that for RFx 301 Wordsize
s (WS)(x)1 318, WS (x)2 320 and WS(x)n 322 are created. For RFy 303 WS(y)1
324, WS(y)2 326 and WS(y)n 328 are created. For RFz 305 WS(z)1 330, WS(z)2
332 and WS(z)n 334 are created. Although only three Wordsizes (WS) are
depicted for each reading frame it would be clear to one with ordinary shills
in the
art that an N number of Wordsizes can be created for each reading frame. Words
l0 336, 338 and 340 are calculated by multiplying Wordsize such that word 1
336 is
calculated by multiplying Wordsize (WS)(x) 1 by WS(y) 1 by WS(z) 1.
Calculation
of word 2 (338) is executed by multiplying WS(x)2 by WS(y)2 by WS(z)2. The
calculation of word N 340 is accomplished by multiplying WS(x)n by WS(y)n by
WS(z)n. Although only three words are depicted in figure 4 it would be clear
that
1s any number of words depending on the number of Wordsize are possible. Using
Wordsize lists 312, 314 and 316 together with appropriate letter 344 values,
computing device 103 of Fig 1 calculates the number of maximum different
words possible also referred to herein as MaxWord 348. MaxWord 348 can also
be calculated from the maximum possible words in a given block. The
calculation
20 of WordNumber 346 is accomplished by counting the different words, in the
illustration brought forward by addition of word 1 336, word 2 338 and word N
340. Shift parameters 342 are predefined constants allowing a simplified
WordNumber 346 calculation by using predefined shift calculation of all words
such that word 1 336 is accomplished by multiplying WS(x)1 318, WS(y)1 324
zs and WS(z) 1 330. Then using a shift parameter computing device can easily
calculate all other words thus enabling the calculation of WordNumber 346 by
adding all words. U ratio calculation 350 is then accomplished by using
WordNumber 346 and MaxWord 348. Resulting U ratio is then used for
complexity calculation 350.
24
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
Turning now to Fig. 5 where a flow chart illustration of a typical operation
of the system in accordance with the first embodiment of the present invention
is
illustrated and where the typical operation involves an imagery early
prediction of
response to therapy. In such an operation, a response to therapy using imagery
s data such as a two dimensional image as well as a three dimensional image
and
complexity calculation is illustrated where system 400 is illustrated at two
subsequent time frames T1 and T2 and where time interval T2 - Tl (0T) is
sufficient for therapy evaluation using system 400. System 400 comprises
subject
402, input device 404, image data record 406, computing device 408, text final
to result 410, image final result 412, database 414, user 416 and therapy
regimen
418. In the typical operation of system 400, subject 402, (for example a
patient
with a brain cancer), is submitted to an imagery evaluation examination such
as a
dwMRI, dwMRT, MRI, CT, 3D CT image reconstruction, and the lilce at time T1.
Input device 404 such as a MRI machine obtains imagery information of a
Is selected region of interest 420 of subject 402, such as the brain. Image
data record
406 preferably in digital format, for example a digital MRI image of a brain
section with a tumor, is sent to computing device 408 where complexity
calculation, indicative parameter calculation and final result calculation
(all three
not shown) are performed on the imagery data. Complexity calculation can be
2o performed on a small part of the image as well as on the entire image as
previously described. The parameters for this calculation can be defined or
predefined within computing device 408. Output of computing device 408 is
typically, but not exclusively, in the form of final result (not shown) and is
displayed typically but not exclusively as Text Final Result (TFR) 410, such
as:
2s "Tumor of type A", "Tumor of complexity A", "Tumor of consistency A" etc.
Output 408 can also be displayed as Image Final Result (IFR) 412, such a MRI
image of the brain with the tumor delineated, both in a user-friendly format.
IFR
412 and TFR 410 are sent for storage in database 414 and are displayed to user
416 preferably via user interface device 112 of Fig. 2. During time interval
~T
2s
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
(T2-T1) subject 402 receives therapy such as radiotherapy and/or chemotherapy
for the purpose of reducing or eliminating the brain tumor. At time T2 after
time
interval an image data record 406 of selected region of interest 420 of
subject 402
is obtained by input device 404. For example a patient with brain tumor
obtained
s at time Tl is having a MRI image of the same region at time T2. Image data
record 406 of the same region of interest 420 at time T1 is now transferred to
computing device 408 where complexity calculation on the obtained data is then
performed. Computing device 408 obtains images and respective calculated data
processed at time T1 and stored in database 414. Computing device 408 then
uses
1o this data and other parameters and calculates a new indicative parameter
and a
new final result (not shown). Output of computing device 408 is typically but
not
exclusively in the form of final result (not shown) and is displayed typically
but
not exclusively as Text Final Result (TFR) 410, such as: "Tumor of type B",
"Tumor of complexity B", "Tumor of consistency B" etc. Output 408 can also be
Is displayed as Image Final Result (IFR) 412, such a MRI image of the brain
with
the tumor delineated, both are easily understandable to user 416. In the
example
at hand tumor type A at time T1 is of the complexity B at time T2 thus tumor
type
complexity has changed during time interval 0T. Thus, the tissue examined is
characterized by the complexity value it is assigned. Complexity values are
2o assigned prior to and after treatment or at predefined intervals. Treatment
is
provided based on the complexity values of the tumor type at any specific
time.
The various complexity results enable a non-invasive tissue characteristic
analysis. The same can be easily perceived by observing the complexity result
of
circle 412 at time Tl and T2 respectively. Output of system 400 can be in any
2s other format known in the art IFR 412 and TFR 410 are then sent for storage
in
database 414 for later evaluation, transfer etc. and displayed to user 416
preferably via user interface device 112 of Fig. 2. During the operation of
system
400 at both times T1 and T2 the user 416 can manipulate all the elements of
the
system, such as selecting a new selected region of interest 420, observing any
26
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
imagery data obtained by input device 404 or stored on database 414, print a
hard
copy of image data record 406, IFR 412, TFR 410 and the like. System 400 can
also be used for other purposes such as reaching a diagnosis, next step of
investigation indication, therapeutic indications, prognosis evaluation and
other
s purposes suited for medical and other fields. System 400 configuration can
be
used with other static image input devices such as an endoscope device such as
an
Endoscope from Olympus, and for evaluating skin lesions obtained by an image
input device, such as a camera from Canon. The present invention can also be
operational for evaluating images obtained from a pathological specimen such
as
1o by a camera, a microscope camera, a miniature camera, an electron
microscope
camera and the like.
It should be evident to one with ordinary skills in the art that the
illustration brought here and above is but an example, and many other uses and
configurations of the system are possible within the scope of the present
15 invention.
Turning now to Fig. 6 which is a flow chart of the second preferred
embodiment of the present invention. System 500 is functional in the
evaluation
of medical video data such as data obtained by an US machine, a US/Doppler
2o machine, an ECHO machine, an Endoscope device, a three dimensional ECHO
and comparable instruments known in the art. System 500 is illustrated here
and
below for the purpose of identifying heart defects via the utilization of an
ECHO
machine. It should be clear to one skilled in the art that this representation
is an
example only and any other uses and other working configurations are possible.
25 System 500 is a preferably, but not exclusively, medical system comprising
subject 502, selected region of interest (SRI) 504, input device 506, video
data
stream 508, computing device 510, user interface device (UID) 512, database
514
and user 516. Subject 502 is an image source such as an infant with a
clinically
evident heart murmur. Selected region of interest (SRI) 504 is a part of
subject
30 502, such as the heart, from which video data stream 508 is obtained by
input
27
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
device 506 and is substantially large to be visualized by input device. SRI
504
typically is a body organ or cavity such as the brain, heart, kidney, blood
vessel,
abdomen, pelvis etc. Input device 506 is operative in acquiring physical data
from
selected region of interest 504 of subject 502 by physical means such as light
s capturing techniques, high frequency ultrasound wave resonance
interpretation,
Doppler shift interpretation, and radiation particles signature interpretation
and so
forth. Input device 506 is internally capable of converting this physical
information into image data stream 508 records such as a two dimensional
image,
a there dimensional image, streaming two and three dimensional video and the
lilce. Input device 506 can be an ECHO machine, such as a Sonos 2500 from HP,
or system V from GE Medical and the like. Input device 506 in the example
brought forth is an ECHO machine functional in extracting ultrasonic wave
information regarding the heart of subject 502. In Fig. 5 video data stream
508 is
illustrated as an output of input device 506 such as video images of the heart
of
is subject 502. It should be evident to the person skilled in the art that
input device
508 could have different output formats such as mathematical, textual and the
like, used to represent measurements performed by physical measuring devices
such as input device 508. Video data stream 508 is a serial image data record
that
when displayed to user 516 at a rate of at least 24 images per second is
perceived
2o by a human user 516 as real-time motion event of the image sequence. In the
current example the data stream comprises a time related sequential collection
of
images of SRI 504 (heart) of subject 502. The data stream 508 is transmitted
to
computing device 510 by means known in the art such as a coaxial cable,
infrared, wireless RF transmission and the like. Computing device 510 is
2s operative in calculating complexity calculation (not shown) on at least a
substantially small part of video data stream 508. The extent of this
calculation
could be predefined within device 508, could be dependent on the type on
information received by device 508, and could be user-selected parts of video
streams 508 displayed to the user 518 at real time. Next, computing device 510
2s
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
calculates indicative parameter (not shown) and final result (not shown) as
stated
here and above. In the present example, indicative parameter is calculated
based
on complexity calculations of video data record 508 (streaming images of
heart),
predefined data base concerning selected region of interest 504 and subject
502 as
s well as input device 506 and so forth. At least a substantially small part
of the
predefined data is used for the calculation of an indicative parameter.
Indicative
parameter results as well as additional predefined data is used for the
conversion
of the indicative parameter result into the final result having a user-
friendly
format relating SRI 504 to be displayed to the user 516. Calculated results
are
then transferred to database 514, user interface device (UID) 512, other
computing devices (not shown) through a computer network and the like, printed
as hard copy (not shown) and the like. User 516 interacts with system 500
typically through UID 512 but also through input device 506. In the current
example user 516 is a medical professional such as a radiologist handling
input
Is device 506 such as an ECHO machine in a manner resulting in the acquisition
of
correct images from selected region of interest 504. User 516 can view
obtained
data stream on the input device screen (not shown) as well as on UID 512. User
516 can also observe in real time final results of obtained data on UID 512
and
reselect a new region of interest, correct attitude or location of examining
input
2o device 506 or subject 502 and the like. User 516 can also communicate with
computing device 510 as well as database 514 through UID 512 for the purpose
of changing predefined parameters, uploading saved data, comparing saved data
to new acquired data, etc. UID 512 can be a software such as a graphical user
interface device working on input device 506 or on computing device 510 as
well
2s as on an independent computing and displaying device such as a PC computer,
Handheld device and the like. User 512 interacts with input device 506 and UID
512 by means known in the art such as a pointing device (mouse), a touch
screen
device (stylus), an audio device (microphone) and the like. In the
illustrative
example presented herein, user 516 is a radiologist handling input device 506
29
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
such as echo device over the surface of subject 502, a child with a heart
murmur,
in a specific location such that a selected region of interest 504 such as the
heart
is clearly viewed by input device 506. The user observes the input device
screen
(not shown) while collecting streaming image data records. The data is
s transferred to computing device 510 such as a computer of the input device
where
a series of calculations and database analysis is carried out and where final
result
is the final outcome. Final result (not shown) is then displayed to the user
substantially immediately and therefore the user can decide whether the
collected
information is adequate. The final result can be a diagnosis such as a
Ventricular
~o Septal Defect (VSD) displayed as image and text, an indication for therapy
such
as a textual content describing the size of the VSD and the correct therapy
indication, an evaluation of the VSD size over time such as the enlargement
degree from the last examination etc.
The present invention also provides for a system and method for analysis
Is and evaluation of Electric Data Records (EDR) obtained from various
instruments
measuring electrical signals, more specifically electric potential. The system
and
method can be used for non-invasive diagnosis. The invention discloses a
system
and method according to which electrograph complexity calculation can be
implemented on electric data recording of medical examination tools as well as
20 other tools measuring and displaying electric data records. The input data
is
recorded in real-time via an electric potential sensitive machine previously
described. A streaming data of different electric potentials referred to as
Electrical
Data Recording (EDR) is recorded digitally. The digital recording is received
by
the application. A complexity calculation of the at least a part of the data
is
2s performed. An indicative parameter is calculated using the complexity
calculation
according to predefined information obtained beforehand. The indicative
parameter is used for calculation and transformation such that a final result
can be
displayed to the user. The final result can point to areas of interest in the
EDR,
facilitate diagnosis, and suggest treatment, used as a prognostic marker as
well as
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
other forms of medically relevant data. Thus, the output of the system is
useful in
the evaluation and quantification of Electric Data Records (EDR).
Turning now to Fig. 7 wherein parts of the system of the present invention
s are disclosed and referenced 100. Input device 101, 102 whereby data is
obtained
can be an electrocardiogram machine such as an ECG machine by Nihon-Kodan,
manufactured in Japan, an electroencephalogram machine such as EEG by
Nicolet - Viking 4, an Oscilloscope as well as any other instruments using
electric
signals, more specifically potential difference, to produce data or EDR such
as an
to ECG, EEG, Electric Brain Stem Evoked Potential (EBSEP) and the like. In
Fig. 7
only two input devices are depicted for the salve of clarity. It will be
evident to the
person skilled in the art that any number of input devices as well as
different
types of input devices can be connected to the computing device 103. Data
obtained by input devices 101 and 102 is transferred via cable, modem, Infra
Red
is (IR) or any other form known to a computing device 103. Computing device
103
is a software program or a hardware device such as a PC computer, such as a PC
computer, a hand held computer such as Pocket PC and the like. Within the
computing device 103 input received from input devices 101 and 102 is
processed
and an output data is transferred to the Interface devices 104 and 105.
Interface
2o devices may be a computer screen such as an LG Studioworks 57I, a hand held
device such as the Palm Pilot manufactured by the Palm Corporation, a monitor
screen, a Television screen, an interactive LCD screen, a paper record, as
well as
other interface devices. The output data can be stored on a storage device 107
such as a computer hard disk, as well as any storage device. The output data
can
2s also be sent for storage, viewing and manipulation to other parties by hard
wire
(not shown), IR device (not shown) or any other transfer modalities including
via
data network. Interface device 104 and 105 may be used to alter operation of
input device 101 and 102, computing device 103 or both. Such activity can be
done by the user 106 via direct human interaction with an interface device
such as
3o by touch, speech, manipulation of attached mouse device and the like.
Output
31
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
information can be viewed as graphs, pictures, summary analysis, and the like,
as
well as manipulated by the user 106 for other purposes such as transferring
the
data, saving the output and the like.
s Turning now to Fig. 8 where operation of the system 200 of the present
invention is disclosed where a streaming digital electric data recording
(DEDR)
201 such as an electrocardiogram recording, a Holter recording, an
encephalogram recording, a brain stem evoked potential recording and the like
is
obtained by input device 101 of Fig. 7. The DEDR 201 is transferred to
to computing device 102 as described also in Fig. 7. and then undergoes a
complexity calculation 203 and Indicative parameter calculation 204. The
complexity calculation 203 performed on the DEDR stream 201 is preferably
done on at least one substantially small part of the data. Complexity
calculation
203 can be performed automatically as predefined in parameters within
Is Computing device 102 also of Fig. 7. Said calculation can be done on at
least one
substantially selected part of said data as predefined in predefined database
205.
The calculation can also be perforned on at least one substantially small
selected
region of interest 206 of said data by user (not shown) using the user
interface
device 104 also of Fig. 7. Indicative calculation 204, previously described in
2o related application hereinabove is a quantitative data element calculated
with
respect to predefined parameters such as previously inputted DEDR streams
(i.e.
normal ECG result for an 18 year old African American male, etc.), predefined
formulas describing known and predicted DEDR stream behavior and patterns
(i.e. LAD deviation in an otherwise healthy 18 years old African American
obese
2s male, VT in trauma patients, etc.) as well as other parameters such as age,
social
circumstances, body stature, racial origin, occupation, previous illnesses,
current
illnesses and the lilce. Said data can be stored before hand as well as stored
continuously with operation. Said data can be stored on the predefined
database
205 as well as on any database device (not shown) connected to Computing
3o device 102 of Fig. 7 as well as any remote databases devices (also not
shown).
32
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
Calculated indicative parameter 204 can be displayed to the user (not shown)
on
the user interface device 104 also of Fig. 7. Said parameter can also be saved
on
the computing device 102 of Fig. 7 as well sent to other computer devices (not
shown) by methods known in the art. Calculated indicative parameter 204 can be
s converted to an easy to understand, final result 207 such as a certainty of
an
electric data record (EDR) diagnosis, an image representation of the EDR, such
as
an electrograph, such as an ECG strip and the like, a summary of the streaming
EDR input selected, a region of interest of the streaming EDR by predefined
parameters located within the predefined database 205, a suggested immediate
1o therapy indication and the like. Final result 207 is then transferred to
the user
interface 104 also of Fig. 7 and displayed 208 to the user (not shown).
DEDR stream 201 received from input devices 101 and 102 of Fig. 7 can be
displayed to the user as an image display 209 during the system's operation.
Thus
is allowing the user to observe and if needed to manipulate the system
operation in
real time using the user interface device 104 also of Fig. 7 as previously
discussed.
The present invention also provides for a system and method for analysis
2o and evaluation of Audio Data Stream (ADS) obtained from various instruments
measuring analog audio signals such as a stethoscope, fetoscope etc. The
system
and method can be used for non-invasive diagnosis. The invention discloses a
system and method according to which audio data complexity calculation can be
implemented on audio data recording streams of medical examination tools as
2s well as other tools measuring and displaying audio data records. The input
data is
recorded in real-time via audio sensitive instruments previously described in
the
art and mentioned hereinabove. A streaming data of different analog audio
frequencies referred to as Audio Data Stream (ADS) is recorded and then
converted to a digital form by an audio to digital conversion element. The
digital
3o recording is received by the application. A complexity calculation of the
at least a
33
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
part of the data is performed. An indicative parameter is calculated using the
complexity calculation according to predefined information obtained
beforehand.
The indicative parameter is manipulated so as to supply the output result. The
output result is useful in the analysis, quantification and evaluation of
Audio Data
Stream (ADS).
Turning now to Fig. 9 wherein parts of the system of the present invention
and an exemplary operation is disclosed and referenced 100. Input device 101
is
an Audio Medical Diagnostic Tool (AMDT) such as a stethoscope such as an 3M
stethoscope from Littmann, a fetoscope, a vascular Doppler, as well as any
other
instruments capable of receiving and preferably enhancing audio signals, such
as
an analog audio signals to produce data. Data could include, for example, a
heart
beat murmur, blood vessel flow pattern, and the like. In Fig. 9 Input device
101 is
typically manipulated by user 106 and is place in preferably close proximity
to
the subject 102 area of interest. Input device 101 is preferably manipulated
by the
user 106 such that an optimal location is chosen for Audio Data Stream (ADS)
collection. In Fig. 9 only one input device is depicted for sake of clarity.
It will be
evident to the person skilled in the art that any number of input devices as
well as
different types of input devices can be connected to the other elements of the
2o system such as the user 106. Analog data obtained by input device 101 is
transferred preferably via an acoustic conductor, or any other suitable form
known in the art to the processing unit 105. Processing unit 105 is functional
in
converting ADS from analog to digital format as well as enhancing and
filtering
ADS as well as transmitting said data to computing device 103 and user 106 via
suitable cable, Infra Red (IR) apparatus, modem device and similar transfer
means of digital information. The parameters used by processing device 105 can
be located within the processing device 105, received from user 106 by way of
user interface device 104, stored on storage device 107 as well as on other
locations outside the proposed system (not shown). Processing unit 105 can be
located in substantial proximity to input device 101 such that sound
propagation
34
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
need not be enhanced. Computing device 103 is a software program or a
hardware device such as a PC computer, such as a PC computer, hand held
computer such as Pocket PC and the like. Within a computing device 103 input
received from processing unit 105 is processed and an output data is
transferred to
the interface device 104. Interface device 104 may be a computer screen, a
hand
held computer such as Pahn Pilot, a monitor screen, a printer device, a
speaker
system, head phones, as well as other interface devices capable of
transferring
visual, audio and other information to the human user 106. The output data can
be
stored on a storage device 107 such as a computer hard disk as well as any
to storage device. The output data can also be sent for storage, viewing and
manipulation to other parties by hard wire (not shown), IR device (not shown)
or
any other transfer modalities. Interface device 104 can be used by user 106 to
alter operation of input device 101, computing device 103, processing unit 105
or
any of them. Such activity can be done by the user 106 via direct human
Is interaction with interface device such as by touch, speech, manipulation of
attached mouse device and the like. Operation alterations can include
manipulating sensitivity range, enhancing range, filtering modes and the like.
User 106 can directly manipulate input device 101 typically by touch but also
by
any other means compatible with input device 101. User 106 can also receive
raw
2o analog ADS directly from input device 101 preferably by an acoustic cable
device
such as a hollow polymer tube used with of a stethoscope, by air conduction
through an analog audio amplifier-speaker device (not shown), as well as any
other acceptable means of sound transport. Output information of computing
device 103 as well as other data information of storage device 107 can be
viewed
2s on the user interface device 104 such as graphs, pictures, summary
analysis, and
the like, as well as manipulated by the user 106 for other purposes such as
transferring the data, saving the output data, and the like. Said output
information
as well as data information can also become available to the user as audio
data
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
excerpts via the user interface device 104, an analog audio amplifier-speaker
device (not shown) etc.
It may be evident to the person skilled in the art that audio-digital
s converter as well as an amplifier and the processing unit 105 can be located
in
any number of locations throughout the system such that data can be used and
transferred efficiently. For example, it is preferable that the processing
unit 105
be located close to input device 101 such that analog data will not be
transferred
over long distances thus reducing the amount of lost data by natural
dissipation of
to sound as well as to reduce external interferences. Another example, can
relate to
user 106, such that the user may receive digital as well as analog data
through the
user interface device 104, the input device 101 as well as from processing
unit
105. User 106 can use such aids as a digital to audio converter (not shown)
and an
audio transferring means such as a stethoscope hollow polymer tube. User 106
15 can also perceive analog audio signals transferred via air from an
amplifier and
microphone located on user interface device 104 as well as on other elements
of
the system.
Turning now to Fig. 10 where operation of the system 200 of the present
invention is disclosed where an Audio Data Stream (ADS) 201 such as discussion
2o between user 106 and subject 102 both of Fig. 9 during a psychiatric
interview is
obtained by input device 110 of Fig. 9. A Video Data Stream (VDS) 211 such as
the continuous video of the subject 102 of Fig. 9 during a psychiatric
interview is
obtained by input device 101 of Fig. 9. The ADS 201 and VDS 211 are optionally
transferred by suitable means to the processing device 105 of Fig. 9 where
2s manipulation 202 of the received data is then performed. The manipulations
can
include amplification, filtering, audio to digital conversion, color
correction, and
any other manipulations that can be done on audio and video data for the
purpose
of receiving a pure digitalized audio and video data from the preferred
target.
Working parameters and databases for the manipulation process 202 is obtained
3o from a predefined data located in the processing device 105 of Fig. 9,
database
36
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
105 as well as directly from user 106 of Fig. 9 as well as through user
interface
device 104. It can be easily understood by the person skilled in the art that
any of
the above mentioned operations can be performed in other locations within the
system such as within computing device 103 also of Fig. 9 as well as in other
locations as well as outside the said system. Manipulated ADS is then
transferred
to Computing device 103 as described also in Fig. 9. and then undergoes a
complexity calculation 203 and Indicative parameter calculation 204. The
complexity calculation 203 performed on the ADS stream 201 is preferably done
on at least one substantially small part of the data. Complexity calculation
203
to can be performed automatically as predefined in parameters within computing
device 103 also of Fig. 9. The calculation can be accomplished on at least one
substantially small selected part of said data as predefined in database 205.
Said
calculation can also be performed on at least one substantially small selected
region of interest 206 of said data by user (not shown) using the user
interface
is device 104 also of Fig. 9. Indicative calculation 204, previously described
in
related application hereinabove is a quantitative data element calculated with
respect to predefined parameters such as previously inputted ADS streams (i.e.
normal breathing sounds of an 18 year old White female, etc.), predefined
formulas describing known and predicted ADS stream behavior and patterns (i.e.
2o Pneumonia in a 60 years old African American obese male, Intussusceptions
obstruction in a 5 years old male, vascular narrowing of the dorsalis pedis
artery
of a patient with vascular insufficiency etc.) as well as other parameters
such as
age, social circumstances, body stature, racial origin, occupation, previous
illnesses, current illnesses and the lilce. Said data can be stored before
hand as
25 well as stored continuously with operation. Said data can be stored on the
predefined database 205 as well as on any database device (not shown)
connected
to computing device 103 of Fig. 9 as well as any remote databases devices
(also
not shown). Calculated indicative parameter 204 can be displayed to the user
in
raw state (not shown) on the user interface device 104 also of Fig. 9. Said
37
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
parameter can also be saved on the computing device 102 of fig. 9 as well sent
to
other computer devices (not shown) by methods known in the art. Calculated
indicative parameter 204 can be converted to an easy to understand, final
result
207 such as a certainty of an auscultatory audio stream with enhancement of
s abnormal findings, diagnosis, an audio and image representations of the ADS,
such as an audiogram, such as audiogram manipulated to graphic representation
on paper, a summary of the streaming ADS input selected, a region of interest
of
the streaming ADS by predefined parameters located within the predefined
database 205, a suggested irrunediate therapy indication and the like. Final
result
l0 207 is then transferred to the user interface 104 also of Fig. 9 and is
played and
displayed 208 to the user (not shown). The audio data stream 201 as well as
manipulated ADS 202 can be directly transferred to the user (not shown) and to
the user interface device 104 also of fig. 9 for observation, supervision as
well as
for the manipulation of the location of the input device 101 of Fig. 9 and of
the
Is manipulation processes 202. User 106 of Fig. 9 can preferably control all
steps of
information acquisition, manipulation, viewing, listening, storing, sending
and the
like. ADS stream 202 can be played to the user as a sound track 210 as well as
displayed as an image display 209 during the system's operation. Thus allowing
the user to observe and if needed to manipulate the system operation in real
time
2o using the user interface device 104 also of Fig. 9 as well as his hands as
previously discussed.
The present invention also provides for a system and method for analysis
and evaluation of Human Behavior Stigmata (HBS) obtained from various
2s instruments measuring audio and visual output emanating from a patient,
more
specifically video and sound capturing instruments. The system and method can
be used for non-invasive diagnosis, pronosis and treatment evaluation. The
invention discloses a system and method according to which audio and video
complexity calculation can be implemented on audio and video data recordings
of
3o human psychiatric patients as well as other subjects for whome the study of
38
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
behaviour patterns is of relevance. The input data is recorded in real-time
via
audio and video sensitive instruments previously described. The streaming
audio
and video data is then recorded digitally. The digital recording is received
by the
application. A complexity calculation of the at least a part of the data is
performed. An indicative parameter is calculated using the complexity
calculation
according to predefined information obtained beforehand. The indicative
parameter is used for calculation and transformation such that a final result
can be
displayed to the user. The final result can point to areas of interest in the
HBS
stream; facilitate diagnosis, suggest treatment, used as a prognostic marker
as
well as other forms of medically relevant data. Thus, the output of the system
is
useful in the evaluation and quantification of behavior, more specifically in
Human Behavior Stigmata (HBS), more specifically in the psychiatric
disturbances of human behavior.
Turning now to Fig. 11 wherein parts of the system of the present
invention are disclosed and referenced 100. User 106 is interacting with
subject
102. Such interaction is verbal. Input devices 101 and 110 typically directed
at
subj ect 102 and situated in such a location as to maximize data location and
minimize interaction of user 106 and subject 102 whereby audio and visual data
is
obtained. Input device 101 is a visual capturing device such as a video camera
such as a Sony camcorder, manufactured in Japan, as well as any other
instruments capable of capturing streaming visual signals. Input device 110 is
an
audio capturing device such as a tape recorder such as a Sony tape recorder, a
microphone device such as a wireless microphone from Polycome as well as any
streaming audio capturing device. In Fig. 11 only two input devices are
depicted
for the sake of clarity. It will be evident to the person skilled in the art
that any
number of input devices as well as different types of input devices can be
connected to the computing device 103 via processing device 105. Furthermore,
it
will be appreciated by the person skilled in the art that any device combining
an
3o audio as well as video device can be used in place of two input devices 101
and
39
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
110 illustrated in Fig. 11. Data obtained by input devices 101 and 110 is
transferred via cable, modem, Infra Red (IR) or any other form known to a
processing device 105. Analog data obtained by input device 101 and 110 can be
transformed into a digital format there within or is transferred preferably to
the
s processing unit 105. Processing unit 105 is functional in converting audio
and
video data from analog to digital format as well as enhancing and filtering
said
data as well as transmitting said data to computing device 103 and user 106
via
suitable cable, IR apparatus, modem device and similar transfer means of
digital
information. The parameters used by processing device 105 can be located
within
to the processing device 105, received from user 106 by way of user interface
device
104, stored on storage device 107 as well as on other locations outside the
proposed system (not shown). It will be evident to the person skilled in the
art
that many input devices known contain there within processing units such as
processing unit 105 such that with many such input devices the existence of
1s processing device 105 in system 100 is optional and input devices 101 and
110
can transfer digital format, enhanced and filtered information directly to
computing device 103. Computing device 103 is a software program or a
hardware device such as a PC computer, such as a PC computer, hand held
computer such as Pocket PC and the like. Within the computing device 103 input
2o received from input devices 101 and 110 is processed and an output data is
transferred to the interface devices 104. Interface devices may be a computer
screen such as an LG Studioworks 57I, a hand held computer such as Palm Pilot
manufactured by the Palm corporation, a monitor screen, a television device,
an
interactive LCD screen, a paper record, a speaker device as well as other
interface
2s devices functional in conveying video as well as audio information. The
output
data can be stored on a storage device 107 such as a computer hard disk as
well as
any storage device. The output data can also be sent for storage, viewing and
manipulation to other parties by hard wire (not shown), IR device (not shown)
or
any other transfer modalities including via data network (not shown).
Interface
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
device 104 may be used to alter operation of input devices 101 and 110,
computing device 103 or any other part of the system. Such activity can be
done
by the user 106 via direct human interaction such as by touch, speech,
manipulation of attached mouse device and the like. Output information can be
s viewed as graphs, pictures, audio excerpts, summary analysis, and the like,
as
well as manipulated by the user 106 for other purposes such as transferring
the
data, saving the output and the like.
Turning now to Fig. 12 where operation of the system 200 of the present
to invention is disclosed where an Audio Data Stream (ADS) 201 such as a
discussion between user 106 and subject 102 both of Fig. 11 during a
psychiatric
interview is obtained by input device 110 of Fig. 11. A Video Data Stream
(VDS)
211 such as the continuous video of the subject 102 of Fig. 11 during a
psychiatric interview is obtained by input device 101 of Fig. 11. The ADS 201
15 and VDS 211 are optionally transferred by suitable means to the processing
device 105 of Fig. 11 where manipulation 202 of the received data is then
performed. The manipulations can include amplification, filtering, analog to
digital conversion, color correction, and any other manipulations that can be
done
on audio and video data for the purpose of receiving a pure digitalized audio
and
2o video data from the proffered target. Working parameters and a database for
the
manipulation process 202 are obtained from a predefined data located in
processing device 105 of Fig. 11, database 107 as well as directly from user
106
of Fig. 11 as well as through a user interface device 104 also of Fig. 11. It
can be
easily understood by the person skilled in the art that any of the above
mentioned
2s operations can be performed in other locations within the system such as
within
the computing device 103 also of Fig. 11, as well as in other locations, as
well as
outside the said system (not shown). Manipulated ADS and VDS are then
transferred to the computing device 103 as described also in Fig. 11.
Manipulated
ADS and VDS then undergo a complexity calculation 203. The complexity
3o calculation 203 performed on the ADS and VDS stream 201 is preferably done
on
41
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
at least one substantially small part of the data. Complexity calculation 203
can
be performed automatically as predefined in parameters within the computing
device 103 also of Fig. 11, as predefined in data base 205. Said complexity
calculation can also be performed on at least one substantially small selected
region of interest 206 of said data by user (not shown) using the user
interface
device 104 also of Fig. 11. Complexities obtained at step 203 can be stored in
computing device 103, data base 205 or other appropriate locations in system
200
for further use and reference. Indicative parameter calculation 204 is then
calculated from the resulting complexities obtained at step 203. The
indicative
to calculation 204 is a quantitative and qualitative data element, calculated
according to the predefined parameters such as previously inputted ADS and
VDS streams (i.e. normal mimics, gestures and oration of healthy young adult),
predefined fornulas describing known and predicted ADS and VDS streams
behavior and patterns (i.e. typical body gestures as well as speech of a manic
1s patient etc.) as well as other parameters such as age, social
circumstances, racial
origin, occupation, previous illnesses, concurrent illnesses and the like.
Said data
can be stored before hand as well as stored continuously with operation. Said
data
can then be stored on the Predefined Database 205 as well as on any database
device (not shown) connected to the computing device 103 of Fig. 11 as well as
2o any remote databases devices (also not shown). Calculated Indicative
Parameter
204 can then be displayed to the user in raw state (not shown) on the user
interface device 104 also of Fig. 11. Said parameter can also be saved on the
computing device 102 of Fig. 11 as well sent to other computer devices (not
shown) by methods known in the art. Calculated Indicative Parameter 204 can
25 then be converted to an easy to understand final result 207 such as an
audio and
image replay of a part of the interview with enhancement of abnornal findings,
a
probable diagnosis, an audio and image representation of the findings, such as
an
exemplary image of a certain gesture, mimic, word use etc., a summary of the
streaming ADS and VDS inputs selected, a region of interest of the streaming
42
SUBSTITUTE SHEET (RULE 26)
CA 02486309 2004-11-09
WO 03/096262 PCT/IL03/00386
ADS and VDS by predefined parameters located within the predefined database
205, a suggested therapy indication, a statistical probability of response to
therapy
and the like. Final result 207 is then transferred to the user interface 104
also of
Fig. 11 and is then played and displayed 208 to the user (not shown). The ADS
s 201 and VDS 211 as well as manipulated ADS done at manipulating process 202
can be directly transferred to the user (not shown) and to the user interface
device
104 also of Fig. 11 for observation, supervision as well as for the
manipulation of
the type, location, as well as other input devices 101 and 110 of Fig. 11 and
of the
manipulation processes 202. User 106 of Fig. 11 can preferably control all
steps
Zo of information acquisition, manipulation, viewing, listening, storing,
sending and
the like. ADS stream 202 can be played to the user as a sound track 210 as
well as
displayed as a video or image display 209 during system 200 operation. Thus
allowing the user to observe and if needed to manipulate system 200 operation
in
real time using the user interface device 104 also of Fig. 11 as well as other
forms
is of communication with user interface device 104 as previously discussed.
The person skilled in the art will appreciate that what has been shown is
not limited to the description above. Many modifications and other embodiments
of the invention will be appreciated by those skilled in the art to which this
2o invention pertains. It will be apparent that the present invention is not
limited to
the specific embodiments disclosed and those modifications and other
embodiments are intended to be included within the scope of the invention.
Although specific teens are employed herein, they are used in a generic and
descriptive sense only and not for purposes of limitation.
2s
It should be evident to the person skilled in the art that the illustration
brought here and above is but an example and many other uses and
configurations
of the system are possible within the scope of the present invention.
43
SUBSTITUTE SHEET (RULE 26)