Language selection

Search

Patent 2656305 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent Application: (11) CA 2656305
(54) English Title: AN AUTOMATED REPORTING SYSTEM
(54) French Title: SYSTEME DE TRANSMISSION DE DONNEES AUTOMATISEES
Status: Dead
Bibliographic Data
(51) International Patent Classification (IPC):
  • G06Q 10/00 (2012.01)
(72) Inventors :
  • LAWIE, DAVID (Australia)
(73) Owners :
  • IOGLOBAL PTY LTD (Not Available)
(71) Applicants :
  • IOGLOBAL PTY LTD (Australia)
(74) Agent: MBM INTELLECTUAL PROPERTY LAW LLP
(74) Associate agent:
(45) Issued:
(22) Filed Date: 2009-02-24
(41) Open to Public Inspection: 2010-08-24
Availability of licence: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): No

(30) Application Priority Data: None

Abstracts

English Abstract




An automated reporting system is disclosed which comprises
means for receiving quality control/quality assurance
related data on-line from a remote location, means for
facilitating on-line definition by a user of report
criteria for a quality control/quality assurance report
based on the received data, and a report engine arranged
to produce a quality control/quality assurance report
according to the defined criteria and based on the
received data. A corresponding method is also disclosed.


Claims

Note: Claims are shown in the official language in which they were submitted.




-24-

Claims:


1. An automated reporting system comprising:
means for receiving quality control/quality assurance
related data on-line from a remote location;
means for facilitating on-line definition by a user
of report criteria for a quality control/quality assurance
report based on the received data; and
a report engine arranged to produce a quality
control/quality assurance report according to the defined
criteria and based on the received data.


2. An automated reporting system as claimed in claim 1,
wherein the quality control/quality assurance related data
is assay data.


3. An automated reporting system as claimed in claim 2,
wherein the quality control/quality assurance related data
comprises mining related data, laboratory related data, or
metallurgy related data.


4. An automated reporting system as claimed in any one
of the preceding claims, wherein the means for
facilitating definition by a user of report criteria
comprises a web interface accessible through the Internet.

5. An automated reporting system as claimed in claim 4,
wherein the web interface comprises a web server.


6. An automated reporting system as claimed in any one
of the preceding claims, wherein the means for receiving
quality control/quality assurance related data on-line is
arranged to receive quality control/quality assurance
related data by email.


7. An automated reporting system as claimed in any one
of the preceding claims, wherein the means for receiving



-25-


quality control/quality assurance related data on-line is
arranged to receive quality control/quality assurance
related data through the web interface.


8. An automated reporting system as claimed in any one
of the preceding claims, wherein the system further
comprises a system database arranged to store quality
control/quality assurance related data received from at
least one client.


9. An automated reporting system as claimed in any one
of the preceding claims, wherein the means for receiving
quality control/quality assurance related data on-line is
arranged to extract quality control/quality assurance data
from a client hosted database.


10. An automated reporting system as claimed in claim 9,
comprising a data abstraction layer arranged to directly
interact with the client database.


11. An automated reporting system as claimed in claim 9
or claim 10, wherein the means for receiving quality
control/quality assurance related data of this embodiment
is arranged to extract quality control/quality assurance
data from a client hosted database on-the-fly based on
criteria defined for a client report.


12. An automated reporting system as claimed in any one
of the preceding claims, wherein the means for
facilitating definition by a user of report criteria is
arranged to facilitate selection on-line by a user of
extraction criteria defining data to be included in a
report.


13. An automated reporting system as claimed in claim 12,
wherein the extraction criteria include one or more of
project name, sample taken date or sample reported date,



-26-


element or compound, test type, laboratory, batch number,
analytical method, blank type, standard name, field
duplicate name or laboratory duplicate name.


14. An automated reporting system as claimed in any one
of the preceding claims, wherein the means for
facilitating definition by a user of report criteria is
arranged to facilitate definition of an exception profile
indicative of the level at which test data suggests a
potential area of concern.


15. An automated reporting system as claimed in claim 14,
wherein the exception profile is effective for all
elements and compounds included in the or each test.


16. An automated reporting system as claimed in any one
of the preceding claims, wherein the means for
facilitating definition by a user of report criteria is
arranged to facilitate definition of at least one
exception parameter specific to one or more element or
compound and for at least one test.


17. An automated reporting system as claimed in any one
of the preceding claims, wherein the system comprises at
least one report procedure usable by the report engine to
produce a report.


18. An automated reporting system as claimed in any one
of the preceding claims, wherein the report engine
comprises a report application usable to create required
report content and/or present report content.


19. An automated reporting system as claimed in any one
of the preceding claims, wherein the report engine
comprises a format generator application usable to export
a report in a desired format.



-27-


20. An automated reporting system as claimed in any one
of the preceding claims, wherein the system is arranged to
facilitate downloading of a report by a user.


21. A method of generating a quality control/quality
assurance report comprising:
receiving quality control/quality assurance related
data on-line from a remote location;
facilitating on-line definition by a user of report
criteria for a quality control/quality assurance report
based on the received data; and
producing a quality control/quality assurance report
according to the defined criteria and based on the
received data.


22. A method as claimed in claim 21, wherein the quality
control/quality assurance related data is assay data.


23. A method as claimed in claim 22, wherein the quality
control/quality assurance related data comprises mining
related data, laboratory related data, or metallurgy
related data.


24. A method as claimed in any one of claims 21 to 23,
wherein the step of facilitating definition by a user of
report criteria comprises providing a web interface
accessible through the Internet.


25. A method as claimed in claim 24, wherein the web
interface comprises a web server.


26. A method as claimed in any one of claims 21 to 24,
comprising receiving quality control/quality assurance
related data by email.


27. A method as claimed in any one of the preceding
claims, comprising receiving quality control/quality



-28-


assurance related data through the web interface.


28. A method as claimed in any one of claims 21 to 27,
comprising storing quality control/quality assurance
related data received from at least one client.


29. A method as claimed in any one of claims 21 to 28,
comprising extracting quality control/quality assurance
data from a client hosted database.


30. A method as claimed in claim 29, comprising providing
a data abstraction layer arranged to directly interact
with the client database.


31. A method as claimed in claim 29 or claim 30,
comprising extracting quality control/quality assurance
data from a client hosted database on-the-fly based on
criteria defined for a client report.


32. A method as claimed in any one of claims 21 to 31,
facilitating selection on-line by a user of extraction
criteria defining data to be included in a report.


33. A method as claimed in claim 32, wherein the
extraction criteria include one or more of project name,
sample taken date or sample reported date, element or
compound, test type, laboratory, batch number, analytical
method, blank type, standard name, field duplicate name or
laboratory duplicate name.


34. A method as claimed in any one of claims 21 to 33,
comprising facilitating definition of an exception profile
indicative of the level at which test data suggests a
potential area of concern.


35. A method as claimed in claim 34, wherein the
exception profile is effective for all elements and



-29-

compounds included in the or each test.


36. A method as claimed in any one of claims 21 to 35,
comprising facilitating definition of at least one
exception parameter specific to one or more element or
compound and for at least one test.


37. A method as claimed in any one of claims 21 to 36,
comprising providing at least one report procedure usable
to produce a report.


38. A method as claimed in any one of claims 21 to 37,
comprising facilitating downloading of a report by a user.

39. An automated reporting system substantially as
hereinbefore described with reference to, and as shown in,
the accompanying drawings.


40. A method of generating a quality control/quality
assurance report substantially as hereinbefore described
with reference to, and as shown in, the accompanying
drawings.

Description

Note: Descriptions are shown in the official language in which they were submitted.



CA 02656305 2009-02-24
- 2 -

AN AUTOMATED REPORTING SYSTEM
Field of the Invention

The present invention relates to an automated reporting
system and, in particular, to an automated reporting
system for use in generating quality assurance/quality
control reports from mining related data.

Background of the Invention

Quality assurance/quality control reports have
traditionally taken significant amounts of time to
complete, are often inconsistent and in some cases include
mathematically incorrect data.
Summary of the Invention

In accordance with an aspect of the present invention,
there is provided an automated reporting system
comprising:
means for receiving quality control/quality assurance
related data on-line from a remote location;
means for facilitating on-line definition by a user
of report criteria for a quality control/quality assurance
report based on the received data; and
a report engine arranged to produce a quality
control/quality assurance report according to the defined
criteria and based on the received data.
In one embodiment, the quality control/quality assurance
related data is assay data, and may be mining related
data, laboratory related data, metallurgy related data, or
any other data derived from at least one quality
control/quality assurance test.

In one embodiment, the means for facilitating definition


CA 02656305 2009-02-24
- 3 -

by a user of report criteria comprises a web interface
accessible through the Internet. The web interface may
comprise a web server.

In one embodiment, the means for receiving quality
control/quality assurance related data on-line is arranged
to receive quality control/quality assurance related data
by email.

In one embodiment, the means for receiving quality
control/quality assurance related data on-line is arranged
to receive quality control/quality assurance related data
through the web interface.

In one embodiment, the system further comprises a system
database arranged to store quality control/quality
assurance related data received from at least one client.
In addition, or alternatively, the means for receiving
quality control/quality assurance related data on-line is
arranged to extract quality control/quality assurance data
from a client hosted database. With this embodiment, the
system may further comprise a data abstraction layer
arranged to directly interact with the client database.
The means for receiving quality control/quality assurance
related data of this embodiment may be arranged to extract
quality control/quality assurance data from a client
hosted database on-the-fly based on criteria defined for a
client report.

The means for facilitating definition by a user of report
criteria may be arranged to facilitate selection on-line
by a user of extraction criteria defining data to be
included in a report.

The extraction criteria may include one or more of project


CA 02656305 2009-02-24
4 -

name, sample taken date or sample reported date, element
or compound, test type, laboratory, batch number,
analytical method, blank type, standard name, field
duplicate name or laboratory duplicate name.
The means for facilitating definition by a user of report
criteria may be arranged to facilitate definition of an
exception profile indicative of the level at which test
data suggests a potential area of concern.
The exception profile may be effective for all elements
and compounds included in the or each test.

The means for facilitating definition by a user of report
criteria may be arranged to facilitate definition of at
least one exception parameter specific to one or more
element or compound and for at least one test.

In one arrangement, the system comprises at least one
report procedure usable by the report engine to produce a
report.

The report engine may comprise at least one report
application usable to create required report content
and/or present report content, such as a chart generation
application and a format generator application usable to
export a report in a desired format.

The system may be arranged to facilitate downloading of a
report by a user.

The system may comprise a control unit arranged to control
and coordinate operations in the system.

Brief Description of the Drawings

The present invention will now be described, by way of


CA 02656305 2009-02-24
- 5 -

example only, with reference to the accompanying drawings,
in which:

Figure 1 is a schematic block diagram of an automated
reporting system in accordance with an embodiment of the
present invention;
Figure 2 is a diagrammatic representation of a log in
screen produced by the system shown in Figure 1;
Figure 3 is a diagrammatic representation of a home
page produced by the system shown in Figure 1, and showing
an exception profiles box;
Figure 4 is a diagrammatic representation of a
profile definition screen produced by the system shown in
Figure 1;
Figure 5 is a diagrammatic representation of a
compounds box produced by the system shown in Figure 1;
Figure 6 is a diagrammatic representation of an
element parameters box produced by the system shown in
Figure 1;
Figure 7 is a diagrammatic representation of a home
page produced by the system shown in Figure 1 and showing
a criteria box;
Figure 8 is a diagrammatic representation of a home
page produced by the system shown in Figure 1 and showing
an advanced criteria box;
Figure 9 is a diagrammatic representation of a home
page produced by the system shown in Figure 1 and showing
a report box;
Figure 10 is a diagrammatic representation of a
report summary produced by the system shown in Figure 1;
Figure 11 is a diagrammatic representation of a home
page produced by the system shown in Figure 1 and showing
a quality control report;
Figure 12 is a diagrammatic representation of a
sizing graph produced by the system shown in Figure 1;
Figure 13 is a diagrammatic representation of a
contamination graph produced by the system shown in Figure


CA 02656305 2009-02-24
6 -
1;
Figure 14 is a diagrammatic representation of a
standards graph produced by the system shown in Figure 1;
Figure 15 is a diagrammatic representation of a
multiple elements standard graph produced by the system
shown in Figure 1;
Figure 16 is a diagrammatic representation of an
individual run chart produced by the system shown in
Figure 1;
Figures 17 to 22 show field repeatability charts
produced by the system shown in Figure 1;
Figures 23 to 25 show laboratory repeatability charts
produced by the system shown in Figure 1; and
Figure 26 is a schematic block diagram of an
automated reporting system in accordance with an
alternative embodiment of the present invention
Description of an Embodiment of the Invention

Referring to the drawings, in Figure 1 there is shown a
schematic block diagram of an automated reporting system
10 usable on line to generate quality control/quality
assurance reports based on stored data.

In this example, an embodiment is described wherein the
stored data is mining related data derived from mining
related tests and the reports are quality
assurance/quality control (QA/QC) reports based on the
mining related data. However, it will be understood that
the system may also be arranged to produce QA/QC reports
based on other types of data, such as metallurgy related
data, laboratory related data, or any other data derived
from at least one quality control/quality assurance test.

The system 10 includes a database 12 arranged to store
data used to form the basis of a report, and a web
interface device 14, which in this example includes a web


CA 02656305 2009-02-24
- 7 -

server 15, for facilitating interaction with the system 10
by remote computing devices 16 through a communications
network, in this example the Internet 18. The web
interface also includes an upload application 17 arranged
to facilitate uploading of QA/QC data from a user personal
computing device 16.

In this example, the upload application 17 is arranged to
facilitate reception of QA/QC data from a client by email,
or by any suitable file transfer mechanism which may be
implemented through the web interface.

It will be understood that the computing devices 16 may be
any communications enabled computing device such as
personal computing devices, PDAs, web enabled mobile
phones, and so on.

The system 10 also includes a control unit 20 which may
include a processor and associated operational software
for controlling and coordinating operations in the system
10 including controlling operation of the web interface
14, controlling operation of the upload application 17,
and controlling communications between the web interface
14 and the database 12.
The control unit 20 also interacts with a report engine 22
arranged to generate reports in response to instructions
received from the control unit 20. The report engine 22
operates in accordance with procedures stored in a
procedures database 24, the procedures defining the
framework for extraction of desired data from the database
12, processing of the data so as to produce transformed
data as required, and presentation of the extracted data
in a report. The report engine 22 may include one or more
dependent applications usable to create required report
content from the extracted data and/or to create
particular content presentation. In this example, the


CA 02656305 2009-02-24
- 8 -

report engine 22 includes a chart generator application 26
which is executed by the report engine 22 as required to
generate defined charts from the extracted data, and a
format generator application 28 which may be executed to
export a report in a selected format.

It will be appreciated that the database 12 may include
data associated with one or multiple users, or data
associated with multiple organisations such that only
authorised users or authorised users associated with an
organisation are permitted to access the respective data.
Hereinafter, individual users, multiple users and users
associates with organizations will be referred to as
"users" for ease of reference.
During use, a user desiring to create a report from data
stored in the database 12 connects to the web interface 14
through the Internet 18 using a computing device 16.
After appropriate identification of the user at a log in
screen 40 shown in Figure 2, for example by entering user
name and password details into log in boxes 42 and
clicking a log in button 44, the user is granted access to
data associated with the user stored on the database 12.
In particular, a user is able to use the web interface 14
to define criteria for a desired report and to
subsequently view the created report on the web interface
14.

When the report criteria have been defined, the report
criteria are passed from the web interface 14 to the
report engine 22 by the control unit 20 and the report
engine 22, using the procedures stored in the procedures
database 24, extracts the required data from the database
12 according to the report criteria. The report engine 22
then creates a report using the extracted data and using
any required report application, such as the chart
generator 26. The created report is then passed back to


CA 02656305 2009-02-24
- 9 -

the web interface 14 through the control unit 20 and is
served to the user's computing device 16 through the
Internet 18 for viewing on the computing device 16.

A specific example of the system 10 during operation will
now be described in relation to Figures 3 to 25 which
represent screens served to a user computing device 16 by
the web server 15 during use.

After successful log in using a log in screen 40 as shown
in Figure 2, a home page 50 as shown in Figure 3 is served
to the user's computing device 16 by the web server 15.
It will be understood that the home page 50 includes
information and selection options which are specific to
the user in that only data associated with the user is
accessible, and user specific profiles and other user
defined report specific information are displayed, as
discussed in more detail below.

The home page 50 in this example is displayed using a
conventional web browser and, as such, includes an address
box 52 usable to enter a web page address, and navigation
buttons 54 usable to navigate between web pages. The home
page 50 also includes a links bar 56 on which several
links 57 are disposed, the links being usable to navigate
to different parts of the system.

The home page 50 also includes a report definition section
58 which is used to define profiles and criteria for a
report to be created by the system 10, and in this example
includes an exception profiles button 60, a criteria
button 62, an advanced criteria button 64, a run/export
button 66 and a notes button 68.

Activation of the exception profiles button 60, for
example using a mouse, causes a profiles box 70 to be
displayed as shown in Figure 2, the profiles box 70


CA 02656305 2009-02-24
- 10 -

including a profile selection drop down box 72 usable to
select from one or more existing predefined profiles.
Selection of a predefined profile causes profile
parameters 74 to be displayed.
The report profile parameters displayed in the profiles
box 70 define boundary limits for report data indicative
of the level at which a particular test indicates an
exception, that is, test results which indicate a
potential area of concern.

In the present example, the data stored in the database
relates to the following tests:
i) a sizing test usable to determine the efficiency
of particle size reduction processes and whether
the process conforms to specification;
ii) a contamination test usable to test for
contamination introduced between taking the
sample and carrying out analysis. Blanks may be
introduced into the laboratory either as coarse
material that will require sample preparation,
or as a pulp that will require digestion and
analysis;
iii) an accuracy test useable to determine the
accuracy of a particular laboratory over time in
detecting an analyte by using one or more
standards wherein the amount of an analyte
present in the standard is known;
iv) a field repeatability test usable to determine
process sampling precision by comparing original
and duplicate samples;
v) a laboratory repeatability test usable to
determine the sampling precision of a laboratory
by comparing results obtained at the laboratory
from original and duplicate samples.

An exception profile may be modified and new profiles may


CA 02656305 2009-02-24
- 11 -

be added by activating an open button 80, which causes a
profile definition screen 82 as shown in Figure 4 to be
displayed. The profile definition screen 82 includes an
element selection box 84 usable to select elements to be
included in the exception profile and thereby made
available for selection in a report created based on the
exception profile. In this example, selected elements 86,
elements available for selection 88 and elements not
available for selection 90 are represented differently so
that they are distinguishable from each other by a user.
An exception profile which is applicable for all selected
elements in the profile is definable using a profile
parameters box 92 which permits selection of an exception
parameter for each available test.
In this example, the exception parameters include a sizing
parameter 94 which defines the percentage amount of
particles passing a maximum size limit, such as 2mm for
coarse crush particles and 75pm for pulp; a standards
parameter 96 which defines an exception limit based on a
percentage deviation from an expected value or based on a
specific number multiple of standard deviations from the
expected value; a field repeatability parameter 98 which
defines a percentage relative difference between duplicate
samples, or is based on an expected relative error model;
a laboratory repeatability parameter 100 which defines
either a specific number multiple of the analytical
detection limit, a percentage relative difference between
duplicate samples, or defines an expected relative error
model; a blank parameter 102 which relates to a
contamination test and defines a maximum limit based on a
specific number multiple of an analytical detection limit
or based on an absolute value. A profile name may be
defined using a profile name box 108.
If an element is selected having compounds associated with
it, clicking on the element will cause a compounds box 110


CA 02656305 2009-02-24
- 12 -

as shown in Figure 5 to be displayed. Using the compounds
box 110, a user is able to select an element and/or one or
more compounds including the element to be included in the
profile by marking check boxes 112.
It will be understood that the profile definition screen
82 is used to define exception profiles to apply to all
elements and compounds in a report. However, exception
parameters specific to one or more elements or compounds
io may alternatively be defined by activating an individual
analyte button 114. This causes an element parameters box
116 as shown in Figure 6 to be displayed.

The element parameters box 116 enables the type of test
model for use in a particular test to be selected and also
enables individual exception parameters to be selected for
each element using parameter selection boxes 120.

In the present example, the models available for the
contamination test are based on a specific number multiple
of the analytical detection limit (x DL) or based on an
absolute value (Abs Val), the models available for the
accuracy test are based on an exception limit for a
percentage deviation from an expected value (%EV) or based
on a specific number multiple of standard deviations from
an expected value (SD), the models available for the field
repeatability test are based on a percentage relative
difference between duplicate samples (%RD), or based on an
expected relative error model (%RE), and the models
available for the laboratory repeatability test are based
on a percentage relative difference between duplicate
samples (%RD), or based on an expected relative error
model (%RE).

Selection of the criteria button 62 causes a criteria box
122 as shown in Figure 7 to be displayed. The criteria
box 122 is usable to define the data to be included in the


CA 02656305 2009-02-24
- 13 -

report and, for this purpose, a project selection box 124
is provided so as to allow a user to include data in the
report from all projects associated with the user or
select particular projects associated with the user, a
date range usable to restrict data in the report based on
dates reported or dates sampled and a specified date range
using date selection fields 126, an elements selection box
128 usable to select one or more elements to be included
in the report, and a report sections drop down box 130
usable to select the report sections to be included in the
report, each report section in this example relating to a
particular test.

Selection of the advanced criteria button 64 causes an
advanced criteria box 140 as shown in Figure 8 to be
displayed, the advanced criteria box 140 including
advanced criteria buttons 142 usable to define the data to
be included in the report with more granularity.

The system may also enable a user to select an appropriate
scale for plots, charts, and so on, created by the system
so that the report data is represented in a more
meaningful way.

Selection of the run/export buttons 66 causes a report box
150 as shown in Figure 9 to be displayed. The report box
150 includes a summary button 152, a view report button
154, an export type drop down box 156, an export report
button 158, and an extract data button 160.
Activation of the summary button 152 causes a report
summary 170 as shown in Figure 10 to be displayed. The
report summary 170 provides an indication of the projects
to be included in the report, and the amount of data to be
included for the selected tests. This allows a user to
obtain an indication of the nature and size of a report
before the report is initiated, and in particular of the


CA 02656305 2009-02-24
- 14 -

type and quantum of QC data associated with the report.
Activation of the view report button 154 causes the
control unit 20 to send a communication to the report
engine 22 to extract data from the database 12 according
to the report criteria defined using the criteria box 122
and the advanced criteria box 140, and to create a report
according to report creation procedures defined in the
procedures database 24 and including exception profiles
defined by the selected exception profile.

Using the extract data button 160, a user is able to
extract raw data from the system.

As shown in Figure 11, the created report 180 is displayed
in a report window 181 and the report may be navigated
using report navigation icons 182 and report navigation
buttons 184. Each of the report navigation icons 182
includes a heading 186 representing a section of the
report, and, in particular, each test included in the
report has an associated heading. Each heading may be
expanded so as to provide sub-headings which may be used
to navigate to specific sub-sections of the report.

The results 190 of an exemplary sizing test represented in
bar chart form are shown in Figure 12. In this example,
the test relates to pulp, the selected exception
parameter, that is the percentage of pulp passing a given
sieve size, is 90%, and the sieve size is 75pm. The
exception parameter is represented by horizontal exception
line 192. As can be seen, in this example 29.79% of
samples fail the test, although the graph shows an
improvement in pulverizing performance over time.

The section of the report relating to contamination is
organized by blank type (coarse or pulp), laboratory
analytical method, and analyte combination. The results


CA 02656305 2009-02-24
- 15 -

196 of a contamination test represented in bar chart form
are shown in Figure 13. In this example, the test relates
to 17 samples analysed by Lab 1 in relation to element Au
and method FA. The selected exception parameter is 10
times the detection limit and is represented by horizontal
exception line 198. In this test, no samples exceed the
exception line 198 and, accordingly, no samples fail the
test.

The section of the report relating to accuracy is divided
into two summary sections, multiple standards per
laboratory, element and method, and multiple elements per
standard, lab and method.

The results of an accuracy test are shown in a standards
graph 200 shown in Figure 14. The standards graph 200
represents all standards for a laboratory analysed for the
same analyte by the same method. In this example, the
test relates to three standards analysed by Lab 1 for
element Fe using method XRF1. The standards graph
includes guide lines 202 at +/- 10% deviation from an
expected value and guide lines 204 at +/- 20% deviation
from an expected value. The average of the percent
relative difference between analytical values and the
expected value for each standard in a batch is calculated
and plotted against the batch in batch submission order.
It will be understood. that the chart provides a display of
the variability of laboratory over time for a single
element for multiple standards, and any bias affecting the
whole grade range, or a specific standard or grade range
will be apparent.

An alternative standards graph 210 showing the results of
an accuracy test is shown in Figure 15. The standards
graph 210 represents multiple analytes for a standard,
laboratory and method using box and whisker plots 212.


CA 02656305 2009-02-24
- 16 -

Each box and whisker plot 212 represents a highest value
214, a lowest value 216, an average value 218, a median
value 220, and upper 222 and lower 224 boundaries of 50%
of the results.
It will be understood that the chart provides a snapshot
of the performance of all of the selected analytes in a
particular standard or laboratory method combination.

An individual run chart 230 showing a temporal plot of
analytical results of an element for a particular standard
is shown in Figure 16. The charts are grouped by
laboratory, standard, method and element, with each
combination shown on a separate plot. Each run chart 230
also includes a 3-point moving average 232, an expected
value line 234, and standard deviation lines 236.

Also shown in Figure 16 is a table 238 which displays the
plotted data average bias %CV, plot % bias, plot %CV and
the % of values outside the control limits.

In the present example shown in Figure 16, the standard is
STDO1, the laboratory is Lab 1, the method is FA and the
element is Au.
The section of the report relating to precision is
organized into two categories - field repeatability and
laboratory repeatability, with sub-categories for each
duplicate type (such as blasthole duplicates, RC
duplicates and soil duplicates), element and method.

The results of a field repeatability test are shown in a
linear original vs duplicate scatter chart 240 using a
linear/linear scale and a log scatter chart 242 using a
log/log scale in Figure 17. Guide lines 246 are shown at
-20%, -10%, 0%, +10% and +20% which presents as lines with
a slope of 0.8, 0.9, 1, 1.1 and 1.2 on the linear/linear


CA 02656305 2009-02-24
- 17 -

chart and parallel lines on the log/log plot.
The field repeatability test results may also be
represented using linear/linear 250 and linear/log 252 %
relative difference vs mean concentration plots as shown
in Figure 18. The plots 250, 252 display the % difference
of an original value derived from a first sample and a
duplicate value from a duplicate sample. Guide lines are
drawn at -20%, -10%, 0, +10% and +20%. These plots will
show whether there is an analytical bias, as it would be
expected that there are approximately the same number of
points above and below the zero line.

The field repeatability test results may also be
represented using plots 260 of calculated coefficient of
variation (CV%) versus the order in which the samples were
analysed as shown in Figure 19. This plot shows a time
plot of analytical performance.

An error model plot 270 as shown in Figure 20 may also be
produced. The error model plot 270 displays the
calculated CV% versus the ranking % of the CV% sorted in
ascending order. A table 272 is also displayed showing
the number of pairs available after filtering for data
close to the detection limit or above a specified absolute
value and the associated calculated RMS CV% and robust
CV%. The curved lines 274 represent the pattern that the
CVs should make if the analytical data conforms
respectively to 1, 2, 5, 10, 20, 30, 40 or 50 % relative
error, assuming normally distributed errors.

For the field repeatability test, a user is able to
define, on an element by element basis, an exception limit
based on a multiple of the analytical detection limit, or
based on a percent relative difference or an expected
relative error model.


CA 02656305 2009-02-24
- 18 -

An exception model plot 280 for a lower cut value and
based on a percent relative difference is shown in Figure
21. The plot 280 shows exception zones 282 defined by
exception parameters of 10 x the detection limit (10xDL)
represented by vertical exception line 284 and 20%
relative difference represented by horizontal exception
lines 286.

An exception model plot 290 for a lower cut value and
based on an expected relative error model is shown in
Figure 22. The plot 290 shows a control line 292 below
which 90% of the plotted CV values would be expected to
occur given a 1 standard deviation relative error model of
5
o.
The results of a lab repeatability test are shown in a
linear original vs duplicate scatter chart 300 using a
linear/linear scale and a log scatter chart 302 using a
log/log scale in Figure 23. Guide lines 304 are shown at
-10%, -5%, 0%, +5%, and +10% which presents as lines with
a slope of 0.8, 0.9, 1, 1.1 and 1.2 on the linear/linear
chart and parallel lines on the log/log plot. Laboratory
repeatability is different to field repeatability because
sub-sampling is done after a particle size reduction and,
in general, the sampling errors between duplicates are
much reduced relative to the sampling errors obtained for
field repeatability.

The lab repeatability test results may also be represented
using linear/linear 310 and linear/log 312 % relative
difference vs mean concentration plots as shown in Figure
24. The plots 310, 312 display an original value derived
from a first sample and a duplicate value from a duplicate
sample. Guide lines are drawn at -20%, -10%, 0, +10% and
+20%. These plots will show whether there is an
analytical bias, as it would be expected that there are
approximately the same number of points above and below


CA 02656305 2009-02-24
- 19 -
the zero line.

The lab repeatability test results may also be represented
using plots 320 of calculated coefficient of variation
(CV%) versus the order in which the samples were analysed
as shown in Figure 25. This plot shows a time plot of
analytical performance.

Error model plots and exception model plot of the type
shown in Figures 20 and 21 in relation to field
repeatability may also be produced for lab repeatability.
The system is arranged to group lab repeatability results
by laboratory and, using the expandable headings 186 in
the report 180, a user is able to view results specific to
a particular laboratory.

It will be appreciated that the present system allows a
user to quickly obtain a customized accurate and
reproducible QA/QC report based on received QA/QC data and
to obtain the report on-line from a remote location.
Since all processing is carried out at a central location
remote from the users, for example by operators of the
system, the users are required only to have on-line access
to the system, for example through the Internet, in order
to obtain the reports.

It will also be appreciated that the system enables a user
to obtain an automatically structured report, wherein the
structure is determined and modified according to the type
of data to be included in the report. In addition, by
defining user selectable and modifiable pre-defined
exception profiles, the system also ensures that the
exception profiles applied to the data are consistent. In
this way, the format and content of the report is
consistent and reproducible.


CA 02656305 2009-02-24
- 20 -

It will also be appreciated that the data which forms the
basis of a report is stored in one location only, which
ensures that reports created based on the stored data are
consistent and reproducible.
In a variation of the above embodiment, instead of
facilitating reception of quality control/quality
assurance data from a client, for example by enabling a
client to send data to the system using the web interface,
the system may be configured to retrieve quality
control/quality assurance data from a client database in
response to a request by a user such as in response to
definition by a user of report criteria, automatically
based on predefined criteria, or in any other way. The
retrieved data may be stored in the database 12 for
subsequent use in generating one or more reports, or the
quality control/quality assurance data may be retrieved
directly from a client database when required by the
report engine 22.
An example of an alternative automated reporting system
400 is shown in Figure 26. Like and similar features are
indicated with like reference numerals.

The system 400 comprises a data abstraction layer 404
arranged to communicate directly with a client database
406 in order to retrieve quality control/quality assurance
data from the client database 406 when the data is
required during report creation.
It will be appreciated that with this embodiment the
client database 406 is hosted by a client and, as such,
the quality control/quality assurance data stored in the
client 406 database is under the client's control.
During use, a user associated with a client desiring to
create a report from data stored in the client database 12


CA 02656305 2009-02-24
- 21 -

connects to the web interface 14 through the Internet 18
using a computing device 16 remotely located relative to
the client, or using a client computing device 410 at the
same location as the client database 406. After
appropriate identification of the user at a log in screen
40 shown in Figure 2, the user is granted access to the
system 400 and is able to view high level information
relating to projects associated with the client. The
project information is stored at the client database 406
and is retrieved from the client database after user login
has occurred. Using the web interface 14, the user is
able to select one or more of the client projects which
are to form the basis of a report.

As with the previous embodiment described in relation to
Figures 1 to 25, a user is able to use the web interface
14 to define criteria for the desired report and to
subsequently view the created report on the web interface
14.
With the present embodiment, when the report criteria have
been defined, the report engine 22 instructs the data
abstraction layer 404 to retrieve from the client database
406 data which is required in order to generate the
desired report according to the defined report criteria,
and the data abstraction layer interacts directly with the
client database 406 over the Internet 18 and extracts the
required data from the client database 406. In order to
facilitate interaction with the client database, a network
interface 408 is provided.

It will be understood that the report criteria may be
defined using a computing device 16 disposed at a location
remote from the client database 406, or using a client
computing device 410 disposed locally.

The data abstraction layer 404 may have an associated


CA 02656305 2009-02-24
- 22 -

database 412 for temporarily storing data as the data is
extracted from the client database 406.

The report engine 22 then creates a report using the
extracted data and using any required report application,
such as the chart generator 26. The created report is
then passed back to the web interface 14 through the
control unit 20 and is served to the user's computing
device 16 through the Internet 18 for viewing on the
computing device 16.

It will be appreciated that with this embodiment quality
control/quality assurance data for each client is stored
by the client and data required by the system to create a
report is extracted from the client database when
required. In this way, it is not necessary for the system
to include a database for storing client data. Moreover,
since the client database 406 is queried directly by the
data abstraction layer 404, compatibility issues between
client data and system database requirements are
minimized.

In the claims which follow and in the preceding
description of the invention, except where the context
requires otherwise due to express language or necessary
implication, the word "comprise" or variations such as
"comprises" or "comprising" is used in an inclusive sense,
i.e. to specify the presence of the stated features but
not to preclude the presence or addition of further
features in various embodiments of the invention.

It is to be understood that, if any prior art publication
is referred to herein, such reference does not constitute
an admission that the publication forms a part of the
common general knowledge in the art, in Australia or any
other country.


CA 02656305 2009-02-24
- 23 -

Modifications and variations as would be apparent to a
skilled addressee are deemed to be within the scope of the
present invention.

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date Unavailable
(22) Filed 2009-02-24
(41) Open to Public Inspection 2010-08-24
Dead Application 2012-02-24

Abandonment History

Abandonment Date Reason Reinstatement Date
2011-02-24 FAILURE TO PAY APPLICATION MAINTENANCE FEE

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Application Fee $400.00 2009-02-24
Registration of a document - section 124 $100.00 2009-05-06
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
IOGLOBAL PTY LTD
Past Owners on Record
LAWIE, DAVID
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Representative Drawing 2010-07-27 1 4
Abstract 2009-02-24 1 14
Description 2009-02-24 22 866
Claims 2009-02-24 6 207
Drawings 2009-02-24 16 681
Cover Page 2010-08-12 2 32
Abstract 2012-01-31 1 14
Description 2012-01-31 22 866
Claims 2012-01-31 6 207
Correspondence 2009-03-26 1 13
Assignment 2009-02-24 2 99
Assignment 2009-05-06 4 141
Correspondence 2009-05-06 4 109
Correspondence 2009-06-10 1 15