Language selection

Search

Patent 2802122 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent: (11) CA 2802122
(54) English Title: METHOD AND CONTROL UNIT FOR CONTROLLING A DISPLAY OF A PROXIMITY WARNING SYSTEM
(54) French Title: PROCEDE ET MODULE DE COMMANDE ADAPTES POUR COMMANDER UN AFFICHAGE D'UN DISPOSITIF AVERTISSEUR DE PROXIMITE
Status: Granted
Bibliographic Data
(51) International Patent Classification (IPC):
  • G08G 1/16 (2006.01)
  • B60R 21/013 (2006.01)
  • H04N 7/18 (2006.01)
(72) Inventors :
  • ROTHACHER, URS MARTIN (Switzerland)
  • STEGMAIER, PETER ARNOLD (Switzerland)
(73) Owners :
  • SAFEMINE AG (Switzerland)
(71) Applicants :
  • SAFEMINE AG (Switzerland)
(74) Agent: MARKS & CLERK
(74) Associate agent:
(45) Issued: 2016-05-31
(86) PCT Filing Date: 2010-06-10
(87) Open to Public Inspection: 2011-12-15
Examination requested: 2015-05-22
Availability of licence: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): Yes
(86) PCT Filing Number: PCT/CH2010/000152
(87) International Publication Number: WO2011/153646
(85) National Entry: 2012-12-10

(30) Application Priority Data: None

Abstracts

English Abstract

The present idea refers to a method and a control unit for controlling a display (19) of a proximity warning system. Vehicles and other objects (4a, 4b, 4c, 5, 6, 7, 8), for example, in a surface mine (1), are equipped with cameras (12) for providing images of different scenes. A control unit (13) of such object (4a, 4b, 4c, 5, 6, 7, 8) receives a signal representing positional information of such object from a radio based positioning receiver (11). Dependent on the positional information a subset of at least one camera (12) is selected, and a control signal is provided for the display (19) to display images provided by the selected subset of one or more cameras (12).By such method, the most relevant scene in terms of collision avoidance can be displayed to the operator.


French Abstract

La présente invention se rapporte à un procédé et à un module de commande adaptés pour commander un affichage (19) d'un dispositif avertisseur de proximité. Dans une mine en surface (1) par exemple, des véhicules et d'autres objets (4a, 4b, 4c, 5, 6, 7, 8) sont équipés de caméras (12) pour fournir des images de différentes scènes. Un module de commande (13) d'un tel objet (4a, 4b, 4c, 5, 6, 7, 8) reçoit un signal représentant des informations de position d'un tel objet en provenance d'un récepteur de positionnement fonctionnant par radio (11). Sur la base des informations de position, un sous-ensemble d'au moins une caméra (12) est sélectionné, et un signal de commande est généré pour commander à l'affichage (19) d'afficher des images fournies par le sous-ensemble sélectionné d'une ou de plusieurs caméras (12). Grâce au procédé selon l'invention, la scène la plus pertinente en termes d'évitement de collision peut être affichée à l'intention de l'opérateur.

Claims

Note: Claims are shown in the official language in which they were submitted.


17
The embodiments of the invention in which an exclusive
property or privilege is claimed are defined as follows:
1. A method for controlling a display of a proximity
warning system, comprising:
receiving a signal representing first positional
information of a movable object from a radio based
positioning receiver;
dependent on the first positional information selecting
a subset of at least one camera out of at least two cameras
available for providing images of different scenes; and
providing a control signal for the display to display
images provided by the selected subset of one or more
cameras,
wherein the subset of one or more cameras is selected
subject to the first positional information and subject to
location information of stationary objects stored in an
electronic map.
2. A method according to claim 1, wherein the cameras
available for selection are arranged to provide images of
different sections around the movable object.
3. A method according to claim 2, wherein the cameras and
the radio based positioning receiver are attached to the
same movable object.
4. A method according to claim 1 or 3, wherein second
positional information is received with respect to a second
object,
wherein the second positional information originates
from a radio based positioning receiver of the second
object, and

18
wherein the subset of at least one camera is selected
based on the first positional information and the second
positional information.
5. A method according to claim 4, wherein a distance
between the objects is determined from the first positional
information and the second positional information, and
wherein the subset of at least one camera is selected based
on the distance.
6. A method according to claim 2, wherein second
positional information is received with respect to a second
object,
wherein the second positional information originates
from a radio based positioning receiver of the second
object, and
wherein the subset of at least one camera is selected
based on the first positional information and the second
positional information.
7. A method according to claim 6, wherein one of the
sections is identified as relevant when mapping the second
positional information to the sections, and wherein the
camera associated with the identified section is selected in
the selection step.
8. A method according to claim 6, wherein a distance
between the objects is determined from the first positional
information and the second positional information, and
wherein the subset of at least one camera is selected based
on the distance.

19
9. A method according to claim 8, wherein one of the
sections is identified as relevant when mapping the second
positional information to the sections, and wherein the
camera associated with the identified section is selected in
the selection step.
10. A method according to claim 9, wherein the camera
associated with the identified section is selected provided
at least one of the distance between the objects and the
distance to a crossing point of their trajectories is below
a threshold.
11. A method according to claim 2, wherein one of the
sections is identified as relevant when mapping the
stationary object location information to the sections, and
wherein the camera associated with the identified section is
selected in the selection step.
12. A method according to claim 11, wherein a distance
between the movable object and the stationary object is
determined from the first positional information and the
stationary object location information, and wherein the
camera associated with the identified section is selected
provided the determined distance between the movable object
and the stationary object is below a threshold.
13. A method according to claim 1, wherein the movable
object the radio based positioning receiver is assigned to
is different to a second object the cameras available for
selection are attached to,
wherein the cameras available for selection are
arranged to provide images of different sections around the
first second object,

20
wherein one of the sections is identified as relevant
when mapping the first positional information to the
sections, and
wherein the camera associated with the identified
section is selected in the selection step.
14. A method according to any one of claims 1 to 13,
wherein a signal is received from at least one sensor for
measuring the distance to another object by means different
to those of the radio based positioning receiver,
wherein the subset of at least one camera is selected
based on the first positional information and the distance
information provided by the sensor, and
wherein the sensor includes at least one of a radio
detection and ranging device, a light detection and ranging
device, and a sound detection and ranging device.
15. A method according to any one of claims 1 to 14,
wherein in a default mode the control signal is designed for
allowing images provided by all the cameras available to be
displayed, and wherein based on the selection step the
control signal is modified for allowing images provided by
the one or more selected cameras only to be displayed.
16. A method according to any one of claims 1 to 15,
wherein the control signal is provided for the display to
display and highlight the images from the selected subset of
one or more cameras.
17. A method according to any one of claims 1 to 16,
wherein the control signal is designed for triggering one of
an acoustic and a visual warning.

21
18. A computer readable medium on which is stored computer
program code means which, when loaded in a processor unit of
a control unit, configures the control unit to perform a
method as defined in any one of claims 1 to 17.
19. A control unit for controlling a display of a proximity
warning system, comprising:
a receiving unit for receiving a signal representing
first positional information of a movable object from a
radio based positioning receiver, a selection unit for
selecting a subset of at least one camera out of at least
two cameras available for providing images of different
scenes dependent on the first positional information and
subject to location information of stationary objects stored
in an electronic map, and an output for providing a control
signal to a display for displaying images provided by the
selected subset of one or more cameras.
20. A proximity warning system comprising
a display,
at least two cameras for providing images of different
scenes, and
a control unit for controlling the display, the control
unit comprising a receiving unit for receiving a signal
representing first positional information of a movable
object from a radio based positioning receiver, a selection
unit for selecting a subset of at least one camera out of
the at least two cameras available for providing images of
different scenes dependent on the positional information and
subject to location information of stationary objects stored
in an electronic map, and an output for providing a control
signal to the displaying images provided by the selected
subset of one or more cameras.

22
21. A proximity warning system according to claim 20,
wherein the receiving unit is designed for receiving
positional information of a second object.
22. A proximity warning system according to claim 20 or 21,
comprising a log for logging at least one of the first
positional information and the selected camera signal.
23. A movable object, comprising a proximity warning system
as defined in any one of claims 20 to 22, wherein the at
least two cameras are attached to different locations of the
movable object, and
wherein the movable object is a vehicle, a crane, a
dragline, a haul truck, a digger or a shovel.

Description

Note: Descriptions are shown in the official language in which they were submitted.


CA 02802122 2015-05-22
1
Method and control unit for controlling a display of a
proximity warning system
Technical Field
The invention relates to a method and a control
unit for controlling a display of a proximity warning
system.
Background Art
Surface mines and similar sites or areas are
generally operated by means of a large number of vehicles,
some of which may be exceedingly large and difficult to
maneuver and have very limited visibility for the
operator.
Collision and/or proximity warning systems are
established for conventional automobiles as well as for
extra-large vehicles.
Proximity warning systems in form of park
distance control systems make use of ultrasonic sensors
located in the bumpers of a car. More sophisticated
systems may rely on different sensors such as three
dimensional distance cameras as proposed in WO 2004/021546
A2. There, it is suggested to provide at least a forward, a
backward and a sideward looking camera at a passenger car.
For extra-large vehicles used in mining, WO
2004/047047 A2 suggests to use satellite supported radio
positioning receivers on board of the vehicles and other
objects, such as cranes, for generating proximity warnings
in order to reduce the risk of collisions. Another
approach based on GNNS receivers is disclosed in the
International Publication No. WO 2010/142046.

CA 02802122 2015 10 -15
2
Other approaches for extra-large vehicles are
introduced in "Avoiding accidents with mining vehicles",
retrieved and accessed from/on the Internet at
http://www.flir.com/uploadedFiles/Eurasia/MMC/Appl_Storie
s/AS 0020 EN.pdf on February 2, 2010. Sensors for avoiding
collisions may include radar systems, conventional cameras or
thermal imaging cameras.
In non-conventional types of vehicles such as
the vehicles used in mining, each camera may display its
image on a display installed in the driving cab. The more
cameras there are available the more image information the
driver is exposed to such that the driver may be
distracted by images not being relevant for collision
avoidance. Or, the driver may be overstrained by
monitoring the output of all cameras available.
It is generally desirable to overcome or
ameliorate one or more of the above described
difficulties, or to at least provide a useful alternative.
Summary of Invention
According to the present invention, there is
provided a method for controlling a display of a proximity
warning system, comprising:
receiving a signal representing first positional
information of a movable object from a radio based
positioning receiver;
dependent on the first positional information
selecting a subset of at least one camera out of at least
two cameras available for providing images of different
scenes; and

CA 02802122 2015-10-15
2a
providing a control signal for the display to
display images provided by the selected subset of one or
more cameras,
wherein the subset of one or more cameras is
selected subject to the first positional information and
subject to location information of stationary objects
stored in an electronic map.
According to the present invention, there is
also provided a computer readable medium on which is
lo stored computer program code means which, when loaded in a
processor unit of a control unit, configures the control
unit to perform a method as described herein.
According to the present invention, there is
also provided a control unit for controlling a display of
a proximity warning system, comprising:
a receiving unit for receiving a signal
representing first positional information of a movable
object from a radio based positioning receiver, a
selection unit for selecting a subset of at least one
camera out of at least two cameras available for providing
images of different scenes dependent on the first
positional information and subject to location information
of stationary objects stored in an electronic map, and an
output for providing a control signal to a display for
displaying images provided by the selected subset of one
or more cameras.
According to the present invention, there is
also provided a proximity warning system comprising
a display,
at least two cameras for providing images of
different scenes, and
a control unit for controlling the display, the
control unit comprising a receiving unit for receiving a

CA 02802122 2015-10-15
2b
signal representing first positional information of a
movable object from a radio based positioning receiver, a
selection unit for selecting a subset of at least one
camera out of the at least two cameras available for
providing images of different scenes dependent on the
positional information and subject to location information
of stationary objects stored in an electronic map, and an
output for providing a control signal to the displaying
images provided by the selected subset of one or more
lo cameras.
According to the present invention, there is
also provided movable object, comprising a proximity
warning system as described herein, wherein the at least
two cameras are attached to different locations of the
movable object, and
wherein the movable object is a vehicle, a
crane, a dragline, a haul truck, a digger or a shovel.
In this respect, it is desired to improve means
in a multi camera based proximity warning system for
drawing the attention of the operator to the most relevant
camera output/s.
Accordingly, a signal representing positional
information is received from a radio based positioning
receiver. A subset of at least one camera out of at least
two cameras for providing images of different scenes is
selected dependent on the positional information. A control
signal is provided for the display to display images
provided by the selected subset of one or more cameras.
According to another preferred embodiment of the
present invention, a control unit is provided for controlling
a display. Such control unit comprises a receiving unit for
receiving a signal representing positional information of

CA 02802122 2012-12-10
WO 2011/153646
PCT/CH2010/000152
3
an object from a radio based positioning receiver. A se-
lection unit is designed for selecting a subset of at
least one camera out of at least two cameras available
for providing images of different scenes subject to the
positional information. At an output of the control unit,
a control signal is provided to display images provided
by the selected subset of one or more cameras.
The basic idea of the present invention is to
provide an aid to the operator at which of the camera
lo outputs to look at by means of prioritizing such cam-
era(s). For this reason, a GNSS receiver is used for de-
termining the present location of the object the cameras
are assigned to and/or the location of an object differ-
ent to the one the cameras are assigned to. The location
of an object, specifically presented as coordinates in a
chosen coordinate system, may advantageously be subsumed
under the term "positional information" as presently
used.
In an advantageous scenario, the GNSS re-
ceiver and the cameras are attached to the same object.
An electronic map of preferably stationary objects being
critical to traffic on a site may be stored, and the cur-
rent position of the object as identified by the GNSS re-
ceiver, may be compared or otherwise put into relation to
the position of one or more objects listed in such map.
For example, in case of the distance between the object
and a stationary object listed in the map being or possi-
bly becoming critical, e.g. by subtracting the two loca-
tion data from each other, it is decided to which one of
the cameras to draw the operators attention to which
preferably is the camera that looks into the direction
the critical object is located at.
In another advantageous scenario, the GNSS
receiver and the cameras are attached to the same object,
other objects including movable objects may be equipped
with GNSS receivers, too, for determining their respec-
tive positions and/or trajectories. Such positional in-
formation is broadcast or individually transmitted by

CA 02802122 2012-12-10
WO 2011/153646
PCT/CH2010/000152
4
these objects to other objects on the site being equipped
with a corresponding receiver. By means of such posi-
tional information shared amongst objects on the site,
the direction and distance, and also any approaching ve-
locity may be determined at the present object with re-
spect to one or more other objects around. As soon as one
or more of these parameters becomes critical in terms of
proximity and/or a collision scenario, it is determined
again to which of the cameras the operators attention
lo should be drawn to, which preferably is the camera that
looks into the direction the critical object is located
at. Summarizing, by means of other objects, e.g. operated
and located on the same site, being equipped with GNSS
receivers, too, and an infrastructure enabling these ob-
15 jects to exchange information about their current loca-
tion, information about the existence, the distance to,
and the direction of such other objects in the vicinity
can be generated.
In another advantageous scenario, the GNSS
20 receiver and the cameras are attached to different ob-
jects, the object comprising the cameras not necessarily
including a GNSS receiver. However, other objects on a
site may be equipped with a GNSS receiver and broadcast
or otherwise transmit their positional information to
25 other objects on the site being equipped with a corre-
sponding receiving unit. In case, the object equipped
with the cameras receives such positional information via
its receiving unit, the positional information may be
evaluated and the direction and/or the distance and/or
30 the approaching velocity of near-by or distant objects
may be considered as critical in terms of a collision or
a pre-collision scenario. Again, the selection step can
be implemented the same way as described above, and the
display is controlled such that for the operator of the
35 object the cameras are attached to emphasis is put on the
one or more cameras looking into the direction another
object is detected.

CA 02802122 2012-12-10
WO 2011/153646
PCT/CH2010/000152
The general purpose of the selection step is
to make the operator focus to the one or more cameras by
which a potential danger is currently being filmed under
the assumption that there are at least two cameras avail-
able filming different scenes, i.e. preferably different
scenes around the object the cameras are attached to.
Consequently, it is ensured that the image information
being most relevant especially in terms of proximity
warning including collision avoidance is displayed on the
lo display. The determination which_one/s of the cameras
currently monitors the most relevant scene is performed
by a selection unit comprised in the control unit. In
particular, the location information provided by a GNSS
receiver is analyzed in terms of proximity to potentially
n dangerous objects.
By automatically selecting the camera cur-
rently monitoring the scene of most interest and by dis-
playing image information delivered from this camera, the
personnel in charge for a safe operation of such object
20 being e.g. a vehicle may not be distracted by a multitude
of image information but instead may focus on the most
relevant image information being displayed.
For advantageous embodiments it is referred
to the dependent claims. It is noted that embodiments re-
25 ferred to or claimed only in connection with the method
are deemed to be disclosed in connection with the appara-
tus, too, and vice versa.
30 Brief Description of the Drawings
A number of embodiments of the present inven-
tion will now be described by way of example only and
with reference to the accompanying drawings, in which the
35 figures show:
Fig. 1 a schematic representation of a mining
site,

CA 02802122 2012-12-10
WO 2011/153646
PCT/CH2010/000152
6
Fig. 2 a block diagram of a monitoring system
according to an embodiment of the present invention,
Fig. 3 a top view on a schematic vehicle with
cameras mounted according to an embodiment of the present
invention,
Fig. 4 a display,
Fig. 5 a block diagram of another monitoring
system according to an embodiment of the present inven-
tion, and
Fig. 6 another display.
Modes for Carrying Out the Invention
In the present application, an "image" is un-
derstood as being the output of a camera filming a scene.
This can be a camera working in the visible range if
light but also a camera working in the infrared range.
Such image typically visualizes the scene on a display.
When talking about different images it is inherently un-
derstood that these images are generated by different
cameras, typically simultaneously. In this respect, "im-
age information" may include any information provided by
such camera, and, in particular, the image itself.
The cameras used provide images of "different
scenes". A scene is "different" to another scene whenever
the cameras involved do not shoot or scan the same per-
spective. Cameras may not shoot the same perspective, for
example, when they are attached to different locations of
an object. In the context of the present application, it
is preferred that the at least two cameras are mounted on
the same object which may be a movable object such as a
vehicle, or a stationary object such as a building. In
such scenario, it is preferred that the cameras are ar-
ranged such that they scan different sides of the object
they are attached to in order to detect other objects in
proximity at different or even all sides of the object.

CA 02802122 2012-12-10
WO 2011/153646
PCT/CH2010/000152
7
A "section" assigned to a camera is under-
stood as - e.g. when shooting with a camera horizontally
- the horizontal area in front of the camera in which the
camera is able to monitor scenes in, and that conse-
quently can be displayed in an image. Typically, a sec-
tion of a camera may include a sector.
The "control unit" may be embodied in hard-
ware, in software or both, and may be embodied in a sin-
gle device, or its functions may be decentralized. Its
lo functional building block "selection unit" may also be
embodied in hardware, in software or both.
The "display" may have the form of a single
display for displaying images from a single source, or
may allow displaying images from many different sources,
15 i.e. cameras, simultaneously. The "display" also encom-
passes the totality of a multitude of separated displays
which are, for example, distributed in the drivers cab of
a vehicle. Summarizing, the display includes any display-
ing means for displaying image information delivered by
20 the cameras.
The "control signal to display images" trig-
gers at least displaying the image selected for display-
ing. The control signal may evoke additional action sub-
ject to what is displayed during the normal mode of op-
25 eration, i.e. the default mode when there is no object in
the vicinity detectable: If, for example, the display
regularly shows images of a single camera source only,
the control signal may cause to switch from displaying
images from the current camera source to displaying im-
30 ages from the camera source selected according to the
present idea. If the current image source by chance coin-
cides with the selected image source, there may be no
change visible to the monitoring person. In some embodi-
ments, the control signal causes to highlight the se-
35 lected images for drawing the attention to the subject
images, e.g. by a flashing frame, or other means. If, for
example, the display by default displays images from
various sources, the control signal may cause that the

CA 02802122 2012-12-10
WO 2011/153646
PCT/CH2010/000152
8
entire display now displays images only from the selected
source. Or, the control signal may cause images from
other sources being shut down or completely masked or
visually downsized in order to emphasize the selected im-
ages. The selected image may, as indicated, claim the en-
tire display screen, or remain in an unchanged image size
on the screen. The control signal may include zooming in
the selected image. The control signal may additionally
cause acoustic warnings to be issued. The control signal
may either comprise the selected image itself provided
the images are supplied by the cameras to the control
unit, or it may cause the subject cameras to directly
route the requested image to the display, or it may cause
the display to accept only display information from the
camera as selected. All the above holds true also for the
selection of multiple images if appropriate.
A "warning system" and a corresponding "warn-
ing" activity may refer to any suitable activity for
drawing the attention of the driver or operator to what
might be identified as a scene that may become critical
in terms of collision or proximity, including selecting
an image to be displayed. Such warning system may primar-
ily include the display which the cameras supply with im-
ages, but may additionally include acoustic means such as
a horn, a diaphone or a speaker, and possibly other vis-
ual means such as one or more LEDs, a flashlight, etc..
The warning character of the images displayed may be es-
pecially emphasized by displaying the images intermit-
tently, or by alternating between the image information
and its inverse colors, or by overlaying the image infor-
mation with visual warning symbols. Any warning in addi-
tion to the warning provided by the bare display of the
images or selected images may be issued in combination
with the control signal such that the control signal may
activate such additional warnings, too. In other embodi-
ments, a control signal separate from the control signal
for the display may be issued subject to range informa-
tion derived from the positional information delivered by

CA 02802122 2012-12-10
WO 2011/153646
PCT/CH2010/000152
9
the one or more GNSS receivers. For example, a first con-
trol signal for the display may be issued based on a
first threshold condition for the object still being dis-
tant with respect to the present object, and a separate
control signal for an acoustic warning element may be is-
sued based on a second threshold condition for the object
being very close with respect to the present object.
The term "radio based positioning system"
stands for a GNSS or for any other type of positioning
lo system based on radio signals, such as a pseudolite sys-
tem. The term "GNSS" stands for "Global Navigation Satel-
lite System" and encompasses all satellite based naviga-
tion systems, including GPS and Galileo. A "receiver" is
a receiver designed for receiving information from satel-
lites and for determining its position subject to the
signals received.
A "movable object" is any object that can
change and is expected to change its position and/or ori-
entation or configuration in space. It may e.g. be a
20 truck or any other vehicle that moves from place to place
and changes its orientation with respect to the general
north-south direction, e.g. by steering, or it may be an
object positioned at a fixed location but able to rotate
about its axis or to change its physical configuration,
25 e.g. by extending an arm, in such a manner that the vol-
ume of safety space attributed to it varies in signifi-
cant manner.
Fig. 1 schematically depicts a site 1, such
as a surface mine. Typically, such a site covers a large
30 area, in the case of a surface mine e.g. in the range of
square kilometers, with a network of roads 2 and other
traffic ways, such as rails 3. A plurality of objects is
present in the mine, such as:
- Large vehicles, such as haul trucks 4a,
35 cranes 4b or diggers 4c. Vehicles of this type may easily
weigh several 100 tons, and they are generally difficult
to control, have very large breaking distances, and a

CA 02802122 2012-12-10
WO 2011/153646
PCT/CH2010/000152
large number of blind spots that the driver is unable to
visually monitor without monitoring cameras.
- Medium sized vehicles 5, such as regular
trucks. These vehicles are easier to control, but they
5 still have several blind spots and require a skilled
driver.
- Small vehicles 6. Typically, vehicles of
this type weigh 3 tons or less. They comprise passenger
vehicles and small lorries.
10 - Trains 7.
A further type of object within the mine is
comprised of stationary obstacles, such as temporary or
permanent buildings 9, open pits, boulders, non-movable
excavators, stationary cranes, deposits, etc.
The risk of accidents in such an environment
is high. In particular, the large sized vehicles can eas-
ily collide with other vehicles, or obstacles.
For this reason, objects according to an em-
bodiment present in a mine 1 and subject to potential
collision may be equipped with at least one GNSS receiver
11, a control unit per object, at least two cameras (not
shown in Fig. 1) and a display per object (not shown in
Fig. 1). Large objects may provide more than one GNSS re-
ceiver 11 per object as shown in Fig. 1. The entirety of
these elements per object for generating a proximity
warning is called a monitoring system. The GNSS receivers
12 interact with satellites 30 for determining the posi-
tional information of the object they are mounted to.
Figure 2 illustrates a block diagram of a
monitoring system including a control unit 13 according
to an embodiment of the present invention. A receiver 17
of the control unit 13 is connected to cameras 12. An
output 16 of the control unit 13 is connected to a dis-
play 19 and a beeper as warning means. Both connections
may be implemented as wireless connections or as wired
connections. One or more connections can be implemented
via bus connections. Each camera 12 delivers a series of
images with respect to the scene monitored by the respec-

CA 02802122 2012-12-10
WO 2011/153646
PCT/CH2010/000152
11
tive camera 12. Preferably, each of the cameras 12 looks
into a different direction for monitoring different
scenes with respect to the object these cameras are at-
tached to.
The monitoring system further comprises a ra-
dio based positioning receiver 11, attached to the pre-
sent object. The receiver 11 provides a signal comprising
positional information, i.e. the position of the present
object, determined in combination with satellites 30 as
shown in Figure 1. Such signal may be received by a re-
ceiving unit 15 in the control unit 13.
The control unit 13 comprises a microproces-
sor system 14, which controls the operations of the con-
trol unit 13. A memory 18 comprises programs as well as
n various parameters, such as unique identifiers of the
cameras. Such programs may comprise instructions for
evaluating the positional information, and for selecting
a subset of cameras currently providing the most signifi-
cant image information.
The radio based positioning receiver 11 may
provide positional information of the subject location it
is located which represents the subject location of the
object it is attached to. Provided that other moving or
stationary objects on the site are equipped with such re-
ceivers 11, too, the positional information related to
the various objects may be shared between the control
units of these objects, such that by comparing positional
information stemming from positioning receivers located
on different objects proximity and even approximation can
be detected. For further details it is referred to
PCT/CH2009/000394 which is incorporated herein by refer-
ence.
Position information of the present object
provided by the radio based positioning receiver 11 is
transferred to the control unit 13 and evaluated there.
Advantageously, such evaluation takes into account posi-
tional information received from other objects gathered
by their own radio based positioning receivers and trans-

CA 02802122 2012-12-10
WO 2011/153646
PCT/CH2010/000152
12
mitted e.g. by a wireless interface not shown in Figure
2. By way of evaluating the positional information from
these different sources, a proximity situation may be de-
tected. If such proximity situation is detected by means
of the positional information, a control signal may be
issued which activates displaying the image from the cam-
era looking into a direction where the proximate object
is located at. The selected image represent the camera
that currently films the proximate object which is of
lo most interest to be monitored by the operator in order to
avoid a collision. This is why image information stemming
from this camera is emphasized in being presented to the
personnel via the display.
Figure 2 shows an electronic map 40 stored in
the control unit 13 which holds location information sig-
nificant of stationary objects located on the site where
the monitoring system of Figure 2 is in use. The posi-
tional information supplied by the GNSS receiver 11 is
compared or otherwise put in relation to the location in-
formation of the stationary objects. In case, sufficient
proximity or approximation is detected between thses ob-
jects, the camera 12 looking into the direction of the
stationary object is selected for displaying its image
exclusively on the display 19.
In another embodiment, the object is equipped
with another sensor (not shown) for measuring the dis-
tance to another object, such as a radio detection and
ranging device, a light detection and ranging device, and
a sound detection and ranging device. A signal is re-
ceived from such sensor, and the subset of one or more
cameras additionally may be selected based on the dis-
tance information provided by such sensor. There may be
multiple sensors arranged at different sides of a vehi-
cle. These sensors may operate for detecting near-by ob-
jects, and in particular objects not tagged with a GNSS
receiver, by that providing additional information on the
surrounding of the vehicle. Such sensors may individually
trigger the selection of the camera(s) through the con-

CA 02802122 2012-12-10
WO 2011/153646
PCT/CH2010/000152
13
trol unit (13) and preferably cover similar sectors as
the cameras.
Figure 3 illustrates a schematic top view on
a vehicle 6 equipped with four cameras 12, one located at
each side of the vehicle 6, and a single GNSS receiver
11. Sections monitored by each camera 12 are indicated by
sector lines and referred to by 121. This makes each cam-
era 12 scan a different scene at each side of the vehicle
6. Alternatively, the cameras 12 can be located at the
edges of the vehicle 6. Both arrangements are beneficial
for covering a large area in the vicinity of the object
for proximity including collision detection purposes.
Provided that second positional information
is received from an object different to the present vehi-
6, the selection of the camera may be based on the
positional information with respect to the present vehi-
cle 6 and such second positional information. Analyzing
the positional information of both of the objects may al-
low identification of the direction the other object is
located at with respect to the vehicle 6, and the dis-
tance between the vehicle and the other object. In case
the other object is located at a position 200 to the left
hand side of the vehicle 6, the section 121 of the left
hand camera 12 is identified as relevant section 121 when
mapping the position of the other object 200 to the sec-
tions 121 of the cameras 12 of the vehicle. For such map-
ping, it is beneficial to permanently monitor the orien-
tation of the vehicle 6 which may alter when moving the
vehicle. This may involve e.g. a compass or any other
means for determining the orientation of the vehicle with
respect to the coordinate system the GNSS makes use of.
The identified section 121 makes the camera 12 associated
to be the preferred camera for selection. As a result,
this camera 12 will exclusively provide images of this
proximity situation to the operator provided the distance
to the object 200 is not that far that any selection is
suppressed. The first and second positional information
may be used for determining the distance between the ot-

CA 02802122 2012-12-10
WO 2011/153646 -
PCT/CH2010/000152
19
her object and the vehicle. The distance information may
be included in the selection step, and the image of the
camera corresponding to the identified section may only
be selected when the determined distance between the ob-
jects is below a given threshold. Otherwise, it is as-
sumed that the other object still is too far away for
justifying a warning to the operator.
Given that a third object 300 is in proximity
to the vehicle 6 and given that the second object 200
lo still is at its position as illustrated in Figure 3, the
position of the third object 300 may be determined with
respect to the sections 121 of the cameras 12 of the ve-
hicle 6. Hence, the section 121 to the right hand side of
the vehicle 6 is identified as section the object 300
15 maps/falls. The right hand side camera 12 is associated
to this section 121. For this example, the object 200 now
is assumed to be at a distance from the vehicle 6 which
justifies issuing a warning to the operator.
Subject to the display/warning strategy both
20 cameras, i.e. the left hand and the right hand camera 12
may be selected for delivering images to the display,
e.g. compared to a default display mode where all four
cameras 12 show their images on the display. However,
following another strategy, only the object closest to
25 the vehicle 6 shall be displayed. By determining the dis-
tances between the vehicle 6 and the objects 200 and 300,
the image of the camera being mounted to the object being
closest will be allowed to display the scene it monitors,
i.e. the camera 12 to the right hand as the object 300 is
30 closer to the vehicle 6 than the object 200.
In the above examples, the radio based posi-
tioning receiver 11 always is present at the vehicle 6 /
object the cameras are attached to. In another embodi-
ment, no such radio based positioning receiver 11 is at-
35 tached to the object holding the cameras. Instead, the
selection of cameras only relies on positional informa-
tion received from other objects. Such positional infor-
mation may be sufficient for selecting the one or more

CA 02802122 2012-12-10
WO 2011/153646
PCT/CH2010/000152
cameras by a mapping step equivalent to the one described
in connection with the embodiment above. This holds for
other objects providing their position information not in
an absolute measure but e.g. in relation to the present
5 object, or to any other known object. Or, preferably,
means other than radio based positioning means may be
provided for allowing an assessment of the position of
the other object with respect to the own position. In
case the own position is a priori rather limited to a
n small area, e.g. when the vehicle may move only within a
limited radius, even no such additional means are needed,
as the own position may be known in advance, stored in
the control unit and be used for putting the position of
the other object provided in absolute coordinates into
15 relation with its own position.
The display 19 in Figure 4 represents a flat
panel display offering displaying images from e.g. 8 cam-
eras across its screen. Once the control signal is re-
ceived from the control unit 13, and provided the control
signal identifies only one camera 12 for providing image
information most relevant to be displayed, the entire
screen of Figure 4 may be reserved for showing the sub-
ject image information. In Figure 4, the screen of the
display 19 is devided, and the image information selected
is displayed on portion 20 of the display. Portion 21 may
be reserved for issuing visual warnings, such a bold
"DANGER" symbol or text or other kind of visual warnings.
The block diagram of Figure 5 differs from
the block diagram of Figure 3 only in the way the control
signal affects the control of the display. Instead of the
control signal carrying the image information itself, the
control signal now acts on AND gates 22 each of which AND
gates is connected with one of the cameras 12. By acti-
vating one of the AND gates by a corresponding control
signal, the subject AND gate allows for the associated
camera 12 to provide image information to the display 19,
while, for example, all the other AND gates are blocked
and do not allow for displaying image information from

CA 02802122 2012-12-10
WO 2011/153646
PCT/CH2010/000152
16
the other cameras 12. There is no need for providing a
receiver 17 for the image information in the control unit
13.
Figure 6 provides another schematic represen-
tation of a display 19, which display 19 is divided into
four sub-displays 191 - 194, each sub-display 191 - 194
permanently displaying information from a camera as-
signed. In this embodiment, the control signal only high-
lights the sub-display 192 which displays image informa-
lo tion from the camera 12 selected to be most critical in
terms of a potential collision by a blinking frame or
similar.
While presently preferred embodiments of the
invention are shown and described, it is to be distinctly
15 understood that the invention is not limited thereto but
may be otherwise variously embodied and practiced within
the scope of the following claims.

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date 2016-05-31
(86) PCT Filing Date 2010-06-10
(87) PCT Publication Date 2011-12-15
(85) National Entry 2012-12-10
Examination Requested 2015-05-22
(45) Issued 2016-05-31

Abandonment History

There is no abandonment history.

Maintenance Fee

Last Payment of $263.14 was received on 2023-05-30


 Upcoming maintenance fee amounts

Description Date Amount
Next Payment if small entity fee 2024-06-10 $125.00
Next Payment if standard fee 2024-06-10 $347.00

Note : If the full payment has not been received on or before the date indicated, a further fee may be required which may be one of the following

  • the reinstatement fee;
  • the late payment fee; or
  • additional fee to reverse deemed expiry.

Patent fees are adjusted on the 1st of January every year. The amounts above are the current amounts if received by December 31 of the current year.
Please refer to the CIPO Patent Fees web page to see all current fee amounts.

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Application Fee $400.00 2012-12-10
Maintenance Fee - Application - New Act 2 2012-06-11 $100.00 2012-12-10
Maintenance Fee - Application - New Act 3 2013-06-10 $100.00 2012-12-10
Registration of a document - section 124 $100.00 2013-04-09
Maintenance Fee - Application - New Act 4 2014-06-10 $100.00 2014-05-23
Request for Examination $800.00 2015-05-22
Maintenance Fee - Application - New Act 5 2015-06-10 $200.00 2015-05-27
Final Fee $300.00 2016-03-16
Maintenance Fee - Patent - New Act 6 2016-06-10 $200.00 2016-06-09
Maintenance Fee - Patent - New Act 7 2017-06-12 $200.00 2017-05-30
Maintenance Fee - Patent - New Act 8 2018-06-11 $200.00 2018-05-28
Maintenance Fee - Patent - New Act 9 2019-06-10 $200.00 2019-05-27
Maintenance Fee - Patent - New Act 10 2020-06-10 $250.00 2020-05-29
Maintenance Fee - Patent - New Act 11 2021-06-10 $255.00 2021-06-07
Maintenance Fee - Patent - New Act 12 2022-06-10 $254.49 2022-05-30
Maintenance Fee - Patent - New Act 13 2023-06-12 $263.14 2023-05-30
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
SAFEMINE AG
Past Owners on Record
None
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Abstract 2012-12-10 2 67
Claims 2012-12-10 4 167
Drawings 2012-12-10 4 45
Description 2012-12-10 16 744
Representative Drawing 2012-12-10 1 14
Cover Page 2013-02-07 2 45
Claims 2012-12-11 5 169
Description 2015-05-22 18 803
Claims 2015-05-22 5 160
Claims 2015-10-15 6 203
Description 2015-10-15 18 819
Representative Drawing 2016-04-12 1 6
Cover Page 2016-04-12 2 43
Final Fee 2016-03-16 1 32
Prosecution-Amendment 2012-12-10 6 195
Assignment 2012-12-10 2 113
PCT 2012-12-10 12 394
Assignment 2013-04-09 5 236
Prosecution-Amendment 2015-05-22 13 509
Examiner Requisition 2015-06-26 5 313
Amendment 2015-10-15 19 700