Language selection

Search

Patent 2810910 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent: (11) CA 2810910
(54) English Title: SYSTEM AND METHOD FOR ROTATING A USER INTERFACE FOR A MOBILE DEVICE
(54) French Title: SYSTEME ET PROCEDE POUR FAIRE TOURNER UNE INTERFACE UTILISATEUR POUR DISPOSITIF MOBILE
Status: Granted
Bibliographic Data
(51) International Patent Classification (IPC):
  • H04M 1/03 (2006.01)
  • G06F 1/16 (2006.01)
  • H04M 1/725 (2006.01)
(72) Inventors :
  • DELUCA, MICHAEL JOSEPH (United States of America)
(73) Owners :
  • BLACKBERRY LIMITED (Canada)
(71) Applicants :
  • RESEARCH IN MOTION LIMITED (Canada)
(74) Agent: RIDOUT & MAYBEE LLP
(74) Associate agent:
(45) Issued: 2017-10-31
(86) PCT Filing Date: 2011-09-21
(87) Open to Public Inspection: 2012-03-29
Examination requested: 2013-03-07
Availability of licence: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): Yes
(86) PCT Filing Number: PCT/US2011/052611
(87) International Publication Number: WO2012/040363
(85) National Entry: 2013-03-07

(30) Application Priority Data:
Application No. Country/Territory Date
10178685.3 European Patent Office (EPO) 2010-09-23
12/888,915 United States of America 2010-09-23

Abstracts

English Abstract

A system and method (1100) for determining the orientation of a mobile device (100) for displaying a graphical user interface and for activating an audio user interface in response to an incoming call or outgoing call. For the graphical user interface, depending on the detected orientation of the mobile device (100), the graphical user interface can be displayed in a first vertical orientation, a second vertical orientation, a first horizontal orientation, and a second horizontal orientation. For the audio user interface, depending on the detected orientation of the mobile device (100), a speaker (214) and a microphone can be activated based on the detected vertical orientation so that the activated speaker (214) is near the top of the mobile device (100) and the activated microphone is near the bottom of mobile device (100).


French Abstract

L'invention concerne un système et un procédé (1100) qui permettent de déterminer l'orientation d'un dispositif mobile (100) afin d'afficher une interface utilisateur graphique et d'activer une interface utilisateur audio en réponse à un appel entrant ou à un appel sortant. Pour l'interface utilisateur graphique, selon l'orientation détectée du dispositif mobile (100), l'interface utilisateur graphique peut être affichée dans une première orientation verticale, une seconde orientation verticale, une première orientation horizontale et une seconde orientation horizontale. Pour l'interface utilisateur audio, selon l'orientation détectée du dispositif mobile (100), un haut-parleur (214) et un microphone peuvent être activés sur la base de l'orientation verticale détectée, de telle manière que le haut-parleur activé (214) est proche du haut du dispositif mobile (100) et que le microphone activé est proche du bas du dispositif mobile (100).

Claims

Note: Claims are shown in the official language in which they were submitted.


19
CLAIMS:
1. A mobile device having first and second opposing sides, the mobile
device comprising:
a first speaker positioned at the first side;
a second speaker positioned at the second side;
a first microphone positioned at the first side;
a second microphone positioned at the second side;
an orientation component configured to determine an orientation of the mobile
device;
and
a processor configured to:
activate the first microphone and the second microphone;
receive a first audio signal from the first microphone;
receive a second audio signal from the second microphone;
make a determination, based on comparing the first audio signal to the second
audio signal to determine that the second audio signal has a level greater
than a
level of the first audio signal, that the mobile device is in a first
orientation; and
in response to the determination, activate the first speaker and deactivate
the first
microphone.
2. The mobile device of claim 1 wherein the orientation component comprises
an
accelerometer.
3. The mobile device of any one of claims 1-2 wherein the first speaker and
the first
microphone is a first transducer and the second speaker and the second
microphone is a second
transducer wherein, in the event the orientation component detects the first
side is higher than the

20
second side, the processor is further configured to activate the first
transducer to function as a
speaker and to cause the second transducer to function as a microphone.
4. The mobile device of any one of claims 1-3 wherein the first speaker and
the first
microphone is a first transducer and the second speaker and the second
microphone is a second
transducer wherein in the event the orientation component detects the second
side is higher than
the first side, the processor is further configured to activate the second
transducer to function as a
speaker and to cause the first transducer to function as a microphone.
5. The mobile device of any one of claims 1-4 wherein the orientation
component is an
accelerometer.
6. The mobile device of any one of claims 1-5 wherein the mobile device
further comprises
third and fourth opposing sides, wherein the orientation component is further
configured to detect
which side is higher than the other sides and the processor is configured to
display call
information in accordance with the detection of the higher side.
7. A processor-implemented method for a mobile device having first and
second opposing
sides, the method comprising:
receiving, at a processor, an orientation signal from an orientation
component, wherein
the orientation signal indicates which of a first side or a second side of the
mobile device
is higher;
activating, by the processor, a first microphone on the first side and a
second microphone
on the second side;
receiving, by the processor, a first audio signal from the first microphone;
receiving, by the processor, a second audio signal from the second microphone;
making a determination, by the processor based on comparing the first audio
signal to the
second audio signal to determine that the second audio signal has a level
greater than a
level of the first audio signal, that the mobile device is in a first
orientation; and

21
by the processor in response to the determination, activating the first
speaker and
deactivating the first microphone.
8. The processor-implemented method of claim 7 wherein the orientation
component
comprises an accelerometer.
9. The processor-implemented method of claim 7 wherein the orientation
component
comprises one of a gyroscope and a mercury switch.

Description

Note: Descriptions are shown in the official language in which they were submitted.


CA 02810910 2016-09-23
SYSTEM AND METHOD FOR ROTATING A USER
INTERFACE FOR A MOBILE DEVICE
CLAIM FOR PRIORITY
[0001] This application claims priority to U.S. Application No. 12/888,915,
filed September 23,
2010 and European Application No. 10178685.3, filed September 23, 2010, both
of which are
entitled SYSTEM AND METHOD FOR ROTATING A USER INTERFACE FOR A MOBILE
DEVICE.
FIELD OF TECHNOLOGY
[0002] The present disclosure relates to mobile devices, and more specifically
to rotating a user
interface for a mobile device based on the orientation of the mobile device.
BACKGROUND
[0003] Mobile devices are becoming more prevalent and more advanced. Mobile
devices can
include, but are not limited to, cellular telephones, smart telephones,
wireless personal digital
assistants (PDAs), and laptop computers with wireless 802.11 or Bluetooth
capabilities. These
devices can run on a wide variety of networks from data-only networks such as
Mobitex and
DataTACCD networks to complex voice and data networks such as GSM/GPRS, CDMA,
EDGE,
UMTS and CDMA2000 networks. As the technology associated with mobile devices
continues
to advance, users of these devices are becoming more reliant on these devices.
Along with this
reliance, there is an increase in the popularity of touch-sensitive displays
or touchscreens based
mobile devices due to the larger display screens. Typically, these touchscreen
mobile devices are
substantially rectangular having two shorter sides and two longer sides with
the touchscreen
between the four sides with a microphone on one of the shorter sides and a
speaker on the
opposite shorter side. In response to an incoming call or an outgoing call,
call information can be
displayed on the display screen. The call infolmation can be displayed in a
vertical orientation
with the speaker near the top of the mobile device and the microphone near the
bottom of the

CA 02810910 2013-03-07
WO 2012/040363 PCT/US2011/052611
2
mobile device. Thus, when a user attempts to use a mobile device with a
touchscreen to place a
call or receive a call, the user must determine the proper orientation of the
mobile device. For
example, the user has to determine the proper vertical orientation of the
mobile device with the
speaker near the top and the microphone near the bottom. Typically, the user
is able to determine
the proper vertical orientation based on call information displayed by the
user interface.
BRIEF DESCRIPTION OF THE DRAWINGS
[0004] Embodiments of the present disclosure will now be described, by way of
example only,
with reference to the attached Figures, wherein:
[0005] Figure 1 is a block diagram of a mobile device in a communication
network in accordance
with an exemplary embodiment;
[0006] Figure 2 is a front view of a mobile device in a first vertical
orientation, with the mobile
device having two microphones and two speakers in accordance with an exemplary
embodiment;
[0007] Figure 3 is a front view of a mobile device in a first vertical
orientation, with the mobile
device having two microphones and two speakers in accordance with an exemplary
embodiment;
[0008] Figure 4 is a front view of a mobile device in a first horizontal
orientation, with the mobile
device having two microphones and two speakers in accordance with an exemplary
embodiment;
[0009] Figure 5 is a front view of a mobile device in a first horizontal
orientation, with the mobile
device having two microphones and two speakers in accordance with an exemplary
embodiment;
[0010] Figure 6 is a front view of a mobile device in a first vertical
orientation, with the mobile
device having two transducers in accordance with an exemplary embodiment;
[0011] Figure 7 is a front view of a mobile device in a second vertical
orientation, with the mobile
device having two transducers in accordance with an exemplary embodiment;

CA 02810910 2013-03-07
WO 2012/040363 PCT/US2011/052611
3
[0012] Figure 8 is a front view of a mobile device in a first horizontal
orientation, with the mobile
device having two transducers in accordance with an exemplary embodiment;
[0013] Figure 9 is a front view of a mobile device in a second horizontal
orientation, with the
mobile device having two transducers in accordance with an exemplary
embodiment;
[0014] Figure 10 is a flowchart of a first method for displaying a graphical
user interface and
activating an audio user interface in accordance with an exemplary embodiment;
[0015] Figure 11 is a flowchart of a second method for displaying a graphical
user interface and
activating an audio user interface in accordance with an exemplary embodiment;
and
[0016] Figure 12 is a flowchart of a method for activating an audio user
interface in the event the
orientation component does not provide a definitive orientation of the mobile
device in
accordance with an exemplary embodiment.
DETAILED DESCRIPTION
[0017] As will be appreciated for simplicity and clarity of illustration,
where appropriate,
reference numerals have been repeated among the different figures to indicate
corresponding or
analogous elements. In addition, numerous specific details are set forth in
order to provide a
thorough understanding of the implementations described herein. However, those
of ordinary
skill in the art will understand that the implementations described herein can
be practiced without
these specific details. In other instances, methods, procedures and components
have not been
described in detail so as not to obscure the related relevant feature being
described. Also, the
description is not to be considered as limiting the scope of the
implementations described herein.
[0018] Several definitions that apply throughout this disclosure will now be
presented. The word
"coupled" is defined as connected, whether directly or indirectly through
intervening components,
and is not necessarily limited to physical connections. The term
"communicatively coupled" is
defined as connected whether directly or indirectly though intervening
components, is not
necessarily limited to a physical connection, and allows for the transfer of
data. The term "mobile

CA 02810910 2013-03-07
WO 2012/040363 PCT/US2011/052611
4
device" is defined as any electronic device that is capable of at least
accepting information entries
from a user and includes the device's own power source. A "wireless
communication" means
communication that occurs without wires using electromagnetic radiation. The
term "memory"
refers to transitory memory and non-transitory memory. For example, non-
transitory memory can
be implemented as Random Access Memory (RAM), Read-Only Memory (ROM), flash,
ferromagnetic, phase-change memory, and other non-transitory memory
technologies. The term
"media" is defined as visual, audio, or combined visual and audio data which
can be outputted by
a mobile device.
[0019] The present disclosure provides a system and method for determining the
orientation of a
mobile device 100 for displaying a graphical user interface and for activating
an audio user
interface in response to an incoming call or outgoing call. For the graphical
user interface,
depending on the detected orientation of the mobile device 100, the graphical
user interface can
be displayed in a first vertical orientation, a second vertical orientation, a
first horizontal
orientation, and a second horizontal orientation. For the audio user
interface, depending on the
detected orientation of the mobile device 100, a speaker and a microphone can
be activated based
on the detected vertical orientation so that the activated speaker is at a
higher short side of the
mobile device 100 and the activated microphone is at a lower short side of the
mobile device 100.
[0020] Referring to Figure 1, a block diagram of a mobile device in a
communication network in
accordance with an exemplary embodiment is illustrated. As shown, the mobile
device 100 can
include a microprocessor 338 that controls the operation of the mobile device
100, such as
facilitating communications, providing a graphical user interface, executing
programs, and so
forth. A communication subsystem 311 performs communication transmission and
reception with
the wireless network 319. The microprocessor 338 further can be coupled with
an auxiliary
input/output (I/O) subsystem 328 that can be coupled to the mobile device 100.
Additionally, in
at least one embodiment, the microprocessor 338 can be coupled to a serial
port (for example, a
Universal Serial Bus port) 330 that facilitates communication with other
devices or systems via
the serial port 330. A display 322 can be communicatively coupled to the
microprocessor 338 to
facilitate display of information to an operator of the mobile device 100.
When the mobile device
100 is equipped with a keyboard 332, which may be physical or virtual (e.g.,
displayed), the

CA 02810910 2013-03-07
WO 2012/040363 PCT/US2011/052611
keyboard 332 can be communicatively coupled to the microprocessor 338. The
mobile device
100 can include one or more speakers 334 and one or more microphones 336,
which may
advantageously be communicatively coupled to the microprocessor 338 and
discussed in further
detail below. Additionally, a vibrator 360, such as a vibrator motor, can be
communicatively
coupled to the microprocessor 338 to generate vibrations in the mobile device
100. Other similar
components can be provided on or within the mobile device 100 and are
optionally
communicatively coupled to the microprocessor 338. Other communication
subsystems 340 and
other communication device subsystems 342 are generally indicated as
communicatively coupled
with the microprocessor 338. An example of a communication subsystem 340 is a
short-range
communication system such as a BLUETOOTH communication module or a WIFI
communication module (a communication module in compliance with IEEE 802.11b)
and
associated circuits and components. Additionally, the microprocessor 338 can
perform operating
system functions and executes programs or software applications on the mobile
device 100. In
some embodiments, not all of the above components are included in the mobile
device 100. The
auxiliary I/O subsystem 328 can take the form of one or more different
navigation tools (multi-
directional or single-directional), external display devices such as
keyboards, and other
subsystems capable of providing input or receiving output from the mobile
device 100.
[0021] The mobile device 100 can be equipped with components to enable
operation of various
programs, as shown in Figure 1. As shown, the memory 324 can provide storage
for the
operating system 350, device programs 358, data, and so forth. The operating
system 350 can be
generally configured to manage other programs 358 that are also stored in
memory 324 and
executable on the processor 338. The operating system 350 can handle requests
for services
made by programs 358 through predefined program 358 interfaces. More
specifically, the
operating system 350 can typically determine the order in which multiple
programs 358 are
executed on the processor 338 and the execution time allotted for each program
358, manages the
sharing of memory 324 among multiple programs 358, handles input and output to
and from other
device subsystems 342, and so forth. In addition, operators can interact
directly with the
operating system 350 through a user interface, typically including the
keyboard 332 and display
screen 322. The operating system 350, programs 358, data, and other
information can be stored
in memory 324, RAM 326, read-only memory (ROM), or another suitable storage
element (not

CA 02810910 2013-03-07
WO 2012/040363 PCT/US2011/052611
6
shown). An address book 352, personal information manager (PIM) 354, and other
information
356 can also be stored.
[0022] The mobile device 100 can be enabled for two-way communication within
voice, data, or
voice and data communication systems. A Subscriber Identity Module (SIM) or
Removable User
Identity Module (RUIM) can be utilized to authorize communication with the
communication
network 319. A SIM/RUIM interface 344 within the mobile device 100 can
interface a
SIM/RUIM card to the microprocessor 338 and facilitates removal or insertion
of a SIM/RUIM
card (not shown). The SIM/RUIM card features memory and can hold key
configurations 351,
and other information 353 such as identification and subscriber related
information. The mobile
device 100 can be equipped with an antenna 318 for transmitting signals to the
communication
network 319 and another antenna 316 for receiving communication from the
communication
network 319. Alternatively, a single antenna (not shown) can be utilized to
transmit and receive
signals. A communication subsystem 311 can include a transmitter 314 and
receiver 312, one or
more antennae 316, 318, local oscillators (L0s) 313, and a processing module
320 such as a
digital signal processor (DSP) 320.
[0023] The mobile device 100 can include a touch-sensitive display or
touchscreen 224 that
includes one or more touch location sensors 364, an overlay 226, and a display
322, such as a
liquid crystal display (LCD) or light emitting diode (LED) display, such as
shown in Figure 2.
The touch location sensor(s) 364 can be a capacitive, resistive, infrared,
surface acoustic wave
(SAW), or other type of touch-sensitive sensor and can be integrated into the
overlay 226. The
overlay 226, or cover, can be comprised of laminated glass, plastic, or other
suitable material(s)
and is advantageously translucent or transparent. A touch, or touch contact,
can be detected by
the touchscreen 224 and processed by the processor 338, for example, to
determine a location of
the touch. Touch location data can include the center of the area of contact
or the entire area of
contact for further processing. A touch may be detected from a contact member,
such as a body
part of a user, for example a finger or thumb, or other objects, for example a
stylus, pen, or other
pointer, depending on the nature of the touch location sensor.

CA 02810910 2013-03-07
WO 2012/040363 PCT/US2011/052611
7
[0024] Referring to Figures 2-9, front views of a mobile device in vertical
and horizontal
orientations in accordance with exemplary embodiments are illustrated. As
shown, the mobile
device 100 can include a substantially rectangular frame or body 202 having a
first short side 204,
a second short side 206, a first long side 208, and a second long side 210.
The frame 202 can be
a single structure or formed using multiple structures. The first short side
204 and second short
side 206 can be on opposite sides of each other. The first long side 208 and
second long side 210
can be on opposite sides of each other. A touchscreen 224 can be interposed
between the first
short side 204, the second short side 206, the first long side 208, and the
second long side 210.
The mobile device 100 can include audio components including at least one
speaker and at least
one microphone.
[0025] Referring to Figures 2-5, front views of a mobile device having two
speakers and two
microphones in vertical and horizontal orientations in accordance with
exemplary embodiments
are illustrated. As shown, the mobile device 100 can include a first speaker
214, a second speaker
216, a first microphone 218, and a second microphone 220. The first speaker
214 and the second
speaker 216 can be on opposite sides of each other and on the short sides of
the mobile device
100. For example, the first speaker 214 can be on the first short side 204 and
the second speaker
216 can be on the second short side 206. The first microphone 218 and the
second microphone
220 can be on opposite sides of each other and on the short sides of the
mobile device 100. For
example, the first microphone 218 can be on the first short side 204 and the
second microphone
220 can be on the second short side 206. In one or more embodiments, a speaker
and a
microphone can be paired to form an audio pairing, with the speaker and
microphone being on
opposite sides of each other. For example, a first audio pairing can include
the first speaker 214
on the first short side 204 and the second microphone 220 on the second short
side 206 and a
second audio pairing can include the second speaker 216 on the second short
side 206 and the
first microphone 218 on the first short side 204.
[0026] As shown in Figures 2-5, each audio component 214, 216, 218, 220 is
shown on a top
surface 222 of the mobile device 100. Although the audio component 214, 216,
218, 220 are
shown on the top surface 222, one or more audio components 214, 216, 218, 220
can be on or
about one or more of the top surfaces 222, side surface, bottom surface or any
combination

CA 02810910 2013-03-07
WO 2012/040363 PCT/US2011/052611
8
thereof In one or more embodiments, the mobile device can include more or less
audio
components. As discussed below, depending on the orientation of the mobile
device 100, each
audio component 214, 216, 218, 220 can be activated or deactivated.
[0027] Referring to Figures 6-9, front views of a mobile device, having two
transducers, in
vertical and horizontal orientations in accordance with exemplary embodiments
are illustrated. As
shown, the mobile device can include a first transducer 402 at about the first
short side 204 of the
mobile device 100 and a second transducer 404 at about the second short side
206 of the mobile
device 100. Although, the first transducer 402 and the second transducer 404
are shown as being
on the top surface, the first transducer 402, the second transducer 404, or
both can be on the top
surface 222, side, or bottom surface of the mobile device 100. As discussed
below, depending on
the orientation of the mobile device 100, each transducer 402, 404 can
function as a speaker or a
microphone.
[0028] The mobile device 100 can include one or more orientation components
366 to detect the
orientation of the mobile device 100. An orientation component 366 can detect
which short side
204, 206 is higher than the other short side 204, 206. For example, the first
short side 204 of the
mobile device 1000 is higher or substantially higher than the second short
side 206 as shown in
Figures 2 and 6 and the second short side 206 of the mobile device 100 is
higher or substantially
higher than the first short side 204 as shown in Figures 3 and 7. Although the
first vertical
orientation and the second vertical orientation can be approximately 180
apart. In one or more
embodiments, the orientation component 366 can detect a horizontal
orientation, a first horizontal
orientation or a second horizontal orientation. In the horizontal orientation,
the first short side
204 and the second short side 206 can be even with each other or substantially
even with each
other. For example, the mobile device 100 can be in a first horizontal
orientation with the first
short side 204 on the left and the second short side 206 on the right as shown
in Figures 4 and 8
or can be in a second vertical orientation with the second short side 206 on
the left and the first
short side 204 on the right as shown in Figures 5 and 9. Although the vertical
orientations are
shown at 180 apart, each orientation can include a range, for example the
range can be +/- 180
for the audio user interface and for the vertical orientations and horizontal
orientations the range
can be +/- 45 for the graphical user interface.

CA 02810910 2013-03-07
WO 2012/040363 PCT/US2011/052611
9
[0029] The orientation component can include one or more accelerometers, a
gyroscopes, a
mercury switches, any combination thereof, or any other device or devices that
can detect which
short side 204, 206 is higher than the other short side 204, 206, detect which
side 204, 206, 208,
210 is higher than the sides 204, 206, 208, 210, or both. In one or more
embodiments, the mobile
device 100 can include a manual switch (not shown) which can set the user
interface in a single
orientation. For example, a graphical user interface can be set to be
displayed in a first vertical
orientation, a second vertical orientation, a first horizontal orientation, or
a second horizontal
orientation. For the audio user interface, activation of a speaker 214, 216
and a microphone 218,
220 based on a first vertical orientation or a second vertical orientation.
[0030] By knowing the orientation of the mobile device 100, a graphical user
interface can be
displayed on the display 322 and an audio user interface can be activated by
activating one or
more speakers and one or more microphones in accordance with the determined
vertical
orientation of the mobile device 100. The graphical user interface can cause
the display of
information, such as an image or message, based on the determined orientation
of the mobile
device 100. For example, as shown in Figures 2-9, the information, "INCOMING
CALL Bob
Smith 555-555-1234," is displayed in accordance with the determined
orientation. Thus, if the
mobile device 100 of Figure 2 is rotated 90 clockwise, then the information
is rotated 90
clockwise and displayed as shown in Figure 5. If the mobile device 100 of
Figure 5 is rotated 90
clockwise, then the information is rotated 90 clockwise and is displayed as
shown in Figure 3. If
the mobile device 100 of Figure 3 is rotated 90 clockwise, then the
information is rotated 90
clockwise and is displayed as shown in Figure 4. If the mobile device 100 of
Figure 4 is rotated
90 clockwise, then the information is rotated 90 clockwise and is displayed
as shown in Figure
2. Similarly, if the mobile device 100 of Figure 6 is rotated 90 clockwise,
then the information is
rotated 90 clockwise and displayed as shown in Figure 9. If the mobile device
100 of Figure 9 is
rotated 90 clockwise, then the information is rotated 90 clockwise and is
displayed as shown in
Figure 7. If the mobile device 100 of Figure 7 is rotated 90 clockwise, then
the information is
rotated 90 clockwise and is displayed as shown in Figure 8. If the mobile
device 100 of Figure 8
is rotated 90 clockwise, then the information is rotated 90 clockwise and is
displayed as shown
in Figure 6. As the mobile device 100 is rotated, the orientation component
366 can detect the

CA 02810910 2013-03-07
WO 2012/040363 PCT/US2011/052611
orientation of the mobile device 100 and display the information in accordance
with the
orientation of the mobile device 100.
[0031] By knowing the orientation of the mobile device 100, an audio user
interface can be
enabled by activating one or more microphones and activating one or more
speakers. For
example, if the orientation component 366 determines that the mobile device
100 is in the first
vertical orientation as shown in Figure 2, then the first speaker 214 and the
second microphone
220 can be activated. In addition, the second speaker 216 and first microphone
220 can remain
deactivated. If the orientation component 366 determines that the mobile
device 100 is in the
second vertical orientation as shown in Figure 3, then the second speaker 216
and first
microphone 218 can be activated. In addition, the first speaker 214 and the
second microphone
220 can remain deactivated. If mobile device 100 contains transducers, based
on the orientation,
one transducer can be configured to function as a speaker and the other
transducer can be
configured to function as a microphone. For example, in the event the
orientation component 366
determines that the mobile device 100 is in a first vertical orientation as
shown in Figure 6, the
first transducer 402 can be configured to function as a speaker and the second
transducer 404 can
be configured to function as a microphone. In another example, in the event
the orientation
component 366 determines that the mobile device 100 is in a second vertical
orientation as shown
in Figure 7, the first transducer 402 can be configured to function as a
microphone and the second
transducer 404 can be configured to function as a speaker.
[0032] In the event the mobile device 100 is in a horizontal orientation as
shown in Figures 4, 5,
8, or 9, then once the orientation of the mobile device 100 is detected as
being in a vertical
orientation, then one or more of the audio components 214, 216, 218, 220 can
be activated
accordingly in response to an incoming call or outgoing call. For example, in
the event the mobile
device 100 of Figure 4 is in the telephone mode and the mobile device 100 is
rotated to the first
vertical orientation as shown in Figure 2, then the first speaker 214 and
second microphone 220
can be activated. In addition, the second speaker 216 and first microphone 218
can remain
deactivated. In another example, in the event the mobile 100 of Figure 4 is in
the telephone mode
and the mobile device 100 is rotated to the second vertical orientation as
shown in Figure 3, then
the second speaker 216 and first microphone 218 can be activated. In addition,
the first speaker

CA 02810910 2013-03-07
WO 2012/040363 PCT/US2011/052611
11
214 and second microphone 220 can remain deactivated. In another example, in
the event the
mobile device 100 of Figure 4 is rotated to the first vertical orientation as
shown in Figure 6, then
the first transducer 602 can be configured to function as a speaker and the
second transducer 604
can be configured to function as a microphone. In yet another example, in the
event the mobile
device 100 of Figure 4 is rotated to the second vertical orientation as shown
in Figure 7, then the
first transducer 602 can be configured to function as a microphone and the
second transducer 604
can be configured to function as a speaker. Regardless of the determined
orientation of the
mobile device 100, in the event the mobile device is in a music playing mode,
then one or more
speakers can be activated. For example, both speakers 214, 216 shown in
Figures 2-5 can be
activated or both transducers 402, 404 shown in Figures 6-9 can be configured
to function as
speakers.
[0033] Referring to Figure 10, a flowchart of a first method for displaying a
graphical user
interface and activating an audio user interface in accordance with an
exemplary embodiment is
illustrated. This exemplary method 1000 can be used when the processor 336 is
configured to
only determine two orientations based on the information provided from the
orientation
component 366 which can be from an orientation signal. The two orientations
can be when the
first short side 204 is higher than the second short side 206 and the second
short side 206 is
higher than the first short side 206. In the event, the first short side 204
and the second short side
206 are exactly or substantially exactly even, the processor 336 can have a
default mode such as
the first short side 204 being higher than the second short side 206 and thus
a user of the mobile
device 100 can recognize the proper orientation based on the how the call
information is
displayed, for example with the call information shown with the orientated
with the first short side
204 being higher than the second short side 206 as shown in Figures 2 and 5.
In other
embodiments, one or more other defaults can be used such as the method shown
in Figure 12.
The exemplary method 1000 is provided by way of example, as there are a
variety of ways to
carry out the method. The method 1000 described below can be carried out using
the
communication devices and communication network shown in Figures 1-9 by way of
example,
and various elements of these figures are referenced in explaining exemplary
method 1000. Each
block shown in Figure 10 represents one or more processes, methods or
subroutines carried out in
exemplary method 1000. The exemplary method 1000 may begin at block 1002.

CA 02810910 2013-03-07
WO 2012/040363 PCT/US2011/052611
12
[0034] At block 1002, the orientation component is triggered. For example, in
response to an
incoming call or an outgoing call, the processor 336 can trigger the
orientation component 366 to
determine which short side 204, 206 is higher. After triggering the
orientation component 366,
the method 1000 can proceed to block 1004.
[0035] At block 1004, the speaker at the higher short side is activated and a
microphone at the
other short side is activated. The processor 336 can receive an orientation
signal from the
orientation component 366 with the orientation signal indicating whether the
first short side 204 is
higher than the second short side 206 as shown in Figures 2 and 6 or in the
event the second short
side 206 is higher than the first short side 204 as shown in Figures 3 and 7.
For example, the
processor 226 can activate the first speaker 214 and activate the second
microphone 216 of
Figure 2. In another example, the first transducer 602 can be configured to
function as a speaker
and the second transducer 604 can be configured to function as a microphone.
In addition, the
non-activated audio components can remain non-activated. For example, in the
event the first
side 204 is higher than the second short side 206 as shown in Figure 2, then
the second speaker
216 and the second microphone 218 remain non-activated. After activating the
speaker at the
higher short side and the microphone at the other short side, the method 1000
proceeds to block
1006.
[0036] At block 1006, call information is displayed in accordance with the
vertical orientation of
the mobile device. The processor 338 can cause the display of the call
information in accordance
with the vertical orientation of the mobile device 100. For example, as shown
in Figures 2 and 6,
the processor 338 can cause the display of the call information, "INCOMING
CALL Bob Smith
555-555-1234," in accordance with the vertical orientation of mobile device,
for example, the
first short side 204 being higher than the second short side 206. For example,
as shown in
Figures 3 and 7, the processor 338 can cause the display of the call
information, "INCOMING
CALL Bob Smith 555-555-1234," in accordance with the vertical orientation of
mobile device, for
example, the second short side 206 being higher than the first short side 204.
[0037] Referring to Figure 11, a flowchart of a second method for displaying a
graphical user
interface and activating an audio user interface in accordance with an
exemplary embodiment is

CA 02810910 2013-03-07
WO 2012/040363 PCT/US2011/052611
13
illustrated. This exemplary method 1100 can be used when the processor 336 is
configured to
determine four orientations based on the information provided from the
orientation component
366 which can be from an orientation signal. The four orientations can be when
the first short
side 204 is higher than the rest of the sides 206, 208, 210, when the second
short side 206 is
higher than the rest of the sides 204, 208, 210, when the first long side 208
is higher than the rest
of the sides 204, 206, 210, and when the second long side 210 is higher than
the rest of the sides
202, 204, 208. In this exemplary method 1100, the call information can be
displayed in
accordance with the orientation and in the event the orientation is horizontal
(one of the long
sides 208, 210 is higher than the other sides 204, 206, 208, 210, then the
activation of a speaker
and microphone can be delayed while the user rotates the mobile device 100
into a vertical
orientation (the first short side 204 being higher than the second short side
206 or the second
short side 206 being higher than the first short side 204). The exemplary
method 1100 is
provided by way of example, as there are a variety of ways to carry out the
method. The method
1100 described below can be carried out using the communication devices and
communication
network shown in Figures 1-9 by way of example, and various elements of these
figures are
referenced in explaining exemplary method 1100. Each block shown in Figure 11
represents one
or more processes, methods or subroutines carried out in exemplary method
1100. The
exemplary method 1100 may begin at block 1102.
[0038] At block 1102, the orientation component is triggered. For example, in
response to an
incoming call or an outgoing call, the processor 336 can trigger the
orientation component 366 to
determine which short side 204, 206 is higher. In another example, the
orientation component
366 can be triggered again after call information "INCOMING CALL Bob Smith 555-
555-1234,"
is displayed in accordance with a horizontal orientation as shown in Figures
4, 5, 7, and 8. After
triggering the orientation component 366, the method 1000 can proceed to block
1104.
[0039] At block 1104, a determination is made whether a short side is the
higher side. For
example, the processor 336 can receive the orientation signal from the
orientation component 366
with the orientation signal indicating which side 204, 206, 208, 210 is higher
than the other sides
204, 206, 208, 210. In the event one of the short sides 204, 206 is higher as
shown in Figures 2,
3, 6, and 7, then the method 1100 can proceed to block 1106. In the event one
of the long sides

CA 02810910 2013-03-07
WO 2012/040363 PCT/US2011/052611
14
208, 210 is higher as shown in Figures 4, 5, 8, and 9, then the method 1100
can proceed to block
1110.
[0040] At block 1106, the speaker at the higher short side is activated and a
microphone at the
other short side is activated. The processor 336 can receive an orientation
signal from the
orientation component 366 with the orientation signal indicating whether the
first short side 204 is
higher than the second short side 206 as shown in Figures 2 and 6 or in the
event the second short
side 206 is higher than the first short side 204 as shown in Figures 3 and 7.
For example, the
processor 226 can activate the first speaker 214 and activate the second
microphone 216 of
Figure 2. In another example, the first transducer 602 can be configured to
function as a speaker
and the second transducer 604 can be configured to function as a microphone.
In addition, the
non-activated audio components can remain non-activated. For example, in the
event the first
side 204 is higher than the second short side 206 as shown in Figure 2, then
the second speaker
216 and the second microphone 218 remain non-activated. After activating the
speaker at the
higher short side and the microphone at the other short side, the method 1100
proceeds to block
1108.
[0041] At block 1108, call information is displayed in accordance with the
vertical orientation of
the mobile device. The processor 338 can cause the display of the call
information in accordance
with the vertical orientation of the mobile device 100. For example, as shown
in Figures 2 and 6,
the processor 338 can cause the display of the call information, "INCOMING
CALL Bob Smith
555-555-1234," in accordance with the vertical orientation of mobile device,
for example, the
first side 204 being higher than the second side 206. For example, as shown in
Figures 3 and 7,
the processor 338 can cause the display of the call information, "INCOMING
CALL Bob Smith
555-555-1234," in accordance with the vertical orientation of mobile device,
for example, the
second side 206 being higher than the first side 204. After displaying the
call information, the
method can proceed to block 1102.
[0042] At block 1110, call information is displayed in accordance with the
horizontal orientation
of the mobile device. The processor 338 can cause the display of the call
information in
accordance with the horizontal orientation of the mobile device 100. For
example, as shown in

CA 02810910 2013-03-07
WO 2012/040363 PCT/US2011/052611
Figures 5 and 9, the processor 338 can cause the display of the call
information, "INCOMING
CALL Bob Smith 555-555-1234," in accordance with the horizontal orientation of
mobile device,
for example, the first long side 208 being higher than the second long side
210. For example, as
shown in Figures 4 and 8, the processor 338 can cause the display of the call
information,
"INCOMING CALL Bob Smith 555-555-1234," in accordance with the vertical
orientation of
mobile device, for example, the second long side 210 being higher than the
first side 208.
[0043] Referring to Figure 12, a flowchart of a method for activating an audio
user interface in
the event the orientation component does not provide a definitive orientation
of the mobile device
in accordance with an exemplary embodiment is illustrated. This exemplary
method 1200 can be
used when the detected orientation is indeterminate, for example in the event
the mobile device
100 is laying flat with no side 204, 206, 208, 210 being higher than the other
sides 204, 206, 208,
210. In alternate embodiments the process can eliminate the need for an
accelerometer or other
device that utilizes gravity to determine orientation. The exemplary method
1200 is provided by
way of example, as there are a variety of ways to carry out the method. The
method 1200
described below can be carried out using the communication devices and
communication network
shown in Figures 1-9 by way of example, and various elements of these figures
are referenced in
explaining exemplary method 1200. Each block shown in Figure 12 represents one
or more
processes, methods or subroutines carried out in exemplary method 1200. The
exemplary method
1200 may begin at block 1202.
[0044] At block 1202, a phone call is started. For example, the mobile device
100 can receive a
phone call or can initiate a phone call. After starting a phone call is
started, the method 1200 can
proceed to block 1204.
[0045] At block 1204, both speakers and both microphones are turned on or
activated. For
example, the processor 338 can activate the first speaker 214, the second
speaker 216, the first
microphone 218, and the second microphone 220 are activated. After activating
the first speaker
214, the second speaker 216, the first microphone 218, and the second
microphone 220, the
method 1200 can proceed to block 1206. In the event, the mobile device 100 has
a first
transducer 602 and a second transducer 604, then the processor 338 can
configure both

CA 02810910 2013-03-07
WO 2012/040363 PCT/US2011/052611
16
transducers 602, 604 to function as microphones. After activating both
speakers 214, 216 and
both microphones 218, 220, the method can proceed to block 1206. In the event,
both
transducers 602, 604 are configured to function as microphones, the method
1200 can proceed to
block 1208 (not shown).
[0046] At block 1206, the received audio is played through both speakers. For
example, in the
event audio from another communication device, such as a mobile phone or
landline telephone, is
received by the mobile device 100, the processor 338 can play the audio via
the first speaker 214
and the second speaker 216. After playing the received audio, the method 1200
can proceed to
block 1208.
[0047] At block 1208, audio is processed or received on both microphones. For
example, in the
event audio is received at the first microphone 218, the second microphone 220
or both, the
processor 338 can receive the audio and processes the received audio.
Processing the received
audio can include the processor 338 determining the signal to noise ratio
(SNR) for each
microphone 218, 220. After processing the audio, the method 1200 can proceed
to block 1210.
[0048] At block 1210, a determination is made as to whether the audio level at
one microphone is
substantially greater than the audio level at the other microphone, this
inequality would
correspond to the user speaking with one microphone located closer to the
mouth of the user than
the other microphone. For example, the processor 338 can determine if the
audio, for example,
the SNR, is substantially greater at the first microphone 218 or the second
microphone 220. The
processor 338 can compare the SNR associated with the first microphone 218
with the SNR
associated with the second microphone 220 to determine if one is substantially
greater than the
other. In the event, one SNR is not substantially greater, for example, the
two SNRs are within a
predetermine range, then the method 1200 can proceed to block 1208. In the
event that one SNR
is substantially greater, for example, one SNR is above a predetermined range
higher than the
other SNR, the method 1200 can proceed to block 1212. In alternate
embodiments, audio level
determining approaches other than SNR are anticipated. Such approaches include
signal to
interference or total signal level. Furthermore, the audio level determination
may be enhanced by
performing the determination by either subtracting a signal substantially
equivalent to the audio

CA 02810910 2013-03-07
WO 2012/040363 PCT/US2011/052611
17
produced by the speakers and received by the microphones from the audio signal
received by the
microphones, or by performing the determination during relative quiet portions
of audio produced
by the speakers, or by using a combination thereof
[0049] At block 1212, the microphone at the other side is turned off and the
adjacent speaker is
turned off For example, based on the SNR, the processor 338 can turn off the
microphone 218,
220 with the lower SNR and can turn off the speaker 214, 216 adjacent to the
microphone 218,
220 that is not turned off For example, in the event the processor 338
determines that the mobile
device 100 is in the first vertical orientation as shown in Figure 2, the
processor 338 can turn off
the first microphone 218 and can turn off the second speaker 216 and in the
event the processor
338 determines that the mobile device 100 is in the second vertical
orientation as shown in Figure
3, the processor 338 turns off the second microphone 220 and turns off the
first speaker 214. In
the event, the mobile device 100 has a first transducer 602 and a second
transducer 604, then the
processor 338 can configure the transducers 602, 604 to function based on the
orientation, for
example, the first vertical orientation as shown in Figure 6 or the second
vertical orientation as
shown in Figure 7. In the event the processor 338 determines the mobile device
100 is in the first
vertical orientation, then the first transducer 602 can be configured to
function as a speaker and
the second transducer can be configured to function as a microphone.
Conversely, in the event
the processor 338 determines the mobile device 100 is in the second vertical
orientation, then the
first transducer 602 can be configured to function as a microphone and the
second transducer can
be configured to function as a speaker. After turning off the appropriate
components, the method
1200 can proceed to block 1214.
[0050] At block 1214, the phone call is continued.
[0051] When a mobile device 100 having multiple speakers 214, 216 and multiple
microphones
218, 220 is used to make or receive a phone call, the user must determine the
proper orientation
of the mobile device 100. When the mobile device 100 includes a touchscreen
224, the user of
the mobile device 100 can experience difficulty in determining the proper
orientation. For
example, which way to hold the mobile device 100 so that the side with the
active speaker 214,
216 is on higher than the side with the active microphone 218, 220. By
including an orientation

CA 02810910 2016-09-23
component 366 in the mobile device 100, the mobile device 100 and more
specifically the
processor 338 can automatically determine how to display a graphical user
interface and activate
the audio user interface based on the orientation of the mobile device 100.
Thus, the user does
not need to determine a proper orientation since the mobile device 100
determines the orientation
based on the position of the mobile device 100. As a result, the user can
respond to a call or
make a call more quickly and thereby shorten the length of the call. By
reducing the length of the
call, the user of the mobile device 100 is able to reduce congestion on the
network.
[0053] The above reference was made in detail to implementations of the
technology. Each
example was provided by way of explanation of the technology only, not as a
limitation of the
technology. It will be apparent to those skilled in the art that various
modifications and
variations can be made in the present technology without departing from the
scope of the
technology. For instance, features described as part of one implementation can
be used on
another implementation to yield a still further implementation. Thus, it is
intended that the
present technology cover such modifications and variations that come within
the scope of the
technology.
18

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Administrative Status , Maintenance Fee  and Payment History  should be consulted.

Administrative Status

Title Date
Forecasted Issue Date 2017-10-31
(86) PCT Filing Date 2011-09-21
(87) PCT Publication Date 2012-03-29
(85) National Entry 2013-03-07
Examination Requested 2013-03-07
(45) Issued 2017-10-31

Abandonment History

There is no abandonment history.

Maintenance Fee

Last Payment of $263.14 was received on 2023-09-15


 Upcoming maintenance fee amounts

Description Date Amount
Next Payment if standard fee 2024-09-23 $347.00
Next Payment if small entity fee 2024-09-23 $125.00

Note : If the full payment has not been received on or before the date indicated, a further fee may be required which may be one of the following

  • the reinstatement fee;
  • the late payment fee; or
  • additional fee to reverse deemed expiry.

Patent fees are adjusted on the 1st of January every year. The amounts above are the current amounts if received by December 31 of the current year.
Please refer to the CIPO Patent Fees web page to see all current fee amounts.

Payment History

Fee Type Anniversary Year Due Date Amount Paid Paid Date
Request for Examination $800.00 2013-03-07
Registration of a document - section 124 $100.00 2013-03-07
Registration of a document - section 124 $100.00 2013-03-07
Application Fee $400.00 2013-03-07
Maintenance Fee - Application - New Act 2 2013-09-23 $100.00 2013-03-07
Maintenance Fee - Application - New Act 3 2014-09-22 $100.00 2014-09-08
Maintenance Fee - Application - New Act 4 2015-09-21 $100.00 2015-09-04
Maintenance Fee - Application - New Act 5 2016-09-21 $200.00 2016-08-31
Registration of a document - section 124 $100.00 2017-05-12
Maintenance Fee - Application - New Act 6 2017-09-21 $200.00 2017-09-06
Final Fee $300.00 2017-09-18
Maintenance Fee - Patent - New Act 7 2018-09-21 $200.00 2018-09-17
Maintenance Fee - Patent - New Act 8 2019-09-23 $200.00 2019-09-13
Maintenance Fee - Patent - New Act 9 2020-09-21 $200.00 2020-09-11
Maintenance Fee - Patent - New Act 10 2021-09-21 $255.00 2021-09-17
Maintenance Fee - Patent - New Act 11 2022-09-21 $254.49 2022-09-16
Maintenance Fee - Patent - New Act 12 2023-09-21 $263.14 2023-09-15
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
BLACKBERRY LIMITED
Past Owners on Record
RESEARCH IN MOTION LIMITED
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Abstract 2013-03-07 2 76
Claims 2013-03-07 4 141
Representative Drawing 2013-03-07 1 23
Description 2013-03-07 18 968
Drawings 2013-03-07 12 258
Cover Page 2013-05-21 2 51
Claims 2015-02-24 3 117
Claims 2015-12-07 11 436
Claims 2016-09-23 3 83
Description 2016-09-23 18 969
Final Fee 2017-09-18 1 50
Representative Drawing 2017-10-04 1 12
Cover Page 2017-10-04 1 47
Assignment 2013-03-07 13 607
PCT 2013-03-07 4 178
Prosecution-Amendment 2014-08-27 2 69
Amendment 2016-09-23 12 409
Prosecution-Amendment 2015-02-24 13 497
Prosecution-Amendment 2015-06-10 3 225
Amendment 2015-12-07 16 614
Examiner Requisition 2016-03-23 5 337