Language selection

Search

Patent 2917542 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent: (11) CA 2917542
(54) English Title: MOTION EVENT RECOGNITION AND VIDEO SYNCHRONIZATION SYSTEM AND METHOD
(54) French Title: SYSTEME ET PROCEDE DE RECONNAISSANCE D'EVENEMENT DE MOUVEMENT ET DE SYNCHRONISATION VIDEO
Status: Granted and Issued
Bibliographic Data
(51) International Patent Classification (IPC):
  • G06F 3/01 (2006.01)
  • H04N 21/242 (2011.01)
  • H04N 21/80 (2011.01)
  • H04W 4/00 (2018.01)
(72) Inventors :
  • BENTLEY, MICHAEL (United States of America)
  • KAPS, RYAN (United States of America)
  • BOSE, BHASKAR (United States of America)
  • ALAM, SHEEHAN (United States of America)
  • GILLIAN, MICHAEL (United States of America)
  • ABDEL-RAHMAN, MAZEN (United States of America)
(73) Owners :
  • BLAST MOTION INC.
(71) Applicants :
  • BLAST MOTION INC. (United States of America)
(74) Agent: SMITHS IP
(74) Associate agent: OYEN WIGGS GREEN & MUTALA LLP
(45) Issued: 2022-08-16
(86) PCT Filing Date: 2015-04-21
(87) Open to Public Inspection: 2015-10-29
Examination requested: 2020-04-14
Availability of licence: N/A
Dedicated to the Public: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): Yes
(86) PCT Filing Number: PCT/US2015/026896
(87) International Publication Number: WO 2015164389
(85) National Entry: 2016-01-05

(30) Application Priority Data:
Application No. Country/Territory Date
14/257,959 (United States of America) 2014-04-21

Abstracts

English Abstract

Enables recognition of events within motion data obtained from portable wireless motion capture elements and video synchronization of the events with video as the events occur or at a later time, based on location and/or time of the event or both. May use integrated camera or external cameras with respect to mobile device to automatically generate generally smaller event videos of the event on the mobile device or server. Also enables analysis or comparison of movement associated with the same user, other user, historical user or group of users. Provides low memory and power utilization and greatly reduces storage for video data that corresponds to events such as a shot, move or swing of a player, a concussion of a player, or other medical related events or events, such as the first steps of a child, or falling events.


French Abstract

La présente invention concerne un système et un procédé permettant la reconnaissance d'événements dans des données de mouvement obtenues à partir d'éléments de capture de mouvement sans fil portatifs et la synchronisation vidéo des événements avec une vidéo lorsque les événements se produisent ou à un moment ultérieur, sur la base de l'emplacement et/ou du moment de l'événement ou des deux. L'invention peut utiliser une caméra intégrée ou des caméras externes par rapport à un dispositif mobile, en vue de générer automatiquement des vidéos d'événement généralement plus petites de l'événement sur le dispositif mobile ou un serveur. L'invention permet également l'analyse ou la comparaison d'un mouvement associé au même utilisateur, à un autre utilisateur, à un utilisateur historique ou à un groupe d'utilisateurs. L'invention assure une faible utilisation de mémoire et de puissance et réduit considérablement une mémorisation de données vidéo qui correspondent à des événements tels qu'un tir, un déplacement ou un élan d'un joueur, une commotion d'un joueur ou d'autres événements de nature médicale ou événements, tels que les premiers pas d'un enfant ou des événements du type chute.

Claims

Note: Claims are shown in the official language in which they were submitted.


CLAIMS
1. A motion event recognition and video synchronization system comprising:
at least one motion capture element configured to couple with a user or piece
of
equipment or mobile device coupled with the user, wherein said at least one
motion
capture element comprises
a memory;
a sensor configured to capture any combination of values associated with an
orientation, position, velocity and acceleration of said at least one motion
capture
element;
a radio;
a microcontroller coupled with said memory, said sensor and said radio,
wherein
said microcontroller is configured to
collect data that comprises sensor values from said sensor;
store said data in said memory;
analyze said data and recognize an event within said data to determine
event data;
transmit said event data associated with said event via said radio;
a mobile device comprising
a computer;
a wireless communication interface configured to communicate with said radio
to
obtain said event data associated with said event;
wherein said computer is coupled with wireless communication interface,
wherein
said computer is configured to
receive said event data from said wireless communication interface;
analyze said event data to form motion analysis data;
store said event data, or said motion analysis data, or both said event data
and said motion analysis data;
obtain an event start time and an event stop time of said event from said
event data;
67
Date Recue/Date Received 2021-09-14

request image data captured with a camera comprising a video captured at
least during a timespan from said event start time to said event stop time;
display an event video on a display comprising both of
said event data, said motion analysis data or any combination
thereof that occurs during said timespan from said event start time
to said event stop time, and
said video captured during said timespan from said event start time
to said event stop time; and,
wherein said computer is further configured to
discard at least a portion of said video outside of said event start
time to said event stop time as non-event related video based on
said motion analysis data, and
store said event video
without said portion of said video outside of said event start
time to said event stop time, and
with said motion analysis data that occurs from said event
start time to said event stop time.
2. The system of claim 1 wherein said computer is further configured to
determine a clock
difference between said at least one motion capture element and said mobile
device and
synchronize said motion analysis data with said video.
3. The system of claim 1 wherein said computer is further configured to
save said video from
said event start time to said event stop time with said motion analysis data
that occurs from
said event start time to said event stop time without said portion of said
video outside of said
event start time to said event stop time to a server computer external to said
computer.
4. The system of claim 1 wherein said system further comprises a server
computer remote to
said mobile device and wherein said server computer is configured to save said
video from
said event start time to said event stop time with said motion analysis data
that occurs from
said event start time to said event stop time and return said video captured
during said
timespan from said event start time to said event stop time to said computer
in said mobile
device.
68
Date Recue/Date Received 2021-09-14

5. The system of claim 1 wherein said computer is further configured while a
communication
link is open between said at least one motion capture element and said mobile
device to
discard at least a portion of said video outside of said event start time to
said event stop time,
and to save said video from said event start time to said event stop time with
said motion
analysis data that occurs from said event start time to said event stop time.
6. The system of claim 1 wherein said computer is further configured while a
communication
link is not open between said at least one motion capture element and said
mobile device to
save said video, and after said event is received after said communication
link is open, to
then discard at least a portion of said video outside of said event start time
to said event stop
time and to save said video from said event start time to said event stop time
with said
motion analysis data that occurs from said event start time to said event stop
time.
7. The system of claim 1 wherein said video is obtained from a camera coupled
with said
mobile device.
8. The system of claim 1 wherein said video is obtained from a camera remote
from said mobile
device.
9. The system of claim 1 wherein said video is obtained from a server
computer remote to said
mobile device.
10. The system of claim 1 wherein said at least one motion capture element is
further configured
to be vvom near the user's head and wherein said recognize said event vvithin
said data
comprises calculation of a location of impact on the user's head.
11. The system of claim 1 wherein said at least one motion capture element is
further configured
to couple with a helmet or hat or cap or mouthpiece.
12. The system of claim 1 wherein said at least one motion capture element is
further configured
to couple with a helmet and wherein said recognize said event within said data
comprises
calculation of the location of impact on the user's head based on the physical
geometry of the
helmet.
13. The system of claim 1 further comprising an isolator configured to
surround said at least one
motion capture element to approximate physical acceleration dampening of
cerebrospinal
fluid around said user's brain to minimize translation of linear acceleration
and rotational
69
Date Recue/Date Received 2021-09-14

acceleration of said event data to obtain an observed linear acceleration and
an observed
rotational acceleration of the user's brain.
14. The system of claim 1 wherein said computer is further configured to
synchronize said video
and said event data, or said motion analysis data, via image analysis to more
accurately
determine a start event frame or stop event frame in said video or both, that
is most closely
associated with said event start time or said event stop time or both.
15. The system of claim 1 wherein said system further comprises a server
computer, wherein
said server computer is configured to synchronize said video and said event
data, or said
motion analysis data, via image analysis to more accurately determine a start
event frame or
stop event frame in said video or both, that is most closely associated with
said event start
time or said event stop time or both.
16. The system of claim 1 wherein said computer is further configured to
access previously stored event data or motion analysis data associated with
said user or
piece of equipment;
wherein said display said event video further comprises a presentation of said
event data
associated with said user on a display based on
said event data or motion analysis data associated with said user or piece of
equipment
and
said previously stored event data or motion analysis data associated with
said user or piece of equipment
or
said previously stored event data or motion analysis data associated with
at least one other user or other piece of equipment.
17. The system of claim 1 wherein said microcontroller in said at least one
motion capture
element is further configured to transmit said event to at least one other
motion capture
element or at least one other mobile device or any combination thereof, and
wherein said at
least one other motion capture element or said at least one other mobile
device or any
combination thereof is configured to save data associated with said event
based on said event
start time and said event stop time.
18. The system of claim 1 further comprising:
Date Recue/Date Received 2021-09-14

an identifier coupled with said at least one motion capture element or said
user or said
piece of equipment;
wherein said computer is further configured to
receive said identifier; and
associate said identifier with said event data and said motion analysis data.
19. The system of claim 1 wherein said at least one motion capture element
further comprises a
light emitting element configured to output light if said event occurs.
20. The system of claim 1 wherein said at least one motion capture element
further comprises an
audio output element configured to output sound if said event occurs or if
said at least one
motion capture element is out of range of said computer or wherein said
computer is
configured to display and alert if said at least one motion capture element is
out of range of
said computer, or any combination thereof.
21. The system of claim 1 wherein said at least one motion capture element
further comprises a
location determination element configured to determine a location that is
coupled with said
microcontroller and wherein said microcontroller is further configured to
transmit said
location to said computer, or wherein said system further comprises a server
computer and
wherein said microcontroller is further configured to transmit said location
to said server
computer and wherein said computer or server computer is further configured to
form at least
one event video from portions of at least one video based on said location and
said event start
time and said event stop time.
22. The system of claim 1 wherein said microcontroller or said computer is
further configured to
determine a location of said event or wherein said microcontroller and said
computer are
further configured to determine the location of said event and correlate the
location.
23. The system of claim 1 wherein said computer is further configured to
request at least one
image or video that contains said event from at least one camera proximal to
said event or
from a server computer.
24. The system of claim 1 wherein said computer is further configured to
broadcast a request for
camera locations proximal to said event or oriented to view said event.
25. The system of claim 1 wherein said computer is further configured to
display a list of one or
more times at which said event has occurred or wherein one or more events has
occurred.
71
Date Recue/Date Received 2021-09-14

26. The system of claim 1 wherein said at least one motion capture element is
physically coupled
with said mobile device.
27. The system of claim 1 wherein the microcontroller is coupled with a
temperature sensor and
wherein said microcontroller is further configured to transmit a temperature
obtained from
the temperature sensor as a temperature event.
28. The system of claim 1 wherein said event data comprises
motion data associated with said at least one motion capture element coupled
with any
combination of said user, or said piece of equipment or said mobile device
or
motion data indicative of standing, walking, falling, heat stroke, a seizure,
violent
shaking, a concussion, a collision, abnormal gait, abnormal or non-existent
breathing
or
any combination thereof.
72
Date Recue/Date Received 2021-09-14

Description

Note: Descriptions are shown in the official language in which they were submitted.


CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
MOTION EVENT RECOGNITION AND VIDEO SYNCHRONIZATION
SYSTEM AND METHOD
BACKGROUND OF THE INVENTION
FIELD OF THE INVENTION
[001] One or more embodiments pertain to the field of motion capture data
analysis and
displaying information based on events recognized within the motion capture
data or within
motion analysis data associated with a user, or piece of equipment and/or
based on previous
motion analysis data from the user or other user(s) and/or piece of equipment.
More particularly,
but not by way of limitation, one or more embodiments enable a motion event
recognition and
video synchronization system and method that enables recognition of events
within motion data
including but not limited to a shot, move or swing of a player, a concussion
of a player, boxer,
rider or driver, or a heat stroke, hypothermia, seizure, asthma attack,
epileptic attack or any other
sporting or physical motion related event including walking and falling.
Motion events may be
correlated and/or otherwise synchronized with image(s) or video, as the events
happen or at a
later time based on location and/or time of the event or both, for example on
the mobile device
or on a remote server, and as captured from internal/external camera(s) or
nanny cam, for
example to enable saving video of the event, such as the first steps of a
child, violent shaking
events, sporting, military or other motion events including concussions, or
falling events
associated with an elderly person and for example discarding non-event related
video data, to
greatly reduce storage requirements for event videos.
DESCRIPTION OF THE RELATED ART
[002] Existing motion capture systems process and potentially store enormous
amounts of data
with respect to the actual events of interest. For example, known systems
capture accelerometer
data from sensors coupled to a user or piece of equipment and analyze or
monitor movement. In
these scenarios, thousands or millions of motion capture samples are
associated with the user at
rest or not moving in a manner that is related to a particular event that the
existing systems are
attempting to analyze. For example, if monitoring a football player, a large
amount of motion
data is not related to a concussion event, for a baby, a large amount of
motion data is not related
in general to a shaking event or non-motion event such as sudden infant death
syndrome (SIDS),
for a golfer, a large amount of motion data captured by a sensor mounted on
the player's golf
club is of low acceleration value, e.g., associated with the player standing
or waiting for a play or

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
otherwise not moving or accelerating in a manner of interest. Hence,
capturing, transferring and
storing non-event related data increases requirements for power, bandwidth and
memory.
[003] In addition, video capture of a user performing some type of motion may
include even
larger amounts of data, much of which has nothing to do with an actual event,
such as a swing of
a baseball bat or home run. There are no known systems that automatically trim
video, e.g., save
event related video and discard non-event related video, to generate smaller
video segments that
correspond to the events that occur in the video and for example as detected
through analysis of
the motion capture data.
[004] Some systems that are related to monitoring impacts are focused on
linear acceleration
related impacts. These systems are unable to monitor rotational accelerations
or velocities and
are therefore unable to detect certain types of events that may produce
concussions. In addition,
many of these types of systems do not produce event related, connectionless
messages for low
power and longevity considerations. Hence, these systems are limited in their
use based on their
lack of robust characteristics.
[005] Known systems also do not contemplate data mining of events within
motion data to form
a representation of a particular movement, for example a swing of an average
player or average
professional player level, or any player level based on a function of events
recognized within
previously stored motion data. Thus, it is difficult and time consuming and
requires manual
labor to find, trim and designate particular motion related events for use in
virtual reality for
example. Hence, current systems do not easily enable a particular user to play
against a
previously stored motion event of the same user or other user along with a
historical player for
example. Furthermore, known systems do not take into account cumulative
impacts, and for
example with respect to data mined information related to concussions, to
determine if a series of
impacts may lead to impaired brain function over time.
[006] Other types of motion capture systems include video systems that are
directed at
analyzing and teaching body mechanics. These systems are based on video
recording of an
athlete and analysis of the recorded video of an athlete. This technique has
various limitations
including inaccurate and inconsistent subjective analysis based on video for
example. Another
technique includes motion analysis, for example using at least two cameras to
capture three-
dimensional points of movement associated with an athlete. Known
implementations utilize a
stationary multi-camera system that is not portable and thus cannot be
utilized outside of the
environment where the system is installed, for example during an athletic
event such as a golf
tournament, football game or to monitor a child or elderly person. In general
video based
2

systems do not also utilize digital motion capture data from sensors on the
object undergoing
motion since they are directed at obtaining and analyzing images having visual
markers instead
of electronic sensors. These fixed installations are extremely expensive as
well.
[007] Regardless of the motion capture data obtained, the data is generally
analyzed on a per
user or per swing basis that does not contemplate processing on a mobile
phone, so that a user
would only buy a motion capture sensor and an -app" for a pre-existing mobile
phone. In
addition, existing solutions do not contemplate mobile use, analysis and
messaging and/or
comparison to or use of previously stored motion capture data from the user or
other users or
data mining of large data sets of motion capture data, for example to obtain
or create motion
capture data associated with a group of users, for example professional
golfers, tennis players,
baseball players or players of any other sport to provide events associated
with a ``professional
level" average or exceptional virtual reality opponent. To summarize, motion
capture data is
generally used for immediate monitoring or sports performance feedback and
generally has had
limited and/or primitive use in other fields.
[008] Known motion capture systems generally utilize several passive or active
markers or
several sensors. There are no known systems that utilize as little as one
visual marker or sensor
and an app that for example executes on a mobile device that a user already
owns, to analyze and
display motion capture data associated with a user and/or piece of equipment.
The data is
generally analyzed in a laboratory on a per user or per swing basis and is not
used for any other
purpose besides motion analysis or representation of motion of that particular
user and is
generally not subjected to data mining.
[009] There are no known systems that allow for motion capture elements such
as wireless
sensors to seamlessly integrate or otherwise couple with a user or shoes,
gloves, shirts, pants,
belts, or other equipment, such as a baseball bat, tennis racquet, golf club,
mouth piece for a
boxer, football or soccer player, or protective mouthpiece utilized in any
other contact sport for
local analysis or later analysis in such a small format that the user is not
aware that the sensors
are located in or on these items. There are no known systems that provide
seamless mounts, for
example in the weight port of a golf club or at the end shaft near the handle
so as to provide a
3
Date Recue/Date Received 2021-09-14

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
wireless golf club, configured to capture motion data. Data derived from
existing sensors is not
saved in a database for a large number of events and is not used relative to
anything but the
performance at which the motion capture data was acquired.
[0010] In addition, for sports that utilize a piece of equipment and a ball,
there are no known
portable systems that allow the user to obtain immediate visual feedback
regarding ball flight
distance, swing speed, swing efficiency of the piece of equipment or how
centered an impact of
the ball is, i.e., where on the piece of equipment the collision of the ball
has taken place. These
systems do not allow for user's to play games with the motion capture data
acquired from other
users, or historical players, or from their own previous performances. Known
systems do not
allow for data mining motion capture data from a large number of swings to
suggest or allow the
searching for better or optimal equipment to match a user's motion capture
data and do not
enable original equipment manufacturers (OEMs) to make business decisions,
e.g., improve their
products, compare their products to other manufacturers, up-sell products or
contact users that
may purchase different or more profitable products.
[0011] In addition, there are no known systems that utilize motion capture
data mining for
equipment fitting and subsequent point-of-sale decision making for
instantaneous purchasing of
equipment that fits an athlete. Furthermore, no known systems allow for custom
order
fulfillment such as assemble-to-order (ATO) for custom order fulfillment of
sporting equipment,
for example equipment that is built to customer specifications based on motion
capture data
mining, and shipped to the customer to complete the point of sales process,
for example during
play or virtual reality play.
[0012] In addition, there are no known systems that use a mobile device and
RFID tags for
passive compliance and monitoring applications.
[0013] There are no known systems that enable data mining for a large number
of users related
to their motion or motion of associated equipment to find patterns in the data
that allows for
business strategies to be determined based on heretofore undiscovered patterns
related to motion.
There are no known systems that enable obtain payment from OEMs, medical
professionals,
gaming companies or other end users to allow data mining of motion data. For
at least the
limitations described above there is a need for a motion event recognition and
video
synchronization system and method.
4

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
BRIEF SUMMARY OF THE INVENTION
100141 Embodiments of the invention enable a motion event recognition and
video
synchronization system and method that provides intelligent recognition of
events within motion
data including but not limited to motion capture data obtained from portable
wireless motion
capture elements such as visual markers and sensors, radio frequency
identification tags and
mobile device computer systems, or calculated based on analyzed movement
associated with the
same user, or compared against the user or another other user, historical user
or group of users.
Enables low memory utilization for event data and video data by trimming
motion data and
videos to correspond to the detected events. This may be performed on the
mobile device or on
a remote server and based on location and/or time of the event and based on
the location and/or
time of the video, and may optionally include the orientation of the camera to
further limit the
videos that may include the motion events. Embodiments enable event based
viewing and low
power transmission of events and communication with an app executing on a
mobile device
and/or with external cameras to designate windows that define the events.
Enables recognition
of motion events, and designation of events within images or videos, such as a
shot, move or
swing of a player, a concussion of a player, boxer, rider or driver, or a heat
stroke, hypothermia,
seizure, asthma attack, epileptic attack or any other sporting or physical
motion related event
including walking and falling. Events may be correlated with one or more
images or video as
captured from internal/external camera or cameras or nanny cam, for example to
enable saving
video of the event, such as the first steps of a child, violent shaking
events, sporting events
including concussions, or falling events associated with an elderly person.
Concussion related
events and other events may be monitored for linear acceleration thresholds
and/or patterns as
well as rotational acceleration and velocity thresholds and/or patterns and/or
saved on an event
basis and/or transferred over lightweight connectionless protocols or any
combination thereof.
[0015] Embodiments of the invention enable a user to purchase an application
or "app" and a
motion capture element and immediately utilize the system with their existing
mobile computer,
e.g., mobile phone. Embodiments of the invention may display motion
information to a
monitoring user, or user associated with the motion capture element or piece
of equipment.
Embodiments may also display information based on motion analysis data
associated with a user
or piece of equipment based on (via a function such as but not limited to a
comparison)
previously stored motion capture data or motion analysis data associated with
the user or piece
of equipment or previously stored motion capture data or motion analysis data
associated with at
least one other user. This enables sophisticated monitoring, compliance,
interaction with actual
motion capture data or pattern obtained from other user(s), for example to
play a virtual game

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
using real motion data obtained from the user with responses generated based
thereon using real
motion data capture from the user previously or from other users (or
equipment). This capability
provides for playing against historical players, for example a game of virtual
tennis, or playing
against an "average" professional sports person, and is unknown in the art
until now.
[0016] For example, one or more embodiments include at least one motion
capture element
configured to couple with a user or piece of equipment or mobile device
coupled with the user,
wherein the at least one motion capture element includes a memory, a sensor
configured to
capture any combination of values associated with an orientation, position,
velocity, acceleration
(linear and/or rotational) of the at least one motion capture element, a
radio, and a
microcontroller coupled with the memory, the sensor and the radio. The
microcontroller is
configured to collect data that includes sensor values from the sensor, store
the data in the
memory, analyze the data and recognize an event within the data to determine
event data and
transmit the event data associated with the event via the radio. Embodiments
of the system may
also include an application configured to execute on a mobile device wherein
the mobile device
includes a computer, a wireless communication interface configured to
communicate with the
radio to obtain the event data associated with the event. The computer is
coupled with wireless
communication interface wherein the computer executes the application or "app"
to configure
the computer to receive the event data from the wireless communication
interface, analyze the
event data to form motion analysis data, store the event data, or the motion
analysis data, or both
the event data and the motion analysis data, and display information
comprising the event data,
or the motion analysis data, or both associated with the at least one user on
a display.
[0017] One or more embodiments include at least one motion capture sensor that
is configured
to be placed near the user's head wherein the microcontroller is further
configured to calculate of
a location of impact on the user's head. Embodiments of the at least one
motion capture sensor
may be configured to be coupled on a hat or cap, within a protective
mouthpiece, using any type
of mount, enclosure or coupling mechanism. One or more embodiments of the at
least one
motion capture sensor may be configured to be coupled with a helmet on the
user's head and
wherein the calculation of the location of impact on the user's head is based
on the physical
geometry of the user's head and/or helmet. Embodiments may include a
temperature sensor
coupled with the at least one motion capture sensor or with the
microcontroller for example.
[0018] Embodiments of the invention may also utilize an isolator configured to
surround the at
least one motion capture element to approximate physical acceleration
dampening of
cerebrospinal fluid around the user's brain to minimize translation of linear
acceleration and
6

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
rotational acceleration of the event data to obtain an observed linear
acceleration and an
observed rotational acceleration of the user's brain. Thus,
embodiments may eliminate
processing to translate forces or acceleration values or any other values from
the helmet based
acceleration to the observed brain acceleration values. Therefore, embodiments
utilize less
power and storage to provide event specific data, which in turn minimizes the
amount of data
transfer, which yields lower transmission power utilization and even lower
total power
utilization. Different isolators may be utilized on a football/hockey/lacrosse
player's helmet
based on the type of padding inherent in the helmet. Other embodiments
utilized in sports where
helmets are not worn, or occasionally worn may also utilize at least one
motion capture sensor
on a cap or hat, for example on a baseball player's hat, along with at least
one sensor mounted on
a batting helmet. Headband mounts may also be utilized in sports where a cap
is not utilized,
such as soccer to also determine concussions. In one or more embodiments, the
isolator utilized
on a helmet may remain in the enclosure attached to the helmet and the sensor
may be removed
and placed on another piece of equipment that does not make use of an isolator
that matches the
dampening of a user's brain fluids. Embodiments may automatically detect a
type of motion and
determine the type of equipment that the motion capture sensor is currently
attached to based on
characteristic motion patterns associated with certain types of equipment,
i.e., surfboard versus
baseball bat.
[0019] Embodiments of the invention may be configured to obtain/calculate a
linear
acceleration value or a rotational acceleration value or both. This enables
rotational events to be
monitored for concussions as well as linear accelerations. Other events may
make use of the
linear and/or rotational acceleration and/or velocity, for example as compared
against patterns or
templates to not only switch sensor personalities during an event to alter the
capture
characteristics dynamically, but also to characterize the type of equipment
currently being
utilized with the current motion capture sensor. This enables a single motion
capture element
purchase by a user to instrument multiple pieces of equipment or clothing by
enabling the sensor
to automatically determine what type of equipment or piece of clothing the
sensor is coupled to
based on the motion captured by the sensor when compared against
characteristic patterns or
templates of motion.
[0020] Embodiments of the invention may transmit the event data associated
with the event
using a connectionless broadcast message. In one or more embodiments,
depending on the
wireless communication employed, broadcast messages may include payloads with
a limited
amount of data that may be utilized to avoid handshaking and overhead of a
connection based
protocol. In other embodiments connectionless or connection based protocols
may be utilized in
7

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
any combination.
100211 In one or more embodiments, the computer may access previously stored
event data or
motion analysis data associated with the user or piece of equipment, for
example to determine
the number of concussions or falls or other swings, or any other motion event.
Embodiments
may also present event data associated with the at least one user on a display
based on the event
data or motion analysis data associated with the user or piece of equipment
and the previously
stored event data or motion analysis data associated with the user or piece of
equipment or with
at least one other user or other piece of equipment. This enables comparison
of motion events,
in number or quantitative value, e.g., the maximum rotational acceleration
observed by the user
or other users in a particular game or historically. In addition, patterns or
templates that define
characteristic motion of particular pieces of equipment for typical events may
be dynamically
updated, for example on a central server or locally, and dynamically updated
in motion capture
sensors via the wireless interface in one or more embodiments. This enables
sensors to improve
over time.
[0022] Embodiments of the invention may transmit the information to a display
on a visual
display coupled with the computer or a remote computer, for example over
broadcast television
or the Internet for example. Embodiments of the display may also be configured
to accept sub-
event time locations to provide discrete scrolling along the timeline of the
whole event. For
example a golf swing may include sub-events such as an address, swing back,
swing forward,
strike, follow through. The system may display time locations for the sub-
events and accept user
input near the location to assert that the video should start or stop at that
point in time, or scroll
to or back to that point in time for ease of viewing sub-events for example.
[0023] Embodiments of the invention may also include an identifier coupled
with the at least
one motion capture sensor or the user or the piece of equipment. In one or
more embodiments,
the identifier may include a team and jersey number or student identifier
number or license
number or any other identifier that enables relatively unique identification
of a particular event
from a particular user or piece of equipment. This enables team sports or
locations with multiple
players or users to be identified with respect to the app that is configured
to receive data
associated with a particular player or user. One or more embodiments receive
the identifier, for
example a passive RFID identifier or MAC address or other serial number
associated with the
player or user and associate the identifier with the event data and motion
analysis data.
[0024] One or more embodiments of the at least one motion capture element may
further
include a light emitting element configured to output light if the event
occurs. This may be
8

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
utilized to display a potential, mild or severe level of concussion on the
outer portion of the
helmet without any required communication to any external device for example.
Different
colors or flashing intervals may also be utilized to relay information related
to the event.
Alternatively, or in combination, the at least one motion capture element may
further include an
audio output element configured to output sound if the event occurs or if the
at least one motion
capture sensor is out of range of the computer or wherein the computer is
configured to display
and alert if the at least one motion capture sensor is out of range of the
computer, or any
combination thereof. Embodiments of the sensor may also utilize an LCD that
outputs a coded
analysis of the current event, for example in a Quick Response (QR) code or
bar code for
example so that a referee may obtain a snapshot of the analysis code on a
mobile device locally,
and so that the event is not viewed in a readable form on the sensor or
wirelessly transmitted and
intercepted by anyone else.
[0025] In one or more embodiments, the at least one motion capture element
further includes a
location determination element coupled with the microcontroller. This may
include a GPS
(Global Positioning System) device for example. Alternatively, or in
combination, the computer
may triangulate the location in concert with another computer, or obtain the
location from any
other triangulation type of receiver, or calculate the location based on
images captured via a
camera coupled with the computer and known to be oriented in a particular
direction, wherein
the computer calculates an offset from the mobile device based on the
direction and size of
objects within the image for example.
[0026] In one or more embodiments, the computer is further configured to
request at least one
image or video that contains the event from at least one camera proximal to
the event. This may
include a broadcast message requesting video from a particular proximal camera
or a camera that
is pointing in the direction of the event. In one or more embodiments, the
computer is further
configured to broadcast a request for camera locations proximal to the event
or oriented to view
the event, and optionally display the available cameras, or videos therefrom
for the time duration
around the event of interest. In one or more embodiments, the computer is
further configured to
display a list of one or more times at which the event has occurred, which
enables the user obtain
the desired event video via the computer, and/or to independently request the
video from a third
party with the desired event times.
[0027] In one or more embodiments, the at least one motion capture sensor is
coupled with the
mobile device and for example uses an internal motion sensor within or coupled
with the mobile
device. This enables motion capture and event recognition with minimal and
ubiquitous
9

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
hardware, e.g., using a mobile device with a built-in accelerometer. In one or
more
embodiments, a first mobile device may be coupled with a user recording motion
data, while a
second mobile device is utilized to record a video of the motion. In one or
more embodiments,
the user undergoing motion may gesture, e.g., tap N times on the mobile device
to indicate that
the second user's mobile device should start recording video or stop recording
video. Any other
gesture may be utilized to communicate event related or motion related
indications between
mobile devices.
[0028] Embodiments of the at least one motion capture sensor may include a
temperature
sensor, or the microcontroller may otherwise be coupled with a temperature
sensor. In these
embodiments, the microcontroller is configured to transmit a temperature
obtained from the
temperature sensor as a temperature event, for example as a potential
indication of heat stroke or
hypothermia.
[0029] Thus embodiments of the invention may recognize any type of motion
event, including
events related to motion associated with the at least one motion capture
sensor coupled with any
combination of the user, or the piece of equipment or the mobile device or
motion that is
indicative of standing, walking, falling, a heat stroke, seizure, violent
shaking, a concussion, a
collision, abnormal gait, abnormal or non-existent breathing or any
combination thereof or any
other type of event having a duration of time during with motion occurs.
[0030] Embodiments of the invention may utilize data mining on the motion
capture data to
obtain patterns for users, equipment, or use the motion capture data or events
of a given user or
other user in particular embodiments of the invention. Data mining relates to
discovering new
patterns in large databases wherein the patterns are previously unknown. Many
methods may be
applied to the data to discover new patterns including statistical analysis,
neural networks and
artificial intelligence for example. Due to the large amount of data,
automated data mining may
be performed by one or more computers to find unknown patterns in the data.
Unknown patterns
may include groups of related data, anomalies in the data, dependencies
between elements of the
data, classifications and functions that model the data with minimal error or
any other type of
unknown pattern. Displays of data mining results may include displays that
summarize newly
discovered patterns in a way that is easier for a user to understand than
large amounts of pure
raw data. One of the results of the data mining process is improved market
research reports,
product improvement, lead generation and targeted sales. Generally, any type
of data that will
be subjected to data mining must be cleansed, data mined and the results of
which are generally
validated. Businesses may increase profits using data mining. Examples of
benefits of

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
embodiments of the invention include customer relationship management to
highly target
individuals based on patterns discovered in the data. In addition, market
basket analysis data
mining enables identifying products that are purchased or owned by the same
individuals and
which can be utilized to offer products to users that own one product but who
do not own
another product that is typically owned by other users.
[0031] Other areas of data mining include analyzing large sets of motion data
from different
users to suggest exercises to improve performance based on performance data
from other users.
For example if one user has less rotation of the hips during a swing versus
the average user, then
exercises to improve flexibility or strength may be suggested by the system.
In a golf course
embodiment, golf course planners may determine over a large amount of users on
a golf course
which holes should be adjusted in length or difficulty to obtain more discrete
values for the
average number of shots per hole, or for determining the amount of time
between golfers, for
example at a certain time of day or for golfers of a certain age. In addition,
sports and medical
applications of data mining include determining morphological changes in user
performance
over time, for example versus diet or exercise changes to determine what
improves performance
the most, or for example what times of the day, temperatures, or other
conditions produce swing
events that result in the furthest drive or lowest score. Use of motion
capture data for a
particular user or with respect to other users enables healthcare compliance,
for example to
ensure a person with diabetes moves a certain amount during the day, and
morphological
analysis to determine how a user's motion or range of motion has changed over
time. Games
may be played with motion capture data that enables virtual reality play
against historical greats
or other users. For example, a person may play against a previous performance
of the same
person or against the motion capture data of a friend. This allows users to
play a game in a
historic stadium or venue in a virtual reality environment, but with motion
capture data acquired
from the user or other users previously for example. Military planners may
utilize the motion
capture data to determine which soldiers are most fit and therefore eligible
for special operations,
or which ones should retire, or by coaches to determine when a player should
rest based on the
concussion events and severity thereof sustained by a player for example and
potentially based
on a mined time period where other users have increased performance after a
concussion related
event.
[0032] Embodiments of the system perform motion capture and/or display with an
application
for example that executes on mobile device that may include a visual display
and an optional
camera and which is capable of obtaining data from at least one motion capture
element such as
a visual marker and/or a wireless sensor. The system can also integrate with
standalone cameras,
11

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
or cameras on multiple mobile devices. The system also enables the user to
analyze and display
the motion capture data in a variety of ways that provide immediate easy to
understand graphical
information associated with the motion capture data. Motion capture elements
utilized in the
system intelligently store data for example related to events associated with
striking a ball,
making a ski turn, jumping, etc., and eliminate false events, and greatly
improve memory usage
and minimize storage requirements. In addition, the data may be stored for
example for more
than one event associated with the sporting equipment, for example multiple
bat swings or for an
entire round of golf or more if necessary at least until the data is
downloaded to a mobile device
or to the Internet. Data compression of captured data may also be utilized to
store more motion
capture data in a given amount of memory. Motion capture elements utilized in
the system may
also be configured to intelligently power down portions of their circuitry to
save power, for
example power down transceivers until motion is detected of a certain type.
Embodiments of the
invention may also utilize flexible battery connectors to couple two or more
batteries in parallel
to increase the time the system may be utilized before replacing the
batteries. Motion capture
data is generally stored in memory such as a local database or in a network
accessible database,
any of which enables data mining described above. Any other type of data
mining may be
performed using embodiments of the invention, including searching for temporal
changes of data
related to one or more users and or simply searching for data related to a
particular user or piece
of equipment.
[0033] Other embodiments may display information such as music selections or
music playlists
to be played based on the motion related data. This for example enables a
performance to be
compared to another user's performance and select the type of music the other
user plays, or to
compare the performance relative to a threshold that determines what type of
music selection to
suggest or display.
[0034] Embodiments of the invention directed sports for example enable RFID or
passive RFID
tags to be placed on items that a user moves wherein embodiments of the system
keep track of
the motion. For example, by placing passive RFID tags on a particular helmet
or cap, or
protective mouthpiece for boxing, football, soccer or other contact sport,
particular dumbbells at
a gym, and by wearing motion capture elements such as gloves and with a pre-
existing mobile
device for example an IPHONE , embodiments of the invention provide automatic
safety
compliance or fitness and/or healthcare compliance. This is achieved by
keeping track of the
motion, and via RIFD or passive RFID, the weight that the user is lifting.
Embodiments of the
invention may thus add the number of repetitions multiplied by the amount of
weight indicated
by each RFID tag to calculate the number of calories burned by the user. In
another example, an
12

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
RFID tag coupled with a stationary bike, or wherein the stationary bike can
mimic the identifier
and/or communicate wirelessly to provide performance data and wherein the
mobile computer
includes an RFID reader, the number of rotations of the user's legs may be
counted. Any other
use of RFID or passive RFID is in keeping with the spirit of the invention.
This enables doctors
to remotely determine whether a user has complied with their medical
recommendations, or
exceeded linear or rotational acceleration indicative of a concussion for
example. Embodiments
may thus be utilized by users to ensure compliance and by doctors to lower
their malpractice
insurance rates since they are ensuring that their patients are complying with
their
recommendations, albeit remotely. Embodiments of the invention do not require
RFID tags for
medical compliance, but may utilize them. Embodiments of the invention
directed at golf also
enable golf shots for each club associated with a golfer to be counted through
use of an identifier
such as RFID tags on each club (or optionally via an identifier associated
with motion capture
electronics on a golf club or obtained remotely over the radio) and a mobile
computer, for
example an IPHONEO equipped with an RFID reader that concentrates the
processing for golf
shot counting on the mobile computer instead of on each golf club. Embodiments
of the
invention may also allow for the measurement of orientation (NorthlSouth,
and/or two horizontal
axes and the vertical axis) and acceleration using an inertial measurement
unit, or accelerometers
and/or magnetometers, and/or gyroscopes. This is not required for golf shot
counting, although
one or more embodiments may determine when the golf club has struck a golf
ball through
vibration analysis for example and then query a golfer whether to count a shot
or not. This
functionality may be combined with speed or acceleration threshold or range
detection for
example to determine whether the golf club was travelling within an acceptable
speed or range,
or acceleration or range for the "hit" to count. Wavelets may also be utilized
to compare valid
swing signatures to eliminate count shots or eliminate false strikes for
example. This range may
vary between different clubs, for example a driver speed range may be "greater
than 30 mph"
while a putter speed range may be "less than 20 mph", any range may be
utilized with any club
as desired, or the speed range may be ignored for example. Alternatively or in
combination, the
mobile computer may only query the golfer to count a shot if the golfer is not
moving laterally,
i.e., in a golf cart or walking, and/or wherein the golfer may have rotated or
taken a shot as
determined by a orientation or gyroscope sensor coupled with the mobile
computer. The
position of the stroke may be shown on a map on the mobile computer for
example. In addition,
GPS receivers with wireless radios may be placed within the tee markers and in
the cups to give
daily updates of distances and helps with reading putts and greens for
example. The golfer may
also wear virtual glasses that allow the golfer to see the golf course map,
current location,
distance to the hole, number of shots on the current hole, total number of
shots and any other
13

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
desired metric. If the user moves a certain distance, as determined by GPS for
example, from the
shot without counting the shot, the system may prompt the user on whether to
count the shot or
not. The system does not require a user to initiate a switch on a club to
count a shot and does not
require LED's or active or battery powered electronics on each club to count
shots. The mobile
computer may also accept gestures from the user to count a shot or not count a
shot so that the
golfer does not have to remove any gloves to operate the mobile computer. For
embodiments
that utilize position/orientation sensors, the system may only count shots
when a club is oriented
vertically for example when an impact is detected. The apparatus may also
include identifiers
that enable a specific apparatus to be identified The identifiers may be a
serial number for
example. The identifier for example may originate from an RFID tag on each
golf club, or
optionally may include a serial number or other identifier associated with
motion capture
elements associated with a golf club. Utilizing this apparatus enables the
identification of a
specific golfer, specific club and also enables motion capture and/or display
with a system that
includes a television and/or mobile device having a visual display and an
optional camera and
capable of obtaining data from at least one motion capture element such as a
visual marker
and/or a wireless sensor. The system can also integrate with standalone
cameras, or cameras on
multiple mobile devices. The system also enables the user to analyze and
display the motion
capture data in a variety of ways that provide immediate and easy to
understand graphical
information associated with the motion capture data. The apparatus enables the
system to also
determine how "centered" an impact is with respect to a ball and a piece of
equipment, such as a
golf club for example. The system also allows for fitting of equipment
including shoes, clubs,
etc., and immediate purchasing of the equipment even if the equipment requires
a custom
assemble-to-order request from a vendor. Once the motion capture data, videos
or images and
shot count indications are obtained by the system, they may be stored locally,
for example in a
local database or sent over a telephonic or wireless interface to a remote
database for example.
Once in a database, the various elements including any data associated with
the user, such as
age, sex, height, weight, address, income or any other related information may
be utilized in
embodiments of the invention and/or subjected to data mining. One or more
embodiments
enable users or OEMs for example to pay for access to the data mining
capabilities of the
system.
100351 For example, embodiments that utilize motion capture elements allow for
analyzing the
data obtained from the apparatus and enable the presentation of unique
displays associated with
the user, such as 3D overlays onto images of the body of the user to visually
depict the captured
motion data. In addition, these embodiments may also utilize active wireless
technology such as
14

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
BLUETOOTH 0 Low Energy for a range of up to 50 meters to communicate with a
golfer's
mobile computer. Embodiments of the invention also allow for display of
queries for counting a
stroke for example as a result of receiving a golf club ID, for example via an
RFID reader or
alternatively via wireless communication using BLUETOOTHO or IEEE 802.11 for
example.
Use of BLUETOOTHO Low Energy chips allows for a club to be in sleep mode for
up to 3
years with a standard coin cell battery, thus reducing required maintenance.
One or more
embodiments of the invention may utilize more than one radio, of more than one
technology for
example. This allows for a level of redundancy that increases robustness of
the system. For
example, if one radio no longer functions, e.g., the BLUETOOTH0 radio for
example, then the
IEEE 802.11 radio may be utilized to transfer data and warn the golfer that
one of the radios is
not functioning, while still allowing the golfer to record motion data and
count shots associated
with the particular club. For embodiments of the invention that utilize a
mobile device (or more
than one mobile device) without camera(s), sensor data may be utilized to
generate displays of
the captured motion data, while the mobile device may optionally obtain images
from other
cameras or other mobile devices with cameras. For example, display types that
may or may not
utilize images of the user may include ratings, calculated data and time line
data. Ratings
associated with the captured motion can also be displayed to the user in the
form of numerical or
graphical data with or without a user image, for example an "efficiency"
rating. Other ratings
may include linear acceleration and/or rotational acceleration values for the
determination of
concussions and other events for example. Calculated data, such as a predicted
ball flight path
data can be calculated and displayed on the mobile device with or without
utilizing images of the
user's body. Data depicted on a time line can also be displayed with or
without images of the
user to show the relative peaks of velocity for various parts of the equipment
or user's body for
example. Images from multiple cameras including multiple mobile devices, for
example from a
crowd of golf fans, may be combined into a BULLET TIME 0 visual effect
characterized by
slow motion of the golf swing shown from around the golfer at various angles
at normal speed.
All analyzed data may be displayed locally, or uploaded to the database along
with the motion
capture data, images/videos, shot count and location data where it may undergo
data mining
processes, wherein the system may charge a fee for access to the results for
example.
100361 In one or more embodiments, a user may play a golf course or hit tennis
balls, or
alternatively simply swing to generate motion capture data for example and
when wearing
virtual reality glasses, see an avatar of another user, whether virtual or
real in an augmented
reality environment. In other embodiments, the user moves a piece of equipment
associated with
any sport or simply move the user's own body coupled with motion capture
sensors and view a

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
virtual reality environment displayed in virtual reality glasses of the user's
movement or
movement of a piece of equipment so instrumented. Alternatively or in
combination, a virtual
reality room or other environment may be utilized to project the virtual
reality avatars and
motion data. Hence, embodiments of the system may allow a user on a real golf
course to play
along with another user at a different location that is not actually hitting
balls along with a
historical player whose motion data has been analyzed or a data mining
constructed user based
on one or more motion capture data sequences, and utilized by an embodiment of
the system to
project an avatar of the historical player. Each of the three players may play
in turn, as if they
were located in the same place.
[0037] Motion capture data and/or events can be displayed in many ways, for
example tweeted,
to a social network during or after motion capture. For example, if a certain
amount of exercise
or motion is performed, or calories performed, or a new sports power factor
maximum has been
obtained, the system can automatically tweet the new information to a social
network site so that
anyone connected to the Internet may be notified. The data uploaded to the
Internet, i.e., a
remote database or remote server or memory remote to the system may be viewed,
analyzed or
data mined by any computer that may obtain access to the data. This allows for
remote
compliance tweeting and/or compliance and/or original equipment manufacturers
to determine
for a given user what equipment for compliance or sporting equipment for
sports related
embodiments is working best and/or what equipment to suggest. Data mining also
enables
suggestions for users to improve their compliance and/or the planning of
sports venues,
including golf courses based on the data and/or metadata associated with
users, such as age, or
any other demographics that may be entered into the system. Remote storage of
data also
enables medical applications such as morphological analysis, range of motion
over time, and
diabetes prevention and exercise monitoring and compliance applications as
stated. Other
applications also allow for games that use real motion capture data from other
users, or historical
players whether alive or dead after analyzing videos of the historical players
for example. Virtual
reality and augmented virtual reality applications may also utilize the motion
capture data or
historical motion data. Military personnel such as commanders and/or doctors
may utilize the
motion and/or images in determine what type of G-forces a person has undergone
from an
explosion near an Improvised Explosive Device and automatically route the best
type of medical
aid automatically to the location of the motion capture sensor. One or more
embodiments of the
system may relay motion capture data over a G-force or velocity threshold, to
their commanding
officer or nearest medical personnel for example via a wireless communication
link.
Alternatively, embodiments of the invention may broadcast lightweight
connectionless
16

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
concussion related messages to any mobile devices listening, e.g., a referee's
mobile phone to
aid in the assistance of the injured player wherein the lightweight message
includes an optional
team/jersey number and an acceleration related number such as a
potential/probable concussion
warning or indicator.
[0038] In one or more embodiments of the invention, fixed cameras such as at a
tennis
tournament, football game, baseball game, car or motorcycle race, golf
tournament or other
sporting event can be utilized with a wireless interface located near the
player/equipment having
motion capture elements so as to obtain, analyze and display motion capture
data. In this
embodiment, real-time or near real-time motion data can be displayed on the
video for
augmented video replays. An increase in the entertainment level is thus
created by visually
displaying how fast equipment is moving during a shot, for example with rings
drawn around a
players hips and shoulders. Embodiments of the invention also allow images or
videos from
other players having mobile devices to be utilized on a mobile device related
to another user so
that users don't have to switch mobile phones for example. In one embodiment,
a video
obtained by a first user for a piece of sporting equipment in motion that is
not associated with the
second user having the video camera equipped mobile phone may automatically
transfer the
video to the first user for display with motion capture data associated with
the first user. Video
and images may be uploaded into the database and data mined through image
analysis to
determine the types/colors of clothing or shoes for example that users are
wearing.
[0039] Based on the display of data, the user can determine the equipment that
fits the best and
immediately purchase the equipment, via the mobile device. For example, when
deciding
between two sets of skis, a user may try out both pairs that are instrumented
with motion capture
elements wherein the motion capture data is analyzed to determine which pair
of skis enables
more efficient movement. For golf embodiments, when deciding between two golf
clubs, a user
can take swings with different clubs and based on the analysis of the captured
motion data and
quantitatively determine which club performs better. Custom equipment may be
ordered
through an interface on the mobile device from a vendor that can assemble-to-
order customer
built equipment and ship the equipment to the user for example. Shaft lengths
for putters for
example that are a standard length can be custom made for a particular user
based on captured
motion data as a user putts with an adjustable length shaft for example. Based
on data mining of
the motion capture data and shot count data and distances for example allows
for users having
similar swing characteristics to be compared against a current user wherein
equipment that
delivers longer shots for a given swing velocity for a user of a particular
size and age for
example may be suggested or searched for by the user to improve performance.
OEMs may
17

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
determine that for given swing speeds, which make and model of club delivers
the best overall
performance as well. One skilled in the art will recognize that this applies
to all activities
involving motion, not just golf.
[0040] Embodiments of the system may utilize a variety of sensor types. In one
or more
embodiments of the invention, active sensors may integrate with a system that
permits passive or
active visual markers to be utilized to capture motion of particular points on
a user's body or
equipment. This may be performed in a simply two-dimensional manner or in a
three-
dimensional manner if the mobile device is configured with two or more
cameras, or if multiple
cameras or mobile devices are utilized to capture images such as video and
share the images in
order to create triangulated three-dimensional motion data from a set of two-
dimensional images
obtained from each camera. Another embodiment of the invention may utilize
inertial
measurement units (IMU) or any other sensors that can produce any combination
of orientation,
position, velocity and/or acceleration information to the mobile device. The
sensors may thus
obtain data that may include any combination of one or more values associated
with orientation
(vertical or North/South or both), position (either via through Global
Positioning System, i.e.,
"GPS" or through triangulation), velocity (in all three axes), acceleration
(in all three axes). All
motion capture data obtained from the various sensor types may be saved in a
database for
analysis, monitoring, compliance, game playing or other use and/or data
mining, regardless of
the sensor type.
[0041] In one or more embodiments of the invention, a sensor may be utilized
that includes a
passive marker or active marker on an outside surface of the sensor, so that
the sensor may also
be utilized for visual tracking (either two-dimensional or three-dimensional)
and for orientation,
position, velocity, acceleration or any other physical quantity produced by
the sensor. Visual
marker embodiments of the motion capture element(s) may be passive or active,
meaning that
they may either have a visual portion that is visually trackable or may
include a light emitting
element such as a light emitting diode (LED) that allows for image tracking in
low light
conditions. This for example may be implemented with a graphical symbol or
colored marker at
the end of the shaft near the handle or at the opposing end of the golf club
at the head of the club.
Images or videos of the markers may be analyzed locally or saved in the
database and analyzed
and then utilized in data mining. In addition, for concussion related
embodiments, the visual
marker may emit a light that is indicative of a concussion, for example
flashing yellow for a
moderate concussion and fast flashing red for a sever concussion or any other
visual or optional
audio event indicators or both. As previously discussed, an LCD may output a
local visual
encoded message so that it is not intercepted or otherwise readable by anyone
not having a
18

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
mobile device local and equipped to read the code. This enables sensitive
medical messages to
only be read by a referee or local medical personnel for a concussion or
paralysis related event
for example.
[0042] Embodiments of the motion capture sensors may be generally mounted on
or near one or
more end or opposing ends of sporting equipment, for example such as a golf
club and/or
anywhere in between (for El measurements) and may integrate with other sensors
coupled to
equipment, such as weapons, medical equipment, wristbands, shoes, pants,
shirts, gloves, clubs,
bats, racquets, balls, helmets, caps, mouthpieces, etc., and/or may be
attached to a user in any
possible manner. For example, a rifle to determine where the rifle was
pointing when a recoil
was detected by the motion capture sensor. This data may be transmitted to a
central server, for
example using a mobile computer such as a mobile phone or other device and
analyzed for war
games practice for example. In addition, one or more embodiments of the sensor
can fit into a
weight port of a golf club, and/or in the handle end of the golf club. Other
embodiments may fit
into the handle of, or end of, a tennis racquet or baseball bat for example.
Embodiments that are
related to safety or health monitoring may be coupled with a cap, helmet,
and/or mouthpiece or
in any other type of enclosure. One or more embodiments of the invention may
also operate
with balls that have integrated sensors as well. One or more embodiments of
the mobile device
may include a small mountable computer such as an IPODO SHUFFLE or IPODO NANO

that may or may not have integrated displays, and which are small enough to
mount on a shaft of
a piece of sporting equipment and not affect a user's swing. Alternatively,
the system may
calculate the virtual flight path of a ball that has come in contact with
equipment moved by a
player. For example with a baseball bat or tennis racquet or golf club having
a sensor integrated
into a weight port of other portion of the end of the club striking the golf
ball and having a
second sensor located in the tip of the handle of the golf club, or in one or
more gloves worn by
the player, an angle of impact can be calculated for the club. By knowing the
loft of the face of
the club, an angle of flight may be calculated for the golf ball. In addition,
by sampling the
sensor at the end of the club at a high enough speed to determine oscillations
indicative of where
on the face of the club the golf ball was struck, a quality of impact may be
determined. These
types of measurements and the analysis thereof help an athlete improve, and
for fitting purposes,
allow an athlete to immediately purchase equipment that fits correctly.
Centering data may be
uploaded to the database and data mined for patterns related to the bats,
racquets or clubs with
the best centering on average, or the lowest torsion values for example on a
manufacturer basis
for product improvement. Any other unknown patterns in the data that are
discovered may also
be presented or suggested to users or search on by users, or paid for, for
example by
19

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
manufacturers or users.
100431 One or more embodiments of the sensor may contain charging features
such as
mechanical eccentric weight, as utilized in some watches known as "automatic"
or "self-
winding" watches, optionally including a small generator, or inductive
charging coils for indirect
electromechanical charging of the sensor power supply. Other embodiments may
utilize plugs
for direct charging of the sensor power supply or electromechanical or
microelectromechanical
(MEMS) based charging elements. Any other type of power micro-harvesting
technologies may
be utilized in one or more embodiments of the invention. One or more
embodiments of the
sensor may utilize power saving features including gestures that power the
sensor on or off.
Such gestures may include motion, physical switches, contact with the sensor,
wireless
commands to the sensor, for example from a mobile device that is associated
with the particular
sensors. Other elements that may couple with the sensor includes a battery,
low power
microcontroller, antenna and radio, heat sync, recharger and overcharge sensor
for example. In
addition, embodiments of the invention allow for power down of some or all of
the components
of the system until an electronic signal from accelerometers or a mechanical
switch determines
that the club has moved for example.
[0044] One or more embodiments of the invention enable Elasticity Inertia or
El measurement
of sporting equipment and even body parts for example. Placement of
embodiments of the
sensor along the shaft of a golf club, tennis racquet, baseball bat, hockey
stick, shoe, human arm
or any other item that is not perfectly stiff enables measurement of the
amount of flex at points
where sensors are located or between sensors. The angular differences in the
each sensor over
time allow for not only calculation of a flex profile, but also a flex profile
that is dependent on
time or force. For example, known El machines use static weights between to
support points to
determine an El profile. These machines therefore cannot detect whether the El
profile is
dependent upon the force applied or is dependent on the time at which the
force is applied, for
example El profiles may be non-linear with respect to force or time. Example
materials that are
known to have different physical properties with respect to time include
Maxwell materials and
non-Newtonian fluids.
[0045] A user may also view the captured motion data in a graphical form on
the display of the
mobile device or for example on a set of glasses that contains a video
display. The captured
motion data obtained from embodiments of the motion capture element may also
be utilized to
augment a virtual reality display of user in a virtual environment. Virtual
reality or augmented
reality views of patterns that are found in the database via data mining are
also in keeping with

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
the spirit of the invention. User's may also see augmented information such as
an aim assist or
aim guide that shows for example where a shot should be attempted to be placed
for example
based on existing wind conditions, or to account for hazards, e.g., trees that
are in the way of a
desired destination for a ball, i.e., the golf hole for example.
[0046] One or more embodiments of the invention include a motion event
recognition and video
synchronization system that includes at least one motion capture element
configured to couple
with a user or piece of equipment or mobile device coupled with the user. The
at least one
motion capture element may include a memory, a sensor configured to capture
any combination
of values associated with an orientation, position, velocity and acceleration
of the at least one
motion capture element, a radio, a microcontroller coupled with the memory,
the sensor and the
radio. The microcontroller may be configured to collect data that includes
sensor values from
the sensor, store the data in the memory, analyze the data and recognize an
event within the data
to determine event data, transmit the event data associated with the event via
the radio. The
system may also include a mobile device that includes a computer, a wireless
communication
interface configured to communicate with the radio to obtain the event data
associated with the
event, wherein the computer is coupled with wireless communication interface,
wherein the
computer is configured to receive the event data from the wireless
communication interface.
The computer may also analyze the event data to form motion analysis data,
store the event data,
or the motion analysis data, or both the event data and the motion analysis
data, obtain an event
start time and an event stop time from the event, request image data from
camera that includes a
video captured at least during a timespan from the event start time to the
event stop time and
display an event video on a display that includes both the event data, the
motion analysis data or
any combination thereof that occurs during the timespan from the event start
time to the event
stop time and the video captured during the timespan from the event start time
to the event stop
time.
[0047] Embodiments may synchronize clocks in the system using any type of
synchronization
methodology and in one or more embodiments the computer on the mobile device
is further
configured to determine a clock difference between the motion capture element
and the mobile
device and synchronize the motion analysis data with the video. For example,
one or more
embodiments of the invention provides procedures for multiple recording
devices to synchronize
information about the time, location, or orientation of each device, so that
data recorded about
events from different devices can be combined. Such recording devices may be
embedded
sensors, mobile phones with cameras or microphones, or more generally any
devices that can
record data relevant to an activity of interest. In one or more embodiments,
this synchronization
21

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
is accomplished by exchanging information between devices so that the devices
can agree on a
common measurement for time, location, or orientation. For example, a mobile
phone and an
embedded sensor may exchange messages with the current timestamps of their
internal clocks;
these messages allow a negotiation to occur wherein the two devices agree on a
common time.
Such messages may be exchanged periodically as needed to account for clock
drift or motion of
the devices after a previous synchronization. In other embodiments, multiple
recording devices
may use a common server or set of servers to obtain standardized measures of
time, location, or
orientation. For example, devices may use a GPS system to obtain absolute
location information
for each device. GPS systems may also be used to obtain standardized time. NTP
(Network
Time Protocol) servers may also be used as standardized time servers. Using
servers allows
devices to agree on common measurements without necessarily being configured
at all times to
communicate with one another.
[0048] In one or more embodiments of the invention, some of the recording
devices are
configured to detect the occurrence of various events of interest. Some such
events may occur at
specific moments in time; others may occur over a time interval, wherein the
detection includes
detection of the start of an event and of the end of an event. These devices
are configured to
record any combination of the time, location, or orientation of the recording
device along with
the event data, using the synchronized measurement bases for time, location,
and orientation
described above.
[0049] Embodiments of the computer on the mobile device may be further
configured to discard
at least a portion of the video outside of the event start time to the event
stop. For example, in
one or more embodiments of the invention, some of the recording devices
capture data
continuously to memory while awaiting the detection of an event. To conserve
memory, some
devices may be configured to store data to a more permanent local storage
medium, or to a
server, only when this data is proximate in time to a detected event. For
example, in the absence
of an event detection, newly recorded data may ultimately overwrite previously
recorded data in
memory. A circular buffer may be used in some embodiments as a typical
implementation of
such an overwriting scheme. When an event detection occurs, the recording
device may store
some configured amount of data prior to the start of the event, and some
configured amount of
data after the end of the event, in addition to storing the data captured
during the event itself.
Any pre or post time interval is considered part of the event start time and
event stop time so that
context of the event is shown in the video for example.
22

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
100501 Embodiments of the system may further comprise a server computer remote
to the
mobile device and wherein the server computer is configured to discard at
least a portion of the
video outside of the event start time to the event stop and return the video
captured during the
timespan from the event start time to the event stop time to the computer in
the mobile device.
[0051] Embodiments of the at least one motion capture element may be
configured to transmit
the event to at least one other motion capture sensor or at least one other
mobile device or any
combination thereof, and wherein the at least one other motion capture sensor
or the at least one
other mobile device or any combination thereof is configured to save data
associated with said
event. For example, in embodiments with multiple recording devices operating
simultaneously,
one such device may detect an event and send a message to other recording
devices that such an
event detection has occurred. This message can include the timestamp of the
start and/or stop of
the event, using the synchronized time basis for the clocks of the various
devices. The receiving
devices, e.g., other motion capture sensors and/or cameras may use the event
detection message
to store data associated with the event to nonvolatile storage or to a server.
The devices may be
configured to store some amount of data prior to the start of the event and
some amount of data
after the end of the event, in addition to the data directly associated with
the event. In this way
all devices can record data simultaneously, but use an event trigger from only
one of the devices
to initiate saving of distributed event data from multiple sources.
[0052] Embodiments of the computer may be further configured to save the video
from the
event start time to the event stop time with the motion analysis data that
occurs from the event
start time to the event stop time or a remote server may be utilized to save
the video. In one or
more embodiments of the invention, some of the recording devices may not be in
direct
communication with each other throughout the time period in which events may
occur. In these
situations, devices can be configured to save complete records of all of the
data they have
recorded to permanent storage or to a server. Saving of only data associated
with events may not
be possible in these situations because some devices may not be able to
receive event trigger
messages. In these situations, saved data can be processed after the fact to
extract only the
relevant portions associated with one or more detected events. For example,
multiple mobile
devices might record video of a player or performer, and upload this video
continuously to a
server for storage. Separately the player or performer may be equipped with an
embedded
sensor that is able to detect events such as particular motions or actions.
Embedded sensor data
may be uploaded to the same server either continuously or at a later time.
Since all data,
including the video streams as well as the embedded sensor data, is generally
timestamped, video
23

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
associated with the events detected by the embedded sensor can be extracted
and combined on
the server.
[0053] Embodiments of the server or computer may be further configured while a
communication link is open between the at least one motion capture sensor and
the mobile
device to discard at least a portion of the video outside of the event start
time to the event stop
and save the video from the event start time to the event stop time with the
motion analysis data
that occurs from the event start time to the event stop time. Alternatively,
if the communication
link is not open, embodiments of the computer may be further configured to
save video and after
the event is received after the communication link is open, then discard at
least a portion of the
video outside of the event start time to the event stop and save the video
from the event start time
to the event stop time with the motion analysis data that occurs from the
event start time to the
event stop time. For example, in some embodiments of the invention, data may
be uploaded to a
server as described above, and the location and orientation data associated
with each device's
data stream may be used to extract data that is relevant to a detected event.
For example, a large
set of mobile devices may be used to record video at various locations
throughout a golf
tournament. This video data may be uploaded to a server either continuously or
after the
tournament. After the tournament, sensor data with event detections may also
be uploaded to the
same server. Post-processing of these various data streams can identify
particular video streams
that were recorded in the physical proximity of events that occurred and at
the same time.
Additional filters may select video streams where a camera was pointing in the
correct direction
to observe an event. These selected streams may be combined with the sensor
data to form an
aggregate data stream with multiple video angles showing an event.
[0054] The system may obtain video from a camera coupled with the mobile
device, or any
camera that is separate from or otherwise remote from the mobile device. In
one or more
embodiments, the video is obtained from a server remote to the mobile device,
for example
obtained after a query for video at a location and time interval.
[0055] Embodiments of the server or computer may be configured to synchronize
said video
and said event data, or said motion analysis data via image analysis to more
accurately determine
a start event frame or stop event frame in said video or both, that is most
closely associated with
said event start time or said event stop time or both. In one or more
embodiments of the
invention, synchronization of clocks between recording devices may be
approximate. It may be
desirable to improve the accuracy of synchronizing data feeds from multiple
recording devices
based on the view of an event from each device. In one or more embodiments,
processing of
24

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
multiple data streams is used to observe signatures of events in the different
streams to assist
with fine-grained synchronization. For example, an embedded sensor may be
synchronized with
a mobile device including a video camera, but the time synchronization may be
accurate only to
within 100 milliseconds. If the video camera is recording video at 30 frames
per second, the
video frame corresponding to an event detection on the embedded sensor can
only be determined
within 3 frames based on the synchronized timestamps alone. In one embodiment
of the device,
video frame image processing can be used to determine the precise frame
corresponding most
closely to the detected event. For instance, a shock from a snowboard hitting
the ground that is
detected by an inertial sensor may be correlated with the frame at which the
geometric boundary
of the snowboard makes contact with the ground. Other embodiments may use
other image
processing techniques or other methods of detecting event signatures to
improve synchronization
of multiple data feeds.
[0056] Embodiments of the at least one motion capture element may include a
location
determination element configured to determine a location that is coupled with
the
microcontroller and wherein the microcontroller is configured to transmit the
location to the
computer on the mobile device. In one or more embodiments, the system further
includes a
server wherein the microcontroller is configured to transmit the location to
the server, either
directly or via the mobile device, and wherein the computer or server is
configured to form the
event video from portions of the video based on the location and the event
start time and the
event stop time. For example, in one or more embodiments, the event video may
be trimmed to
a particular length of the event, and transcoded to any or video quality, and
overlaid or otherwise
integrated with motion analysis data or event data, e.g., velocity or
acceleration data in any
manner. Video may be stored locally in any resolution, depth, or image quality
or compression
type to store video or any other technique to maximize storage capacity or
frame rate or with any
compression type to minimize storage, whether a communication link is open or
not between the
mobile device, at least one motion capture sensor and/or server. In one or
more embodiments,
the velocity or other motion analysis data may be overlaid or otherwise
combined, e.g., on a
portion beneath the video, that includes the event start and stop time, that
may include any
number of seconds before and/or after the actual event to provide video of the
swing before a
ball strike event for example. In one or more embodiments, the at least one
motion capture
sensor and/or mobile device(s) may transmit events and video to a server
wherein the server may
determine that particular videos and sensor data occurred in a particular
location at a particular
time and construct event videos from several videos and several sensor events.
The sensor
events may be from one sensor or multiple sensors coupled with a user and/or
piece of

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
equipment for example. Thus the system may construct short videos that
correspond to the
events, which greatly decreases video storage requirements for example.
[0057] In one or more embodiments, the microcontroller or the computer is
configured to
determine a location of the event or the microcontroller and the computer are
configured to
determine the location of the event and correlate the location, for example by
correlating or
averaging the location to provide a central point of the event, and/or
erroneous location data
from initializing GPS sensors may be minimized. In this manner, a group of
users with mobile
devices may generate videos of a golfer teeing off, wherein the event location
of the at least one
motion capture device may be utilized and wherein the server may obtain videos
from the
spectators and generate an event video of the swing and ball strike of the
professional golfer,
wherein the event video may utilize frames from different cameras to generate
a BULLET TIME
0 video from around the golfer as the golfer swings. The resulting video or
videos may be
trimmed to the duration of the event, e.g., from the event start time to the
event stop time and/or
with any pre or post predetermined time values around the event to ensure that
the entire event is
captured including any setup time and any follow through time for the swing or
other event.
[0058] In one or more embodiments, the computer on the mobile device may
request at least one
image or video that contains the event from at least one camera proximal to
the event directly by
broadcasting a request for any videos taken in the area by any cameras,
optionally that may
include orientation information related to whether the camera was not only
located proximally to
the event, but also oriented or otherwise pointing at the event. In other
embodiments, the video
may be requested by the computer on the mobile device from a remote server. In
this scenario,
any location and/or time associated with an event may be utilized to return
images and/or video
near the event or taken at a time near the event, or both. In one or more
embodiments, the
computer or server may trim the video to correspond to the event duration and
again, may utilize
image processing techniques to further synchronize portions of an event, such
as a ball strike
with the corresponding frame in the video that matches the acceleration data
corresponding to
the ball strike on a piece of equipment for example.
[0059] Embodiments of the computer on the mobile device or on the server may
be configured
to display a list of one or more times at which an event has occurred or
wherein one or more
events has occurred. In this manner, a user may find events from a list to
access the event videos
in rapid fashion.
[0060] Embodiments of the invention may include at least one motion capture
sensor that is
physically coupled with said mobile device. These embodiments enable any type
of mobile
26

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
phone or camera system with an integrated sensor, such as any type of helmet
mounted camera
or any mount that includes both a camera and a motion capture sensor to
generate event data and
video data.
27

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
BRIEF DESCRIPTION OF THE DRAWINGS
100611 The above and other aspects, features and advantages of the ideas
conveyed through this
disclosure will be more apparent from the following more particular
description thereof,
presented in conjunction with the following drawings wherein:
[0062] Figure 1 illustrates an embodiment of the motion event recognition and
video
synchronization system.
[0063] Figure IA illustrates a logical hardware block diagram of an embodiment
of the
computer.
[0064] Figure 1B illustrates an architectural view of an embodiment of the
database utilized in
embodiments of the system.
[0065] Figure 1C illustrates a flow chart for an embodiment of the processing
performed by
embodiments of the computers in the system as shown in Figures 1 and 1A.
[0066] Figure ID illustrates a data flow diagram for an embodiment of the
system.
[0067] Figure 2A illustrates a helmet based mount that surrounds the head of a
user wherein the
helmet based mount holds a motion capture sensor. Figure 2B illustrates a neck
insert based
mount that enables retrofitting existing helmets with a motion capture sensor.
[0068] Figure 3 illustrates a close-up of the mount of Figures 2A-B showing
the isolator
between the motion capture sensor and external portion of the helmet.
[0069] Figure 4A illustrates a top cross sectional view of the helmet,
padding, cranium, and
brain of a user. Figure 4B illustrates a rotational concussion event for the
various elements
shown in Figure 4.
[0070] Figure 5 illustrates the input force to the helmet, G1 , versus the
observed force within the
brain and as observed by the sensor when mounted within the isolator.
100711 Figure 6 illustrates the rotational acceleration values of the 3 axes
along with the total
rotational vector amount along with video of the concussion event as obtained
from a camera
and displayed with the motion event data.
[0072] Figure 7 illustrates a timeline display of a user along with peak and
minimum angular
speeds along the timeline shown as events along the time line. In addition, a
graph showing the
28

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
lead and lag of the golf club along with the droop and drift of the golf club
is shown in the
bottom display wherein these values determine how much the golf club shaft is
bending in two
axes as plotted against time.
[0073] Figure 8 illustrates a sub-event scrub timeline that enables inputs
near the start/stop
points in time associated with sub-events to be scrolled to, played to or
from, to easily enable
viewing of sub-events.
[0074] Figure 9 illustrates the relative locations along the timeline where
sub-events start and
stop and the gravity associated with the start and stop times, which enable
user inputs near those
points to gravitate to the start and stop times.
[0075] Figure 10 illustrates an embodiment that utilizes a mobile device as
the motion capture
element and another mobile device as the computer that receives the motion
event data and video
of the first user event.
[0076] Figure 11 illustrates an embodiment of the memory utilized to store
data related to a
potential event.
[0077] Figure 12 shows a flow chart of an embodiment of the functionality
specifically
programmed into the microcontroller to determine whether a prospective event
has occurred.
[0078] Figure 13 illustrates a typical event signature or template, which is
compared to motion
capture data to eliminate false positive events.
[0079] Figure 14 illustrates an embodiment of the motion capture element
configured with
optional LED visual indicator for local display and viewing of event related
information and an
optional LCD configured to display a text or encoded message associated with
the event.
100801 Figure 15 illustrates an embodiment of templates characteristic of
motion events
associated with different types of equipment and/or instrumented clothing
along with areas in
which the motion capture sensor personality may change to more accurately or
more efficiently
capture data associated with a particular period of time and/or sub-event.
[0081] Figure 16 illustrates an embodiment of a protective mouthpiece in front
view and at the
bottom portion of the figure in top view, for example as worn in any contact
sport such as, but
not limited to soccer, boxing, football, wrestling or any other sport for
example.
[0082] Figure 17 illustrates an embodiment of the algorithm utilized by any
computer in Figure
29

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
1 that is configured to display motion images and motion capture data in a
combined format.
100831 Figure 18 illustrates an embodiment of the synchronization architecture
that may be
utilized by one or more embodiments of the invention.
[0084] Figure 19 illustrates the detection of an event by one of the motion
capture sensors,
transmission of the event detection to other motion capture sensors and/or
cameras, saving of the
event motion data and trimming of the video to correspond to the event.

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
DETAILED DESCRIPTION OF THE INVENTION
100851 A motion event recognition and video synchronization system and method
will now be
described. In the following exemplary description numerous specific details
are set forth in
order to provide a more thorough understanding of the ideas described
throughout this
specification. It will be apparent, however, to an artisan of ordinary skill
that embodiments of
ideas described herein may be practiced without incorporating all aspects of
the specific details
described herein. In other instances, specific aspects well known to those of
ordinary skill in the
art have not been described in detail so as not to obscure the disclosure.
Readers should note that
although examples of the innovative concepts are set forth throughout this
disclosure, the claims,
and the full scope of any equivalents, are what define the invention.
[0086] Figure 1 illustrates an embodiment of the motion event recognition and
video
synchronization system 100. Embodiments enable event based viewing and low
power
transmission of events and communication with an app executing on a mobile
device and/or with
external cameras to designate windows that define the events. Enables
recognition of motion
events, and designation of events within images or videos, such as a shot,
move or swing of a
player, a concussion of a player, boxer, rider or driver, or a heat stroke,
hypothermia, seizure,
asthma attack, epileptic attack or any other sporting or physical motion
related event including
walking and falling. Events may be correlated with one or more images or video
as captured
from internal/external camera or cameras or nanny cam, for example to enable
saving video of
the event, such as the first steps of a child, violent shaking events,
sporting events including
concussions, or falling events associated with an elderly person. As shown,
embodiments of the
system generally include a mobile device 101 and applications that execute
thereon, that includes
computer 160, shown as located internally in mobile device 101 as a dotted
outline, (i.e., also see
functional view of computer 160 in Figure IA), display 120 coupled to computer
160 and a
wireless communications interface (generally internal to the mobile device,
see element 164 in
Figure 1A) coupled with the computer. Since mobile phones having mobile
computers are
ubiquitous, users of the system may purchase one or more motion capture
elements and an
application, a.k.a., "app", that they install on their pre-existing phone to
implement an
embodiment of the system. Motion capture capabilities are thus available at an
affordable price
for any user that already owns a mobile phone, tablet computer, music player,
etc., which has
never been possible before.
[0087] Each mobile device 101, 102, 102a, 102b may optionally include an
internal identifier
reader 190, for example an RFID reader, or may couple with an identifier
reader or RFID reader
31

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
(see mobile device 102) to obtain identifier 191. Alternatively, embodiments
of the invention
may utilize any wireless technology in any of the devices to communicate an
identifier that
identifies equipment 110 to the system. Embodiments of the invention may also
include any
other type of identifier coupled with the at least one motion capture sensor
or the user or the
piece of equipment. In one or more embodiments, the identifier may include a
team and jersey
number or student identifier number or license number or any other identifier
that enables
relatively unique identification of a particular event from a particular user
or piece of equipment.
This enables team sports or locations with multiple players or users to be
identified with respect
to the app that is configured to receive data associated with a particular
player or user. One or
more embodiments receive the identifier, for example a passive RFID identifier
or MAC address
or other serial number associated with the player or user and associate the
identifier with the
event data and motion analysis data.
[0088] The system generally includes at least one motion capture element 111
that couples with
user 150 or with piece of equipment 110, via mount 192, for example to a golf
club, or baseball
bat, tennis racquet, hockey stick, weapon, stick, sword, or any other piece of
equipment for any
sport, or other sporting equipment such as a shoe, belt, gloves, glasses, hat,
or any other item.
The at least one motion capture element 111 may be placed at one end, both
ends, or anywhere
between both ends of piece of equipment 110 or anywhere on user 150, e.g., on
a cap, headband,
helmet, mouthpiece or any combination thereof, and may also be utilized for El
measurements of
any item. The motion capture element may optionally include a visual marker,
either passive or
active, and/or may include a wireless sensor, for example any sensor capable
of providing any
combination of one or more values associated with an orientation (North/South
and/or up/down),
position, velocity and/or acceleration of the motion capture element. The
computer may be
configured to obtain data associated with an identifier unique to each piece
of equipment 110,
e.g., clothing, bat, etc., for example from an RFID coupled with club 110,
i.e., identifier 191, and
optionally associated with the at least one motion capture element, either
visually or wirelessly,
analyze the data to form motion analysis data and display the motion analysis
data on display
120 of mobile device 101. Motion capture element 111 may be mounted on or near
the
equipment or on or near the user via motion capture mount 192. Motion capture
element 111
mounted on a helmet for example may include an isolator comprising a material
that is
configured to surround the motion capture element to approximate physical
acceleration
dampening of cerebrospinal fluid around the user's brain to minimize
translation of linear
acceleration and rotational acceleration of event data to obtain an observed
linear acceleration
and an observed rotational acceleration of the user's brain. This lowers
processing requirements
32

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
on the motion capture element microcontroller for example and enables low
memory utilization
and lower power requirements for event based transmission of event data. The
motion capture
data from motion capture element 111, any data associated with the piece of
equipment 110,
such as identifier 191 and any data associated with user 150, or any number of
such users 150,
such as second user 152 may be stored in locally in memory, or in a database
local to the
computer or in a remote database, for example database 172 for example that
may be coupled
with a server. Data may be stored in database 172 from each user 150, 152 for
example when a
network or telephonic network link is available from motion capture element
111 to mobile
device 101 and from mobile device 101 to network 170 or Internet 171 and to
database 172.
Data mining is then performed on a large data set associated with any number
of users and their
specific characteristics and performance parameters. For example, in a golf
embodiment of the
invention, a club ID is obtained from the golf club and a shot is detected by
the motion capture
element. Mobile computer 101 stores images/video of the user and receives the
motion capture
data for the events/hits/shots/motion and the location of the event on the
course and subsequent
shots and determines any parameters for each event, such as distance or speed
at the time of the
event and then performs any local analysis and display performance data on the
mobile device.
When a network connection from the mobile device to network 170 or Internet
171 is available
or for example after a round of golf, the images/video, motion capture data
and performance data
is uploaded to database 172, for later analysis and/or display and/or data
mining. In one or more
embodiments, users 151, such as original equipment manufacturers pay for
access to the
database, for example via a computer such as computer 105 or mobile computer
101 or from any
other computer capable of communicating with database 172 for example via
network 170,
Internet 171 or via website 173 or a server that forms part of or is coupled
with database 172.
Data mining may execute on database 172, for example that may include a local
server
computer, or may be run on computer 105 or mobile device 101, 102, 102a or
102b and access a
standalone embodiment of database 172 for example. Data mining results may be
displayed on
mobile device 101, computer 105, television broadcast or web video originating
from camera
130, 130a and 103b, or 104 or accessed via website 173 or any combination
thereof.
[0089] One or more embodiments of the at least one motion capture element may
further
include a light emitting element configured to output light if the event
occurs. This may be
utilized to display a potential, mild or severe level of concussion on the
outer portion of the
helmet without any required communication to any external device for example.
Different
colors or flashing intervals may also be utilized to relay information related
to the event.
Alternatively, or in combination, the at least one motion capture element may
further include an
33

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
audio output element configured to output sound if the event occurs or if the
at least one motion
capture sensor is out of range of the computer or wherein the computer is
configured to display
and alert if the at least one motion capture sensor is out of range of the
computer, or any
combination thereof. Embodiments of the sensor may also utilize an LCD that
outputs a coded
analysis of the current event, for example in a Quick Response (QR) code or
bar code for
example so that a referee may obtain a snapshot of the analysis code on a
mobile device locally,
and so that the event is not viewed in a readable form on the sensor or
wirelessly transmitted and
intercepted by anyone else.
[0090] One or more embodiments of the system may utilize a mobile device that
includes at
least one camera 130, for example coupled to the computer within the mobile
device. This
allows for the computer within mobile device 101 to command the camera 130 to
obtain an
image or images, for example of the user during an athletic movement. The
image(s) of the user
may be overlaid with displays and ratings to make the motion analysis data
more understandable
to a human for example. Alternatively, detailed data displays without images
of the user may
also be displayed on display 120 or for example on the display of computer
105. In this manner
two-dimensional images and subsequent display thereof is enabled. If mobile
device 101
contains two cameras, as shown in mobile device 102, i.e., cameras 130a and
130b, then the
cameras may be utilized to create a three-dimensional data set through image
analysis of the
visual markers for example. This allows for distances and positions of visual
markers to be
ascertained and analyzed. Images and/or video from any camera in any
embodiments of the
invention may be stored on database 172, for example associated with user 150,
for data mining
purposes. In one or more embodiments of the invention image analysis on the
images and/or
video may be performed to determine make/models of equipment, clothes, shoes,
etc., that is
utilized, for example per age of user 150 or time of day of play, or to
discover any other pattern
in the data.
[0091] Alternatively, for embodiments of mobile devices that have only one
camera, multiple
mobile devices may be utilized to obtain two-dimensional data in the form of
images that is
triangulated to determine the positions of visual markers. In one or more
embodiments of the
system, mobile device 101 and mobile device 102a share image data of user 150
to create three-
dimensional motion analysis data. By determining the positions of mobile
devices 101 and 102
(via position determination elements such as GPS chips in the devices as is
common, or via cell
tower triangulation and which are not shown for brevity but are generally
located internally in
mobile devices just as computer 160 is), and by obtaining data from motion
capture element 111
for example locations of pixels in the images where the visual markers are in
each image,
34

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
distances and hence speeds are readily obtained as one skilled in the art will
recognize.
100921 Camera 103 may also be utilized either for still images or as is now
common, for video.
In embodiments of the system that utilize external cameras, any method of
obtaining data from
the external camera is in keeping with the spirit of the system including
wireless communication
of the data, or via wired communication as when camera 103 is docked with
computer 105 for
example, which then may transfer the data to mobile device 101.
[0093] In one or more embodiments of the system, the mobile device on which
the motion
analysis data is displayed is not required to have a camera, i.e., mobile
device 102b may display
data even though it is not configured with a camera. As such, mobile device
102b may obtain
images from any combination of cameras on mobile device 101, 102, 102a, camera
103 and/or
television camera 104 so long as any external camera may communicate images to
mobile device
102b. Alternatively, no camera is required at all to utilize the system. See
also Figure 17.
[0094] For television broadcasts, motion capture element 111 wirelessly
transmits data that is
received by antenna 106. The wireless sensor data thus obtained from motion
capture element
111 is combined with the images obtained from television camera 104 to produce
displays with
augmented motion analysis data that can be broadcast to televisions, computers
such as computer
105, mobile devices 101, 102, 102a, 102b or any other device configured to
display images. The
motion analysis data can be positioned on display 120 for example by knowing
the location of a
camera (for example via GPS information), and by knowing the direction and/or
orientation that
the camera is pointing so long as the sensor data includes location data (for
example GPS
information). In other embodiments, visual markers or image processing may be
utilized to lock
the motion analysis data to the image, e.g., the golf club head can be tracked
in the images and
the corresponding high, middle and low position of the club can be utilized to
determine the
orientation of user 150 to camera 130 or 104 or 103 for example to correctly
plot the augmented
data onto the image of user 150. By time stamping images and time stamping
motion capture
data, for example after synchronizing the timer in the microcontroller with
the timer on the
mobile device and then scanning the images for visual markers or sporting
equipment at various
positions, simplified motion capture data may be overlaid onto the images. Any
other method of
combining images from a camera and motion capture data may be utilized in one
or more
embodiments of the invention. Any other algorithm for properly positioning the
motion analysis
data on display 120 with respect to a user (or any other display such as on
computer 105) may be
utilized in keeping with the spirit of the system. For example, when obtaining
events or groups
of events via the sensor, after the app receives the events and/or time ranges
to obtain images,

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
the app may request image data from that time span from it's local memory, any
other mobile
device, any other type of camera that may be communicated with and/or post
event
locations/times so that external camera systems local to the event(s) may
provide image data for
the times of the event(s).
[0095] One such display that may be generated and displayed on mobile device
101 include a
BULLET TIME 0 view using two or more cameras selected from mobile devices 101,
102,
102a, camera 103, and/or television camera 104 or any other external camera.
In this
embodiment of the system, the computer is configured to obtain two or more
images of user 150
and data associated with the at least one motion capture element (whether a
visual marker or
wireless sensor), wherein the two or more images are obtained from two or more
cameras and
wherein the computer is configured to generate a display that shows slow
motion of user 150
shown from around the user at various angles at normal speed. Such an
embodiment for
example allows a group of fans to create their own BULLET TIME 0 shot of a
golf pro at a
tournament for example. The shots may be sent to computer 105 and any image
processing
required may be performed on computer 105 and broadcast to a television
audience for example.
In other embodiments of the system, the users of the various mobile devices
share their own set
of images, and or upload their shots to a website for later viewing for
example. Embodiments of
the invention also allow images or videos from other players having mobile
devices to be
utilized on a mobile device related to another user so that users don't have
to switch mobile
phones for example. In one embodiment, a video obtained by a first user for a
piece of
equipment in motion that is not associated with the second user having the
video camera mobile
phone may automatically transfer the video to the first user for display with
motion capture data
associated with the first user. Alternatively, the first user's mobile phone
may be utilized as a
motion sensor in place of or in addition to motion capture element 111 and the
second user's
mobile phone may be utilized to capture video of the first user while in
motion. The first user
may optionally gesture on the phone, tap/shake, etc., to indicate that the
second mobile phone
should start,/stop motion capture for example.
[0096] Figure lA shows an embodiment of computer 160. In computer 160 includes
processor
161 that executes software modules, commonly also known as applications,
generally stored as
computer program instructions within main memory 162. Display interface 163
drives display
120 of mobile device 101 as shown in Figure 1. Optional orientation/position
module 167 may
include a North/South or up/down orientation chip or both. In one or more
embodiments, the
orientation/position module may include a location determination element
coupled with the
microcontroller. This may include a GPS device for example. Alternatively, or
in combination,
36

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
the computer may triangulate the location in concert with another computer, or
obtain the
location from any other triangulation type of receiver, or calculate the
location based on images
captured via a camera coupled with the computer and known to be oriented in a
particular
direction, wherein the computer calculates an offset from the mobile device
based on the
direction and size of objects within the image for example. Optional
temperature sensor may
coupled with processor 161 via a wired or wireless link and may be utilized
for example as an
indicator of hypothermia or heat stroke alone or in combination with any
motion detected that
may be indicative of shaking or unconsciousness for example. Communication
interface 164
may include wireless or wired communications hardware protocol chips and/or an
RFID reader
or an RFID reader may couple to computer 160 externally or in any other manner
for example.
In one or more embodiments of the system communication interface may include
telephonic
and/or data communications hardware. In one or more embodiments communication
interface
164 may include a Wi-Fi TM or other IEEE 802.11 device and/or BLUETOOTH
wireless
communications interface or ZigBee() wireless device or any other wireless
technology.
BLUETOOTH 0 class 1 devices have a range of approximately 100 meters, class 2
devices have
a range of approximately 10 meters. BLUETOOTH 0 Low Power devices have a range
of
approximately 50 meters. Any wireless network protocol or type may be utilized
in
embodiments of the system so long as mobile device 101 and motion capture
element 111 can
communicate with one another. Processor 161, main memory 162, display
interface 163,
communication interface 164 and orientation/position module 167 may
communicate with one
another over communication infrastructure 165, which is commonly known as a -
bus".
Communications path 166 may include wired or wireless medium that allows for
communication
with other wired or wireless devices over network 170. Network 170 may
communicate with
Internet 171 and/or database 172. Database 172 may be utilized to save or
retrieve images or
videos of users, or motion analysis data, or users displayed with motion
analysis data in one form
or another. The data uploaded to the Internet, i.e., a remote database or
remote server or memory
remote to the system may be viewed, analyzed or data mined by any computer
that may obtain
access to the data. This allows for original equipment manufacturers to
determine for a given
user what sporting equipment is working best and/or what equipment to suggest.
Data mining
also enables the planning of golf courses based on the data and/or metadata
associated with
users, such as age, or any other demographics that may be entered into the
system. Remote
storage of data also enables medical applications such as morphological
analysis, range of
motion over time, and diabetes prevention and exercise monitoring and
compliance applications.
Data mining based applications also allow for games that use real motion
capture data from other
users, one or more previous performances of the same user, or historical
players whether alive or
37

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
dead after analyzing motion pictures or videos of the historical players for
example. Virtual
reality and augmented virtual reality applications may also utilize the motion
capture data or
historical motion data. The system also enables uploading of performance
related events and/or
motion capture data to database 172, which for example may be implemented as a
social
networking site. This allows for the user to "tweet" high scores, or other
metrics during or after
play to notify everyone on the Internet of the new event. For example, one or
more
embodiments include at least one motion capture element 111 configured to
couple with a user
or piece of equipment or mobile device coupled with the user, wherein the at
least one motion
capture element includes a memory, a sensor configured to capture any
combination of values
associated with an orientation, position, velocity, acceleration of the at
least one motion capture
element, a radio, and a microcontroller coupled with the memory, the sensor
and the radio. The
microcontroller is configured to collect data that includes sensor values from
the sensor, store the
data in the memory, analyze the data and recognize an event within the data to
determine event
data and transmit the event data associated with the event via the radio.
Embodiments of the
system may also include an application configured to execute on a mobile
device wherein the
mobile device includes a computer, a wireless communication interface
configured to
communicate with the radio to obtain the event data associated with the event.
The computer is
coupled with wireless communication interface wherein the computer executes
the application or
"app" to configure the computer to receive the event data from the wireless
communication
interface, analyze the event data to form motion analysis data, store the
event data, or the motion
analysis data, or both the event data and the motion analysis data, and
display information
comprising the event data, or the motion analysis data, or both associated
with the at least one
user on a display.
[0097] Figure 1B illustrates an architectural view of an embodiment of
database 172 utilized in
embodiments of the system. As shown tables 180-186 include information related
to N number
of users, M pieces of equipment per user, P number of sensors per user or
equipment, S number
of sensor data per sensor, T number of patterns found in the other tables, D
number of data users
and V videos. All tables shown in Figure 1B are exemplary and may include more
or less
information as desired for the particular implementation. Specifically, table
180 includes
information related to user 150 which may include data related to the user
such as age, height,
weight, sex, address or any other data. Table 181 include information related
to M number of
pieces of equipment 110, which may include clubs, racquets, bats, shirts,
pants, shoes, gloves,
helmets, etc., for example the manufacturer of the equipment, model of the
equipment, and type
of the equipment. For example, in a golf embodiment, the manufacturer may be
the name of the
38

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
manufacturer, the model may be a name or model number and the type may be the
club number,
i.e., 9 iron, the equipment ID may be identifier 191 in one or more
embodiments of the
invention. Table 182 may include information related to P number of sensors
111 on user 150 or
equipment 110 or mobile computer 101. The sensors associated with user 150 may
include
clothing, clubs, helmets, caps, headbands, mouthpieces, etc., the sensors
associated with
equipment 110 may for example be motion capture data sensors, while the
sensors associated
with mobile computer 101 may include sensors 167 for position/orientation and
sensors 130 for
images/video for example. Table 183 may include information related to S
number of sensor
data per user per equipment, wherein the table may include the time and
location of the sensor
data, or any other metadata related to the sensor data such as temperature,
weather, humidity, as
obtained locally via the temperature sensor shown in Figure 1A, or via
wireless communications
or in any other manner for example, or the sensor data may include this
information or any
combination thereof. The table may also contain a myriad of other fields, such
as ball type, i.e.,
in a golf embodiment the type of golf ball utilized may be saved and later
data mined for the best
performing ball types, etc. This table may also include an event type as
calculated locally, for
example a potential concussion event. Table 184 may include information
related to T number
of patterns that have been found in the data mining process for example. This
may include fields
that have been searched in the various tables with a particular query and any
resulting related
results. Any data mining results table type may be utilized in one or more
embodiments of the
invention as desired for the particular implementation. This may include
search results of any
kind, including El measurements, which also may be calculated on computer 160
locally, or any
other search value from simple queries to complex pattern searches. Table 185
may include
information related to D number of data mining users 151 and may include their
access type, i.e.,
full database or pattern table, or limited to a particular manufacturer, etc.,
the table may also
include payment requirements and/or receipts for the type of usage that the
data mining user has
paid for or agreed to pay for and any searches or suggestions related to any
queries or patterns
found for example. Any other schema, including object oriented database
relationships or
memory based data structures that allow for data mining of sensor data
including motion capture
data is in keeping with the spirit of the invention. Although exemplary
embodiments for
particular activities are given, one skilled in the art will appreciate that
any type of motion based
activity may be captured and analyzed by embodiments of the system using a
motion capture
element and app that runs on a user's existing cell phone 101, 102 or other
computer 105 for
example. Embodiments of the database may include V number of videos 179 as
held in table
186 for example that include the user that generated the video, the video
data, time and location
of the video. The fields are optional and in one or more embodiments, the
videos may be stored
39

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
on any of the mobile devices in the system or any combination of the mobile
devices and
server/DB 172. In one or more embodiments, the videos may be broken into a
subset of videos
that are associated with the "time" field of the sensor data table 183,
wherein the time field may
include an event start time and event stop time. In this scenario, large
videos may be trimmed
into one or more smaller event videos that correspond to generally smaller
time windows
associated with events of the event type held in table 183 to greatly reduce
video storage
requirements of the system.
[0098] There are a myriad of applications that benefit and which are enabled
by embodiments of
the system that provide for viewing and analyzing motion capture data on the
mobile computer
or server/database, for example for data mining database 172 by users 151. For
example, users
151 may include compliance monitors, including for example parents, children
or elderly,
managers, doctors, insurance companies, police, military, or any other entity
such as equipment
manufacturers that may data mine for product improvement. For example in a
tennis
embodiment by searching for top service speeds for users of a particular size
or age, or in a golf
embodiment by searching for distances, i.e., differences in sequential
locations in table 183
based on swing speed in the sensor data field in table 183 to determine which
manufacturers
have the best clubs, or best clubs per age or height or weight per user, or a
myriad of other
patterns. Other embodiments related to compliance enable messages from mobile
computer 101
or from server/database to be generated if thresholds for G-forces, (high or
zero or any other
levels), to be sent to compliance monitors, managers, doctors, insurance
companies, etc., as
previously described. Users 151 may include marketing personnel that determine
which pieces
of equipment certain users own and which related items that other similar
users may own, in
order to target sales at particular users. Users 151 may include medical
personnel that may
determine how much movement a sensor for example coupled with a shoe, i.e., a
type of
equipment, of a diabetic child has moved and how much this movement relates to
the average
non-diabetic child, wherein suggestions as per table 185 may include giving
incentives to the
diabetic child to exercise more, etc., to bring the child in line with healthy
children. Sports
physicians, physiologists or physical therapists may utilize the data per
user, or search over a
large number of users and compare a particular movement of a user or range of
motion for
example to other users to determine what areas a given user can improve on
through stretching
or exercise and which range of motion areas change over time per user or per
population and for
example what type of equipment a user may utilize to account for changes over
time, even
before those changes take place. Data mining motion capture data and image
data related to
motion provides unique advantages to users 151. Data mining may be performed
on flex

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
parameters measured by the sensors to determine if sporting equipment, shoes,
human body parts
or any other item changes in flexibility over time or between equipment
manufacturers or any
combination thereof.
[0099] To ensure that analysis of user 150 during a motion capture includes
images that are
relatively associated with the horizon, i.e., not tilted, the system may
include an orientation
module that executes on computer 160 within mobile device 101 for example. The
computer is
configured to prompt a user to align the camera along a horizontal plane based
on orientation
data obtained from orientation hardware within mobile device 101. Orientation
hardware is
common on mobile devices as one skilled in the art will appreciate. This
allows the image so
captured to remain relatively level with respect to the horizontal plane. The
orientation module
may also prompt the user to move the camera toward or away from the user, or
zoom in or out to
the user to place the user within a graphical "fit box", to somewhat normalize
the size of the user
to be captured. Images may also be utilized by users to prove that they have
complied with
doctors orders for example to meet certain motion requirements.
[00100] Embodiments of the system are further configured to recognize the at
least one motion
capture element associated with user 150 or piece of equipment 110 and
associate at least one
motion capture element 111 with assigned locations on user 150 or piece of
equipment 110. For
example, the user can shake a particular motion capture element when prompted
by the computer
within mobile device 101 to acknowledge which motion capture element the
computer is
requesting an identity for. Alternatively, motion sensor data may be analyzed
for position and/or
speed and/or acceleration when performing a known activity and automatically
classified as to
the location of mounting of the motion capture element automatically, or by
prompting the user
to acknowledge the assumed positions. Sensors may be associated with a
particular player by
team name and jersey number for example and stored in the memory of the motion
capture
sensor for transmission of events. Any computer shown in Figure 1 may be
utilized to program
the identifier associated with the particular motion capture sensor in keeping
with the spirit of
the invention.
[00101] One or more embodiments of the computer in mobile device 101 is
configured to obtain
at least one image of user 150 and display a three-dimensional overlay onto
the at least one
image of user 150 wherein the three-dimensional overlay is associated with the
motion analysis
data. Various displays may be displayed on display 120. The display of motion
analysis data
may include a rating associated with the motion analysis data, and/or a
display of a calculated
ball flight path associated with the motion analysis data and/or a display of
a time line showing
41

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
points in time along a time axis where peak values associated with the motion
analysis data
occur and/or a suggest training regimen to aid the user in improving mechanics
of the user.
These filtered or analyzed data sensor results may be stored in database 172,
for example in table
183, or the raw data may be analyzed on the database (or server associated
with the database or
in any other computer or combination thereof in the system shown in Figure 1
for example), and
then displayed on mobile computer 101 or on website 173, or via a television
broadcast from
camera 104 for example. Data mining results may be combined in any manner with
the unique
displays of the system and shown in any desired manner as well.
[00102] Embodiments of the system may also present an interface to enable user
150 to
purchase piece of equipment 110 over the wireless interface of mobile device
101, for example
via the Internet, or via computer 105 which may be implemented as a server of
a vendor. In
addition, for custom fitting equipment, such as putter shaft lengths, or any
other custom sizing of
any type of equipment, embodiments of the system may present an interface to
enable user 150
to order a customer fitted piece of equipment over the wireless interface of
mobile device 101.
Embodiments of the invention also enable mobile device 101 to suggest better
performing
equipment to user 150 or to allow user 150 to search for better performing
equipment as
determined by data mining of database 172 for distances of golf shots per club
for users with
swing velocities within a predefined range of user 150. This allows for real
life performance
data to be mined and utilized for example by users 151, such as OEMs to
suggest equipment to
user 150, and be charged for doing so, for example by paying for access to
data mining results as
displayed in any computer shown in Figure 1 or via website 173 for example. In
one or more
embodiments of the invention database 172 keeps track of OEM data mining and
is configured
to bill users 151 for the amount of access each of users 151 has purchased
and/or used for
example over a giving billing period. See Figure 1B for example.
[00103] Embodiments of the system are configured to analyze the data obtained
from at least
one motion capture element and determine how centered a collision between a
ball and the piece
of equipment is based on oscillations of the at least one motion capture
element coupled with the
piece of equipment and display an impact location based on the motion analysis
data. This
performance data may also be stored in database 172 and used by OEMs or
coaches for example
to suggest clubs with higher probability of a centered hit as data mined over
a large number of
collisions for example.
[00104] While Figure 1A depicts a physical device, the scope of the systems
and methods set
forth herein may also encompass a virtual device, virtual machine or simulator
embodied in one
42

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
or more computer programs executing on a computer or computer system and
acting or
providing a computer system environment compatible with the methods and
processes
implementing the disclosed ideas. Where a virtual machine, process, device or
otherwise
performs substantially similarly to that of a physical computer system of the
system, such a
virtual platform will also fall within the scope of a system of the
disclosure, notwithstanding the
description herein of a physical system such as that in Figure 1A.
100105] Figure 1C illustrates a flow chart for an embodiment of the processing
performed and
enabled by embodiments of the computers utilized in the system. In one or more
embodiments
of the system, a plurality of motion capture elements are optionally
calibrated at 301. In some
embodiments this means calibrating multiple sensors on a user or piece of
equipment to ensure
that the sensors are aligned and/or set up with the same speed or acceleration
values for a given
input motion. In other embodiments of the invention, this means placing
multiple motion
capture sensors on a calibration object that moves and calibrates the
orientation, position, speed,
acceleration, or any combination thereof at the same time. This step general
includes providing
motion capture elements and optional mount (or alternatively allowing a mobile
device with
motion capture sensing capabilities to be utilized), and an app for example
that allows a user
with an existing mobile phone or computer to utilize embodiments of the system
to obtain
motion capture data, and potentially analyze and/or send messages based
thereon. In one or
more embodiments, users may simply purchase a motion capture element and an
app and begin
immediately using the system. The system captures motion data with motion
capture element(s)
at 302, recognized any events within the motion capture data, i.e., a linear
and/or rotational
acceleration over a threshold indicative of a concussion for example at 303,
and sends the motion
capture data to a mobile computer 101, 102 or 105 for example, which may
include an IPOD ,
ITOUCH , IPADR, IPHONE , ANDROID Phone or any other type of computer that a
user
may utilize to locally collect data at 304. In one or more embodiments the
sensor may transmit
an event to any other motion capture sensor to start an event data storage
process on the other
sensors for example. In other embodiments, the sensor may transmit the event
to other mobile
devices to signify that videos for the event should be saved with unneeded
portions of the video
discarded for example, to enable the video to be trimmed either near the point
in time of the
event or at a later time. In one or more embodiments, the system minimizes the
complexity of
the sensor and offloads processing to extremely capable computing elements
found in existing
mobile phones and other electronic devices for example. The transmitting of
data from the
motion capture elements to the user's computer may happen when possible,
periodically, on an
event basis, when polled, or in any other manner as will be described in
various sections herein.
43

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
This saves great amount of power compared to known systems that continuously
send raw data
in two ways, first data may be sent in event packets, within a time window
around a particular
motion event which greatly reduces the data to a meaningful small subset of
total raw data, and
secondly the data may be sent less than continuously, or at defined times, or
when asked for data
so as to limit the total number of transmissions. In one or more embodiments,
the event may
displayed locally, for example with an LED flashing on the motion capture
sensor 111, for
example yellow slow flashing for potential concussion or red fast flashing for
probably
concussion at 305. Alternatively, or in combination, the alert or event may be
transmitted and
displayed on any other computer or mobile device shown in Figure 1 for
example.
[00106] The main intelligence in the system is generally in the mobile
computer or server where
more processing power may be utilized and so as to take advantage of the
communications
capabilities that are ubiquitous in existing mobile computers for example. In
one or more
embodiments of the system, the mobile computer may optionally obtain an
identifier from the
user or equipment at 306, or this identifier may be transmitted as part of
step 305, such as a
passive RFID or active RFID or other identifier such as a team/jersey number
or other player ID,
which may be utilized by the mobile computer to determine what user has just
been potentially
injured, or what weight as user is lifting, or what shoes a user is running
with, or what weapon a
user is using, or what type of activity a user is using based on the
identifier of the equipment.
The mobile computer may analyze the motion capture data locally at 307 and
display, i.e., show
or send information such as a message for example when a threshold is observed
in the data, for
example when too many G-forces have been registered by a player, soldier or
race car driver, or
when not enough motion is occurring (either at the time or based on the
patterns of data in the
database as discussed below based on the user's typical motion patterns or
other user's motion
patterns for example.) In other embodiments, once a user has performed a
certain amount of
motion, a message may be sent to safety or compliance monitor(s) at 307 to
store or otherwise
display the data, including for example referees, parents, children or
elderly, managers, doctors,
insurance companies, police, military, or any other entity such as equipment
manufacturers. The
message may be an SMS message, or email, or tweet or any other type of
electronic
communication. If the particular embodiment is configured for remote analysis
or only remote
analysis, then the motion capture data may be sent to the server/database at
308. If the
implementation does not utilize a remote database, the analysis on the mobile
computer is local.
If the implementation includes a remote database, then the analysis may be
performed on the
mobile computer or server/database or both at 309. Once the database obtains
the motion
capture data, then the data may be analyzed and a message may be sent from the
server/database
44

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
to compliance personnel or business entities as desired to display the event
alone or in
combination or with respect to previous event data associated with the user or
other users at 310.
[00107] Embodiments of the invention make use of the data from the mobile
computer and/or
server for gaming, morphological comparing, compliance, tracking calories
burned, work
performed, monitoring of children or elderly based on motion or previous
motion patterns that
vary during the day and night, safety monitoring for players, troops when G-
forces exceed a
threshold or motion stops, local use of running, jumping throwing motion
capture data for
example on a cell phone including virtual reality applications that make use
of the user's current
and/or previous data or data from other users, or play music or select a play
list based on the type
of motion a user is performing or data mining. For example if motion is
similar to a known
player in the database, then that user's playlist may be sent to the user's
mobile computer 101.
The processing may be performed locally so if the motion is fast, fast music
is played and if the
motion is slow, then slow music may be played. Any other algorithm for playing
music based
on the motion of the user is in keeping with the spirit of the invention. Any
use of motion
capture data obtained from a motion capture element and app on an existing
user's mobile
computer is in keeping with the spirit of the invention, including using the
motion data in virtual
reality environments to show relative motion of an avatar of another player
using actual motion
data from the user in a previous performance or from another user including a
historical player
for example. Display of information is generally performed via three
scenarios, wherein display
information is based on the user's motion analysis data or related to the
user's piece of
equipment and previous data, wherein previous data may be from the same
user/equipment or
one or more other users/equipment. Under this scenario, a comparison of the
current motion
analysis data with previous data associated with this user/equipment allows
for patterns to be
analyzed with an extremely cost effective system having a motion capture
sensor and app.
Under another scenario, the display of information is a function of the
current user's
performance, so that the previous data selected from the user or another
user/equipment is based
on the current user's performance. This enables highly realistic game play,
for example a virtual
tennis game against a historical player wherein the swings of a user are
effectively responded to
by the capture motion from a historical player. This type of realistic game
play with actual data
both current and previously stored data, for example a user playing against an
average pattern of
a top 10 player in tennis, i.e., the speed of serves, the speed and angle of
return shots, for a given
input shot of a user makes for game play that is as realistic as is possible.
Television images
may be for example analyzed to determine swing speeds and types of shots taken
by historical
players that may no longer be alive to test one's skills against a master, as
if the master was still

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
alive and currently playing the user. Compliance and monitoring by the user or
a different user
may be performed in a third scenario without comparison to the user's previous
or other user's
previous data wherein the different user does not have access to or own for
example the mobile
computer. In other words, the mobile phone is associated with the user being
monitored and the
different user is obtaining information related to the current performance of
a user for example
wearing a motion capture element, such as a baby, or a diabetes patient.
100108] Figure 1D illustrates a data flow diagram for an embodiment of the
system. As shown
motion capture data is sent from a variety of motion capture elements 111 on
many different
types of equipment 110 or associated with user 150, for example on clothing, a
helmet,
headband, cap, mouthpiece or anywhere else coupled with the user. The
equipment or user may
optionally have an identifier 191 that enables the system to associate a value
with the motion,
i.e., the weight being lifted, the type of racquet being used, the type of
electronic device being
used, i.e., a game controller or other object such as baby pajamas associated
with second user
152, e.g., a baby. In one or more embodiments, elements 191 in the figure may
be replaced or
augmented with motion capture elements 111 as one skilled in the art will
appreciate. In one or
more embodiments of the system, mobile computer 101 receives the motion
capture data, for
example in event form and for example on an event basis or when requested by
mobile computer
101, e.g., after motion capture elements 111 declares that there is data and
turns on a receiver for
a fix amount of time to field requests so as to not waste power, and if no
requests are received,
then turn the receiver off for a period of time. Once the data is in mobile
computer 101, then the
data is analyzed, for example to take raw or event based motion capture data
and for example
determine items such as average speed, etc., that are more humanly
understandable in a concise
manner. The data may be stored, shown to the right of mobile computer 101 and
then the data
may be displayed to user 150, or 151, for example in the form of a monitor or
compliance text or
email or on a display associated with mobile computer 101 or computer 105.
This enables users
not associated with the motion capture element and optionally not even the
mobile computer
potentially to obtain monitor messages, for example saying that the baby is
breathing slowly, or
for example to watch a virtual reality match or performance, which may include
a user supplying
motion capture data currently, a user having previously stored data or a
historical player, such as
a famous golfer, etc., after analysis of motion in video from past tournament
performance(s). In
gaming scenarios, where the data obtained currently, for example from user 150
or equipment
110, the display of data, for example on virtual reality glasses may make use
of the previous data
from that user/equipment or another user/equipment to respond to the user's
current motion data,
i.e., as a function of the user's input. The previous data may be stored
anywhere in the system,
46

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
e.g., in the mobile computer 101, computer 105 or on the server or database
172 (see Fig. 1).
The previous data may be utilized for example to indicate to user 151 that
user 150 has
undergone a certain number of potential concussion events, and therefore must
heal for a
particular amount of time before playing again. Insurance companies may demand
such
compliance to lower medical expenses for example. Video may be stored and
retrieved from
mobile device 101, computer 105 or as shown in Figure 1, on server or in
database coupled with
server 172 to form event videos that include the event data and the video of
the event shown
simultaneously for example on a display, e.g., overlaid or shown in separate
portions of the
display of mobile computer 101 or computer 105 generally.
[00109] Figure 2A illustrates a helmet 110a based mount that surrounds the
head 150a of a user
wherein the helmet based mount holds a motion capture sensor 111, for example
as shown on the
rear portion of the helmet. Figure 2B illustrates a neck insert based mount,
shown at the bottom
rear portion of the helmet, that enables retrofitting existing helmets with a
motion capture sensor
111. In embodiments that include at least one motion capture sensor that is
configured to be
coupled with or otherwise worn near the user's head 150a, the microcontroller
may be further
configured to calculate of a location of impact on the user's head. The
calculation of the location
of impact on the user's head is based on the physical geometry of the user's
head and/or helmet.
For example, if motion capture element 111 indicates a rearward acceleration
with no rotation
(to the right in the figure as shown), then the location of impact may be
calculated by tracing the
vector of acceleration back to the direction of the outside perimeter of the
helmet or user's head.
This non-rotational calculation effectively indicates that the line of force
passes near or through
the center of gravity of the user's head/helmet, otherwise rotational forces
are observed by
motion capture element 111. If a sideward vector is observed at the motion
capture element 111,
then the impact point is calculated to be at the side of the helmet/head and
through the center of
gravity. Hence, any other impact that does not impart a rotational
acceleration to the motion
capture sensor over at least a time period near the peak of the acceleration
for example, or during
any other time period, may be assumed to be imparted in a direction to the
helmet/head that
passes through the center of gravity. Hence, the calculation of the point of
impact is calculated
as the intersection of the outer perimeter of the helmet/head that a vector of
force is detected and
traversed backwards to the point of impact by calculating the distance and
angle back from the
center of gravity. For example, if the acceleration vector is at 45 degrees
with no rotation, then
the point of impact is 45 degrees back from the center of gravity of the
helmet/head, hence
calculating the sine of 45, approximately 0.7 multiplied by the radius of the
helmet or 5 inches,
results in an impact about 3.5 inches from the front of the helmet.
Alternatively, the location of
47

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
impact may be kept in angular format to indicate that the impact was at 45
degrees from the front
of the helmet/head. Conversely, if rotational acceleration is observed
without linear
acceleration, then the helmet/head is rotating about the sensor. In this
scenario, the force
required to rotate the brain passes in front of the center of gravity and is
generally orthogonal to
a line defined as passing through the center of gravity and the sensor, e.g.,
a side impact,
otherwise translation linear acceleration would be observed. In this case, the
location of impact
then is on the side of the helmet/head opposite the direction of the
acceleration. Hence, these
two calculations of location of impact as examples of simplified methods of
calculations that
may be utilized although any other vector based algorithm that takes into
account the mass of the
head/helmet and the size of the head/helmet may be utilized. One such
algorithm may utilize
any mathematical equations such as F = m * a, i.e., Force equal mass times
acceleration, and
Torque = r X F, where r is the position vector at the outer portion of the
head/helmet, X is the
cross product and F is the Force vector, to calculate the force vector and
translate back to the
outer perimeter of the helmet/head to calculate the Force vector imparted at
that location if
desired. Although described with respect to a helmet, other embodiments of the
at least one
motion capture sensor may be configured to be coupled with a hat or cap,
within a protective
mouthpiece, using any type of mount, enclosure or coupling mechanism. Similar
calculations
may be utilized for the hat/cap/mouthpiece to determine a location/direction
of impact, linear or
rotational forces from the accelerations or any other quantities that may be
indicative of
concussion related events for example. Embodiments may include a temperature
sensor coupled
with the at least one motion capture sensor or with the microcontroller for
example as shown in
Figure IA. The temperature sensor may be utilized alone or in combination with
the motion
capture element, for example to determine if the body or head is shivering,
i.e., indicative of
hypothermia, or if no movement is detected and the temperature for example
measure wirelessly
or via a wire based temperature sensor indicates that the body or brain is
above a threshold
indicative of heat stroke.
[00110] Embodiments of the invention may also utilize an isolator configured
to surround the at
least one motion capture element to approximate physical acceleration
dampening of
cerebrospinal fluid around the user's brain to minimize translation of linear
acceleration and
rotational acceleration of the event data to obtain an observed linear
acceleration and an
observed rotational acceleration of the user's brain. Thus embodiments do not
have to translate
forces or acceleration values or any other values from the helmet based
acceleration to the
observed brain acceleration values and thus embodiments of the invention
utilize less power and
storage to provide event specific data, which in turn minimizes the amount of
data transfer which
48

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
yields lower transmission power utilization. Different isolators may be
utilized on a
football/hockey/lacrosse player's helmet based on the type of padding inherent
in the helmet.
Other embodiments utilized in sports where helmets are not worn, or
occasionally worn may also
utilize at least one motion capture sensor on a cap or hat, for example on a
baseball player's hat,
along with at least one sensor mounted on a batting helmet. Headband mounts
may also be
utilized in sports where a cap is not utilized, such as soccer to also
determine concussions. In
one or more embodiments, the isolator utilized on a helmet may remain in the
enclosure attached
to the helmet and the sensor may be removed and placed on another piece of
equipment that does
not make use of an isolator that matches the dampening of a user's brain
fluids. Embodiments
may automatically detect a type of motion and determine the type of equipment
that the motion
capture sensor is currently attached to based on characteristic motion
patterns associated with
certain types of equipment, i.e., surfboard versus baseball bat. In one or
more embodiments an
algorithm that may be utilized to calculate the physical characteristics of an
isolator may include
mounting a motion capture sensor on a helmet and mounting a motion capture
sensor in a
headform in a crash test dummy head wherein the motion capture sensor in the
headform is
enclosed in an isolator. By applying linear and rotational accelerations to
the helmet and
observing the difference in values obtained by the helmet sensor and observed
by the sensor in
the headform for example with respect to a sensor placed in a cadaver head
within a helmet, the
isolator material of the best matching dampening value may be obtained that
most closely
matches the dampening effect of a human brain.
100111] Figure 3 illustrates a close-up of the mount of Figures 2A-B showing
the isolator
between the motion capture sensor and external portion of the helmet.
Embodiments of the
invention may be configured to obtain/calculate a linear acceleration value or
a rotational
acceleration value or both. This enables rotational events to be monitored for
concussions as
well as linear accelerations. As shown, an external acceleration G1 may impart
a lower
acceleration more associated with the acceleration observed by the human
brain, namely G2 on
sensor 111 by utilizing isolator 111c within sensor mount 111b. This enables
rotational events to
be monitored for concussions as well as linear accelerations. Other events may
make use of the
linear and/or rotational acceleration and/or velocity, for example as compared
against patterns or
templates to not only switch sensor personalities during an event to alter the
capture
characteristics dynamically, but also to characterize the type of equipment
currently being
utilized with the current motion capture sensor. This enables a single motion
capture element
purchase by a user to instrument multiple pieces of equipment or clothing by
enabling the sensor
to automatically determine what type of equipment or piece of clothing the
sensor is coupled to
49

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
based on the motion captured by the sensor when compared against
characteristic patterns or
templates of motion.
[00112] Figure 4A illustrates a top cross sectional view of the motion capture
element 111
mounted on helmet 110a having padding 110a1 that surrounds cranium 401, and
brain 402 of a
user. Figure 4B illustrates a rotational concussion event for the various
elements shown in
Figure 4. As shown, different acceleration values may be imparted on the human
brain 402 and
cranium 401 having center of gravity 403 and surrounded by padding 110a1 in
helmet 110a. As
shown, to move within a unit time period, the front portion of the brain must
accelerate at a
higher rate G2a, than the rear portion of the brain at G2c or at G2b at the
center of gravity.
Hence, for a given rotational acceleration value different areas of the brain
may be affected
differently. One or more embodiments of the invention may thus transmit
information not only
related to linear acceleration, but also with rotational acceleration.
[00113] Figure 5 illustrates the input force to the helmet, Gl, e.g., as shown
at 500 g, versus the
observed force within the brain G2, and as observed by the sensor when mounted
within the
isolator and as confirmed with known headform acceleration measurement
systems. The upper
right graph shows that two known headform systems confirm acceleration values
observed by an
isolator based motion capture element 111 shown in Figure 4A with respect to
headform
mounted accelerometers.
[00114] Figure 6 illustrates the rotational acceleration values of the 3 axes
along with the total
rotational vector amount along with video of the concussion event as obtained
from a camera
and displayed with the motion event data. In one or more embodiments, the
acceleration values
from a given sensor may be displayed for rotational (as shown) or linear
values, for example by
double tapping a mobile device screen, or in any other manner. Embodiments of
the invention
may transmit the event data associated with the event using a connectionless
broadcast message.
In one or more embodiments, depending on the wireless communication employed,
broadcast
messages may include payloads with a limited amount of data that may be
utilized to avoid
handshaking and overhead of a connection based protocol. In other embodiments
connectionless
or connection based protocols may be utilized in any combination. In this
manner, a referee may
obtain nearly instantaneous readouts of potential concussion related events on
a mobile device,
which allows the referee to obtain medical assistance in rapid fashion.
[00115] In one or more embodiments, the computer may access previously stored
event data or
motion analysis data associated with the user or piece of equipment, for
example to determine
the number of concussions or falls or other swings, or any other motion event.
Embodiments

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
may also present event data associated with the at least one user on a display
based on the event
data or motion analysis data associated with the user or piece of equipment
and the previously
stored event data or motion analysis data associated with the user or piece of
equipment or with
at least one other user or other piece of equipment. This enables comparison
of motion events,
in number or quantitative value, e.g., the maximum rotational acceleration
observed by the user
or other users in a particular game or historically. In addition, patterns or
templates that define
characteristic motion of particular pieces of equipment for typical events may
be dynamically
updated, for example on a central server or locally, and dynamically updated
in motion capture
sensors via the wireless interface in one or more embodiments. This enables
sensors to improve
over time. Hence, the display shown in Figure 6 may also indicate the number
of concussions
previously stored for a given boxer/player and enable the referee/doctor to
make a decision as to
whether or not the player may keep playing or not.
[00116] Embodiments of the invention may transmit the information to a display
on a visual
display coupled with the computer or a remote computer, for example over
broadcast television
or the Internet for example. Hence, the display in Figure 6 may be also shown
to a viewing
audience, for example in real-time to indicate the amount of force imparted
upon the
boxer/player/rider, etc.
[00117] Figure 7 illustrates a timeline display 2601 of a user along with peak
and minimum
angular speeds along the timeline shown as events along the time line. In
addition, a graph
showing the lead and lag of the golf club 2602 along with the droop and drift
of the golf club is
shown in the bottom display wherein these values determine how much the golf
club shaft is
bending in two axes as plotted against time. An embodiment of the display is
shown in Figure 8
with simplified time line and motion related event (maximum speed of the
swing) annotated on
the display.
[00118] Figure 8 illustrates a sub-event scrub timeline that enables inputs
near the start/stop
points 802a-d in time, i.e., sub-event time locations shown in Figure 7 and
associated with sub-
events to be scrolled to, played to or from, to easily enable viewing of sub-
events. For example
a golf swing may include sub-events such as an address, swing back, swing
forward, strike,
follow through. The system may display time locations for the sub-events 802a-
d and accept
user input near the location to assert that the video should start or stop at
that point in time, or
scroll to or back to that point in time for ease of viewing sub-events for
example. User input
element 801 may be utilized to drag the time to a nearby sub-event for example
to position the
video at a desired point in time. Alternatively, or in combination a user
input such as asserting a
51

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
finger press near another sub-event point in time while the video is playing,
may indicate that the
video should stop at the next sub-event point in time. The user interface may
also be utilized to
control-drag the points to more precisely synchronize the video to the frame
in which a particular
sub-event or event occurs. For example, the user may hold the control key and
drag a point 802b
to the left or right to match the frame of the video to the actual point in
time where the velocity
of the club head is zero for example to more closely synchronize the video to
the actual motion
analysis data shown, here Swing Speed in miles per hour. Any other user
gesture may be
utilized in keeping with the spirit of the invention to synchronize a user
frame to the motion
analysis data, such as voice control, arrow keys, etc.
[00119] Figure 9 illustrates the relative locations along the timeline where
sub-events 802a and
802b start and stop and the gravity associated with the start and stop times,
which enable user
inputs near those points to gravitate to the start and stop times. For
example, when dragging the
user interface element 801 left and right along the time line, the user
interface element may
appear to move toward the potential well 802a and 802b, so that the user
interface element is
easier to move to the start/stop point of a sub-event.
[00120] In one or more embodiments, the computer is further configured to
request at least one
image or video that contains the event from at least one camera proximal to
the event. This may
include a broadcast message requesting video from a particular proximal camera
or a camera that
is pointing in the direction of the event. In one or more embodiments, the
computer is further
configured to broadcast a request for camera locations proximal to the event
or oriented to view
the event, and optionally display the available cameras, or videos therefrom
for the time duration
around the event of interest. In one or more embodiments, the computer is
further configured to
display a list of one or more times at which the event has occurred, which
enables the user obtain
the desired event video via the computer, and/or to independently request the
video from a third
party with the desired event times. The computer may obtain videos from the
server 172 as well
and locally trim the video to the desired events. This may be utilized to
obtain third party videos
or videos from systems that do not directly interface with the computer, but
which may be in
communication with the server 172.
[00121] Figure 10 illustrates an embodiment that utilizes a mobile device 102b
as the motion
capture element 111a and another mobile device 102a as the computer that
receives the motion
event data and video of the first user event. The view from mobile device 102a
is shown in the
left upper portion of the figure. In one or more embodiments, the at least one
motion capture
sensor is coupled with the mobile device and for example uses an internal
motion sensor 111a
52

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
within or coupled with the mobile device. This enables motion capture and
event recognition
with minimal and ubiquitous hardware, e.g., using a mobile device with a built-
in accelerometer.
In one or more embodiments, a first mobile device 102b may be coupled with a
user recording
motion data, here shown skateboarding, while a second mobile device 102a is
utilized to record a
video of the motion. In one or more embodiments, the user undergoing motion
may gesture,
e.g., tap N times on the mobile device to indicate that the second user's
mobile device should
start recording video or stop recording video. Any other gesture may be
utilized to communicate
event related or motion related indications between mobile devices.
[00122] Thus embodiments of the invention may recognize any type of motion
event, including
events related to motion that is indicative of standing, walking, falling, a
heat stroke, seizure,
violent shaking, a concussion, a collision, abnormal gait, abnormal or non-
existent breathing or
any combination thereof or any other type of event having a duration of time
during with motion
occurs. Events may also be of any granularity, for example include sub-events
that have known
signatures, or otherwise match a template or pattern of any type, including
amplitude and/or time
thresholds in particular sets of linear or rotational axes. For example,
events indicating a
skateboard push-off or series of pushes may be grouped into a sub-event such
as "prep for
maneuver", while rotational axes in X for example may indicate "skateboard
flip/roll". In one or
more embodiments, the events may be grouped and stored/sent.
[00123] Figure 11 illustrates an embodiment of the memory utilized to store
data. Memory
4601 may for example be integral to the microcontroller in motion capture
element 111 or may
couple with the microcontroller, as for example a separate memory chip. Memory
4601 as
shown may be configured to include one or more memory buffer 4610, 4611 and
4620, 4621
respectively. One embodiment of the memory buffer that may be utilized is a
ring buffer. The
ring buffer may be implemented to be overwritten multiple times until an event
occurs. The
length of the ring buffer may be from 0 to N memory units. There may for
example be M ring
buffers, for M strike events for example. The number M may be any number
greater than zero.
In one or more embodiments, the number M may be equal to or greater than the
number of
expected events, e.g., number of hits, or shots for a round of golf, or any
other number for
example that allows all motion capture data to be stored on the motion capture
element until
downloaded to a mobile computer or the Internet after one or more events. In
one embodiment,
a pointer, for example called HEAD keeps track of the head of the buffer. As
data is recorded in
the buffer, the HEAD is moved forward by the appropriate amount pointing to
the next free
memory unit. When the buffer becomes full, the pointer wraps around to the
beginning of the
buffer and overwrites previous values as it encounters them. Although the data
is being
53

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
overwritten, at any instance in time (t), there is recorded sensor data from
time (t) back
depending on the size of the buffer and the rate of recording. As the sensor
records data in the
buffer, an "Event" in one or more embodiments stops new data from overwriting
the buffer.
Upon the detection of an Event, the sensor can continue to record data in a
second buffer 4611 to
record post Event data, for example for a specific amount of time at a
specific capture rate to
complete the recording of a prospective shot. Memory buffer 4610 now contains
a record of
data for a desired amount of time from the Event backwards, depending on the
size of the buffer
and capture rate along with post Event data in the post event buffer 4611.
Video may also be
stored in a similar manner and later trimmed, see Figure 19 for example.
[00124] For example, in a golf swing, the event can be the impact of the club
head with the ball.
Alternatively, the event can be the impact of the club head with the ground,
which may give rise
to a false event. In other embodiments, the event may be an acceleration of a
user's head which
may be indicative of a concussion event, or a shot fired from a weapon, or a
ball striking a
baseball bat or when a user moves a weight to the highest point and descends
for another
repetition. The Pre-Event buffer stores the sensor data up to the event of
impact, the Post-Event
buffer stores the sensor data after the impact event. One
or more embodiments of the
microcontroller are configured to analyze the event and determine if the event
is a repetition,
firing or event such as a strike or a false strike. If the event is considered
a valid event according
to a pattern or signature or template (see Figures 13 and 15), and not a false
event, then another
memory buffer 4620 is used for motion capture data up until the occurrence of
a second event.
After that event occurs, the post event buffer 4621 is filled with captured
data.
[00125] Specifically, the motion capture element 111 may be implemented as one
or more
MEMs sensors. The sensors may be commanded to collect data at specific time
intervals. At
each interval, data is read from the various MEMs devices, and stored in the
ring buffer. A set of
values read from the MEMs sensors is considered a FRAME of data. A FRAME of
data can be
0, 1, or multiple memory units depending on the type of data that is being
collected and stored in
the buffer. A FRAME of data is also associated with a time interval. Therefore
frames are also
associated with a time element based on the capture rate from the sensors. For
example, if each
Frame is filled at 2ms intervals, then 1000 FRAMES would contain 2000ms of
data (2 seconds).
In general, a FRAME does not have to be associated with time.
[00126] Data can be constantly stored in the ring buffer and written out to
non-volatile memory
or sent over a wireless or wired link over a radio/antenna to a remote memory
or device for
example at specified events, times, or when communication is available over a
radio/antenna to a
54

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
mobile device or any other computer or memory, or when commanded for example
by a mobile
device, i.e., "polled", or at any other desired event.
[00127] Figure 12 shows a flow chart of an embodiment of the functionality
specifically
programmed into the microcontroller to determine whether an event that is to
be transmitted for
the particular application, for example a prospective event or for example an
event has occurred.
The motion, acceleration or shockwave that occurs from an impact to the
sporting equipment is
transmitted to the sensor in the motion capture element, which records the
motion capture data as
is described in Figure 11 above. The microcontroller is configured to then
analyze the event and
determine whether the event is a prospective event or not.
[00128] One type of event that occurs is acceleration or a
head/helmet/cap/mouthpiece based
sensor over a specified linear or rotational value, or the impact of the
clubface when it impacts a
golf ball. In other sports that utilize a ball and a striking implement, the
same analysis is applied,
but tailored to the specific sport and sporting equipment. In tennis a
prospective strike can be the
racquet hitting the ball, for example as opposed to spinning the racquet
before receiving a serve.
In other applications, such as running shoes, the impact detection algorithm
can detect the shoe
hitting the ground when someone is running. In exercise it can be a particular
motion being
achieved, this allows for example the counting of repetitions while lifting
weights or riding a
stationary bike.
[00129] In one or more embodiments of the invention, processing starts at
4701. The
microcontroller compares the motion capture data in memory 4610 with linear
velocity over a
certain threshold at 4702, within a particular impact time frame and searches
for a discontinuity
threshold where there is a sudden change in velocity or acceleration above a
certain threshold at
4703. If no discontinuity in velocity or for example acceleration occurs in
the defined time
window, then processing continues at 4702. If a discontinuity does occur, then
the prospective
impact is saved in memory and post impact data is saved for a given time P at
4704. For
example, if the impact threshold is set to 126, discontinuity threshold is set
to 66, and the
impact time frames is 10 frames, then microcontroller 3802 signals impact,
after detection of a
12G acceleration in at least one axis or all axes within 10 frames followed by
a discontinuity of
6G. In a typical event, the accelerations build with characteristic
accelerations curves. Impact is
signaled as a quick change in acceleration/velocity. These changes are
generally distinct from
the smooth curves created by an incrementally increasing or decreasing curves
of a particular
non-event. For concussion based events, linear or rotational acceleration in
one or more axes is
over a threshold. For golf related events, if the acceleration curves are that
of a golf swing, then

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
particular axes have particular accelerations that fit within a signature,
template or other pattern
and a ball strike results in a large acceleration strike indicative of a hit.
If the data matches a
given template, then it is saved, if not, it processing continues back at
4702. If data is to be
saved externally as determined at 4705, i.e., there is a communication link to
a mobile device
and the mobile device is polling or has requested impact data when it occurs
for example, then
the event is transmitted to an external memory, or the mobile device or saved
externally in any
other location at 4706 and processing continues again at 4702 where the
microcontroller
analyzes collected motion capture data for subsequent events. If data is not
to be saved
externally, then processing continues at 4702 with the impact data saved
locally in memory
4601. If sent externally, the other motion capture devices may also save their
motion data for the
event detected by another sensor. This enables sensors with finer resolution
or more motion for
example to alert other sensors associated with the user or piece of equipment
to save the event
even if the motion capture data does not reach a particular threshold or
pattern, for example see
Figure 15. This type of processing provides more robust event detection as
multiple sensors may
be utilized to detect a particular type of event and notify other sensors that
may not match the
event pattern for one reason or another. In addition, cameras may be notified
and trim or
otherwise discard unneeded video and save event related video, which may lower
memory
utilization not only of events but also for video. In one or more embodiments
of the invention,
noise may be filtered from the motion capture data before sending, and the
sample rate may be
varied based on the data values obtained to maximize accuracy. For example,
some sensors
output data that is not accurate under high sampling rates and high G-forces.
Hence, by lowering
the sampling rate at high G-forces, accuracy is maintained. In one or more
embodiments of the
invention, the microcontroller associated with motion capture element 111 may
sense high G
forces and automatically switch the sampling rate. In one or more embodiments,
instead of
using accelerometers with 6G/12G/24G ranges or 2G/4G/8G/16G ranges,
accelerometers with 2
ranges, for example 2G and 24G may be utilized to simplify the logic of
switching between
ranges.
[00130] One or more embodiments of the invention may transmit the event to a
mobile device
and/or continue to save the events in memory, for example for a round of golf
or until a mobile
device communication link is achieved.
[00131] For example, with the sensor mounted in a particular mount, a typical
event signature is
shown in Figure 13, also see Figure 15 for comparison of two characteristic
motion types as
shown via patterns or templates associated with different pieces of equipment
or clothing for
example. In one or more embodiments, the microcontroller is configured to
execute a pattern
56

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
matching algorithm to follow the curves for each of the axis and use segments
of 1 or more axis
to determine if a characteristic swing has taken place, in either linear or
rotational acceleration or
any combination thereof If the motion capture data in memory 4601 is within a
range close
enough to the values of a typical swing as shown in Figure 13, then the motion
is consistent with
an event. Embodiments of the invention thus reduce the number of false
positives in event
detection, after first characterizing the angular and/or linear velocity
signature of the movement,
and then utilizing elements of this signature to determine if similar
signatures for future events
have occurred.
[00132] The motion capture element collects data from various sensors. The
data capture rate
may be high and if so, there are significant amounts of data that is being
captured. Embodiments
of the invention may use both lossless and lossy compression algorithms to
store the data on the
sensor depending on the particular application. The compression algorithms
enable the motion
capture element to capture more data within the given resources. Compressed
data is also what
is transferred to the remote computer(s). Compressed data transfers faster.
Compressed data is
also stored in the Internet "in the cloud", or on the database using up less
space locally.
[00133] Figure 14 illustrates an embodiment of the motion capture element 111
configured with
optional LED visual indicator 1401 for local display and viewing of event
related information
and an optional LCD 1402 configured to display a text or encoded message
associated with the
event. In one or more embodiments, the LED visual indicator may flash slow
yellow for a
moderate type of concussion, and flash fast red for a severe type of
concussion to give a quick
overall view of the event without requiring any wireless communications. In
addition, the LED
may be asserted with a number of flashes or other colors to indicate any
temperature related
event or other event. One or more embodiments may also employ LCD 1402 for
example that
may show text, or alternatively may display a coded message for sensitive
health related
information that a referee or medical personnel may read or decode with an
appropriate reader
app on a mobile device for example. In the lower right portion of the figure,
the LCD display
may produce an encoded message that states "Potential Concussion 1500
degree/s/s rotational
event detect ¨ alert medical personnel immediately". Other paralysis
diagnostic messages or any
other type of message that may be sensitive may be encoded and displayed
locally so that
medical personnel may immediately begin assessing the user/player/boxer
without alarming
other players with the diagnostic message for example, or without transmitting
the message over
the air wirelessly to avoid interception.
[00134] Figure 15 illustrates an embodiment of templates characteristic of
motion events
57

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
associated with different types of equipment and/or instrumented clothing
along with areas in
which the motion capture sensor personality may change to more accurately or
more efficiently
capture data associated with a particular period of time and/or sub-event. As
shown, the
characteristic push off for a skateboard is shown in acceleration graphs 1501
that display the X,
Y and Z axes linear acceleration and rotational acceleration values in the top
6 timelines,
wherein time increases to the right. As shown, discrete positive x-axis
acceleration captured is
shown at 1502 and 1503 while the user pushes the skateboard with each step,
followed by
negative acceleration as the skateboard slows between each push. In addition,
y-axis wobbles
during each push are also captured while there is no change in the z axis
linear acceleration and
no rotational accelerations in this characteristic template or pattern of a
skateboard push off or
drive. Alternatively, the pattern may include a group of threshold
accelerations in x at
predefined time windows with other thresholds or no threshold for wobble for
example that the
captured data is compared against to determine automatically the type of
equipment that the
motion capture element is mounted to or that the known piece of equipment is
experiencing
currently. This enables event based data saving and transmission for example.
[00135] The pattern or template in graphs 1511 however show a running event as
the user
slightly accelerates up and down during a running event. Since the user's
speed is relatively
constant there is relatively no acceleration in x and since the user is not
turning, there is
relatively no acceleration in y (left/right). This pattern may be utilized to
compare within ranges
for running for example wherein the pattern includes z axis accelerations in
predefined time
windows. Hence, the top three graphs of graphs 1511 may be utilized as a
pattern to notate a
running event. The bottom three graphs may show captured data that are
indicative of the user
looking from side to side when the motion capture element is mounted in a
helmet and/or
mouthpiece at 1514 and 1515, while captured data 1516 may be indicative of a
moderate or
sever concussion observed via a rotational motion of high enough angular
degrees per second
squared. In addition, the sensor personality may be altered dynamically at
1516 or at any other
threshold for example to change the motion capture sensor rate of capture or
bit size of capture
to more accurately in amplitude or time capture the event. This enables
dynamic alteration of
quality of capture and/or dynamic change of power utilization for periods of
interest, which is
unknown in the art. In one or more embodiments, a temperature timeline may
also be recorded
for embodiments of the invention that utilize temperature sensors, either
mounted within a
helmet, mouthpiece or in any other piece of equipment or within the user's
body for example.
[00136] Figure 16 illustrates an embodiment of a protective mouthpiece 1601 in
front view and
at the bottom portion of the figure in top view, for example as worn in any
contact sport such as,
58

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
but not limited to soccer, boxing, football, wrestling or any other sport for
example.
Embodiments of the mouthpiece may be worn in addition to any other headgear
with or without
a motion capture element to increase the motion capture data associated with
the user and
correlate or in any other way combine or compare the motion data and or events
from any or all
motion capture elements worn by the user. Embodiments of the mouthpiece and/or
helmet
shown in Figures 2A-B or in any other piece of equipment may also include a
temperature sensor
for example and as previously discussed.
[00137] Figure 17 illustrates an embodiment of the algorithm utilized by any
computer in Figure
1 that is configured to display motion images and motion capture data in a
combined format. In
one or more embodiments, the motion capture data and any event related
start/stop times may be
saved on the motion capture element 111. One or more embodiments of the
invention include a
motion event recognition and video synchronization system that includes at
least one motion
capture element configured to couple with a user or piece of equipment or
mobile device coupled
with the user. The at least one motion capture element may include a memory, a
sensor
configured to capture any combination of values associated with an
orientation, position,
velocity and acceleration of the at least one motion capture element, a radio,
a microcontroller
coupled with the memory, the sensor and the radio. The microcontroller may be
configured to
collect data that includes sensor values from the sensor, store the data in
the memory, analyze the
data and recognize an event within the data to determine event data, transmit
the event data
associated with the event via the radio. The system may also include a mobile
device that
includes a computer, a wireless communication interface configured to
communicate with the
radio to obtain the event data associated with the event, wherein the computer
is coupled with
wireless communication interface, wherein the computer is configured to
receive the event data
from the wireless communication interface. The computer may also analyze the
event data to
form motion analysis data, store the event data, or the motion analysis data,
or both the event
data and the motion analysis data, obtain an event start time and an event
stop time from the
event, request image data from camera that includes a video captured at least
during a timespan
from the event start time to the event stop time and display an event video on
a display that
includes both the event data, the motion analysis data or any combination
thereof that occurs
during the timespan from the event start time to the event stop time and the
video captured
during the timespan from the event start time to the event stop time.
[00138] When a communication channel is available, motion capture data and any
event related
start/stop times are pushed to, or obtained by or otherwise received by any
computer, e.g., 101,
102, 102a, 102b, 105 at 1701. The clock difference between the clock on the
sensor and/or in
59

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
motion capture data times may also be obtained. This may be performed by
reading a current
time stamp in the incoming messages and comparing the incoming message time
with the current
time of the clock of the local computer, see also Figure 18 for example for
more detail on
synchronization. The difference in clocks from the sensor and computer may be
utilized to
request images data from any camera local or pointing at the location of the
event for the
adjusted times to take into account any clock difference at 1702. For example,
the computer may
request images taken at the time/location by querying all cameras 103, 104, or
on devices 101,
102 and/or 102a for any or all such devices having images taken nearby, e.g.,
based on GPS
location or wireless range, and/or pointed at the event obtained from motion
capture element
111. If a device is not nearby, but is pointing at the location of the event,
as determined by its
location and orientation when equipped with a magnetometer for example, then
it may respond
as well with images for the time range. Any type of camera that may
communicate
electronically may be queried, including nanny cameras, etc. For example, a
message may be
sent by mobile computer 101 after receiving events from motion capture sensor
111 wherein the
message may be sent to any cameras for example within wireless range of mobile
device 101.
Alternatively, or in combination, mobile device 101 may send a broadcast
message asking for
any cameras identities that are within a predefined distance from the location
of the event or
query for any cameras pointed in the direction of the event even if not
relatively close. Upon
receiving the list of potential cameras, mobile device 101 may query them for
any images
obtained in a predefined window around the event for example. The computer may
receive
image data or look up the images locally if the computer is coupled with a
camera at 1703. In
one or more embodiments, the server 172 may iterate through videos and events
to determine
any that correlate and automatically trim the videos to correspond to the
durations of the event
start and stop times. Although wireless communications may be utilized, any
other form of
transfer of image data is in keeping with the spirit of the invention. The
data from the event
whether in numerical or graphical overlay format or any other format including
text may be
show with or otherwise overlaid onto the corresponding image for that time at
1704. This is
shown graphically at time 1710, i.e., the current time, which may be
scrollable for example, for
image 1711 showing a frame of a motion event with overlaid motion capture data
1712. See
Figure 6 for combined or simultaneously non-overlaid data for example.
1001391 Figure 18 illustrates an embodiment of the synchronization
architecture that may be
utilized by one or more embodiments of the invention. Embodiments may
synchronize clocks in
the system using any type of synchronization methodology and in one or more
embodiments the
computer 160 on the mobile device 101 is further configured to determine a
clock difference

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
between the motion capture element 111 and the mobile device and synchronize
the motion
analysis data with the video. For example, one or more embodiments of the
invention provides
procedures for multiple recording devices to synchronize information about the
time, location, or
orientation of each device, so that data recorded about events from different
devices can be
combined. Such recording devices may be embedded sensors, mobile phones with
cameras or
microphones, or more generally any devices that can record data relevant to an
activity of
interest. In one or more embodiments, this synchronization is accomplished by
exchanging
information between devices so that the devices can agree on a common
measurement for time,
location, or orientation. For example, a mobile phone and an embedded sensor
may exchange
messages across link 1802, e.g., wirelessly, with the current timestamps of
their internal clocks;
these messages allow a negotiation to occur wherein the two devices agree on a
common time.
Such messages may be exchanged periodically as needed to account for clock
drift or motion of
the devices after a previous synchronization. In other embodiments, multiple
recording devices
may use a common server or set of servers 1801 to obtain standardized measures
of time,
location, or orientation. For example, devices may use a GPS system to obtain
absolute location
information for each device. GPS systems may also be used to obtain
standardized time. NTP
(Network Time Protocol) servers may also be used as standardized time servers.
Using servers
allows devices to agree on common measurements without necessarily being
configured at all
times to communicate with one another.
[00140] Figure 19 illustrates the detection of an event by one of the motion
capture sensors 111,
transmission of the event detection, here shown as arrows emanating from the
centrally located
sensor 111 in the figure, to other motion capture sensors 111 and/or cameras,
e.g., on mobile
device 101, saving of the event motion data and trimming of the video to
correspond to the
event. In one or more embodiments of the invention, some of the recording
devices are
configured to detect the occurrence of various events of interest. Some such
events may occur at
specific moments in time; others may occur over a time interval, wherein the
detection includes
detection of the start of an event and of the end of an event. These devices
are configured to
record any combination of the time, location, or orientation of the recording
device, for example
included in memory buffer 4610 for example along with the event data, or in
any other data
structure, using the synchronized measurement bases for time, location, and
orientation
described above.
[00141] Embodiments of the computer on the mobile device may be further
configured to
discard at least a portion of the video outside of the event start time to the
event stop, for
example portions 1910 and 1911 before and after the event or event with
predefined pre and post
61

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
intervals 1902 and 1903. For example, in one or more embodiments of the
invention, some of
the recording devices capture data continuously to memory while awaiting the
detection of an
event. To conserve memory, some devices may be configured to store data to a
more permanent
local storage medium, or to server 172, only when this data is proximate in
time to a detected
event. For example, in the absence of an event detection, newly recorded data
may ultimately
overwrite previously recorded data in memory, depending on the amount of
memory in each
device that is recording motion data or video data. A circular buffer may be
used in some
embodiments as a typical implementation of such an overwriting scheme. When an
event
detection occurs, the recording device may store some configured amount of
data prior to the
start of the event, near start of pre interval 1902 and some configured amount
of data after the
end of the event, near 1903, in addition to storing the data captured during
the event itself,
namely 1901. Any pre or post time interval is considered part of the event
start time and event
stop time so that context of the event is shown in the video for example. This
gives context to
the event, for example the amount of pre time interval may be set per sport
for example to enable
a setup for a golf swing to be part of the event video even though it occurs
before the actual
event of striking the golf ball. The follow through may be recorded as per the
amount of interval
allotted for the post interval as well.
[00142] Embodiments of the system may further comprise a server computer
remote to the
mobile device and wherein the server computer is configured to discard at
least a portion of the
video outside of the event start time to the event stop and return the video
captured during the
timespan from the event start time to the event stop time to the computer in
the mobile device.
The server or mobile device may combine or overlay the motion analysis data or
event data, for
example velocity or raw acceleration data with or onto the video to form event
video 1900,
which may thus greatly reduce the amount of video storage required as portions
1910 and 1911
may be of much larger length in time that the event in general.
[00143] Embodiments of the at least one motion capture element may be
configured to transmit
the event to at least one other motion capture sensor or at least one other
mobile device or any
combination thereof, and wherein the at least one other motion capture sensor
or the at least one
other mobile device or any combination thereof is configured to save data
associated with said
event. For example, in embodiments with multiple recording devices operating
simultaneously,
one such device may detect an event and send a message to other recording
devices that such an
event detection has occurred. This message can include the timestamp of the
start and/or stop of
the event, using the synchronized time basis for the clocks of the various
devices. The receiving
devices, e.g., other motion capture sensors and/or cameras may use the event
detection message
62

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
to store data associated with the event to nonvolatile storage, for example
within motion capture
element 111 or mobile device 101 or server 172. The devices may be configured
to store some
amount of data prior to the start of the event and some amount of data after
the end of the event,
1902 and 1903 respectively, in addition to the data directly associated with
the event 1901. In
this way all devices can record data simultaneously, but use an event trigger
from only one of the
devices to initiate saving of distributed event data from multiple sources.
100144] Embodiments of the computer may be further configured to save the
video from the
event start time to the event stop time with the motion analysis data that
occurs from the event
start time to the event stop time or a remote server may be utilized to save
the video. In one or
more embodiments of the invention, some of the recording devices may not be in
direct
communication with each other throughout the time period in which events may
occur. In these
situations, devices can be configured to save complete records of all of the
data they have
recorded to permanent storage or to a server. Saving of only data associated
with events may not
be possible in these situations because some devices may not be able to
receive event trigger
messages. In these situations, saved data can be processed after the fact to
extract only the
relevant portions associated with one or more detected events. For example,
multiple mobile
devices might record video of a player or performer, and upload this video
continuously to server
172 for storage. Separately the player or performer may be equipped with an
embedded sensor
that is able to detect events such as particular motions or actions. Embedded
sensor data may be
uploaded to the same server either continuously or at a later time. Since all
data, including the
video streams as well as the embedded sensor data, is generally timestamped,
video associated
with the events detected by the embedded sensor can be extracted and combined
on the server.
Embodiments of the server or computer may be further configured while a
communication link
is open between the at least one motion capture sensor and the mobile device
to discard at least a
portion of the video outside of the event start time to the event stop and
save the video from the
event start time to the event stop time with the motion analysis data that
occurs from the event
start time to the event stop time. Alternatively, if the communication link is
not open,
embodiments of the computer may be further configured to save video and after
the event is
received after the communication link is open, then discard at least a portion
of the video outside
of the event start time to the event stop and save the video from the event
start time to the event
stop time with the motion analysis data that occurs from the event start time
to the event stop
time. For example, in some embodiments of the invention, data may be uploaded
to a server as
described above, and the location and orientation data associated with each
device's data stream
may be used to extract data that is relevant to a detected event. For example,
a large set of
63

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
mobile devices may be used to record video at various locations throughout a
golf tournament.
This video data may be uploaded to a server either continuously or after the
tournament. After
the tournament, sensor data with event detections may also be uploaded to the
same server.
Post-processing of these various data streams can identify particular video
streams that were
recorded in the physical proximity of events that occurred and at the same
time. Additional
filters may select video streams where a camera was pointing in the correct
direction to observe
an event. These selected streams may be combined with the sensor data to form
an aggregate
data stream with multiple video angles showing an event.
[00145] The system may obtain video from a camera coupled with the mobile
device, or any
camera that is separate from or otherwise remote from the mobile device. In
one or more
embodiments, the video is obtained from a server remote to the mobile device,
for example
obtained after a query for video at a location and time interval.
[00146] Embodiments of the server or computer may be configured to synchronize
said video
and said event data, or said motion analysis data via image analysis to more
accurately determine
a start event frame or stop event frame in said video or both, that is most
closely associated with
said event start time or said event stop time or both. In one or more
embodiments of the
invention, synchronization of clocks between recording devices may be
approximate. It may be
desirable to improve the accuracy of synchronizing data feeds from multiple
recording devices
based on the view of an event from each device. In one or more embodiments,
processing of
multiple data streams is used to observe signatures of events in the different
streams to assist
with fine-grained synchronization. For example, an embedded sensor may be
synchronized with
a mobile device including a video camera, but the time synchronization may be
accurate only to
within 100 milliseconds. If the video camera is recording video at 30 frames
per second, the
video frame corresponding to an event detection on the embedded sensor can
only be determined
within 3 frames based on the synchronized timestamps alone. In one embodiment
of the device,
video frame image processing can be used to determine the precise frame
corresponding most
closely to the detected event. See Figure 8 and description thereof for more
detail. For instance,
a shock from a snowboard hitting the ground as shown in Figure 17, that is
detected by an
inertial sensor may be correlated with the frame at which the geometric
boundary of the
snowboard makes contact with the ground. Other embodiments may use other image
processing
techniques or other methods of detecting event signatures to improve
synchronization of multiple
data feeds.
64

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
[00147] Embodiments of the at least one motion capture element may include a
location
determination element configured to determine a location that is coupled with
the
microcontroller and wherein the microcontroller is configured to transmit the
location to the
computer on the mobile device. In one or more embodiments, the system further
includes a
server wherein the microcontroller is configured to transmit the location to
the server, either
directly or via the mobile device, and wherein the computer or server is
configured to form the
event video from portions of the video based on the location and the event
start time and the
event stop time. For example, in one or more embodiments, the event video may
be trimmed to
a particular length of the event, and transcoded to any or video quality for
example on mobile
device 101 or on server 172 or on computer 105 or any other computer coupled
with the system,
and overlaid or otherwise integrated with motion analysis data or event data,
e.g., velocity or
acceleration data in any manner. Video may be stored locally in any
resolution, depth, or image
quality or compression type to store video or any other technique to maximize
storage capacity
or frame rate or with any compression type to minimize storage, whether a
communication link
is open or not between the mobile device, at least one motion capture sensor
and/or server. In
one or more embodiments, the velocity or other motion analysis data may be
overlaid or
otherwise combined, e.g., on a portion beneath the video, that includes the
event start and stop
time, that may include any number of seconds before and/or after the actual
event to provide
video of the swing before a ball strike event for example. In one or more
embodiments, the at
least one motion capture sensor and/or mobile device(s) may transmit events
and video to a
server wherein the server may determine that particular videos and sensor data
occurred in a
particular location at a particular time and construct event videos from
several videos and several
sensor events. The sensor events may be from one sensor or multiple sensors
coupled with a
user and/or piece of equipment for example. Thus the system may construct
short videos that
correspond to the events, which greatly decreases video storage requirements
for example.
[00148] In one or more embodiments, the microcontroller or the computer is
configured to
determine a location of the event or the microcontroller and the computer are
configured to
determine the location of the event and correlate the location, for example by
correlating or
averaging the location to provide a central point of the event, and/or
erroneous location data
from initializing GPS sensors may be minimized. In this manner, a group of
users with mobile
devices may generate videos of a golfer teeing off, wherein the event location
of the at least one
motion capture device may be utilized and wherein the server may obtain videos
from the
spectators and generate an event video of the swing and ball strike of the
professional golfer,
wherein the event video may utilize frames from different cameras to generate
a BULLET TIME

CA 02917542 2016-01-05
WO 2015/164389 PCT/US2015/026896
0 video from around the golfer as the golfer swings. The resulting video or
videos may be
trimmed to the duration of the event, e.g., from the event start time to the
event stop time and/or
with any pre or post predetermined time values around the event to ensure that
the entire event is
captured including any setup time and any follow through time for the swing or
other event.
[00149] In one or more embodiments, the computer on the mobile device may
request at least
one image or video that contains the event from at least one camera proximal
to the event
directly by broadcasting a request for any videos taken in the area by any
cameras, optionally
that may include orientation information related to whether the camera was not
only located
proximally to the event, but also oriented or otherwise pointing at the event.
In other
embodiments, the video may be requested by the computer on the mobile device
from a remote
server. In this scenario, any location and/or time associated with an event
may be utilized to
return images and/or video near the event or taken at a time near the event,
or both. In one or
more embodiments, the computer or server may trim the video to correspond to
the event
duration and again, may utilize image processing techniques to further
synchronize portions of
an event, such as a ball strike with the corresponding frame in the video that
matches the
acceleration data corresponding to the ball strike on a piece of equipment for
example.
[00150] Embodiments of the computer on the mobile device or on the server may
be configured
to display a list of one or more times at which an event has occurred or
wherein one or more
events has occurred. In this manner, a user may find events from a list to
access the event videos
in rapid fashion.
[00151] Embodiments of the invention may include at least one motion capture
sensor that is
physically coupled with said mobile device. These embodiments enable any type
of mobile
phone or camera system with an integrated sensor, such as any type of helmet
mounted camera
or any mount that includes both a camera and a motion capture sensor to
generate event data and
video data.
[00152] While the ideas herein disclosed has been described by means of
specific embodiments
and applications thereof, numerous modifications and variations could be made
thereto by those
skilled in the art without departing from the scope of the invention set forth
in the claims.
66

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

2024-08-01:As part of the Next Generation Patents (NGP) transition, the Canadian Patents Database (CPD) now contains a more detailed Event History, which replicates the Event Log of our new back-office solution.

Please note that "Inactive:" events refers to events no longer in use in our new back-office solution.

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Event History , Maintenance Fee  and Payment History  should be consulted.

Event History

Description Date
Inactive: IPC expired 2023-01-01
Inactive: Grant downloaded 2022-08-16
Inactive: Grant downloaded 2022-08-16
Letter Sent 2022-08-16
Grant by Issuance 2022-08-16
Inactive: Cover page published 2022-08-15
Pre-grant 2022-05-31
Inactive: Final fee received 2022-05-31
Notice of Allowance is Issued 2022-05-30
Letter Sent 2022-05-30
Notice of Allowance is Issued 2022-05-30
Inactive: Q2 passed 2022-04-11
Inactive: Approved for allowance (AFA) 2022-04-11
Inactive: Associate patent agent added 2022-02-22
Appointment of Agent Requirements Determined Compliant 2021-12-31
Revocation of Agent Requirements Determined Compliant 2021-12-31
Inactive: IPC deactivated 2021-11-13
Amendment Received - Voluntary Amendment 2021-09-14
Amendment Received - Response to Examiner's Requisition 2021-09-14
Examiner's Report 2021-05-14
Inactive: Report - No QC 2021-05-07
Common Representative Appointed 2020-11-07
Inactive: IPC assigned 2020-09-18
Letter Sent 2020-05-12
Inactive: COVID 19 - Deadline extended 2020-04-28
Request for Examination Requirements Determined Compliant 2020-04-14
Request for Examination Received 2020-04-14
All Requirements for Examination Determined Compliant 2020-04-14
Inactive: COVID 19 - Deadline extended 2020-03-29
Inactive: COVID 19 - Deadline extended 2020-03-29
Common Representative Appointed 2019-10-30
Common Representative Appointed 2019-10-30
Change of Address or Method of Correspondence Request Received 2019-02-19
Inactive: IPC expired 2018-01-01
Inactive: Cover page published 2016-02-26
Inactive: IPC removed 2016-01-21
Inactive: IPC assigned 2016-01-21
Inactive: IPC removed 2016-01-21
Inactive: IPC assigned 2016-01-20
Inactive: IPC removed 2016-01-20
Inactive: First IPC assigned 2016-01-20
Inactive: IPC assigned 2016-01-20
Inactive: IPC assigned 2016-01-20
Inactive: IPC assigned 2016-01-20
Inactive: First IPC assigned 2016-01-18
Inactive: Notice - National entry - No RFE 2016-01-18
Inactive: IPC assigned 2016-01-18
Inactive: IPC assigned 2016-01-18
Inactive: IPC assigned 2016-01-18
Application Received - PCT 2016-01-18
National Entry Requirements Determined Compliant 2016-01-05
Application Published (Open to Public Inspection) 2015-10-29

Abandonment History

There is no abandonment history.

Maintenance Fee

The last payment was received on 2022-04-19

Note : If the full payment has not been received on or before the date indicated, a further fee may be required which may be one of the following

  • the reinstatement fee;
  • the late payment fee; or
  • additional fee to reverse deemed expiry.

Please refer to the CIPO Patent Fees web page to see all current fee amounts.

Fee History

Fee Type Anniversary Year Due Date Paid Date
Basic national fee - standard 2016-01-05
MF (application, 2nd anniv.) - standard 02 2017-04-21 2017-04-17
MF (application, 3rd anniv.) - standard 03 2018-04-23 2018-04-03
MF (application, 4th anniv.) - standard 04 2019-04-23 2019-04-15
MF (application, 5th anniv.) - standard 05 2020-04-21 2020-04-07
Request for examination - standard 2020-05-19 2020-04-14
MF (application, 6th anniv.) - standard 06 2021-04-21 2021-04-14
MF (application, 7th anniv.) - standard 07 2022-04-21 2022-04-19
Final fee - standard 2022-10-03 2022-05-31
MF (patent, 8th anniv.) - standard 2023-04-21 2023-04-18
MF (patent, 9th anniv.) - standard 2024-04-22 2024-04-10
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
BLAST MOTION INC.
Past Owners on Record
BHASKAR BOSE
MAZEN ABDEL-RAHMAN
MICHAEL BENTLEY
MICHAEL GILLIAN
RYAN KAPS
SHEEHAN ALAM
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Description 2016-01-05 66 4,382
Drawings 2016-01-05 25 2,397
Claims 2016-01-05 9 405
Abstract 2016-01-05 1 78
Representative drawing 2016-01-05 1 36
Cover Page 2016-02-26 1 57
Description 2021-09-14 66 4,483
Claims 2021-09-14 6 249
Cover Page 2022-07-20 1 57
Representative drawing 2022-07-20 1 18
Maintenance fee payment 2024-04-10 3 87
Notice of National Entry 2016-01-18 1 192
Reminder of maintenance fee due 2016-12-22 1 111
Courtesy - Acknowledgement of Request for Examination 2020-05-12 1 433
Commissioner's Notice - Application Found Allowable 2022-05-30 1 575
Electronic Grant Certificate 2022-08-16 1 2,527
Patent cooperation treaty (PCT) 2016-01-05 8 622
National entry request 2016-01-05 5 144
International search report 2016-01-05 1 54
Maintenance fee payment 2017-04-17 1 25
Maintenance fee payment 2018-04-03 1 25
Maintenance fee payment 2019-04-15 1 25
Request for examination 2020-04-14 4 128
Examiner requisition 2021-05-14 3 151
Amendment / response to report 2021-09-14 27 1,111
Maintenance fee payment 2022-04-19 1 27
Final fee 2022-05-31 4 134