Note: Descriptions are shown in the official language in which they were submitted.
CA 02857517 2014-05-29
WO 2013/090946
PCT/US2012/070214
1
SYSTEMS AND METHODS INVOLVING FEATURES OF SEARCH
AND/OR SEARCH INTEGRATION
CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims benefit/priority of U.S. provisional patent
application No. 61/576,352,
filed 15 December 2011, which is incorporated herein by reference in entirety.
BACKGROUND
Field:
Aspects of the present innovations relate to computer networking searches,
and, more
particularly, to associated systems and methods, such as processing search
information,
providing interactive search results, and search integration.
Description of Related Information:
The web has evolved into a rich, multi-media experience, but the process of
searching online
and associated drawbacks have changed little in the last fifteen years. Search
is still primarily
text based (captions) with only small thumbnail images (or previews) appearing
as a visual
search result. Text captions are machine generated and are not a rich or
efficient user
experience. Also, humans process visual information much faster than we
process text, but
there is limited visual information in search results. Search engines have
tried to remedy this
problem by providing "live previews" of the source web pages and presenting
them in text and
graphical form. Unfortunately, this process is expensive, storage heavy and
adds little value for
the end user. Further, Internet search results often result in lists of
hyperlinks that are not very
informative to the searching user.
For example, FIGs. 1 and 2 show exemplary screenshots of prior art search
result pages. These
prior art examples show how generally, when an end user performs an Internet
search, the
search engine produces a search results page (also called an "SERF"). The
prior art, as shown
in FIGs 1 and 2, contain lists of results with hyperlinks and a sentence or
two about each result,
101, and 201. That text, 101, 102, is machine-selected by proprietary
algorithms unique to each
search engine¨ as opposed to being curated by humans ¨ and is sometimes a
random and
not adequate description of the linked page. As such, there is no end-user
control of the
displayed text.
The selected text is called a "caption" as shown in FIGs 1 at 101, and FIG 2
at 201. Captions
were first used when there was no rich media on the web and, therefore, were
only text- based.
CA 02857517 2014-05-29
WO 2013/090946
PCT/US2012/070214
2
Because of this legacy, architecture search results are mostly text-based
captions as shown in
FIGs 1 and 2, the way users consume this media is in a limited format ¨
meaning that they can
only view search results as one form of media at any given time, such as
limited to just video, or
just text.
Continuing with FIGs 1 and 2, the prior art presented results as text, still
images or video. There
is not a great deal of context to the captions in search results and the
presentation of those
results is different from every search engine even though each search engine
has its own
proprietary search algorithms. In order to refine a search in the prior art
systems, one must start
a search over or hit the "back" button to return to earlier results. Further,
searches from mobile
devices only compound problems in the prior art. With limited screen real
estate, proprietary
operating systems, limited bandwidth and a variety of interfaces, such as
touch, voice,
keyboards - both on screen and physical.
FIGs 3 and 4 are illustrations of exemplary prior art web page previews. FIGs
3 and 4 show that
even when an entire page is presented as a live preview, 301, 401 ¨ as it is
with example
company SERP, there is not much value added to the user's search. The
information is densely
packed and the graphics are too small to be useful. Only the general layout of
the page is
discernible which does little in terms of adding content or context.
Another problem is that search engine results are often inaccurate and
imperfect. Text captions
do not always accurately represent the content on a site because they lack
context and
richness. As a result, a search may not be efficient. Users often waste time
uncovering the
actual context of individual search results.
Currently, companies or website publishers do not have control over how their
caption(s) appear
within a SERP. The captions are algorithmically machine generated and cannot
be curated by
the owner of a site.
In sum, there is a need for systems and methods that address the above
drawbacks an/or
provide other beneficial functionality or advantages to parties involved with
search.
SUMMARY
Systems and methods consistent with the present innovations are directed to
implementations
such as processing search information, providing interactive search results,
and search
integration, among others. According to some implementations, system and
methods herein
CA 02857517 2014-05-29
WO 2013/090946
PCT/US2012/070214
3
may allow for search results of improved nature, such as results that are
interactive, expanded,
deeper and/or richer as a function of mixed-media components, as well as
improved user
experience and/or improved value to various participants, among other
benefits.
It is to be understood that both the foregoing general description and the
following detailed
description are exemplary and explanatory only and are not restrictive of the
inventions, as
described. Further features and/or variations may be provided in addition to
those set forth
herein. For example, the present inventions may be directed to various
combinations and
subcombinations of the disclosed features and/or combinations and
subcombinations of several
further features disclosed below in the detailed description.
DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which constitute a part of this specification,
illustrate various
implementations and features of the present inventions and, together with the
description,
explain aspects of innovations herein. In the drawings:
FIG. 1A is an exemplary screenshot of a prior art search result page.
FIG. 1B is block diagram of FIG 1A.
FIG. 2A is an exemplary screenshot of a prior art search result page.
FIG. 2B is block diagram of FIG 2A.
FIG. 3A is an illustration of exemplary prior art web page previews.
FIG. 3B is block diagram of FIG 3A.
FIG. 4A is an illustration of exemplary prior art web page previews.
FIG. 4B is block diagram of FIG 4A.
FIG. 5A is an illustration of a search engine results page with integration
features consistent
with certain aspects of the innovations herein.
FIG. 5B is block diagram of FIG 5A consistent with certain aspects of the
innovations herein.
FIG. 6A is an illustration a live preview showing an example search engine
results page
consistent with certain aspects of the innovations herein.
FIG. 6B is block diagram of FIG 6A consistent with certain aspects of the
innovations herein.
FIG 7A is a diagram illustrating an example search engine results page from a
re-query
consistent with certain aspects related to the innovations herein.
CA 02857517 2014-05-29
WO 2013/090946
PCT/US2012/070214
4
FIG. 7B is block diagram of FIG 7A consistent with certain aspects of the
innovations herein.
FIG. 8A is an example showing ad placement in an implementation consistent
with certain
aspects related to the innovations herein.
FIG. 8B is block diagram of FIG 8A consistent with certain aspects of the
innovations herein.
FIG. 9 is an exemplary screenshot showing an illustrative mobile device
display including a
search engine results page with integrated mixed-media component consistent
with certain
aspects related to the innovations herein.
FIG. 10 is an illustration of an exemplary search engine results page showing
user action with a
mobile device display search results page consistent with certain aspects
related to the
innovations herein.
FIG. 11 is an exemplary screenshot illustrating further mobile device display
functionality
consistent with certain aspects related to the innovations herein.
FIG 12 is an exemplary screenshot illustrating mobile device display of a
search results content
such as a mixed-media module consistent with certain aspects related to the
innovations herein.
FIG 13 is an exemplary screenshot of an illustrative mobile device display
showing user
interaction with a mixed-media module from the search results consistent with
certain aspects
related to the innovations herein.
FIG 14 is an exemplary screenshot of a mobile device display showing an
illustrative result of a
user interaction consistent with certain aspects related to the innovations
herein.
FIG 15 is an illustration showing an example gesture consistent with certain
aspects related to
the innovations herein.
FIG 16 is an illustration showing an example gesture consistent with certain
aspects related to
the innovations herein.
FIG 17 is an illustration showing an example gesture consistent with certain
aspects related to
the innovations herein.
FIG 18 is an illustration showing an example gesture consistent with certain
aspects related to
the innovations herein.
FIG 19 is an illustration of an exemplary search engine results page showing
integration/
position aspects consistent with certain aspects related to the innovations
herein.
CA 02857517 2014-05-29
WO 2013/090946
PCT/US2012/070214
DETAILED DESCRIPTION OF ILLUSTRATIVE IMPLEMENTATIONS
Reference will now be made in detail to the invention, examples of which are
illustrated in the
accompanying drawings. The implementations set forth in the following
description do not
represent all implementations consistent with the claimed invention. Instead,
they are merely
5 some examples consistent with certain aspects related to the invention.
Wherever possible, the
same reference numbers will be used throughout the drawings to refer to the
same or like parts.
According to some implementations, systems and methods consistent with the
innovations
herein are directed to providing search results with improved features. For
example, aspects
herein may relate to innovative integration of a rich, mixed-media,
interactive component, also
sometimes referred to as a 'Qwiki'TM component or module, into search results
pages. In some
implementations, this component or module may be an interactive narrative
presentation of the
content that is being searched and it may feature an interactive layer which
allows the recipient
of the search result to receive more detailed information without leaving the
search engine
results page ("SERP"). According to certain embodiments, systems and methods
involving
search results integrated with these component(s) may include features that
are innovative over
existing systems as a function of the information density and mixed-
media/multimedia
capabilities of such "mixed-media" integrated component(s).
As set forth herein, implementations may involve the integration of such
component into a
search engine results page (SERP). This can be any existing or future SERP
including those
popular today. Moreover, various SERP-component integrated systems and methods
herein
provide display of search engine results in an interactive playable format
compatible with mobile
devices and their variety of interfaces.
FIGs. 5A and 5B are illustrations of an exemplary search engine results page
including an
integrated mixed-media module 501 consistent with aspects of the innovations
herein. Such
implementations allow the user to stay on the search page and efficiently
interact with the
search engine in a way that is beneficial for that search engine through
deeper more refined
searches, increased ad views and clickthrough rates (CTR). Further, in various
embodiments
set forth herein, the integrated component may include features that serve as
an "interactive
summary" of a web page/search result which enhances the utility of the search
experience. This
results in higher quality searches and the increased revenue for the search
provider, such as
through re-queries (deeper searches in the existing topic).
CA 02857517 2014-05-29
WO 2013/090946
PCT/US2012/070214
6
In one illustrative implementation, for example, there is provided a method of
processing search
information comprising, processing information to return, to a user, search
results via a search
engine, in a results page. The search results page, in one example includes at
least one pre-
prepared, non-rendered narrative multimedia presentation. The example method
further
comprising, providing, for display via a search results page, at least one
interactive multimedia
presentation selectable by the user. In particular, such multimedia
presentation may be a
mixed-media module as specified herein. Additionally, the example method
further comprising
providing, as a function of a first interaction of the user with a selected
multimedia presentation,
user access to at least one of third party information, web sites, content,
applications and/or
other multimedia. Also, the example method could include providing, as a
function of a second
interaction of the user with the selected multimedia presentation,
functionality configured to
receive a new search query and generate a new search results page.
Further integrations of such components with search results also involve
creation of new ad
units (and thereby, in some implementations, new sources of revenue) inside of
the mixed-
media interactive summary, as explained further below.
Consistent with certain implementations, another way systems and methods
herein may depart
from the traditional search experience, and from online video, is that the end
user does not have
to experience the mixed-media module in a linear fashion. The user can choose
their path
through the content with various functionality, such as clicking on hyperlinks
within the mixed-
media module, via navigation functionality/gestures, and the like. This allows
the end-user to
explore the information that is of the most interest to them, in greater
detail and in their preferred
format, e.g., text, photos, video, etc.
Turning again to FIGs. 5A and 5B, a mixed-media module 501 can be a
controllable media
component within a search results page. Here, for example, a mixed-media
module may give
the publisher of a site control over its brand and its content as it appears
on the search results
page within such mixed-media module. This may be accomplished via creator
tools associated
with creation of such modules that generate an embeddable interactive object,
or via markup
language that publishers can include on their site that is recognized by
search engine crawlers.
This also leads to a better search experience for the end user.
Implementations include the
integration of a multimedia component such as a mixed-media module into the
SERP of an
Internet search engine as illustrated in FIGs. 5A-5B. Such component/module's
interactive
summary creates a playable caption that surfaces the best contents from the
page 501. The
CA 02857517 2014-05-29
WO 2013/090946
PCT/US2012/070214
7
title in this illustration, for example, "Watch the Qwiki [module], Tokyo" may
specified by the
creator 505.
FIGs. 6A and 6B depict exemplary preview illustrations showing illustrative
search engine
results pages with mixed-media modules. Consistent with this basic preview as
explained
herein, a mixed-media module integrated into the search results page provides
for a richer user
experience and increases traffic for that page. Further, implementations may
include playable
captions that provide more context than regular text captions used in existing
systems.
Consistent with the innovations herein, systems and methods are provided
involving procedures
and/or mechanisms for enhancing search results via novel integration of mixed-
media modules.
Such implementations may present coordinated text, images, video, documents,
narrations and
links all available in one interactive screen format or window. Examples of
these can be seen in
Figs. 6A and 6B. Here, for example, the mixed-media module may be a rich
multimedia visual
and interactive piece of content. A search results page, SERP, integrated with
such mixed-
media module acts as an interactive multimedia summary of a search result
rather than just a
text based caption integrated into an SERP as previously done.
As seen in connection with FIGs. 6A and 6B, the typical search engine result
is augmented or
even replaced by a mixed-media module 601 that enhances the results.
Navigation to a desired
result, e.g. a selected mixed-media module, may be an expansion inside the
normal search
results into a larger display. Further, a 'new search' button, icon or
functionality may be
included within mixed-media modules, e.g., a magnifying glass icon 603. This
may be
configured to allow for further searching or re-querying within the mixed-
media module. Further,
a media/asset loading bar 605 may also be included, allowing for audio and/or
video to play in
the mixed-media module or in another window. The mixed-media module may also
include one
or more hyperlinks 610 to other web pages. An expander button 615 may also be
included to
allow for the mixed-media information in the module to be displayed in a full
screen format.
With regard to these implementations, such as 'new search' functionality,
systems and methods
herein may involve methods of processing search information comprising a
computer server
configured to communicate with at least one search engine web crawler.
Exemplary methods
also may include the computer server configured to receive the search engine
web crawler
results from at least a first query, and to generate search results for
display in a browser window
based on the first query. Methods may also include embodiments involving
provision of search
CA 02857517 2014-05-29
WO 2013/090946
PCT/US2012/070214
8
results include at least a customizable caption, various multimedia content,
and at least one
hyperlink configured to cause a re-query of the search engine web crawler.
Referring still to FIGs. 6A and 6B, in accordance with some aspects of the
innovations herein, a
mixed-media module integrated SERP also improves the usefulness of search. A
seen in FIGs.
6A and 6B, such interactive component has a higher density of information than
the prior art,
which proves to be more valuable to the end user, online content providers,
and the search
engines. The search engine crawlers can detect certain mixed-media module,
such as via
detection of metadata associated with QwikiTM modules, and embed it in a
search results page
(SERF). Further, implementations herein may utilize the mixed-media module as
an interactive
and playable caption, 605.
According to implementations herein, once played, the mixed-media module may
expand within
the page as shown in FIG. 6 at 601 and can offer the user a variety of options
to explore related
content triggering new search queries, 603, media/asset loading, 605, links to
related pages,
610, and playback options, 615. Further, component video or audio files may be
played within
the mixed-media module on the SERP, without need for loading an external page.
In addition to the display of related media/links in a new window on the same
page, further
implementations provide functionality to display associated content on the
same SERP by
instantly modifying it or the contents around it. Here, for example, this new
integrated content
may be displayed without triggering new tabs.
Additionally, in contrast to existing SERP functionality where captions are
algorithmically
machine generated and cannot be curated by relevant parties, systems and
methods herein
may provide a controllable interactive media component within a search results
page. For the
first time, then, implementations herein involving the mixed-media modules
allow the publisher
of a site control over its brand and its content as it appears on the search
results page within the
mixed-media module.
Further, consistent with certain aspects related to the innovations herein,
present
implementations improve upon and enhance existing search technologies because
they
provides narrative context to search results ¨ something lacking until now.
Results herein are
a richer experience with more visual, linked information and interactive
features.
CA 02857517 2014-05-29
WO 2013/090946
PCT/US2012/070214
9
As a function of the present mixed-media modules embodiments, which may be
created by
participants of the search process, search results may be more accurate and
provide better
context. Consistent with implementations herein, brand managers and content
publishers can
control their story within a search engine result without purchasing expensive
search
advertising. This is particularly valuable because existing captions are often
not relevant for a
search engine user they add little or no value to the process. They contain a
limited amount of
data and few clues as to the overall content of the site those captions are
supposed to
summarize. In other words, captions lack the context and visual richness
provided via the
innovations herein. Additionally, search engine results are clustered in a way
that isn't helpful
and can be overwhelming. Users get results that don't help with a decision
because they are
unrelated to what the user actually needs. The limited text in a caption often
doesn't reveal
enough information. As a result the user must select links, search that site
and, if it is not the
desired result, back up to the original search results or begin a new search
from scratch. It's
time consuming, awkward and makes things easy for a user to get lost.
According to further embodiments, a search result enhanced via present mixed-
media
module(s) implementations may also involve innovations associated with second
or follow-up
queries, referred to herein as "re-query." FIG 7 is a diagram illustrating an
example search
engine results page associated with a re-query, consistent with aspects of the
innovations
herein. Notably, a re-query allows a search engine user to refine their search
results without
losing the original search. Clicking on a hyperlink within the mixed-media
module allows the
user to "re-query" the search engine and dig deeper into a subject by
searching the mixed-
media/interactive components within a module. Implementations are also
configured such that
this opens a new window without closing the original one and thereby reduce
the need to
constantly hit the "back" button in order to return to the original results.
This enables the ability
to search, and then re-search specific details of interest within a search
result without getting
distracted or lost.
In one illustrative implementation, for example, there is provided a method of
processing search
information comprising processing information to return, to a user, search
results via a search
engine, in a results page. This example method could also include where the
results page
includes at least one pre-prepared, non-rendered narrative multimedia
presentation, such as a
mixed-media module. Further, the example method could include providing, for
display via a
search results page, at least one interactive multimedia presentation
selectable by the user.
Also, providing, as a function of a first interaction of the user with a
selected multimedia
CA 02857517 2014-05-29
WO 2013/090946
PCT/US2012/070214
presentation, user access to at least one of third party information, web
sites, content,
applications and/or other multimedia. And the example method could also
include providing, as
a function of a second interaction of the user with the selected multimedia
presentation,
functionality configured to receive a new search query and generate a new
search results page.
5 Various "re-query" implementations also allows users to stay on a search
page and refine their
searches in new windows without losing the original search or getting lost.
This is more efficient
for users and less frustrating as they are more likely to find their desired
results. Systems and
methods herein may be configured to refine a SERP via such functionality,
allowing for high
information density. For example, the re-query can show selected caption with
images 705. It
10 can also show video or animation 710. Moreover, specific concepts may
even be suggested for
further re-query 715. In addition to the display of related media/links in a
new window on the
same page there is an option to display associated content on the same SERP by
instantly
modifying it or the contents around it. This new integrated content is
displayed without
triggering new tabs.
These "re-query" innovations may also drive a deeper understanding of a
queried subject matter
by displaying related search topics. As such, systems and methods herein
provide for a mixed
media/multi-media capability which can illustrate/enhance a selected search
result with images,
videos, animations, documents and even narrations. Specific concepts can be
suggested for re-
query, driving additional search engine traffic. This additional traffic
yields higher advertising
rates on the re-query pages as the searches are more specific and focused by a
more specific
customer interest. The richness of the re-queried media also achieves
beneficial advertising
results, given that richer media fetches an increased CPM/CPT (Cost Per
Thousand
impressions) that advertisers are willing to pay.
Systems and methods herein overcome other issues with search engine results,
namely
problems associated with search placement. Placement on the SERP is important
because, the
higher the placement, the more likely a site will be selected by a user. For
this reason, the top
of the page is seen as the most valuable real estate on an SERP. Entire
industries have been
created just to place a search result in a higher position in the SERP as
processing pages and
pages of text results is time consuming.
Presently search engines consider it a success when a user spends a minimal
amount of time
on their page. This might be counterintuitive, but it's because a quick search
process means
CA 02857517 2014-05-29
WO 2013/090946
PCT/US2012/070214
11
that the user is finding the information that they need and moving on. The
down side to this is
that the search engine only has a limited amount of time to display ads and
monetize the
interaction. As such, implementations herein provide an innovative and ideal
scenarios for
search engines, e.g., keeping users on their site through a layer of
interactivity that allows for a
deeper exploration of search results without leaving the original search time
and time again.
While visual results in searches can yield better results, previews of
websites are very
expensive for search engines to create, maintain and store. Bandwidth is also
an issue when
end users access search engines via mobile devices. On mobile devices and
smart phones, in
particular, there is limited screen real estate and text-based search results
are tiny and difficult
to read. It's even more difficult for a user to differentiate between search
results when looking at
a tiny screen.
Moreover, many search engines are adding video content to their search
results. Video is
becoming more prevalent online because publishers don't want to present text-
only sites and
there is a desire to differentiate/supplement search placement; however,
traditional streaming
video is time-consuming to create and view. Video content is also highly
compressed on mobile
devices resulting in poor streaming and picture quality. Video is also hard to
interact with
because there is no standard, universal layer for interactivity. For the most
part, video is a
stand-alone experience because of the lack of an interactive layer. In
addition, similar to
exploring component web pages, watching and re-searching for appropriate
videos is very time
consuming ¨ because of limited previews, users often don't know if they have
discovered the
right or wrong video related to their topic, as the videos are indexed and
retrieved via keyword,
not according to the content of the pages also part of the same search result.
Embodiments herein address these issues and drawbacks, as well. In one
illustrative
implementation, for example, there is provided a method of processing search
information
comprising a computer server configured to communicate with at least one
search engine web
crawler. The example method could also have the computer server configured to
interact with
the search engine web crawler search results by causing display of the search
results. And the
example method may include wherein the search results include interactive
multimedia content,
e.g., one or more mixed-media modules, and/or associated content such as at
least one
hyperlink, etc.
CA 02857517 2014-05-29
WO 2013/090946
PCT/US2012/070214
12
Especially in view of the issued with traditional video content noted above,
systems and
methods herein are an improvement on other rich media such as online video
technology
because they use less bandwidth, are easily customizable, flexible,
incorporate interactive
video, images, text and other types of media.
In still other exemplary embodiments herein, mixed-media module integrated
implementations
can incorporate interactive images, text and other types of media. Further,
given such
implementations operate without large-bandwidth video transmissions especially
rendered video
content for audiovisual/multimedia experience, systems and methods herein
provide an
expanded interactive search with other mixed media, thus allowing for quicker
loads and
consumption of less bandwidth during utilization.
FIG. 8 is an example showing illustrative ad placement 801 features,
consistent with aspects of
the innovations herein. The integration of mixed-media module interactive
summaries into a
SERP creates additional advertising monetization units, these units can be
presented as
interactive captions on the CPC/PPC (Cost Per Click/Pay Per Click)
advertisements that
traditionally are placed alongside search results, or the CPC/PPC ads (and
other promotional
units) can be placed within the mixed-media module itself, as shown in FIG 8.
For example, the
interactive summary can be presented as a caption on the CPC advertisement
that are
traditionally placed alongside organic search results 801. In some
implementations, the CPC
ads can be placed within the multimedia presentation or mixed-media module
805, itself.
It should be noted that FIG 8 may give the appearance that the CPC ad is
loading within the
Wikipedia result. However, implementations may include the CPC ad displaying
its own mixed-
media module. Loading the CPC ad into the Wikipedia mixed-media module is a
different
embodiment from such implementations.
Referring now to FIGs. 9-14, implementations herein with mixed-media module
integrations
involving video can yield improved/higher quality on mobile devices,
consistent with aspects of
the innovations herein. In one illustrative implementation, for example, there
is provided a
method of processing search information comprising returning search results in
a search results
page including one or more pre-prepared narrative multimedia presentations.
The example
method could also include providing at least one integrated multimedia
presentation selected by
a user. And, also, providing access to at least one of additional third party
information, sites,
content, applications and other multimedia. Further, the example method could
include
CA 02857517 2014-05-29
WO 2013/090946
PCT/US2012/070214
13
wherein, the multimedia presentations are configured in association with other
features for low-
bandwidth (e.g., non-rendered, etc.) display for use on a mobile device.
Also, given the flexible and non-rendered nature of the mixed-media modules,
streaming and
picture quality can be easily optimized for specific mobile devices. Further,
such
implementations allow ease of interactions by providing a standard universal
layer for
interactivity. In other embodiments, systems and methods herein may include
features and
implementations involving interactive and coordinated hyperlinks for deeper
exploration of the
content within the video ¨ this feature of coordinating links/content inside
of the mixed-media
module interactive summary allow new attribution and monetization capabilities
by content
creators and search engines utilizing the underlying model(s).
Here, it should be noted that a "mobile device" can be any kind of smartphone,
tablet computer,
laptop, notebook, or any kind of similar device. These devices are typically
touch screen
enabled and retain internet connectivity through either a shorter range radio
such as those used
in WiFi technologies, or through cellular telephone connections, or both. The
device may
connect to the internet in any fashion.
FIG. 9 depicts an illustrative SERP with mixed-media module implementation
formatted for a
mobile smartphone or tablet computer, consistent with aspects of the
innovations herein. As
shown, for example, an illustrative "Play Qwiki module" icon is shown directly
beneath the first
search result in the search result screen.
FIG. 10 is an illustration of a search engine results page with the
integration of touch-enable
functionality consistent with aspects of the innovations herein. In FIG 10, a
user is shown
tapping the "Play Qwiki module" icon using a finger. Touch enabled screens
allow such
interaction with a stylus or other such device as well, while such features
may be navigated with
various cursor-based functionality, as well.
FIG. 11 is an illustration of exemplary mobile device display and
functionality consistent with
aspects of the innovations herein. In the example in FIG 11, the mobile
smartphone may be
rotated to initiate a specified function associated with the SERP or just to
allow for a landscape
display, instead of a profile display.
FIG 12 is an exemplary screenshot illustrating mobile device display of a
search results content
such as a mixed-media module consistent with certain aspects related to the
innovations herein.
CA 02857517 2014-05-29
WO 2013/090946
PCT/US2012/070214
14
FIG 13 is an exemplary screenshot of an illustrative mobile device display
showing user
interaction with a mixed-media module from the search results consistent with
certain aspects
related to the innovations herein. Figure 13 shows a user interacting with a
portion of the
mixed-media module, here tapping the particular media or object with respect
to which
additional content (details, information, etc) or further functionality ("re-
query", etc) is desired.
As set forth elsewhere herein, the search engine may be configured to
interoperate with such
action in a variety of ways.
FIG 14 is an exemplary screenshot of a mobile device display showing an
illustrative result of a
user interaction consistent with certain aspects related to the innovations
herein. Here, this
example shows an illustrative re-direct associated with the tapped object to a
particular web
page. The result shows a multimedia text and image or video within the web
page.
Turning to some more general aspects, an illustrative multimedia presentation
herein may be
configured as an interactive system of mixed-media/interactive content with
clickable
components. Various mixed-media modules, here, may also provide a visual
confirmation of
search results which means less frustration and more productivity for the
user. These mixed-
media modules may also provides visual relevancy - the multimedia nature of
such interactive
component provides more in-depth detail of a topic than text alone.
Further, it is noted that pages with multi-media components are often ranked
higher in search
engine results. In accordance with aspects of the present innovations herein,
systems and
methods herein provide ways for content creators to provide interactive multi-
media content
and, in some implementations, improve their search engine ranking through
increased meta-
data information. The visual nature of embodiments herein also means that such
result would
not have to be ranked at the very top of an SERP to catch the attention of a
search engine user
since visual images are more efficiently scanned than text. For online
advertisers, better search
results will mean greater return on investment. Online ads will be viewed
within a more
appropriate context and, therefore, more likely to target the right consumers.
Interactions with
the associated mixed-media modules can also provide additional data to rank
pages.
In accordance with aspects of the present innovations, mixed-media module
interactive
summaries as integrated herein are lightweight - they use less bandwidth than
pure video and
are a rich, interactive, multi-media experience. Viewing such mixed-media
module is faster and
easier than video alone because they are interactive and have more discrete
sets of contents
CA 02857517 2014-05-29
WO 2013/090946
PCT/US2012/070214
that can easily be traversed beyond a simple play bar associated with most
traditional video.
Mixed-media modules herein also contain more information (meta-data) than
video because of
its multitude of components (mixed media), interactive nature and because of
the ability to re-
query.
5 With regard to certain aspects of the innovations herein, another way
that implementations
herein are an improvement over the traditional search experience, especially
from online video,
is that that end user does not experience the mixed-media module in a linear
fashion. A user
can readily jump to different collections of media once a quick scan assures
them the preset set
of options will not yield the desired results. The user can also choose their
path through the
10 content by clicking on hyperlinks (meta-data) within the mixed-media
module. This allows the
end-user to explore the information that is of the most interest to them, in
greater detail and in
their preferred format (i.e. text, photos, or video). Innovations herein also
work across multiple
platforms. For example, mixed-media module interactive components herein can
run inside a
standard web browser and its player software can be integrated into mobile
devices, TV
15 devices, video game units, etc. Further, such mixed-media module(s) may
be configured as a
universal component across all media and devices.
In accordance with aspects of the present innovations, mixed-media modules
herein can act as
an "interactive summary/caption" which highlights the curated content from a
search result and
presents it in narrative form. As such, users may "preview" the contents of
the search in an
engaging, interactive experience on multiple devices. In certain
implementations, an interaction
a user may have with the mixed-media module is via "Gestures", such as set
forth in connection
with FIGs. 15-19. These Gestures may include various touch-screen enabled
interactions
whereby a user is able to tap, pinch, tap and hold, and swipe or scroll the
mixed-media module.
Various search engines, servers and/or intermediaries may be configured to
respond to or
interact in accordance with these Gestures in different ways, such as the
examples as
described in the Figures and associated descriptions herein. Thus, some
implementations
herein include methods wherein the interactive multimedia content is
configured to allow a new
search query and generate a new search results page.
FIG 15 shows an example Gesture consistent with aspects of the innovations
herein. Here,
within a search result expanded to the selected mixed-media module, systems
and methods
herein may be configured to respond to a user tap or click of an object in the
grid or in the feed
to open another mixed-media module, webpage, video, or detailed animation in
an overlay over
CA 02857517 2014-05-29
WO 2013/090946
PCT/US2012/070214
16
the current screen. Thus, some embodiments include methods wherein the
interaction includes
a tap of a portion, button or link of the selected multimedia presentation
used in the generation
of the new search results page.
FIG 16 shows another example Gesture consistent with aspects of the
innovations herein.
Here, a user can pinch into an object in the grid to see detailed or related
information on the
object including source, related media, access interactive animations, view
full video, read full
article, and the like. Thus, some embodiments include methods wherein
interactions include a
pinch of a portion, button or link of the selected multimedia presentation
used in the generation
of the new search results page.
FIG 17 shows another example Gesture consistent with aspects of the
innovations herein.
Here, for example, systems and methods herein may be configured such that a
user can tap or
click and hold on an element in the grid or in the feed to provide various or
additional options.
Such options may include, though are not limited to, open now, queue for
later, add to favorites,
etc. Thus, some embodiments include methods wherein interactions include a tap
and hold of a
portion, button or link of the selected multimedia presentation used in the
generation of the new
search results page.
FIG 18 shows another example Gesture consistent with aspects of the
innovations herein.
Here, a user can swipe or scroll with one finger left or right over the grid
to advance or rewind
the presentation of the mixed-media. Thus, some embodiments include methods
wherein
interactions include a swipe or scroll of a portion, button or link of the
selected multimedia
presentation used in the generation of the new search results page.
FIG 19 shows another example of an illustrative interface involving a sample
search result and
mixed-media module presentation, consistent with aspects of the innovations
herein. Here, for
example, the mixed-media module may be presented as a very foremost piece of
content,
such as the first item to select in the upper, left portion of the search
result. Such placement
yields easy user access to greater content in the mixed-media module, and all
of the
associated benefits therein to the search engine/provider and ad revenue
partners.
In the description here, it is to be understood that both mouse / cursor
enabled computing
devices, and those without cursors, but use touch screen technologies are both
fully
supported. To that, the terms "click" or "tap" or "touch" can be used
synonymously and
interchangeably. Thus, a clickthrough is the same as a tap-through or any
other term with the
CA 02857517 2014-05-29
WO 2013/090946
PCT/US2012/070214
17
equivalent meaning. The mobile wireless devices can be touch screen enabled,
using a stylus
or finger or other such thing to interact with the screen, and objects on the
screen. The touch
screen enabled technologies also allow for pinching in or out to zoom in or
out or enlarge or
shrink an object or the display. Sliding a touch can scroll either in vertical
or horizontal
directions, or any other direction supported by the system. The touch screens
can also detect
a prolonged tap, opening further functionality when a prolonged tap and hold
occurs on an
object. In devices that do not support a touch screen, such functionality can
be accomplished
by a cursor or pointer of some sort, typically controlled by a mouse, pointer
stick, roller ball,
etc. There may be additional functionality embedded into the display objects
to allow for some
of the functionality such as a scroll bar or zoom buttons, etc. These
functionalities are also fully
supported here and can be used interchangeably with the touch screen enabled
technologies.
In the present description, the terms component, module, device, etc. may
refer to any type of
logical or functional process or blocks that may be implemented in a variety
of ways. For
example, the functions of various blocks can be combined with one another into
any other
number of modules. Each module can be implemented as a software program stored
on a
tangible memory (e.g., random access memory, read only memory, CD-ROM memory,
hard
disk drive) within or associated with the computing elements, sensors,
receivers, etc. disclosed
above, e.g., to be read by a processing unit to implement the functions of the
innovations
herein. Or, the modules can comprise programming instructions transmitted to a
general
purpose computer or to processing hardware via a transmission carrier wave.
Also, the
modules can be implemented as hardware logic circuitry implementing the
functions
encompassed by the innovations herein. Finally, the modules can be implemented
using
special purpose instructions (SIMD instructions), field programmable logic
arrays or any mix
thereof which provides the desired level performance and cost.
As disclosed herein, implementations and features of the invention may be
implemented
through computer-hardware, software and/or firmware. For example, the systems
and methods
disclosed herein may be embodied in various forms including, for example, a
data processor,
such as a computer that also includes a database, digital electronic
circuitry, firmware,
software, or in combinations of them. Further, while some of the disclosed
implementations
describe components such as software, systems and methods consistent with the
innovations
herein may be implemented with any combination of hardware, software and/or
firmware.
Moreover, the above-noted features and other aspects and principles of the
innovations herein
may be implemented in various environments. Such environments and related
applications
CA 02857517 2014-05-29
WO 2013/090946
PCT/US2012/070214
18
may be specially constructed for performing the various processes and
operations according to
the invention or they may include a general-purpose computer or computing
platform
selectively activated or reconfigured by code to provide the necessary
functionality. The
processes disclosed herein are not inherently related to any particular
computer, network,
architecture, environment, or other apparatus, and may be implemented by a
suitable
combination of hardware, software, and/or firmware. For example, various
general-purpose
machines may be used with programs written in accordance with teachings of the
invention, or
it may be more convenient to construct a specialized apparatus or system to
perform the
required methods and techniques.
Aspects of the method and system described herein, such as the location
estimate features,
may be implemented as functionality programmed into any of a variety of
circuitry, including
programmable logic devices ("PLDs"), such as field programmable gate arrays
("FPGAs"),
programmable array logic ("PAL") devices, electrically programmable logic and
memory
devices and standard cell-based devices, as well as application specific
integrated circuits.
Some other possibilities for implementing aspects include: memory devices,
microcontrollers
with memory (such as EEPROM), embedded microprocessors, firmware, software,
etc.
Furthermore, aspects may be embodied in microprocessors having software-based
circuit
emulation, discrete logic (sequential and combinatorial), custom devices,
fuzzy (neural) logic,
quantum devices, and hybrids of any of the above device types. The underlying
device
technologies may be provided in a variety of component types, e.g., metal-
oxide
semiconductor field-effect transistor ("MOSFET") technologies like
complementary metal-oxide
semiconductor ("CMOS"), bipolar technologies like emitter-coupled logic
("ECL"), polymer
technologies (e.g., silicon-conjugated polymer and metal-conjugated polymer-
metal
structures), mixed analog and digital, and so on.
It should also be noted that the various logic and/or functions disclosed
herein may be enabled
using any number of combinations of hardware, firmware, and/or as data and/or
instructions
embodied in various machine-readable or computer-readable media, in terms of
their
behavioral, register transfer, logic component, and/or other characteristics.
Computer-readable
media in which such formatted data and/or instructions may be embodied to
include, but are
not limited to, non-volatile storage media in various forms (e.g., optical,
magnetic or
semiconductor storage media), though do not include non-tangible media.
CA 02857517 2014-05-29
WO 2013/090946
PCT/US2012/070214
19
Unless the context clearly requires otherwise, throughout the description and
the claims, the
words "comprise," "comprising," and the like are to be construed in an
inclusive sense as
opposed to an exclusive or exhaustive sense; that is to say, in a sense of
"including, but not
limited to." Words using the singular or plural number also include the plural
or singular
number respectively. Additionally, the words "herein," "hereunder," "above,"
"below," and words
of similar import refer to this application as a whole and not to any
particular portions of this
application. When the word "or" is used in reference to a list of two or more
items, that word
covers all of the following interpretations of the word: any of the items in
the list, all of the items
in the list and any combination of the items in the list.
Other implementations of the invention will be apparent to those skilled in
the art from
consideration of the specification and practice of the invention disclosed
herein. It is intended
that the specification and examples be considered as exemplary only, with a
true scope and
spirit of the invention being indicated by the disclosure above in combination
with the following
paragraphs describing the scope of one or more implementations of the
following invention