docs: Revised "Optimizing Content for the Assistant" page.

Bug: 25080159
Change-Id: Ie4167ba732ffbd401cabcb7f0ba3bac41a029506
This commit is contained in:
Kevin Hufnagle
2016-08-04 14:33:36 -07:00
parent ab35b73d1b
commit d4197234bb

View File

@@ -11,110 +11,92 @@ page.article=true
<div id="tb">
<h2>In this document</h2>
<ol>
<li><a href="#assist_api">Using the Assist API</a>
<li><a href="#assist_api">Using the Assistant</a>
<ol>
<li><a href="#assist_api_lifecycle">Assist API Lifecycle</a></li>
<li><a href="#source_app">Source App</a></li>
<li><a href="#destination_app">Destination App</a></li>
<li><a href="#source_app">Source app</a></li>
<li><a href="#destination_app">Destination app</a></li>
</ol>
</li>
<li><a href="#implementing_your_own_assistant">Implementing your
own assistant</a></li>
<li><a href="#implementing_your_own_assistant">Implementing Your
Own Assistant</a></li>
</ol>
</div>
</div>
<p>
Android 6.0 Marshmallow introduces a new way for users to engage with apps
through the assistant.
through the assistant. The assistant is a top-level window that users can view to obtain
contextually relevant actions for the current activity. These actions might include deep links
to other apps on the device.</p>
<p>
Users activate the assistant with a long press on the Home button or by saying a
<a href="{@docRoot}reference/android/service/voice/AlwaysOnHotwordDetector.html">keyphrase</a>.
In response, the system opens a top-level window that displays contextually
relevant actions.
</p>
<p>
Users summon the assistant with a long-press on the Home button or by saying
the {@link android.service.voice.AlwaysOnHotwordDetector keyphrase}. In
response to the long-press, the system opens a top-level window that displays
contextually relevant actions for the current activity. These potential
actions might include deep links to other apps on the device.
Google App implements the assistant overlay window through a feature called
Now on Tap, which works with the Android platform-level functionality. The system allows
the user to select an assistant app, which obtains contextual information from your app
using Androids Assist API.
</p>
<p>
This guide explains how Android apps use Android's Assist API to improve the assistant
user experience.
<p/>
</p>
<h2 id="assist_api">Using the Assistant</h2>
<p>
This guide explains how Android apps use Android's Assist API to improve the
assistant user experience.
Figure 1 illustrates a typical user interaction with the assistant. When the user long-presses
the Home button, the Assist API callbacks are invoked
in the <em>source</em> app (step 1). The assistant renders the overlay window (steps 2 and 3),
and then the user selects the action to perform. The assistant executes the selected action,
such as firing an intent with a deep link to the (<em>destination</em>) restaurant app (step 4).
</p>
<h2 id="assist_api">Using the Assist API</h2>
<p>
The example below shows how Google Now integrates with the Android assistant
using a feature called Now on Tap.
</p>
<p>
The assistant overlay window in our example (2, 3) is implemented by Google
Now through a feature called Now on Tap, which works in concert with the
Android platform-level functionality. The system allows the user to select
the assistant app (Figure 2) that obtains contextual information from the
<em>source</em> app using the Assist API which is a part of the platform.
</p>
<div>
<img src="{@docRoot}images/training/assistant/image01.png">
<p class="img-caption" style="text-align:center;">
Figure 1. Assistant interaction example with the Now on Tap feature of
Google Now
the Google App
</p>
</div>
<p>
An Android user first configures the assistant and can change system options
such as using text and view hierarchy as well as the screenshot of the
current screen (Figure 2).
Users can configure the assistant by selecting <strong>Settings > Apps > Default Apps >
Assist &amp; voice input</strong>. Users can change system options such as accessing
the screen contents as text and accessing a screenshot, as shown in Figure 2.
</p>
<p>
From there, the assistant receives the information only when the user
activates assistance, such as when they tap and hold the Home button ( shown
in Figure 1, step 1).
</p>
<div style="float:right;margin:1em;max-width:300px">
<div id="assist-input-settings" style="float:right;margin:1em;max-width:300px">
<img src="{@docRoot}images/training/assistant/image02.png">
<p class="img-caption" style="text-align:center;">
Figure 2. Assist &amp; voice input settings (<em>Settings/Apps/Default
Apps/Assist &amp; voice input</em>)
Figure 2. Assist &amp; voice input settings
</p>
</div>
<h3 id="assist_api_lifecycle">Assist API Lifecycle</h3>
<h3 id="source_app">Source app</h3>
<p>
Going back to our example from Figure 1, the Assist API callbacks are invoked
in the <em>source</em> app after step 1 (user long-presses the Home button)
and before step 2 (the assistant renders the overlay window). Once the user
selects the action to perform (step 3), the assistant executes it, for
example by firing an intent with a deep link to the (<em>destination</em>)
restaurant app (step 4).
</p>
<h3 id="source_app">Source App</h3>
<p>
In most cases, your app does not need to do anything extra to integrate with
the assistant if you already follow <a href=
To ensure that your app works with the assistant as a source of information for the user,
you need only follow <a href=
"{@docRoot}guide/topics/ui/accessibility/apps.html">accessibility best
practices</a>. This section describes how to provide additional information
to help improve the assistant user experience, as well as scenarios, such as
custom Views, that need special handling.
to help improve the assistant user experience as well as scenarios
that need special handling, such as custom Views.
</p>
<h4 id="share_additional_information_with_the_assistant">Share Additional Information with the Assistant</h4>
<h4 id="share_additional_information_with_the_assistant">Share additional information
with the assistant</h4>
<p>
In addition to the text and the screenshot, your app can share
<em>additional</em> information with the assistant. For example, your music
app can choose to pass current album information, so that the assistant can
other information with the assistant. For example, your music
app can choose to pass current album information so that the assistant can
suggest smarter actions tailored to the current activity.
</p>
@@ -122,13 +104,13 @@ page.article=true
To provide additional information to the assistant, your app provides
<em>global application context</em> by registering an app listener and
supplies activity-specific information with activity callbacks as shown in
Figure 3.
Figure 3:
</p>
<div>
<img src="{@docRoot}images/training/assistant/image03.png">
<p class="img-caption" style="text-align:center;">
Figure 3. Assist API lifecycle sequence diagram.
Figure 3. Assist API lifecycle sequence diagram
</p>
</div>
@@ -136,43 +118,42 @@ page.article=true
To provide global application context, the app creates an implementation of
{@link android.app.Application.OnProvideAssistDataListener} and registers it
using {@link
android.app.Application#registerOnProvideAssistDataListener(android.app.Application.OnProvideAssistDataListener)}.
In order to provide activity-specific contextual information, activity
overrides {@link android.app.Activity#onProvideAssistData(android.os.Bundle)}
android.app.Application#registerOnProvideAssistDataListener(android.app.Application.OnProvideAssistDataListener) registerOnProvideAssistDataListener()}.
To provide activity-specific contextual information, the activity
overrides {@link android.app.Activity#onProvideAssistData(android.os.Bundle) onProvideAssistData()}
and {@link
android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent)}.
android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()}.
The two activity methods are called <em>after</em> the optional global
callback (registered with {@link
android.app.Application#registerOnProvideAssistDataListener(android.app.Application.OnProvideAssistDataListener)})
is invoked. Since the callbacks execute on the main thread, they should
callback is invoked. Because the callbacks execute on the main thread, they should
complete <a href="{@docRoot}training/articles/perf-anr.html">promptly</a>.
The callbacks are invoked only when the activity is <a href=
"{@docRoot}reference/android/app/Activity.html#ActivityLifecycle">running</a>.
</p>
<h5 id="providing_context">Providing Context</h5>
<h5 id="providing_context">Providing context</h5>
<p>
{@link android.app.Activity#onProvideAssistData(android.os.Bundle)} is called
when the user is requesting the assistant to build a full {@link
When the user activates the assistant,
{@link android.app.Activity#onProvideAssistData(android.os.Bundle) onProvideAssistData()} is called to build a full
{@link
android.content.Intent#ACTION_ASSIST} Intent with all of the context of the
current application represented as an instance of the {@link
android.app.assist.AssistStructure}. You can override this method to place
into the bundle anything you would like to appear in the
<code>EXTRA_ASSIST_CONTEXT</code> part of the assist Intent.
anything you like into the bundle to appear in the
{@link android.content.Intent#EXTRA_ASSIST_CONTEXT} part of the assist intent.
</p>
<h5 id="describing_content">Describing Content</h5>
<h5 id="describing_content">Describing content</h5>
<p>
Your app can implement {@link
android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent)}
to improve assistant user experience by providing references to content
android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()}
to improve the assistant user experience by providing content-related references
related to the current activity. You can describe the app content using the
common vocabulary defined by <a href="https://schema.org">Schema.org</a>
common vocabulary defined by <a href="https://schema.org" class="external-link">Schema.org</a>
through a JSON-LD object. In the example below, a music app provides
structured data to describe the music album the user is currently
looking at.
structured data to describe the music album that the user is currently
viewing:
</p>
<pre class="prettyprint">
@@ -191,127 +172,158 @@ public void onProvideAssistContent(AssistContent <strong>assistContent</strong>)
</pre>
<p>
Custom implementations of {@link
android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent)}
may also adjust the provided {@link
android.app.assist.AssistContent#setIntent(android.content.Intent) content
intent} to better reflect the top-level context of the activity, supply
{@link android.app.assist.AssistContent#setWebUri(android.net.Uri) the URI}
of the displayed content, and fill in its {@link
android.app.assist.AssistContent#setClipData(android.content.ClipData)} with
additional content of interest that the user is currently viewing.
You can also improve the user experience with custom implementations of
{@link
android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()},
which can provide the following benefits:
</p>
<ul>
<li><a href="{@docRoot}reference/android/app/assist/AssistContent.html#setIntent(android.content.Intent)">
Adjusts the provided content
intent</a> to
better reflect the top-level context of the activity.</li>
<li><a href="{@docRoot}reference/android/app/assist/AssistContent.html#setWebUri(android.net.Uri)">
Supplies the URI</a>
of the displayed content.</li>
<li>Fills in {@link
android.app.assist.AssistContent#setClipData(android.content.ClipData) setClipData()} with additional
content of interest that the user is currently viewing.</li>
</ul>
<p class="note">
<strong>Note: </strong>Apps that use a custom text selection implementation likely need
to implement {@link
android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()}
and call {@link android.app.assist.AssistContent#setClipData(android.content.ClipData) setClipData()}.
</p>
<h4 id="default_implementation">Default Implementation</h4>
<h4 id="default_implementation">Default implementation</h4>
<p>
If neither {@link
android.app.Activity#onProvideAssistData(android.os.Bundle)} nor {@link
android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent)}
callbacks are implemented, the system will still proceed and pass the
information collected automatically to the assistant unless the current
If neither the {@link
android.app.Activity#onProvideAssistData(android.os.Bundle) onProvideAssistData()} nor the {@link
android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()}
callback is implemented, the system still proceeds and passes the
automatically collected information to the assistant unless the current
window is flagged as <a href="#excluding_views">secure</a>.
As shown in Figure 3, the system uses the default implementations of {@link
android.view.View#onProvideStructure(android.view.ViewStructure)} and {@link
android.view.View#onProvideVirtualStructure(android.view.ViewStructure)} to
android.view.View#onProvideStructure(android.view.ViewStructure) onProvideStructure()} and {@link
android.view.View#onProvideVirtualStructure(android.view.ViewStructure) onProvideVirtualStructure()} to
collect text and view hierarchy information. If your view implements custom
text drawing, you should override {@link
android.view.View#onProvideStructure(android.view.ViewStructure)} to provide
text drawing, override {@link
android.view.View#onProvideStructure(android.view.ViewStructure) onProvideStructure()} to provide
the assistant with the text shown to the user by calling {@link
android.view.ViewStructure#setText(java.lang.CharSequence)}.
android.view.ViewStructure#setText(java.lang.CharSequence) setText(CharSequence)}.
</p>
<p>
<strong>In most cases, implementing accessibility support will enable the
assistant to obtain the information it needs.</strong> This includes
providing {@link android.R.attr#contentDescription
android:contentDescription} attributes, populating {@link
android.view.accessibility.AccessibilityNodeInfo} for custom views, making
sure custom {@link android.view.ViewGroup ViewGroups} correctly {@link
android.view.ViewGroup#getChildAt(int) expose} their children, and following
the best practices described in <a href=
"{@docRoot}guide/topics/ui/accessibility/apps.html">“Making Applications
Accessible”</a>.
</p>
<em>In most cases, implementing accessibility support enables the
assistant to obtain the information it needs.</em> To implement accessibility support,
observe the best practices described in <a href=
"{@docRoot}guide/topics/ui/accessibility/apps.html">Making Applications
Accessible</a>, including the following:</p>
<ul>
<li>Provide {@link android.R.attr#contentDescription
android:contentDescription} attributes.</li>
<li>Populate {@link
android.view.accessibility.AccessibilityNodeInfo} for custom views.</li>
<li>Make
sure that custom {@link android.view.ViewGroup ViewGroup} objects correctly
<a href="{@docRoot}reference/android/view/ViewGroup.html#getChildAt(int)">expose</a>
their children.</li>
</ul>
<h4 id="excluding_views">Excluding views from the assistant</h4>
<p>
An activity can exclude the current view from the assistant. This is accomplished
To handle sensitive information, your app can exclude the current view from the assistant
by setting the {@link android.view.WindowManager.LayoutParams#FLAG_SECURE
FLAG_SECURE} layout parameter of the WindowManager and must be done
explicitly for every window created by the activity, including Dialogs. Your
app can also use {@link android.view.SurfaceView#setSecure(boolean)
SurfaceView.setSecure} to exclude a surface from the assistant. There is no
FLAG_SECURE} layout parameter of the {@link android.view.WindowManager}. You must set {@link
android.view.WindowManager.LayoutParams#FLAG_SECURE
FLAG_SECURE} explicitly for
every window created by the activity, including dialogs. Your app can also use
{@link android.view.SurfaceView#setSecure(boolean) setSecure()} to exclude
a surface from the assistant. There is no
global (app-level) mechanism to exclude all views from the assistant. Note
that <code>FLAG_SECURE</code> does not cause the Assist API callbacks to stop
firing. The activity which uses <code>FLAG_SECURE</code> can still explicitly
that {@link android.view.WindowManager.LayoutParams#FLAG_SECURE
FLAG_SECURE} does not cause the Assist API callbacks to stop
firing. The activity that uses {@link android.view.WindowManager.LayoutParams#FLAG_SECURE
FLAG_SECURE} can still explicitly
provide information to the assistant using the callbacks described earlier
this guide.
</p>
<h4 id="voice_interactions">Voice Interactions</h4>
<p class="note"><strong>Note: </strong>For enterprise accounts (Android for Work),
the administrator can disable
the collection of assistant data for the work profile by using the {@link
android.app.admin.DevicePolicyManager#setScreenCaptureDisabled(android.content.ComponentName, boolean)
setScreenCaptureDisabled()} method of the {@link android.app.admin.DevicePolicyManager} API.</p>
<h4 id="voice_interactions">Voice interactions</h4>
<p>
Assist API callbacks are also invoked upon {@link
android.service.voice.AlwaysOnHotwordDetector keyphrase detection}. For more
information see the <a href="https://developers.google.com/voice-actions/">voice
actions</a> documentation.
Assist API callbacks are also invoked upon
<a href="{@docRoot}reference/android/service/voice/AlwaysOnHotwordDetector.html">keyphrase
detection</a>. For more information, see the
<a href="https://developers.google.com/voice-actions/" class="external-link">Voice
Actions</a> documentation.
</p>
<h4 id="z-order_considerations">Z-order considerations</h4>
<p>
The assistant uses a lightweight overlay window displayed on top of the
current activity. The assistant can be summoned by the user at any time.
Therefore, apps should not create permanent {@link
android.Manifest.permission#SYSTEM_ALERT_WINDOW system alert}
windows interfering with the overlay window shown in Figure 4.
current activity. Because the user can activate the assistant at any time,
don't create permanent <a
href="{@docRoot}reference/android/Manifest.permission.html#SYSTEM_ALERT_WINDOW">
system alert</a> windows that interfere with the overlay window, as shown in
Figure 4.
</p>
<div style="">
<img src="{@docRoot}images/training/assistant/image04.png">
<p class="img-caption" style="text-align:center;">
Figure 4. Assist layer Z-order.
Figure 4. Assist layer Z-order
</p>
</div>
<p>
If your app uses {@link
android.Manifest.permission#SYSTEM_ALERT_WINDOW system alert} windows, it
must promptly remove them as leaving them on the screen will degrade user
experience and annoy the users.
If your app uses <a
href="{@docRoot}reference/android/Manifest.permission.html#SYSTEM_ALERT_WINDOW">
system alert</a> windows, remove them promptly because leaving them on the
screen degrades the user experience.
</p>
<h3 id="destination_app">Destination App</h3>
<h3 id="destination_app">Destination app</h3>
<p>
The matching between the current user context and potential actions displayed
in the overlay window (shown in step 3 in Figure 1) is specific to the
assistants implementation. However, consider adding <a href=
"{@docRoot}training/app-indexing/deep-linking.html">deep linking</a> support
to your app. The assistant will typically take advantage of deep linking. For
example, Google Now uses deep linking and <a href=
"https://developers.google.com/app-indexing/">App Indexing</a> in order to
The assistant typically takes advantage of deep linking to find destination apps. To make your
app a potential destination app, consider adding <a href=
"{@docRoot}training/app-indexing/deep-linking.html">deep linking</a> support. The matching
between the current user context and deep links or other potential actions displayed in the
overlay window (shown in step 3 in Figure 1) is specific to the assistants implementation.
For
example, the Google App uses deep linking and <a href=
"https://developers.google.com/app-indexing/" class="external-link">Firebase App Indexing</a> in order to
drive traffic to destination apps.
</p>
<h2 id="implementing_your_own_assistant">Implementing your own assistant </h2>
<h2 id="implementing_your_own_assistant">Implementing Your Own Assistant </h2>
<p>
Some developers may wish to implement their own assistant. As shown in Figure
2, the active assistant app can be selected by the Android user. The
You may wish to implement your own assistant. As shown in <a href="#assist-input-settings">Figure
2</a>, the user can select the active assistant app. The
assistant app must provide an implementation of {@link
android.service.voice.VoiceInteractionSessionService} and {@link
android.service.voice.VoiceInteractionSession} as shown in <a href=
"https://android.googlesource.com/platform/frameworks/base/+/android-5.0.1_r1/tests/VoiceInteraction?autodive=0%2F%2F%2F%2F%2F%2F">
this</a> example and it requires the {@link
android.Manifest.permission#BIND_VOICE_INTERACTION} permission. It can then
"https://android.googlesource.com/platform/frameworks/base/+/marshmallow-release/tests/VoiceInteraction/" class="external-link">
this <code>VoiceInteraction</code> example</a>. It also requires the {@link
android.Manifest.permission#BIND_VOICE_INTERACTION} permission. The assistant can then
receive the text and view hierarchy represented as an instance of the {@link
android.app.assist.AssistStructure} in {@link
android.service.voice.VoiceInteractionSession#onHandleAssist(android.os.Bundle,
android.app.assist.AssistStructure,android.app.assist.AssistContent) onHandleAssist()}.
The assistant receives the screenshot through {@link
It receives the screenshot through {@link
android.service.voice.VoiceInteractionSession#onHandleScreenshot(android.graphics.Bitmap)
onHandleScreenshot()}.
</p>