diff --git a/docs/html/training/articles/assistant.jd b/docs/html/training/articles/assistant.jd index a1fbd6ba1634b..703b3778a2753 100644 --- a/docs/html/training/articles/assistant.jd +++ b/docs/html/training/articles/assistant.jd @@ -11,110 +11,92 @@ page.article=true

In this document

    -
  1. Using the Assist API +
  2. Using the Assistant
      -
    1. Assist API Lifecycle
    2. -
    3. Source App
    4. -
    5. Destination App
    6. +
    7. Source app
    8. +
    9. Destination app
  3. -
  4. Implementing your - own assistant
  5. +
  6. Implementing Your + Own Assistant

Android 6.0 Marshmallow introduces a new way for users to engage with apps - through the assistant. + through the assistant. The assistant is a top-level window that users can view to obtain + contextually relevant actions for the current activity. These actions might include deep links + to other apps on the device.

+ +

+ Users activate the assistant with a long press on the Home button or by saying a + keyphrase. + In response, the system opens a top-level window that displays contextually + relevant actions.

- Users summon the assistant with a long-press on the Home button or by saying - the {@link android.service.voice.AlwaysOnHotwordDetector keyphrase}. In - response to the long-press, the system opens a top-level window that displays - contextually relevant actions for the current activity. These potential - actions might include deep links to other apps on the device. + Google App implements the assistant overlay window through a feature called + Now on Tap, which works with the Android platform-level functionality. The system allows + the user to select an assistant app, which obtains contextual information from your app + using Android’s Assist API.

+

+ This guide explains how Android apps use Android's Assist API to improve the assistant + user experience. +

+

+ + +

Using the Assistant

- This guide explains how Android apps use Android's Assist API to improve the - assistant user experience. + Figure 1 illustrates a typical user interaction with the assistant. When the user long-presses + the Home button, the Assist API callbacks are invoked + in the source app (step 1). The assistant renders the overlay window (steps 2 and 3), + and then the user selects the action to perform. The assistant executes the selected action, + such as firing an intent with a deep link to the (destination) restaurant app (step 4).

- -

Using the Assist API

- -

- The example below shows how Google Now integrates with the Android assistant - using a feature called Now on Tap. -

- -

- The assistant overlay window in our example (2, 3) is implemented by Google - Now through a feature called Now on Tap, which works in concert with the - Android platform-level functionality. The system allows the user to select - the assistant app (Figure 2) that obtains contextual information from the - source app using the Assist API which is a part of the platform. -

- -

Figure 1. Assistant interaction example with the Now on Tap feature of - Google Now + the Google App

- An Android user first configures the assistant and can change system options - such as using text and view hierarchy as well as the screenshot of the - current screen (Figure 2). + Users can configure the assistant by selecting Settings > Apps > Default Apps > + Assist & voice input. Users can change system options such as accessing + the screen contents as text and accessing a screenshot, as shown in Figure 2.

-

- From there, the assistant receives the information only when the user - activates assistance, such as when they tap and hold the Home button ( shown - in Figure 1, step 1). -

- -
+

- Figure 2. Assist & voice input settings (Settings/Apps/Default - Apps/Assist & voice input) + Figure 2. Assist & voice input settings

-

Assist API Lifecycle

+

Source app

- Going back to our example from Figure 1, the Assist API callbacks are invoked - in the source app after step 1 (user long-presses the Home button) - and before step 2 (the assistant renders the overlay window). Once the user - selects the action to perform (step 3), the assistant executes it, for - example by firing an intent with a deep link to the (destination) - restaurant app (step 4). -

- -

Source App

- -

- In most cases, your app does not need to do anything extra to integrate with - the assistant if you already follow accessibility best practices. This section describes how to provide additional information - to help improve the assistant user experience, as well as scenarios, such as - custom Views, that need special handling. + to help improve the assistant user experience as well as scenarios + that need special handling, such as custom Views.

- -

Share Additional Information with the Assistant

+

Share additional information + with the assistant

In addition to the text and the screenshot, your app can share - additional information with the assistant. For example, your music - app can choose to pass current album information, so that the assistant can + other information with the assistant. For example, your music + app can choose to pass current album information so that the assistant can suggest smarter actions tailored to the current activity.

@@ -122,13 +104,13 @@ page.article=true To provide additional information to the assistant, your app provides global application context by registering an app listener and supplies activity-specific information with activity callbacks as shown in - Figure 3. + Figure 3:

- Figure 3. Assist API lifecycle sequence diagram. + Figure 3. Assist API lifecycle sequence diagram

@@ -136,43 +118,42 @@ page.article=true To provide global application context, the app creates an implementation of {@link android.app.Application.OnProvideAssistDataListener} and registers it using {@link - android.app.Application#registerOnProvideAssistDataListener(android.app.Application.OnProvideAssistDataListener)}. - In order to provide activity-specific contextual information, activity - overrides {@link android.app.Activity#onProvideAssistData(android.os.Bundle)} + android.app.Application#registerOnProvideAssistDataListener(android.app.Application.OnProvideAssistDataListener) registerOnProvideAssistDataListener()}. + To provide activity-specific contextual information, the activity + overrides {@link android.app.Activity#onProvideAssistData(android.os.Bundle) onProvideAssistData()} and {@link - android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent)}. + android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()}. The two activity methods are called after the optional global - callback (registered with {@link - android.app.Application#registerOnProvideAssistDataListener(android.app.Application.OnProvideAssistDataListener)}) - is invoked. Since the callbacks execute on the main thread, they should + callback is invoked. Because the callbacks execute on the main thread, they should complete promptly. The callbacks are invoked only when the activity is running.

-
Providing Context
+
Providing context

- {@link android.app.Activity#onProvideAssistData(android.os.Bundle)} is called - when the user is requesting the assistant to build a full {@link + When the user activates the assistant, + {@link android.app.Activity#onProvideAssistData(android.os.Bundle) onProvideAssistData()} is called to build a full + {@link android.content.Intent#ACTION_ASSIST} Intent with all of the context of the current application represented as an instance of the {@link android.app.assist.AssistStructure}. You can override this method to place - into the bundle anything you would like to appear in the - EXTRA_ASSIST_CONTEXT part of the assist Intent. + anything you like into the bundle to appear in the + {@link android.content.Intent#EXTRA_ASSIST_CONTEXT} part of the assist intent.

-
Describing Content
+
Describing content

Your app can implement {@link - android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent)} - to improve assistant user experience by providing references to content + android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()} + to improve the assistant user experience by providing content-related references related to the current activity. You can describe the app content using the - common vocabulary defined by Schema.org + common vocabulary defined by Schema.org through a JSON-LD object. In the example below, a music app provides - structured data to describe the music album the user is currently - looking at. + structured data to describe the music album that the user is currently + viewing:

@@ -191,127 +172,158 @@ public void onProvideAssistContent(AssistContent assistContent)
 

- Custom implementations of {@link - android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent)} - may also adjust the provided {@link - android.app.assist.AssistContent#setIntent(android.content.Intent) content - intent} to better reflect the top-level context of the activity, supply - {@link android.app.assist.AssistContent#setWebUri(android.net.Uri) the URI} - of the displayed content, and fill in its {@link - android.app.assist.AssistContent#setClipData(android.content.ClipData)} with - additional content of interest that the user is currently viewing. + You can also improve the user experience with custom implementations of + {@link + android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()}, + which can provide the following benefits: +

+ +

+ Note: Apps that use a custom text selection implementation likely need + to implement {@link + android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()} + and call {@link android.app.assist.AssistContent#setClipData(android.content.ClipData) setClipData()}.

-

Default Implementation

+

Default implementation

- If neither {@link - android.app.Activity#onProvideAssistData(android.os.Bundle)} nor {@link - android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent)} - callbacks are implemented, the system will still proceed and pass the - information collected automatically to the assistant unless the current + If neither the {@link + android.app.Activity#onProvideAssistData(android.os.Bundle) onProvideAssistData()} nor the {@link + android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()} + callback is implemented, the system still proceeds and passes the + automatically collected information to the assistant unless the current window is flagged as secure. As shown in Figure 3, the system uses the default implementations of {@link - android.view.View#onProvideStructure(android.view.ViewStructure)} and {@link - android.view.View#onProvideVirtualStructure(android.view.ViewStructure)} to + android.view.View#onProvideStructure(android.view.ViewStructure) onProvideStructure()} and {@link + android.view.View#onProvideVirtualStructure(android.view.ViewStructure) onProvideVirtualStructure()} to collect text and view hierarchy information. If your view implements custom - text drawing, you should override {@link - android.view.View#onProvideStructure(android.view.ViewStructure)} to provide + text drawing, override {@link + android.view.View#onProvideStructure(android.view.ViewStructure) onProvideStructure()} to provide the assistant with the text shown to the user by calling {@link - android.view.ViewStructure#setText(java.lang.CharSequence)}. + android.view.ViewStructure#setText(java.lang.CharSequence) setText(CharSequence)}.

- In most cases, implementing accessibility support will enable the - assistant to obtain the information it needs. This includes - providing {@link android.R.attr#contentDescription - android:contentDescription} attributes, populating {@link - android.view.accessibility.AccessibilityNodeInfo} for custom views, making - sure custom {@link android.view.ViewGroup ViewGroups} correctly {@link - android.view.ViewGroup#getChildAt(int) expose} their children, and following - the best practices described in “Making Applications - Accessible”. -

+ In most cases, implementing accessibility support enables the + assistant to obtain the information it needs. To implement accessibility support, + observe the best practices described in Making Applications + Accessible, including the following:

+ +

Excluding views from the assistant

- An activity can exclude the current view from the assistant. This is accomplished + To handle sensitive information, your app can exclude the current view from the assistant by setting the {@link android.view.WindowManager.LayoutParams#FLAG_SECURE - FLAG_SECURE} layout parameter of the WindowManager and must be done - explicitly for every window created by the activity, including Dialogs. Your - app can also use {@link android.view.SurfaceView#setSecure(boolean) - SurfaceView.setSecure} to exclude a surface from the assistant. There is no + FLAG_SECURE} layout parameter of the {@link android.view.WindowManager}. You must set {@link + android.view.WindowManager.LayoutParams#FLAG_SECURE + FLAG_SECURE} explicitly for + every window created by the activity, including dialogs. Your app can also use + {@link android.view.SurfaceView#setSecure(boolean) setSecure()} to exclude + a surface from the assistant. There is no global (app-level) mechanism to exclude all views from the assistant. Note - that FLAG_SECURE does not cause the Assist API callbacks to stop - firing. The activity which uses FLAG_SECURE can still explicitly + that {@link android.view.WindowManager.LayoutParams#FLAG_SECURE + FLAG_SECURE} does not cause the Assist API callbacks to stop + firing. The activity that uses {@link android.view.WindowManager.LayoutParams#FLAG_SECURE + FLAG_SECURE} can still explicitly provide information to the assistant using the callbacks described earlier this guide.

-

Voice Interactions

+

Note: For enterprise accounts (Android for Work), + the administrator can disable + the collection of assistant data for the work profile by using the {@link + android.app.admin.DevicePolicyManager#setScreenCaptureDisabled(android.content.ComponentName, boolean) + setScreenCaptureDisabled()} method of the {@link android.app.admin.DevicePolicyManager} API.

+ +

Voice interactions

- Assist API callbacks are also invoked upon {@link - android.service.voice.AlwaysOnHotwordDetector keyphrase detection}. For more - information see the voice - actions documentation. + Assist API callbacks are also invoked upon + keyphrase + detection. For more information, see the + Voice + Actions documentation.

Z-order considerations

The assistant uses a lightweight overlay window displayed on top of the - current activity. The assistant can be summoned by the user at any time. - Therefore, apps should not create permanent {@link - android.Manifest.permission#SYSTEM_ALERT_WINDOW system alert} - windows interfering with the overlay window shown in Figure 4. + current activity. Because the user can activate the assistant at any time, + don't create permanent + system alert windows that interfere with the overlay window, as shown in + Figure 4.

- Figure 4. Assist layer Z-order. + Figure 4. Assist layer Z-order

- If your app uses {@link - android.Manifest.permission#SYSTEM_ALERT_WINDOW system alert} windows, it - must promptly remove them as leaving them on the screen will degrade user - experience and annoy the users. + If your app uses + system alert windows, remove them promptly because leaving them on the + screen degrades the user experience.

-

Destination App

+

Destination app

- The matching between the current user context and potential actions displayed - in the overlay window (shown in step 3 in Figure 1) is specific to the - assistant’s implementation. However, consider adding deep linking support - to your app. The assistant will typically take advantage of deep linking. For - example, Google Now uses deep linking and App Indexing in order to + The assistant typically takes advantage of deep linking to find destination apps. To make your + app a potential destination app, consider adding deep linking support. The matching + between the current user context and deep links or other potential actions displayed in the + overlay window (shown in step 3 in Figure 1) is specific to the assistant’s implementation. + For + example, the Google App uses deep linking and Firebase App Indexing in order to drive traffic to destination apps.

-

Implementing your own assistant

+

Implementing Your Own Assistant

- Some developers may wish to implement their own assistant. As shown in Figure - 2, the active assistant app can be selected by the Android user. The + You may wish to implement your own assistant. As shown in Figure + 2, the user can select the active assistant app. The assistant app must provide an implementation of {@link android.service.voice.VoiceInteractionSessionService} and {@link android.service.voice.VoiceInteractionSession} as shown in - this example and it requires the {@link - android.Manifest.permission#BIND_VOICE_INTERACTION} permission. It can then + "https://android.googlesource.com/platform/frameworks/base/+/marshmallow-release/tests/VoiceInteraction/" class="external-link"> + this VoiceInteraction example. It also requires the {@link + android.Manifest.permission#BIND_VOICE_INTERACTION} permission. The assistant can then receive the text and view hierarchy represented as an instance of the {@link android.app.assist.AssistStructure} in {@link android.service.voice.VoiceInteractionSession#onHandleAssist(android.os.Bundle, android.app.assist.AssistStructure,android.app.assist.AssistContent) onHandleAssist()}. - The assistant receives the screenshot through {@link + It receives the screenshot through {@link android.service.voice.VoiceInteractionSession#onHandleScreenshot(android.graphics.Bitmap) onHandleScreenshot()}.