diff --git a/docs/html/resources/resources-data.js b/docs/html/resources/resources-data.js index 3e673a554d0d0..b15e8474a353b 100644 --- a/docs/html/resources/resources-data.js +++ b/docs/html/resources/resources-data.js @@ -608,7 +608,7 @@ var ANDROID_RESOURCES = [ } }, { - tags: ['sample', 'accountsync'], + tags: ['sample', 'accountsync', 'updated'], path: 'samples/SampleSyncAdapter/index.html', title: { en: 'SampleSyncAdapter' diff --git a/docs/html/sdk/android-4.0.jd b/docs/html/sdk/android-4.0.jd index 619c907bcc62c..b4fbe724979bb 100644 --- a/docs/html/sdk/android-4.0.jd +++ b/docs/html/sdk/android-4.0.jd @@ -10,6 +10,7 @@ sdk.platform.apiLevel=14
For a high-level introduction to the new user and developer features in Android 4.0, see the -Platform Highlights.
-Reminder: If you've already published an Android application, please test your application on Android {@sdkPlatformVersion} as soon as possible to be sure your application provides the best experience possible on the latest Android-powered devices.
+For a high-level introduction to the new user and developer features in Android 4.0, see the +Platform Highlights.
+Important: To download the new Android -4.0 system components from the Android SDK Manager, you must first update the -SDK tools to revision 14 and restart the Android SDK Manager. If you do not, -the Android 4.0 system components will not be available for download.
-The Contact APIs that are defined by the {@link android.provider.ContactsContract} provider have -been extended to support new features such as a personal profile for the device owner, large contact -photos, and the ability for users to invite individual contacts to social networks that are -installed on the device.
+The contact APIs that are defined by the {@link android.provider.ContactsContract} provider have +been extended to support new features such as a personal profile for the device owner, high +resolution contact photos, and the ability for users to invite individual contacts to social +networks that are installed on the device.
Android now includes a personal profile that represents the device owner, as defined by the -{@link -android.provider.ContactsContract.Profile} table. Social apps that maintain a user identity can -contribute to the user's profile data by creating a new {@link +{@link android.provider.ContactsContract.Profile} table. Social apps that maintain a user identity +can contribute to the user's profile data by creating a new {@link android.provider.ContactsContract.RawContacts} entry within the {@link android.provider.ContactsContract.Profile}. That is, raw contacts that represent the device user do not belong in the traditional raw contacts table defined by the {@link android.provider.ContactsContract.RawContacts} Uri; instead, you must add a profile raw contact in the table at {@link android.provider.ContactsContract.Profile#CONTENT_RAW_CONTACTS_URI}. Raw -contacts in this table are then aggregated into the single user-visible profile information.
+contacts in this table are then aggregated into the single user-visible profile labeled "Me".Adding a new raw contact for the profile requires the {@link android.Manifest.permission#WRITE_PROFILE} permission. Likewise, in order to read from the profile table, you must request the {@link android.Manifest.permission#READ_PROFILE} permission. However, -reading the user profile should not be required by most apps, even when contributing data to the -profile. Reading the user profile is a sensitive permission and users will be very skeptical of apps -that request reading their profile information.
+most apps should need to read the user profile, even when contributing data to the +profile. Reading the user profile is a sensitive permission and you should expect users to be +skeptical of apps that request it. +Android now supports high resolution photos for contacts. Now, when you push a photo into a -contact -record, the system processes it into both a 96x96 thumbnail (as it has previously) and a 256x256 -"display photo" stored in a new file-based photo store (the exact dimensions that the system chooses -may vary in the future). You can add a large photo to a contact by putting a large photo in the -usual {@link android.provider.ContactsContract.CommonDataKinds.Photo#PHOTO} column of a data row, -which the system will then process into the appropriate thumbnail and display photo records.
+contact record, the system processes it into both a 96x96 thumbnail (as it has previously) and a +256x256 "display photo" that's stored in a new file-based photo store (the exact dimensions that the +system chooses may vary in the future). You can add a large photo to a contact by putting a large +photo in the usual {@link android.provider.ContactsContract.CommonDataKinds.Photo#PHOTO} column of a +data row, which the system will then process into the appropriate thumbnail and display photo +records. +The {@link android.provider.ContactsContract.Intents#INVITE_CONTACT} intent action allows you to -invoke an action that indicates the user wants to add a contact to a social network that understand -this intent and use it to invite the contact specified in the contact to that social network.
+The {@link android.provider.ContactsContract.Intents#INVITE_CONTACT} intent action allows an app +to invoke an action that indicates the user wants to add a contact to a social network. The app +receiving the app uses it to invite the specified contact to that +social network. Most apps will be on the receiving-end of this operation. For example, the +built-in People app invokes the invite intent when the user selects "Add connection" for a specific +social app that's listed in a person's contact details.
+ +To make your app visible as in the "Add connection" list, your app must provide a sync adapter to +sync contact information from your social network. You must then indicate to the system that your +app responds to the {@link android.provider.ContactsContract.Intents#INVITE_CONTACT} intent by +adding the {@code inviteContactActivity} attribute to your app’s sync configuration file, with a +fully-qualified name of the activity that the system should start when sending the invite intent. +The activity that starts can then retrieve the URI for the contact in question from the intent’s +data and perform the necessary work to invite that contact to the network or add the person to the +user’s connections.
+ +See the Sample Sync +Adapter app for an example (specifically, see the contacts.xml +file).
-Apps that use a sync adapter to provide information about contacts can register with the system -to -receive the invite intent when there’s an opportunity for the user to “invite” a contact to the -app’s social network (such as from a contact card in the People app). To receive the invite intent, -you simply need to add the {@code inviteContactActivity} attribute to your app’s XML sync -configuration file, providing a fully-qualified name of the activity that the system should start -when the user wants to “invite” a contact in your social network. The activity that starts can then -retrieve the URI for the contact in question from the intent’s data and perform the necessary work -to -invite that contact to the network or add the person to the user’s connections.
The new {@link android.provider.ContactsContract.DataUsageFeedback} APIs allow you to help track how often the user uses particular methods of contacting people, such as how often the user uses each phone number or e-mail address. This information helps improve the ranking for each contact -method associated with each person and provide such contact methods as suggestions.
+method associated with each person and provide better suggestions for contacting each person. -The new calendar API allows you to access and modify the user’s calendars and events. The -calendar -APIs are provided with the {@link android.provider.CalendarContract} provider. Using the calendar -provider, you can:
-The new calendar APIs allow you to access and modify the user’s calendars and events using the +Calendar Provider. You can read, add, modify and delete calendars, events, attendees, reminders and +alerts.
-{@link android.provider.CalendarContract} defines the data model of calendar and event-related -information. All of the user’s calendar data is stored in a number of tables defined by subclasses -of {@link android.provider.CalendarContract}:
+A variety of apps and widgets can use these APIs to read and modify calendar events. However, +some of the most compelling use cases are sync adapters that synchronize the user's calendar from +other calendar services with the Calendar Provider, in order to offer a unified location for +all the user's events. Google Calendar, for example, uses a sync adapter to synchronize Google +Calendar events with the Calendar Provider, which can then be viewed with Android's built-in +Calendar app.
+ +The data model for calendars and event-related information in the Calendar Provider is +defined by {@link android.provider.CalendarContract}. All the user’s calendar data is stored in a +number of tables defined by various subclasses of {@link android.provider.CalendarContract}:
To access a user’s calendar data with the calendar provider, your application must request
-permission from the user by declaring
To access a user’s calendar data with the Calendar Provider, your application must request +the {@link android.Manifest.permission#READ_CALENDAR} permission (for read access) and +{@link android.Manifest.permission#WRITE_CALENDAR} (for write access).
-However, if all you want to do is add an event to the user’s calendar, you can instead use an -INSERT -{@link android.content.Intent} to start an activity in the Calendar app that creates new events. -Using the intent does not require the WRITE_CALENDAR permission and you can specify the {@link -android.provider.CalendarContract#EXTRA_EVENT_BEGIN_TIME} and {@link -android.provider.CalendarContract#EXTRA_EVENT_END_TIME} extra fields to pre-populate the form with -the time of the event. The values for these times must be in milliseconds from the epoch. You must -also specify {@code “vnd.android.cursor.item/event”} as the intent type.
+If all you want to do is add an event to the user’s calendar, you can use an +{@link android.content.Intent#ACTION_INSERT} intent with a {@code "vnd.android.cursor.item/event"} +MIME type to start an activity in the Calendar app that creates new events. Using the intent does +not require any permission and you can specify event details with the following extras:
+ +The new voicemail APIs allows applications to add voicemails to a content provider on the device. +Because the APIs currently do not allow third party apps to read all the voicemails from the system, +the only third-party apps that should use the voicemail APIs are those that have voicemail to +deliver to the user. For instance, it’s possible that a user has multiple voicemail sources, such as +one provided by the phone’s service provider and others from VoIP or other alternative voice +services. These apps can use the APIs to add their voicemails to the system for quick playback. The +built-in Phone application presents all voicemails from the Voicemail Provider with a single list. +Although the system’s Phone application is the only application that can read all the voicemails, +each application that provides voicemails can read those that it has added to the system (but cannot +read voicemails from other services).
+ +The {@link android.provider.VoicemailContract} class defines the content provider for the +voicemail APIs. The subclasses {@link android.provider.VoicemailContract.Voicemails} and {@link +android.provider.VoicemailContract.Status} provide tables in which the Voicemail Providers can +insert voicemail data for storage on the device. For an example of a voicemail provider app, see the +Voicemail Provider +Demo.
The {@link android.hardware.Camera} APIs now support face detection and control for metering and -focus areas.
+The {@link android.hardware.Camera} class now includes APIs for detecting faces and controlling +focus and metering areas.
-Camera apps can now enhance their abilities with Android’s face detection software, which not -only -detects the face of a subject, but also specific facial features, such as the eyes and mouth.
+Camera apps can now enhance their abilities with Android’s face detection APIs, which not +only detect the face of a subject, but also specific facial features, such as the eyes and mouth. +
To detect faces in your camera application, you must register a {@link android.hardware.Camera.FaceDetectionListener} by calling {@link @@ -276,41 +327,38 @@ android.hardware.Camera#setFaceDetectionListener setFaceDetectionListener()}. Yo your camera surface and start detecting faces by calling {@link android.hardware.Camera#startFaceDetection}.
-When the system detects a face, it calls the {@link +
When the system detects one or more faces in the camera scene, it calls the {@link android.hardware.Camera.FaceDetectionListener#onFaceDetection onFaceDetection()} callback in your implementation of {@link android.hardware.Camera.FaceDetectionListener}, including an array of {@link android.hardware.Camera.Face} objects.
An instance of the {@link android.hardware.Camera.Face} class provides various information about -the -face detected by the camera, including:
+the face detected, including:Camera apps can now control the areas that the camera uses for focus and when metering white +
Camera apps can now control the areas that the camera uses for focus and for metering white balance -and auto-exposure (when supported by the hardware). Both features use the new {@link -android.hardware.Camera.Area} class to specify the region of the camera’s current view that should -be focused or metered. An instance of the {@link android.hardware.Camera.Area} class defines the -bounds of the area with a {@link android.graphics.Rect} and the weight of the -area—representing the level of importance of that area, relative to other areas in -consideration—with an integer.
+and auto-exposure. Both features use the new {@link android.hardware.Camera.Area} class to specify +the region of the camera’s current view that should be focused or metered. An instance of the {@link +android.hardware.Camera.Area} class defines the bounds of the area with a {@link +android.graphics.Rect} and the area's weight—representing the level of importance of that +area, relative to other areas in consideration—with an integer.Before setting either a focus area or metering area, you should first call {@link android.hardware.Camera.Parameters#getMaxNumFocusAreas} or {@link android.hardware.Camera.Parameters#getMaxNumMeteringAreas}, respectively. If these return zero, then -the device does not support the respective feature.
+the device does not support the corresponding feature.To specify the focus or metering areas to use, simply call {@link android.hardware.Camera.Parameters#setFocusAreas setFocusAreas()} or {@link @@ -318,17 +366,17 @@ android.hardware.Camera.Parameters#setFocusAreas setMeteringAreas()}. Each take java.util.List} of {@link android.hardware.Camera.Area} objects that indicate the areas to consider for focus or metering. For example, you might implement a feature that allows the user to set the focus area by touching an area of the preview, which you then translate to an {@link -android.hardware.Camera.Area} object and set the focus to that spot. The focus or exposure in that -area will continually update as the scene in the area changes.
+android.hardware.Camera.Area} object and request that the camera focus on that area of the scene. +The focus or exposure in that area will continually update as the scene in the area changes. -Android 4.0 adds several new APIs for applications that interact with media such as photos, -videos, -and music.
+videos, and music. -Android 4.0 adds support for:
The new {@link android.media.RemoteControlClient} allows media players to enable playback -controls -from remote control clients such as the device lock screen. Media players can also expose +controls from remote control clients such as the device lock screen. Media players can also expose information about the media currently playing for display on the remote control, such as track information and album art.
To enable remote control clients for your media player, instantiate a {@link -android.media.RemoteControlClient} with a {@link android.app.PendingIntent} that broadcasts {@link +android.media.RemoteControlClient} with its constructor, passing it a {@link +android.app.PendingIntent} that broadcasts {@link android.content.Intent#ACTION_MEDIA_BUTTON}. The intent must also declare the explicit {@link android.content.BroadcastReceiver} component in your app that handles the {@link android.content.Intent#ACTION_MEDIA_BUTTON} event.
@@ -424,21 +478,19 @@ android.media.MediaMetadataRetriever}.For a sample implementation, see the Random Music Player, which -provides compatibility logic such that it enables the remote control client while continuing to -support Android 2.1 devices.
+provides compatibility logic such that it enables the remote control client on Android 4.0 +devices while continuing to support devices back to Android 2.1.A new media effects framework allows you to apply a variety of visual effects to images and -videos. -The system performs all effects processing on the GPU to obtain maximum performance. Applications in -Android 4.0 such as Google Talk or the Gallery editor make use of the effects API to apply real-time -effects to video and photos.
+videos. The system performs all effects processing on the GPU to obtain maximum performance. +New applications for Android 4.0 such as Google Talk and the Gallery editor make use of the +effects API to apply real-time effects to video and photos.For maximum performance, effects are applied directly to OpenGL textures, so your application -must -have a valid OpenGL context before it can use the effects APIs. The textures to which you apply +must have a valid OpenGL context before it can use the effects APIs. The textures to which you apply effects may be from bitmaps, videos or even the camera. However, there are certain restrictions that textures must meet:
An {@link android.media.effect.Effect} object defines a single media effect that you can apply to -an -image frame. The basic workflow to create an {@link android.media.effect.Effect} is:
+an image frame. The basic workflow to create an {@link android.media.effect.Effect} is:Not all devices support all effects, so you must first check if the desired effect is supported -by -calling {@link android.media.effect.EffectFactory#isEffectSupported isEffectSupported()}.
+by calling {@link android.media.effect.EffectFactory#isEffectSupported isEffectSupported()}. -You can adjust the effect’s parameters by calling {@link android.media.effect.Effect#setParameter +
You can adjust an effect’s parameters by calling {@link android.media.effect.Effect#setParameter setParameter()} and passing a parameter name and parameter value. Each type of effect accepts different parameters, which are documented with the effect name. For example, {@link android.media.effect.EffectFactory#EFFECT_FISHEYE} has one parameter for the {@code scale} of the @@ -480,8 +529,8 @@ texture. The input texture must be bound to a {@link android.opengl.GLES20#GL_T image (usually done by calling the {@link android.opengl.GLES20#glTexImage2D glTexImage2D()} function). You may provide multiple mipmap levels. If the output texture has not been bound to a texture image, it will be automatically bound by the effect as a {@link -android.opengl.GLES20#GL_TEXTURE_2D}. It will contain one mipmap level (0), which will have the same -size as the input.
+android.opengl.GLES20#GL_TEXTURE_2D} and with one mipmap level (0), which will have the same +size as the input. @@ -501,7 +550,7 @@ android.bluetooth.BluetoothProfile.ServiceListener} and the {@link android.bluetooth.BluetoothProfile#HEALTH} profile type to establish a connection with the profile proxy object. -Once you’ve acquired the Health profile proxy (the {@link android.bluetooth.BluetoothHealth} +
Once you’ve acquired the Health Profile proxy (the {@link android.bluetooth.BluetoothHealth} object), connecting to and communicating with paired health devices involves the following new Bluetooth classes:
For more information about using the Bluetooth Health profile, see the documentation for {@link +
For more information about using the Bluetooth Health Profile, see the documentation for {@link android.bluetooth.BluetoothHealth}.
+Android Beam allows you to send NDEF messages (an NFC standard for data stored on NFC tags) from -one -device to another (a process also known as “NDEF Push”). The data transfer is initiated when two +
Android Beam is a new NFC feature that allows you to send NDEF messages from one device to +another (a process also known as “NDEF Push”). The data transfer is initiated when two Android-powered devices that support Android Beam are in close proximity (about 4 cm), usually with their backs touching. The data inside the NDEF message can contain any data that you wish to share between devices. For example, the People app shares contacts, YouTube shares videos, and Browser @@ -531,29 +580,30 @@ shares URLs using Android Beam.
To transmit data between devices using Android Beam, you need to create an {@link android.nfc.NdefMessage} that contains the information you want to share while your activity is in -the foreground. You must then pass the -{@link android.nfc.NdefMessage} to the system in one of two ways:
+the foreground. You must then pass the {@link android.nfc.NdefMessage} to the system in one of two +ways:Call {@link android.nfc.NfcAdapter#setNdefPushMessage setNdefPushMessage()} at any time to set -the -message you want to send. For instance, you might call this method and pass it your {@link +the message you want to send. For instance, you might call this method and pass it your {@link android.nfc.NdefMessage} during your activity’s {@link android.app.Activity#onCreate onCreate()} -method. Then, whenever Android Beam is activated with another device while your activity is in the -foreground, the system sends that {@link android.nfc.NdefMessage} to the other device.
Implement {@link android.nfc.NfcAdapter.CreateNdefMessageCallback}, in which the {@link -android.nfc.NfcAdapter.CreateNdefMessageCallback#createNdefMessage createNdefMessage()} callback +
Implement {@link android.nfc.NfcAdapter.CreateNdefMessageCallback}, in which your +implementation of the {@link +android.nfc.NfcAdapter.CreateNdefMessageCallback#createNdefMessage createNdefMessage()} method returns the {@link android.nfc.NdefMessage} you want to send. Then pass the {@link -android.nfc.NfcAdapter.CreateNdefMessageCallback} to {@link -android.nfc.NfcAdapter#setNdefPushMessageCallback setNdefPushMessageCallback()}. In this case, when -Android Beam is activated with another device while your activity is in the foreground, the system -calls {@link android.nfc.NfcAdapter.CreateNdefMessageCallback#createNdefMessage createNdefMessage()} -to retrieve the {@link android.nfc.NdefMessage} you want to send. This allows you to create a -different {@link android.nfc.NdefMessage} for each occurrence, depending on the user context (such -as which contact in the People app is currently visible).
In this case, when Android Beam is activated with another device while your activity is in the +foreground, the system calls {@link +android.nfc.NfcAdapter.CreateNdefMessageCallback#createNdefMessage createNdefMessage()} to retrieve +the {@link android.nfc.NdefMessage} you want to send. This allows you to define the {@link +android.nfc.NdefMessage} to deliver only once Android Beam is initiated, in case the contents +of the message might vary throughout the life of the activity.
In case you want to run some specific code once the system has successfully delivered your NDEF @@ -567,7 +617,7 @@ onNdefPushComplete()} when the message is delivered.
tags. The system invokes an intent with the {@link android.nfc.NfcAdapter#ACTION_NDEF_DISCOVERED} action to start an activity, with either a URL or a MIME type set according to the first {@link android.nfc.NdefRecord} in the {@link android.nfc.NdefMessage}. For the activity you want to -respond, you can set intent filters for the URLs or MIME types your app cares about. For more +respond, you can declare intent filters for the URLs or MIME types your app cares about. For more information about Tag Dispatch see the NFC developer guide. @@ -578,46 +628,51 @@ a special format that you want your application to also receive during an Androi should create an intent filter for your activity using the same URI scheme in order to receive the incoming NDEF message. -You may also want to pass an “Android application record” with your {@link -android.nfc.NdefMessage} -in order to guarantee a specific application handles an NDEF message, regardless of whether other -applications filter for the same intent. You can create an Android application record by calling -{@link android.nfc.NdefRecord#createApplicationRecord createApplicationRecord()}, passing it the -application’s package name. When the other device receives the NDEF message with this record, the -system automatically starts the application matching the package name. If the target device does not -currently have the application installed, the system uses the Android application record to launch -Android Market and take the user to the application to install it.
+You should also pass an “Android application record” with your {@link android.nfc.NdefMessage} in +order to guarantee that your application handles the incoming NDEF message, even if other +applications filter for the same intent action. You can create an Android application record by +calling {@link android.nfc.NdefRecord#createApplicationRecord createApplicationRecord()}, passing it +your application’s package name. When the other device receives the NDEF message with the +application record and multiple applications contain activities that handle the specified intent, +the system always delivers the message to the activity in your application (based on the matching +application record). If the target device does not currently have your application installed, the +system uses the Android application record to launch Android Market and take the user to the +application in order to install it.
If your application doesn’t use NFC APIs to perform NDEF Push messaging, then Android provides a default behavior: When your application is in the foreground on one device and Android Beam is invoked with another Android-powered device, then the other device receives an NDEF message with an Android application record that identifies your application. If the receiving device has the application installed, the system launches it; if it’s not installed, Android Market opens and takes -the user to your application so they can install it.
+the user to your application in order to install it. +For some example code, see the Android +Beam Demo sample app.
Android now supports Wi-Fi Direct™ for peer-to-peer (P2P) connections between -Android-powered +
Android now supports Wi-Fi Direct for peer-to-peer (P2P) connections between Android-powered devices and other device types without a hotspot or Internet connection. The Android framework provides a set of Wi-Fi P2P APIs that allow you to discover and connect to other devices when each -device supports Wi-Fi Direct™, then communicate over a speedy connection across distances much -longer than a Bluetooth connection.
+device supports Wi-Fi Direct, then communicate over a speedy connection across distances much longer +than a Bluetooth connection.A new package, {@link android.net.wifi.p2p}, contains all the APIs for performing peer-to-peer connections with Wi-Fi. The primary class you need to work with is {@link -android.net.wifi.p2p.WifiP2pManager}, for which you can get an instance by calling {@link +android.net.wifi.p2p.WifiP2pManager}, which you can acquire by calling {@link android.app.Activity#getSystemService getSystemService(WIFI_P2P_SERVICE)}. The {@link -android.net.wifi.p2p.WifiP2pManager} provides methods that allow you to:
+android.net.wifi.p2p.WifiP2pManager} includes APIs that allow you to:The Android system also broadcasts several different actions during certain Wi-Fi P2P events:
See the {@link android.net.wifi.p2p.WifiP2pManager} documentation for more information. Also -look -at the Wi-Fi Direct sample -application for example code.
+look at the Wi-Fi Direct Demo +sample application. @@ -685,20 +741,20 @@ application for example code.Android 4.0 gives users precise visibility of how much network data applications are using. The -Settings app provides controls that allow users to manage set limits for network data usage and even -disable the use of background data for individual apps. In order to avoid users disabling your app’s -access to data from the background, you should develop strategies to use use the data connection -efficiently and vary your usage depending on the type of connection available.
+Android 4.0 gives users precise visibility of how much network data their applications are using. +The Settings app provides controls that allow users to manage set limits for network data usage and +even disable the use of background data for individual apps. In order to avoid users disabling your +app’s access to data from the background, you should develop strategies to use use the data +connection efficiently and adjust your usage depending on the type of connection available.
If your application performs a lot of network transactions, you should provide user settings that allow users to control your app’s data habits, such as how often your app syncs data, whether to perform uploads/downloads only when on Wi-Fi, whether to use data while roaming, etc. With these controls available to them, users are much less likely to disable your app’s access to data when they approach their limits, because they can instead precisely control how much data your app uses. -When you provide an activity with these settings, you should include in its manifest declaration an -intent filter for the {@link android.content.Intent#ACTION_MANAGE_NETWORK_USAGE} action. For -example:
+If you provide a preference activity with these settings, you should include in its manifest +declaration an intent filter for the {@link android.content.Intent#ACTION_MANAGE_NETWORK_USAGE} +action. For example:<activity android:name="DataPreferences" android:label="@string/title_preferences"> @@ -709,10 +765,10 @@ example: </activity>-
This intent filter indicates to the system that this is the application that controls your +
This intent filter indicates to the system that this is the activity that controls your application’s data usage. Thus, when the user inspects how much data your app is using from the -Settings app, a “View application settings” button is available that launches your activity so the -user can refine how much data your app uses.
+Settings app, a “View application settings” button is available that launches your +preference activity so the user can refine how much data your app uses.Also beware that {@link android.net.ConnectivityManager#getBackgroundDataSetting()} is now deprecated and always returns true—use {@link @@ -720,7 +776,7 @@ android.net.ConnectivityManager#getActiveNetworkInfo()} instead. Before you atte transactions, you should always call {@link android.net.ConnectivityManager#getActiveNetworkInfo()} to get the {@link android.net.NetworkInfo} that represents the current network and query {@link android.net.NetworkInfo#isConnected()} to check whether the device has a -connection. You can then check various other connection properties, such as whether the device is +connection. You can then check other connection properties, such as whether the device is roaming or connected to Wi-Fi.
@@ -729,43 +785,10 @@ roaming or connected to Wi-Fi. -Two new sensor types have been added in Android 4.0: {@link -android.hardware.Sensor#TYPE_AMBIENT_TEMPERATURE} and {@link -android.hardware.Sensor#TYPE_RELATIVE_HUMIDITY}.
+{@link android.hardware.Sensor#TYPE_AMBIENT_TEMPERATURE} is a temperature sensor that provides -the ambient (room) temperature near a device. This sensor reports data in degrees Celsius. {@link -android.hardware.Sensor#TYPE_RELATIVE_HUMIDITY} is a humidity sensor that provides the relative -ambient (room) humidity. The sensor reports data as a percentage. If a device has both {@link -android.hardware.Sensor#TYPE_AMBIENT_TEMPERATURE} and {@link -android.hardware.Sensor#TYPE_RELATIVE_HUMIDITY} sensors, you can use them to calculate the dew point -and the absolute humidity.
- -The existing temperature sensor ({@link android.hardware.Sensor#TYPE_TEMPERATURE}) has been -deprecated. You should use the {@link android.hardware.Sensor#TYPE_AMBIENT_TEMPERATURE} sensor -instead.
- -Additionally, Android’s three synthetic sensors have been improved so they now have lower latency -and smoother output. These sensors include the gravity sensor ({@link -android.hardware.Sensor#TYPE_GRAVITY}), rotation vector sensor ({@link -android.hardware.Sensor#TYPE_ROTATION_VECTOR}), and linear acceleration sensor ({@link -android.hardware.Sensor#TYPE_LINEAR_ACCELERATION}). The improved sensors rely on the gyroscope -sensor to improve their output so the sensors appear only on devices that have a gyroscope. If a -device already provides one of the sensors, then that sensor appears as a second sensor on the -device. The three improved sensors have a version number of 2.
- - - - - - - - -Three major features have been added to Renderscript:
+Three major features have been added to RenderScript:
The {@link android.renderscript.Allocation} class now supports a {@link android.renderscript.Allocation#USAGE_GRAPHICS_RENDER_TARGET} memory space, which allows you to render things directly into the {@link android.renderscript.Allocation} and use it as a framebuffer -object.
+object. -{@link android.renderscript.RSTextureView} provides a means to display Renderscript graphics -inside -of a normal View, unlike {@link android.renderscript.RSSurfaceView}, which creates a separate -window. This key difference allows you to do things such as move, transform, or animate an {@link -android.renderscript.RSTextureView} as well as draw Renderscript graphics inside the View alongside -other traditional View widgets.
+{@link android.renderscript.RSTextureView} provides a means to display RenderScript graphics +inside of a {@link android.view.View}, unlike {@link android.renderscript.RSSurfaceView}, which +creates a separate window. This key difference allows you to do things such as move, transform, or +animate an {@link android.renderscript.RSTextureView} as well as draw RenderScript graphics inside +a view that lies within an activity layout.
-The {@link android.renderscript.Script#forEach forEach()} method allows you to call Renderscript -compute scripts from the VM level and have them automatically delegated to available cores on the -device. You do not use this method directly, but any compute Renderscript that you write will have a -{@link android.renderscript.Script#forEach forEach()} method that you can call in the reflected -Renderscript class. You can call the reflected {@link android.renderscript.Script#forEach forEach()} -method by passing in an input {@link android.renderscript.Allocation} to process, an output {@link -android.renderscript.Allocation} to write the result to, and a data structure if the Renderscript -needs more information in addition to the {@link android.renderscript.Allocation}s to. Only one of -the {@link android.renderscript.Allocation}s is necessary and the data structure is optional.
+The {@link android.renderscript.Script#forEach Script.forEach()} method allows you to call +RenderScript compute scripts from the VM level and have them automatically delegated to available +cores on the device. You do not use this method directly, but any compute RenderScript that you +write will have a {@link android.renderscript.Script#forEach forEach()} method that you can call in +the reflected RenderScript class. You can call the reflected {@link +android.renderscript.Script#forEach forEach()} method by passing in an input {@link +android.renderscript.Allocation} to process, an output {@link android.renderscript.Allocation} to +write the result to, and a {@link android.renderscript.FieldPacker} data structure in case the +RenderScript needs more information. Only one of the {@link android.renderscript.Allocation}s is +necessary and the data structure is optional.
@@ -802,118 +825,154 @@ the {@link android.renderscript.Allocation}s is necessary and the data structureAndroid 4.0 improves accessibility for users with disabilities with the Touch Exploration service -and provides extended APIs for developers of new accessibility services.
- -Users with vision loss can now explore applications by touching areas of the screen and hearing -voice descriptions of the content. The “Explore by Touch” feature works like a virtual cursor as the -user drags a finger across the screen.
- -You don’t have to use any new APIs to enhance touch exploration in your application, because the -existing {@link android.R.attr#contentDescription android:contentDescription} -attribute and {@link android.view.View#setContentDescription setContentDescription()} method is all -you need. Because touch exploration works like a virtual cursor, it allows screen readers to -identify the descriptive the same way that screen readers can when navigating with a d-pad or -trackball. So this is a reminder to provide descriptive text for the views in your application, -especially for {@link android.widget.ImageButton}, {@link android.widget.EditText}, {@link -android.widget.CheckBox} and other interactive widgets that might not contain text information by -default.
- -Developers of custom Views, ViewGroups and widgets can make their components compatible with -accessibility services like Touch Exploration. For custom views and widgets targeted for Android 4.0 -and later, developers should implement the following accessibility API methods in their classes:
-Developers who want to maintain compatibility with Android versions prior to 4.0, while still -providing support for new the accessibility APIs, can use the {@link -android.view.View#setAccessibilityDelegate(android.view.View.AccessibilityDelegate) -setAccessibilityDelegate()} method to provide an {@link android.view.View.AccessibilityDelegate} -containing implementations of the new accessibility API methods while maintaining compatibility with -prior releases.
+Android 4.0 improves accessibility for sight-impaired users with new explore-by-touch mode +and extended APIs that allow you to provide more information about view content or +develop advanced accessibility services.
+Users with vision loss can now explore the screen by touching and dragging a finger across the +screen to hear voice descriptions of the content. Because the explore-by-touch mode works like a +virtual cursor, it allows screen readers to identify the descriptive text the same way that screen +readers can when the user navigates with a d-pad or trackball—by reading information provided +by {@link android.R.attr#contentDescription android:contentDescription} and {@link +android.view.View#setContentDescription setContentDescription()} upon a simulated "hover" event. So, +consider this is a reminder that you should provide descriptive text for the views in your +application, especially for {@link android.widget.ImageButton}, {@link android.widget.EditText}, +{@link android.widget.ImageView} and other widgets that might not naturally contain descriptive +text.
-Accessibility events have been significantly improved to provide better information for -accessibility services. In particular, events are generated based on view composition, providing -better context information and allowing accessibility service developers to traverse view -hierarchies to get additional view information and deal with special cases.
-To access additional content information and traverse view hierarchies, accessibility service -application developers should use the following procedure.
+To enhance the information available to accessibility services such as screen readers, you can +implement new callback methods for accessibility events in your custom {@link +android.view.View} components.
+ +It's important to first note that the behavior of the {@link +android.view.View#sendAccessibilityEvent sendAccessibilityEvent()} method has changed in Android +4.0. As with previous version of Android, when the user enables accessibility services on the device +and an input event such as a click or hover occurs, the respective view is notified with a call to +{@link android.view.View#sendAccessibilityEvent sendAccessibilityEvent()}. Previously, the +implementation of {@link android.view.View#sendAccessibilityEvent sendAccessibilityEvent()} would +initialize an {@link android.view.accessibility.AccessibilityEvent} and send it to {@link +android.view.accessibility.AccessibilityManager}. The new behavior involves some additional callback +methods that allow the view and its parents to add more contextual information to the event:
Custom implementations of {@link android.view.View} might want to implement {@link +android.view.View#onInitializeAccessibilityEvent onInitializeAccessibilityEvent()} to +attach additional accessibility information to the {@link +android.view.accessibility.AccessibilityEvent}, but should also call the super implementation to +provide default information such as the standard content description, item index, and more. +However, you should not add additional text content in this callback—that happens +next.
Custom implementations of {@link android.view.View} should usually implement {@link +android.view.View#onPopulateAccessibilityEvent onPopulateAccessibilityEvent()} to add additional +text content to the {@link android.view.accessibility.AccessibilityEvent} if the {@link +android.R.attr#contentDescription android:contentDescription} text is missing or +insufficient. To add more text description to the +{@link android.view.accessibility.AccessibilityEvent}, call {@link +android.view.accessibility.AccessibilityEvent#getText()}.{@link java.util.List#add add()}.
+In order to retrieve {@link android.view.accessibility.AccessibilityNodeInfo} information, your -application must request permission to retrieve application window content through a manifest -declaration that includes a new, separate xml configuration file, which supercedes {@link -android.accessibilityservice.AccessibilityServiceInfo}. For more information, see {@link +
In addition to the new methods above, which are useful when extending the {@link +android.view.View} class, you can also intercept these event callbacks on any {@link +android.view.View} by extending {@link +android.view.View.AccessibilityDelegate AccessibilityDelegate} and setting it on the view with +{@link android.view.View#setAccessibilityDelegate setAccessibilityDelegate()}. +When you do, each accessibility method in the view defers the call to the corresponding method in +the delegate. For example, when the view receives a call to {@link +android.view.View#onPopulateAccessibilityEvent onPopulateAccessibilityEvent()}, it passes it to the +same method in the {@link android.view.View.AccessibilityDelegate}. Any methods not handled by +the delegate are given right back to the view for default behavior. This allows you to override only +the methods necessary for any given view without extending the {@link android.view.View} class.
+ + +If you want to maintain compatibility with Android versions prior to 4.0, while also supporting +the new the accessibility APIs, you can do so with the latest version of the v4 support +library (in Compatibility Package, r4) +using a set of utility classes that provide the new accessibility APIs in a backward-compatible +design.
+ + + +If you're developing an accessibility service, the information about various accessibility events +has been significantly expanded to enable more advanced accessibility feedback for users. In +particular, events are generated based on view composition, providing better context information and +allowing accessibility services to traverse view hierarchies to get additional view information and +deal with special cases.
+ +If you're developing an accessibility service (such as a screen reader), you can access +additional content information and traverse view hierarchies with the following procedure:
+An {@link android.view.accessibility.AccessibilityNodeInfo} represents a single node +of the window content in a format that allows you to query accessibility information about that +node. The {@link android.view.accessibility.AccessibilityNodeInfo} object returned from {@link +android.view.accessibility.AccessibilityEvent} describes the event source, whereas the source from +an {@link android.view.accessibility.AccessibilityRecord} describes the predecessor of the event +source.
In order for your application to publish itself to the system as an accessibility service, it +must declare an XML configuration file that corresponds to {@link +android.accessibilityservice.AccessibilityServiceInfo}. For more information about creating an +accessibility service, see {@link android.accessibilityservice.AccessibilityService} and {@link android.accessibilityservice.AccessibilityService#SERVICE_META_DATA -AccessibilityService.SERVICE_META_DATA}.
+SERVICE_META_DATA} for information about the XML configuration. +If you're interested in the device's accessibility state, the {@link +android.view.accessibility.AccessibilityManager} has some new APIs such as:
+Android 4.0 expands the capabilities for enterprise application with the following features.
-The new {@link android.net.VpnService} allows applications to build their own VPN (Virtual -Private -Network), running as a {@link android.app.Service}. A VPN service creates an interface for a virtual -network with its own address and routing rules and performs all reading and writing with a file -descriptor.
+Private Network), running as a {@link android.app.Service}. A VPN service creates an interface for a +virtual network with its own address and routing rules and performs all reading and writing with a +file descriptor.To create a VPN service, use {@link android.net.VpnService.Builder}, which allows you to specify the network address, DNS server, network route, and more. When complete, you can establish the @@ -941,7 +999,7 @@ the system is granted this permission—apps cannot request it). To then use users must manually enable it in the system settings.
-Applications that manage the device restrictions can now disable the camera using {@link android.app.admin.DevicePolicyManager#setCameraDisabled setCameraDisabled()} and the {@link @@ -949,54 +1007,46 @@ android.app.admin.DeviceAdminInfo#USES_POLICY_DISABLE_CAMERA} property (applied <disable-camera />} element in the policy configuration file).
-The new {@link android.security.KeyChain} class provides APIs that allow you to import and access -certificates and key stores in credential storage. See the {@link android.security.KeyChain} +certificates in the system key store. Certificates streamline the installation of both client +certificates (to validate the identity of the user) and certificate authority certificates (to +verify server identity). Applications such as web browsers or email clients can access the installed +certificates to authenticate users to servers. See the {@link android.security.KeyChain} documentation for more information.
-A new voicemail APIs allows applications to add voicemails to the system. Because the APIs -currently -do not allow third party apps to read all the voicemails from the system, the only third-party apps -that should use the voicemail APIs are those that have voicemail to deliver to the user. For -instance, it’s possible that a users have multiple voicemail sources, such as one provided by their -phone’s service provider and others from VoIP or other alternative services. These kinds of apps can -use the APIs to add voicemail to the system. The built-in Phone application can then present all -voicemails to the user with a single list. Although the system’s Phone application is the only -application that can read all the voicemails, each application that provides voicemails can read -those that it has added to the system.
- -The {@link android.provider.VoicemailContract} class defines the content provider for the -voicemail -APIs. The subclasses {@link android.provider.VoicemailContract.Voicemails} and {@link -android.provider.VoicemailContract.Status} provide tables in which the voicemail providers can -insert voicemail data for storage on the device. For an example of a voicemail provider app, see the -Voicemail Provider -Demo.
+Two new sensor types have been added in Android 4.0:
-The new spell checker framework allows apps to create spell checkers in a manner similar to the -input method framework. To create a new spell checker, you must override the {@link -android.service.textservice.SpellCheckerService.Session} class to provide spelling suggestions based -on text provided by the interface callback methods, returning suggestions as a {@link -android.view.textservice.SuggestionsInfo} object.
+Applications with a spell checker service must declare the {@link
-android.Manifest.permission#BIND_TEXT_SERVICE} permission as required by the service, such that
-other services must have this permission in order for them to bind with the spell checker service.
-The service must also declare an intent filter with
If a device has both {@link android.hardware.Sensor#TYPE_AMBIENT_TEMPERATURE} and {@link +android.hardware.Sensor#TYPE_RELATIVE_HUMIDITY} sensors, you can use them to calculate the dew point +and the absolute humidity.
+ +The previous temperature sensor, {@link android.hardware.Sensor#TYPE_TEMPERATURE}, has been +deprecated. You should use the {@link android.hardware.Sensor#TYPE_AMBIENT_TEMPERATURE} sensor +instead.
+ +Additionally, Android’s three synthetic sensors have been improved so they now have lower latency +and smoother output. These sensors include the gravity sensor ({@link +android.hardware.Sensor#TYPE_GRAVITY}), rotation vector sensor ({@link +android.hardware.Sensor#TYPE_ROTATION_VECTOR}), and linear acceleration sensor ({@link +android.hardware.Sensor#TYPE_LINEAR_ACCELERATION}). The improved sensors rely on the gyroscope +sensor to improve their output, so the sensors appear only on devices that have a gyroscope.
@@ -1004,22 +1054,20 @@ checker.Android’s text-to-speech (TTS) APIs have been greatly extended to allow applications to more -easily -implement custom TTS engines, while applications that want to use a TTS engine have a couple new -APIs for selecting the engine.
+Android’s text-to-speech (TTS) APIs have been significantly extended to allow applications to +more easily implement custom TTS engines, while applications that want to use a TTS engine have a +couple new APIs for selecting an engine.
In previous versions of Android, you could use the {@link android.speech.tts.TextToSpeech} class -to -perform text-to-speech (TTS) operations using the TTS engine provided by the system or set a custom -engine using {@link android.speech.tts.TextToSpeech#setEngineByPackageName -setEngineByPackageName()}. -In Android 4.0, the {@link android.speech.tts.TextToSpeech#setEngineByPackageName -setEngineByPackageName()} method has been deprecated and you can now specify the engine to use with -a new {@link android.speech.tts.TextToSpeech} that accepts the package name of a TTS engine.
+to perform text-to-speech (TTS) operations using the TTS engine provided by the system or set a +custom engine using {@link android.speech.tts.TextToSpeech#setEngineByPackageName +setEngineByPackageName()}. In Android 4.0, the {@link +android.speech.tts.TextToSpeech#setEngineByPackageName setEngineByPackageName()} method has been +deprecated and you can now specify the engine to use with a new {@link +android.speech.tts.TextToSpeech} constructor that accepts the package name of a TTS engine.You can also query the available TTS engines with {@link android.speech.tts.TextToSpeech#getEngines()}. This method returns a list of {@link @@ -1029,30 +1077,29 @@ icon, label, and package name.
Previously, custom engines required that the engine be built using native code, based on a TTS -engine header file. In Android 4.0, there is a framework API for building TTS engines.
+Previously, custom engines required that the engine be built using an undocumented native header +file. In Android 4.0, there is a complete set of framework APIs for building TTS engines.
The basic setup requires an implementation of {@link android.speech.tts.TextToSpeechService} that responds to the {@link android.speech.tts.TextToSpeech.Engine#INTENT_ACTION_TTS_SERVICE} intent. The primary work for a TTS engine happens during the {@link -android.speech.tts.TextToSpeechService#onSynthesizeText onSynthesizeText()} callback in the {@link -android.speech.tts.TextToSpeechService}. The system delivers this method two objects:
+android.speech.tts.TextToSpeechService#onSynthesizeText onSynthesizeText()} callback in a service +that extends {@link android.speech.tts.TextToSpeechService}. The system delivers this method two +objects:Now that the framework supports a true API for creating TTS engines, support for the previous -technique using native code has been removed. Watch for a blog post about the compatibility layer -that you can use to convert TTS engines built using the previous technique to the new framework.
+Now that the framework supports a true API for creating TTS engines, support for the native code +implementation has been removed. Look for a blog post about a compatibility layer +that you can use to convert your old TTS engines to the new framework.
For an example TTS engine using the new APIs, see the Text To Speech Engine sample app.
@@ -1062,6 +1109,27 @@ href=”{@docRoot}resources/samples/TtsEngine/index.html”>Text To Speech Engin +A new spell checker framework allows apps to create spell checkers in a manner similar to the +input method framework. To create a new spell checker, you must implement a service that extends +{@link android.service.textservice.SpellCheckerService} and extend the {@link +android.service.textservice.SpellCheckerService.Session} class to provide spelling suggestions based +on text provided by interface callback methods. In the {@link +android.service.textservice.SpellCheckerService.Session} callback methods, you must return the +spelling suggestions as {@link android.view.textservice.SuggestionsInfo} objects.
+ +Applications with a spell checker service must declare the {@link +android.Manifest.permission#BIND_TEXT_SERVICE} permission as required by the service, such that +other services must have this permission in order for them to bind with the spell checker service. +The service must also declare an intent filter with {@code <action +android:name="android.service.textservice.SpellCheckerService" />} as the intent’s action and should +include a {@code <meta-data>} element that declares configuration information for the spell +checker.
+ + + + @@ -1071,34 +1139,36 @@ href=”{@docRoot}resources/samples/TtsEngine/index.html”>Text To Speech EnginThe {@link android.app.ActionBar} has been updated to support several new behaviors. Most importantly, the system gracefully manages the action bar’s size and configuration when running on -smaller screens in order to provide an optimal user experience. For example, when the screen is -narrow (such as when a handset is in portrait orientation), the action bar’s navigation tabs appear -in a “stacked bar,” which appears directly below the main action bar. You can also opt-in to a -“split action bar,” which will place all action items in a separate bar at the bottom of the screen -when the screen is narrow.
+smaller screens in order to provide an optimal user experience on all screen sizes. For example, +when the screen is narrow (such as when a handset is in portrait orientation), the action bar’s +navigation tabs appear in a “stacked bar,” which appears directly below the main action bar. You can +also opt-in to a “split action bar,” which places all action items in a separate bar at the bottom +of the screen when the screen is narrow. -If your action bar includes several action items, not all of them will fit into the action bar -when on a narrow screen, so the system will place them into the overflow menu. However, Android 4.0 +
If your action bar includes several action items, not all of them will fit into the action bar on +a narrow screen, so the system will place more of them into the overflow menu. However, Android 4.0 allows you to enable “split action bar” so that more action items can appear on the screen in a separate bar at the bottom of the screen. To enable split action bar, add {@link android.R.attr#uiOptions android:uiOptions} with {@code ”splitActionBarWhenNarrow”} to either your -{@code <application>} tag or individual {@code <activity>} tags in your manifest file. -When enabled, the system will enable the additional bar for action items when the screen is narrow -and add all action items to the new bar (no action items will appear in the primary action bar).
+{@code <application>} tag or +individual {@code <activity>} tags +in your manifest file. When enabled, the system will add an additional bar at the bottom of the +screen for all action items when the screen is narrow (no action items will appear in the primary +action bar).If you want to use the navigation tabs provided by the {@link android.app.ActionBar.Tab} APIs, -but -don’t want the stacked bar—you want only the tabs to appear, then enable the split action bar -as described above and also call {@link android.app.ActionBar#setDisplayShowHomeEnabled -setDisplayShowHomeEnabled(false)} to disable the application icon in the action bar. With nothing -left in the main action bar, it disappears—all that’s left are the navigation tabs at the top -and the action items at the bottom of the screen.
+but don’t need the main action bar on top (you want only the tabs to appear at the top), then enable +the split action bar as described above and also call {@link +android.app.ActionBar#setDisplayShowHomeEnabled setDisplayShowHomeEnabled(false)} to disable the +application icon in the action bar. With nothing left in the main action bar, it +disappears—all that’s left are the navigation tabs at the top and the action items at the +bottom of the screen. -If you want to apply custom styling to the action bar, you can use new style properties {@link android.R.attr#backgroundStacked} and {@link android.R.attr#backgroundSplit} to apply a background @@ -1108,31 +1178,38 @@ setStackedBackgroundDrawable()} and {@link android.app.ActionBar#setSplitBackgro setSplitBackgroundDrawable()}.
-The new {@link android.view.ActionProvider} class facilitates user actions to which several -different applications may respond. For example, a “share” action in your application might invoke -several different apps that can handle the {@link android.content.Intent#ACTION_SEND} intent and the -associated data. In this case, you can use the {@link android.widget.ShareActionProvider} (an -extension of {@link android.view.ActionProvider}) in your action bar, instead of a traditional menu -item that invokes the intent. The {@link android.widget.ShareActionProvider} populates a drop-down -menu with all the available apps that can handle the intent.
+The new {@link android.view.ActionProvider} class allows you to create a specialized handler for +action items. An action provider can define an action view, a default action behavior, and a submenu +for each action item to which it is associated. When you want to create an action item that has +dynamic behaviors (such as a variable action view, default action, or submenu), extending {@link +android.view.ActionProvider} is a good solution in order to create a reusable component, rather than +handling the various action item transformations in your fragment or activity.
+ +For example, the {@link android.widget.ShareActionProvider} is an extension of {@link +android.view.ActionProvider} that facilitates a “share” action from the action bar. Instead of using +traditional action item that invokes the {@link android.content.Intent#ACTION_SEND} intent, you can +use this action provider to present an action view with a drop-down list of applications that handle +the {@link android.content.Intent#ACTION_SEND} intent. When the user selects an application to use +for the action, {@link android.widget.ShareActionProvider} remembers that selection and provides it +in the action view for faster access to sharing with that app.
To declare an action provider for an action item, include the {@code android:actionProviderClass} -attribute in the {@code <item>} element for your activity’s options menu, with the class name -of the action provider as the attribute value. For example:
+attribute in the {@code +<item>} element for your activity’s options menu, with the class name of the action +provider as the value. For example:
<item android:id="@+id/menu_share"
android:title="Share"
- android:icon="@drawable/ic_share"
android:showAsAction="ifRoom"
android:actionProviderClass="android.widget.ShareActionProvider" />
In your activity’s {@link android.app.Activity#onCreateOptionsMenu onCreateOptionsMenu()} -callback -method, retrieve an instance of the action provider from the menu item and set the intent:
+callback method, retrieve an instance of the action provider from the menu item and set the +intent:
public boolean onCreateOptionsMenu(Menu menu) {
@@ -1151,17 +1228,18 @@ href=”{@docRoot}resources/samples/ApiDemos/src/com/example/android/apis/app/Ac
class in ApiDemos.
-Collapsible Action Views
+Collapsible action views
-Menu items that appear as action items can now toggle between their action view state and
+
Action items that provide an action view can now toggle between their action view state and
traditional action item state. Previously only the {@link android.widget.SearchView} supported
collapsing when used as an action view, but now you can add an action view for any action item and
switch between the expanded state (action view is visible) and collapsed state (action item is
visible).
To declare that an action item that contains an action view be collapsible, include the {@code
-“collapseActionView”} flag in the {@code android:showAsAction} attribute for the {@code
-<item>} element in the menu’s XML file.
+“collapseActionView”} flag in the {@code android:showAsAction} attribute for the {@code
+<item>} element in the menu’s XML file.
To receive callbacks when an action view switches between expanded and collapsed, register an
instance of {@link android.view.MenuItem.OnActionExpandListener} with the respective {@link
@@ -1178,20 +1256,20 @@ android.view.CollapsibleActionView} interface to receive callbacks when the view
collapsed.
-Other APIs for Action Bar
+Other APIs for action bar
Android 4.0 introduces a variety of new views and other UI components.
+Since the early days of Android, the system has managed a UI component known as the status @@ -1219,8 +1298,8 @@ Back, and so forth) and also an interface for elements traditionally provided by Android 4.0, the system provides a new type of system UI called the navigation bar. The navigation bar shares some qualities with the system bar, because it provides navigation controls for devices that don’t have hardware counterparts for navigating the system, but the navigation -controls is all that it provides (a device with the navigation bar, thus, also includes the status -bar at the top of the screen).
+controls is all that the navigation bar offers (a device with the navigation bar, thus, also +includes the status bar at the top of the screen).To this day, you can hide the status bar on handsets using the {@link android.view.WindowManager.LayoutParams#FLAG_FULLSCREEN} flag. In Android 4.0, the APIs that control @@ -1228,32 +1307,31 @@ the system bar’s visibility have been updated to better reflect the behavior o and navigation bar:
You can set each of these flags for the system bar by calling {@link -android.view.View#setSystemUiVisibility setSystemUiVisibility()} on any view in your activity -window. The window manager will combine (OR-together) all flags from all views in your window and +
You can set each of these flags for the system bar and navigation bar by calling {@link +android.view.View#setSystemUiVisibility setSystemUiVisibility()} on any view in your activity. The +window manager will combine (OR-together) all flags from all views in your window and apply them to the system UI as long as your window has input focus. When your window loses input focus (the user navigates away from your app, or a dialog appears), your flags cease to have effect. Similarly, if you remove those views from the view hierarchy their flags no longer apply.
To synchronize other events in your activity with visibility changes to the system UI (for -example, -hide the action bar or other UI controls when the system UI hides), you can register a {@link -android.view.View.OnSystemUiVisibilityChangeListener} to get a callback when the visibility -changes.
+example, hide the action bar or other UI controls when the system UI hides), you should register a +{@link android.view.View.OnSystemUiVisibilityChangeListener} to be notified when the visibility +of the system bar or navigation bar changes.See the @@ -1263,8 +1341,7 @@ OverscanActivity class for a demonstration of different system UI options.
{@link android.widget.GridLayout} is a new view group that places child views in a rectangular -grid. -Unlike {@link android.widget.TableLayout}, {@link android.widget.GridLayout} relies on a flat +grid. Unlike {@link android.widget.TableLayout}, {@link android.widget.GridLayout} relies on a flat hierarchy and does not make use of intermediate views such as table rows for providing structure. Instead, children specify which row(s) and column(s) they should occupy (cells can span multiple rows and/or columns), and by default are laid out sequentially across the grid’s rows and columns. @@ -1282,11 +1359,10 @@ for samples using {@link android.widget.GridLayout}.
{@link android.view.TextureView} is a new view that allows you to display a content stream, such -as -a video or an OpenGL scene. Although similar to {@link android.view.SurfaceView}, {@link +as a video or an OpenGL scene. Although similar to {@link android.view.SurfaceView}, {@link android.view.TextureView} is unique in that it behaves like a regular view, rather than creating a separate window, so you can treat it like any other {@link android.view.View} object. For example, -you can apply transforms, animate it using {@link android.view.ViewPropertyAnimator}, or easily +you can apply transforms, animate it using {@link android.view.ViewPropertyAnimator}, or adjust its opacity with {@link android.view.View#setAlpha setAlpha()}.
Beware that {@link android.view.TextureView} works only within a hardware accelerated window.
@@ -1294,16 +1370,14 @@ adjust its opacity with {@link android.view.View#setAlpha setAlpha()}.For more information, see the {@link android.view.TextureView} documentation.
-The new {@link android.widget.Switch} widget is a two-state toggle that users can drag to one -side -or the other (or simply tap) to toggle an option between two states.
+side or the other (or simply tap) to toggle an option between two states. -You can declare a switch in your layout with the {@code <Switch>} element. You can use the -{@code android:textOn} and {@code android:textOff} attributes to specify the text to appear on the -switch when in the on and off setting. The {@code android:text} attribute also allows you to place a -label alongside the switch.
+You can use the {@code android:textOn} and {@code android:textOff} attributes to specify the text +to appear on the switch when in the on and off setting. The {@code android:text} attribute also +allows you to place a label alongside the switch.
For a sample using switches, see the switches.xml layout file @@ -1312,12 +1386,11 @@ href=”{@docRoot}resources/samples/ApiDemos/src/com/example/android/apis/view/S activity.
-Android 3.0 introduced {@link android.widget.PopupMenu} to create short contextual menus that pop -up -at an anchor point you specify (usually at the point of the item selected). Android 4.0 extends the -{@link android.widget.PopupMenu} with a couple useful features:
+up at an anchor point you specify (usually at the point of the item selected). Android 4.0 extends +the {@link android.widget.PopupMenu} with a couple useful features:A new {@link android.preference.TwoStatePreference} abstract class serves as the basis for @@ -1337,10 +1411,10 @@ preference screen or dialog. For example, the Settings application uses a {@link android.preference.SwitchPreference} for the Wi-Fi and Bluetooth settings.
-The {@link android.view.View} class now supports “hover” events to enable richer interactions -through the use of pointer devices (such as a mouse or other device that drives an on-screen +through the use of pointer devices (such as a mouse or other devices that drive an on-screen cursor).
To receive hover events on a view, implement the {@link android.view.View.OnHoverListener} and @@ -1360,8 +1434,7 @@ android.view.View.OnHoverListener#onHover onHover()} if it handles the hover eve listener returns false, then the hover event will be dispatched to the parent view as usual.
If your application uses buttons or other widgets that change their appearance based on the -current -state, you can now use the {@code android:state_hovered} attribute in a state list drawable to provide a different background drawable when a cursor hovers over the view.
@@ -1370,11 +1443,10 @@ href=”{@docRoot}samples/ApiDemos/src/com/example/android/apis/view/Hover.html ApiDemos. -Android now provides APIs for receiving input from a stylus input device such as a digitizer -tablet -peripheral or a stylus-enabled touch screen.
+tablet peripheral or a stylus-enabled touch screen.Stylus input operates in a similar manner to touch or mouse input. When the stylus is in contact with the digitizer, applications receive touch events just like they would when a finger is used to @@ -1393,18 +1465,16 @@ can choose to handle stylus input in different ways from finger or mouse input.<
Your application can also query which mouse or stylus buttons are pressed by querying the “button state” of a {@link android.view.MotionEvent} using {@link android.view.MotionEvent#getButtonState getButtonState()}. The currently defined button states are: {@link -android.view.MotionEvent#BUTTON_PRIMARY}, {@link -android.view.MotionEvent#BUTTON_SECONDARY}, {@link -android.view.MotionEvent#BUTTON_TERTIARY}, {@link android.view.MotionEvent#BUTTON_BACK}, -and {@link android.view.MotionEvent#BUTTON_FORWARD}. -For convenience, the back and forward mouse buttons are automatically mapped to the {@link -android.view.KeyEvent#KEYCODE_BACK} and {@link android.view.KeyEvent#KEYCODE_FORWARD} keys. Your -application can handle these keys to support mouse button based back and forward navigation.
+android.view.MotionEvent#BUTTON_PRIMARY}, {@link android.view.MotionEvent#BUTTON_SECONDARY}, {@link +android.view.MotionEvent#BUTTON_TERTIARY}, {@link android.view.MotionEvent#BUTTON_BACK}, and {@link +android.view.MotionEvent#BUTTON_FORWARD}. For convenience, the back and forward mouse buttons are +automatically mapped to the {@link android.view.KeyEvent#KEYCODE_BACK} and {@link +android.view.KeyEvent#KEYCODE_FORWARD} keys. Your application can handle these keys to support +mouse button based back and forward navigation.In addition to precisely measuring the position and pressure of a contact, some stylus input -devices -also report the distance between the stylus tip and the digitizer, the stylus tilt angle, and the -stylus orientation angle. Your application can query this information using {@link +devices also report the distance between the stylus tip and the digitizer, the stylus tilt angle, +and the stylus orientation angle. Your application can query this information using {@link android.view.MotionEvent#getAxisValue getAxisValue()} with the axis codes {@link android.view.MotionEvent#AXIS_DISTANCE}, {@link android.view.MotionEvent#AXIS_TILT}, and {@link android.view.MotionEvent#AXIS_ORIENTATION}.
@@ -1479,29 +1549,31 @@ href="{@docRoot}guide/topics/manifest/application-element.html">{@code <appli element. You can alternatively disable hardware acceleration for individual views by calling {@link android.view.View#setLayerType setLayerType(LAYER_TYPE_SOFTWARE)}. +For more information about hardware acceleration, including a list of unsupported drawing +operations, see the Hardware +Acceleration document.
+ +In previous versions of Android, JNI local references weren’t indirect handles; we used direct -pointers. This didn’t seem like a problem as long as we didn’t have a garbage collector that moves -objects, but it was because it meant that it was possible to write buggy code that still seemed to -work. In Android 4.0, we’ve moved to using indirect references so we can detect these bugs before we -need third-party native code to be correct.
+In previous versions of Android, JNI local references weren’t indirect handles; Android used +direct pointers. This wasn't a problem as long as the garbage collector didn't move objects, but it +seemed to work because it made it possible to write buggy code. In Android 4.0, the system now uses +indirect references in order to detect these bugs.
-The ins and outs of JNI local references are described in “Local and Global References” in -JNI Tips. In Android 4.0, CheckJNI -has been -enhanced to detect these errors. Watch the Android -Developers Blog for an upcoming post about common errors with JNI references and how you can fix -them.
+The ins and outs of JNI local references are described in “Local and Global References” in JNI Tips. In Android 4.0, +CheckJNI has been enhanced to detect these errors. Watch the Android Developers Blog for an upcoming post +about common errors with JNI references and how you can fix them.
This change in the JNI implementation only affects apps that target Android 4.0 by setting either -the {@code targetSdkVersion} or -{@code minSdkVersion} to -{@code “14”} or higher. If you’ve set these attributes to any lower -value, then JNI local references will behave the same as in previous versions.
+the {@code +targetSdkVersion} or {@code +minSdkVersion} to {@code “14”} or higher. If you’ve set these attributes to any lower value, +then JNI local references behave the same as in previous versions. @@ -1569,8 +1641,115 @@ Wi-Fi for peer-to-peer communications. +In addition to everything above, Android 4.0 naturally supports all APIs from previous releases. +Because the Android 3.x (Honeycomb) platform is available only for large-screen devices, if you've +been developing primarily for handsets, then you might not be aware of all the APIs added to Android +in these recent releases.
+Here's a look at some of the most notable APIs you might have missed that are now available +on handsets as well:
+ +<application>
+element or for individual <activity>
+elements. This results
+in smoother animations, smoother scrolling, and overall better performance and response to user
+interaction.
+ Note: If you set your application's {@code minSdkVersion} or {@code targetSdkVersion} to +{@code "14"} or higher, hardware acceleration is enabled by default.
For a detailed view of all API changes in Android {@sdkPlatformVersion} (API -Level +
For a detailed view of all API changes in Android {@sdkPlatformVersion} (API Level {@sdkPlatformApiLevel}), see the API -Differences Report.
- - +href="{@docRoot}sdk/api_diff/{@sdkPlatformApiLevel}/changes.html">API Differences Report.The Android {@sdkPlatformVersion} platform delivers an updated version of the framework API. The -Android {@sdkPlatformVersion} API is assigned an integer identifier — -{@sdkPlatformApiLevel} — that is stored in the system itself. This -identifier, called the "API Level", allows the system to correctly determine whether an application -is compatible with the system, prior to installing the application.
+The Android {@sdkPlatformVersion} API is assigned an integer +identifier—{@sdkPlatformApiLevel}—that is stored in the system itself. +This identifier, called the "API level", allows the system to correctly determine whether an +application is compatible with the system, prior to installing the application.
To use APIs introduced in Android {@sdkPlatformVersion} in your application, you need compile the
-application against the Android library that is provided in the Android {@sdkPlatformVersion} SDK
-platform. Depending on your needs, you might also need to add an
+application against an Android platform that supports API level {@sdkPlatformApiLevel} or
+higher. Depending on your needs, you might also need to add an
android:minSdkVersion="{@sdkPlatformApiLevel}" attribute to the
-<uses-sdk> element in the application's manifest.
For more information about how to use API Level, see the API Levels document.
+For more information, see the API Levels +document.
The system image included in the downloadable SDK platform provides a variety -of -built-in locales. In some cases, region-specific strings are available for the -locales. In other cases, a default version of the language is used. The -languages that are available in the Android 3.0 system -image are listed below (with language_country/region locale -descriptor).
+The system image included in the downloadable SDK platform provides a variety of built-in +locales. In some cases, region-specific strings are available for the locales. In other cases, a +default version of the language is used. The languages that are available in the Android 3.0 system +image are listed below (with language_country/region locale descriptor).