From 99b70f3f5d051261229d1792c169a374fc23326b Mon Sep 17 00:00:00 2001 From: Joe Fernandez Date: Mon, 22 Aug 2011 15:49:52 -0700 Subject: [PATCH] DO NOT MERGE cherrypick from master Change-Id: I63bc055991405c56e9dcc83a54b106b870cf6b29 Change-Id: Ia5150288ce6fe57460159dd7555c6786023b1d9e --- docs/html/guide/appendix/media-formats.jd | 2 +- docs/html/guide/guide_toc.cs | 23 +- docs/html/guide/topics/media/audio-capture.jd | 253 ++++ docs/html/guide/topics/media/camera.jd | 1055 +++++++++++++++++ docs/html/guide/topics/media/index.jd | 987 +-------------- docs/html/guide/topics/media/jetplayer.jd | 70 ++ docs/html/guide/topics/media/mediaplayer.jd | 747 ++++++++++++ 7 files changed, 2185 insertions(+), 952 deletions(-) create mode 100644 docs/html/guide/topics/media/audio-capture.jd create mode 100644 docs/html/guide/topics/media/camera.jd create mode 100644 docs/html/guide/topics/media/jetplayer.jd create mode 100644 docs/html/guide/topics/media/mediaplayer.jd diff --git a/docs/html/guide/appendix/media-formats.jd b/docs/html/guide/appendix/media-formats.jd index ccc63a214a647..137f13811d11c 100644 --- a/docs/html/guide/appendix/media-formats.jd +++ b/docs/html/guide/appendix/media-formats.jd @@ -14,7 +14,7 @@ page.title=Android Supported Media Formats

See also

    -
  1. Audio and Video
  2. +
  3. Multimedia and Camera

Key classes

diff --git a/docs/html/guide/guide_toc.cs b/docs/html/guide/guide_toc.cs index 18d9a48e06c0b..2f50ce7ccb080 100644 --- a/docs/html/guide/guide_toc.cs +++ b/docs/html/guide/guide_toc.cs @@ -268,9 +268,26 @@ -
  • - Media -
  • +
  • +
    + Multimedia and Camera + updated
    + +
  • Copy and Paste diff --git a/docs/html/guide/topics/media/audio-capture.jd b/docs/html/guide/topics/media/audio-capture.jd new file mode 100644 index 0000000000000..75d294b570921 --- /dev/null +++ b/docs/html/guide/topics/media/audio-capture.jd @@ -0,0 +1,253 @@ +page.title=Audio Capture +parent.title=Multimedia and Camera +parent.link=index.html +@jd:body + +
    +
    + +

    In this document

    +
      +
    1. Performing Audio Capture +
        +
      1. Code Example
      2. +
      +
    2. +
    + +

    Key classes

    +
      +
    1. {@link android.media.MediaRecorder}
    2. +
    + +

    See also

    +
      +
    1. Android Supported Media Formats
    2. +
    3. Data Storage
    4. +
    5. MediaPlayer +
    + +
    +
    + +

    The Android multimedia framework includes support for capturing and encoding a variety of common +audio formats, so that you can easily integrate audio into your applications. You can record audio +using the {@link android.media.MediaRecorder} APIs if supported by the device hardware.

    + +

    This document shows you how to write an application that captures audio from a device +microphone, save the audio and play it back.

    + +

    Note: The Android Emulator does not have the ability to capture +audio, but actual devices are likely to provide these capabilities.

    + +

    Performing Audio Capture

    + +

    Audio capture from the device is a bit more complicated than audio and video playback, but still +fairly simple:

    +
      +
    1. Create a new instance of {@link android.media.MediaRecorder android.media.MediaRecorder}.
    2. +
    3. Set the audio source using + {@link android.media.MediaRecorder#setAudioSource MediaRecorder.setAudioSource()}. You will +probably want to use + MediaRecorder.AudioSource.MIC.
    4. +
    5. Set output file format using + {@link android.media.MediaRecorder#setOutputFormat MediaRecorder.setOutputFormat()}. +
    6. +
    7. Set output file name using + {@link android.media.MediaRecorder#setOutputFile MediaRecorder.setOutputFile()}. +
    8. +
    9. Set the audio encoder using + {@link android.media.MediaRecorder#setAudioEncoder MediaRecorder.setAudioEncoder()}. +
    10. +
    11. Call {@link android.media.MediaRecorder#prepare MediaRecorder.prepare()} + on the MediaRecorder instance.
    12. +
    13. To start audio capture, call + {@link android.media.MediaRecorder#start MediaRecorder.start()}.
    14. +
    15. To stop audio capture, call {@link android.media.MediaRecorder#stop MediaRecorder.stop()}. +
    16. When you are done with the MediaRecorder instance, call +{@link android.media.MediaRecorder#release MediaRecorder.release()} on it. Calling +{@link android.media.MediaRecorder#release MediaRecorder.release()} is always recommended to +free the resource immediately.
    17. +
    + +

    Example: Record audio and play the recorded audio

    +

    The example class below illustrates how to set up, start and stop audio capture, and to play the +recorded audio file.

    +
    +/*
    + * The application needs to have the permission to write to external storage
    + * if the output file is written to the external storage, and also the
    + * permission to record audio. These permissions must be set in the
    + * application's AndroidManifest.xml file, with something like:
    + *
    + * <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
    + * <uses-permission android:name="android.permission.RECORD_AUDIO" />
    + *
    + */
    +package com.android.audiorecordtest;
    +
    +import android.app.Activity;
    +import android.widget.LinearLayout;
    +import android.os.Bundle;
    +import android.os.Environment;
    +import android.view.ViewGroup;
    +import android.widget.Button;
    +import android.view.View;
    +import android.view.View.OnClickListener;
    +import android.content.Context;
    +import android.util.Log;
    +import android.media.MediaRecorder;
    +import android.media.MediaPlayer;
    +
    +import java.io.IOException;
    +
    +
    +public class AudioRecordTest extends Activity
    +{
    +    private static final String LOG_TAG = "AudioRecordTest";
    +    private static String mFileName = null;
    +
    +    private RecordButton mRecordButton = null;
    +    private MediaRecorder mRecorder = null;
    +
    +    private PlayButton   mPlayButton = null;
    +    private MediaPlayer   mPlayer = null;
    +
    +    private void onRecord(boolean start) {
    +        if (start) {
    +            startRecording();
    +        } else {
    +            stopRecording();
    +        }
    +    }
    +
    +    private void onPlay(boolean start) {
    +        if (start) {
    +            startPlaying();
    +        } else {
    +            stopPlaying();
    +        }
    +    }
    +
    +    private void startPlaying() {
    +        mPlayer = new MediaPlayer();
    +        try {
    +            mPlayer.setDataSource(mFileName);
    +            mPlayer.prepare();
    +            mPlayer.start();
    +        } catch (IOException e) {
    +            Log.e(LOG_TAG, "prepare() failed");
    +        }
    +    }
    +
    +    private void stopPlaying() {
    +        mPlayer.release();
    +        mPlayer = null;
    +    }
    +
    +    private void startRecording() {
    +        mRecorder = new MediaRecorder();
    +        mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
    +        mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
    +        mRecorder.setOutputFile(mFileName);
    +        mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
    +
    +        try {
    +            mRecorder.prepare();
    +        } catch (IOException e) {
    +            Log.e(LOG_TAG, "prepare() failed");
    +        }
    +
    +        mRecorder.start();
    +    }
    +
    +    private void stopRecording() {
    +        mRecorder.stop();
    +        mRecorder.release();
    +        mRecorder = null;
    +    }
    +
    +    class RecordButton extends Button {
    +        boolean mStartRecording = true;
    +
    +        OnClickListener clicker = new OnClickListener() {
    +            public void onClick(View v) {
    +                onRecord(mStartRecording);
    +                if (mStartRecording) {
    +                    setText("Stop recording");
    +                } else {
    +                    setText("Start recording");
    +                }
    +                mStartRecording = !mStartRecording;
    +            }
    +        };
    +
    +        public RecordButton(Context ctx) {
    +            super(ctx);
    +            setText("Start recording");
    +            setOnClickListener(clicker);
    +        }
    +    }
    +
    +    class PlayButton extends Button {
    +        boolean mStartPlaying = true;
    +
    +        OnClickListener clicker = new OnClickListener() {
    +            public void onClick(View v) {
    +                onPlay(mStartPlaying);
    +                if (mStartPlaying) {
    +                    setText("Stop playing");
    +                } else {
    +                    setText("Start playing");
    +                }
    +                mStartPlaying = !mStartPlaying;
    +            }
    +        };
    +
    +        public PlayButton(Context ctx) {
    +            super(ctx);
    +            setText("Start playing");
    +            setOnClickListener(clicker);
    +        }
    +    }
    +
    +    public AudioRecordTest() {
    +        mFileName = Environment.getExternalStorageDirectory().getAbsolutePath();
    +        mFileName += "/audiorecordtest.3gp";
    +    }
    +
    +    @Override
    +    public void onCreate(Bundle icicle) {
    +        super.onCreate(icicle);
    +
    +        LinearLayout ll = new LinearLayout(this);
    +        mRecordButton = new RecordButton(this);
    +        ll.addView(mRecordButton,
    +            new LinearLayout.LayoutParams(
    +                ViewGroup.LayoutParams.WRAP_CONTENT,
    +                ViewGroup.LayoutParams.WRAP_CONTENT,
    +                0));
    +        mPlayButton = new PlayButton(this);
    +        ll.addView(mPlayButton,
    +            new LinearLayout.LayoutParams(
    +                ViewGroup.LayoutParams.WRAP_CONTENT,
    +                ViewGroup.LayoutParams.WRAP_CONTENT,
    +                0));
    +        setContentView(ll);
    +    }
    +
    +    @Override
    +    public void onPause() {
    +        super.onPause();
    +        if (mRecorder != null) {
    +            mRecorder.release();
    +            mRecorder = null;
    +        }
    +
    +        if (mPlayer != null) {
    +            mPlayer.release();
    +            mPlayer = null;
    +        }
    +    }
    +}
    +
    \ No newline at end of file diff --git a/docs/html/guide/topics/media/camera.jd b/docs/html/guide/topics/media/camera.jd new file mode 100644 index 0000000000000..877bded96393f --- /dev/null +++ b/docs/html/guide/topics/media/camera.jd @@ -0,0 +1,1055 @@ +page.title=Camera +parent.title=Multimedia and Camera +parent.link=index.html +@jd:body + +
    +
    +

    In this document

    +
      +
    1. Considerations
    2. +
    3. The Basics +
    4. Manifest Declarations
    5. +
    6. Using Existing Camera Apps +
        +
      1. Image capture intent
      2. +
      3. Video capture intent
      4. +
      5. Receiving camera intent result
      6. +
      +
    7. Building a Camera App +
        +
      1. Detecting camera hardware
      2. +
      3. Accessing cameras
      4. +
      5. Checking camera features
      6. +
      7. Creating a preview class
      8. +
      9. Placing preview in a layout
      10. +
      11. Capturing pictures
      12. +
      13. Capturing videos
      14. +
      15. Releasing the camera
      16. +
      +
    8. +
    9. Saving Media Files
    10. +
    +

    Key Classes

    +
      +
    1. {@link android.hardware.Camera}
    2. +
    3. {@link android.view.SurfaceView}
    4. +
    5. {@link android.media.MediaRecorder}
    6. +
    7. {@link android.content.Intent}
    8. +
    +

    See also

    +
      +
    1. Camera
    2. +
    3. MediaRecorder
    4. +
    5. Data Storage
    6. +
    +
    +
    + + +

    The Android framework includes support for various cameras and camera features available on +devices, allowing you to capture pictures and videos in your applications. This document discusses a +quick, simple approach to image and video capture and outlines an advanced approach for creating +custom camera experiences for your users.

    + +

    Considerations

    +

    Before enabling your application to use cameras on Android devices, you should consider a few +questions about how your app intends to use this hardware feature.

    + + + + + +

    The Basics

    +

    The Android framework supports capturing images and video through the +{@link android.hardware.Camera} API or camera {@link android.content.Intent}. Here are the relevant +classes:

    + +
    +
    {@link android.hardware.Camera}
    +
    This class is the primary API for controlling device cameras. This class is used to take +pictures or videos when you are building a camera application..
    + +
    {@link android.view.SurfaceView}
    +
    This class is used to present a live camera preview to the user.
    + +
    {@link android.media.MediaRecorder}
    +
    This class is used to record video from the camera.
    + +
    {@link android.content.Intent}
    +
    An intent action type of {@link android.provider.MediaStore#ACTION_IMAGE_CAPTURE +MediaStore.ACTION_IMAGE_CAPTURE} or {@link android.provider.MediaStore#ACTION_VIDEO_CAPTURE +MediaStore.ACTION_VIDEO_CAPTURE} can be used to capture images or videos without directly +using the {@link android.hardware.Camera} object.
    +
    + + +

    Manifest Declarations

    +

    Before starting development on your application with the Camera API, you should make sure +your manifest has the appropriate declarations to allow use of camera hardware and other +related features.

    + + + + +

    Using Existing Camera Apps

    +

    A quick way to enable taking pictures or videos in your application without a lot of extra code +is to use an {@link android.content.Intent} to invoke an existing Android camera application. A +camera intent makes a request to capture a picture or video clip through an existing camera app and +then returns control back to your application. This section shows you how to capture an image or +video using this technique.

    + +

    The procedure for invoking a camera intent follows these general steps:

    + +
      +
    1. Compose a Camera Intent - Create an {@link android.content.Intent} that +requests an image or video, using one of these intent types: +
        +
      • {@link android.provider.MediaStore#ACTION_IMAGE_CAPTURE MediaStore.ACTION_IMAGE_CAPTURE} - +Intent action type for requesting an image from an existing camera application.
      • +
      • {@link android.provider.MediaStore#ACTION_VIDEO_CAPTURE MediaStore.ACTION_VIDEO_CAPTURE} - +Intent action type for requesting a video from an existing camera application.
      • +
      +
    2. +
    3. Start the Camera Intent - Use the {@link +android.app.Activity#startActivityForResult(android.content.Intent, int) startActivityForResult()} +method to execute the camera intent. After you start the intent, the Camera application user +interface appears on the device screen and the user can take a picture or video.
    4. +
    5. Receive the Intent Result - Set up an {@link +android.app.Activity#onActivityResult(int, int, android.content.Intent) onActivityResult()} method +in your application to receive the callback and data from the camera intent. When the user +finishes taking a picture or video (or cancels the operation), the system calls this method.
    6. +
    + + +

    Image capture intent

    +

    Capturing images using a camera intent is quick way to enable your application to take pictures +with minimal coding. An image capture intent can include the following extra information:

    + + + +

    The following example demonstrates how to construct a image capture intent and execute it. +The {@code getOutputMediaFileUri()} method in this example refers to the sample code shown in Saving Media Files.

    + +
    +private static final int CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE = 100;
    +private Uri fileUri;
    +
    +@Override
    +public void onCreate(Bundle savedInstanceState) {
    +    super.onCreate(savedInstanceState);
    +    setContentView(R.layout.main);
    +
    +    // create Intent to take a picture and return control to the calling application
    +    Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
    +
    +    fileUri = getOutputMediaFileUri(MEDIA_TYPE_IMAGE); // create a file to save the image
    +    intent.putExtra(MediaStore.EXTRA_OUTPUT, fileUri); // set the image file name
    +
    +    // start the image capture Intent
    +    startActivityForResult(intent, CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE);
    +}
    +
    + +

    When the {@link android.app.Activity#startActivityForResult(android.content.Intent, int) +startActivityForResult()} method is executed, users see a camera application interface. +After the user finishes taking a picture (or cancels the operation), the user interface returns to +your application, and you must intercept the {@link +android.app.Activity#onActivityResult(int, int, android.content.Intent) onActivityResult()} +method to receive the result of the intent and continue your application execution. For information +on how to receive the completed intent, see Receiving Camera Intent +Result.

    + + +

    Video capture intent

    +

    Capturing video using a camera intent is a quick way to enable your application to take videos +with minimal coding. A video capture intent can include the following extra information:

    + + + +

    The following example demonstrates how to construct a video capture intent and execute it. +The {@code getOutputMediaFileUri()} method in this example refers to the sample code shown in Saving Media Files.

    + +
    +private static final int CAPTURE_VIDEO_ACTIVITY_REQUEST_CODE = 200;
    +private Uri fileUri;
    +
    +@Override
    +public void onCreate(Bundle savedInstanceState) {
    +    super.onCreate(savedInstanceState);
    +    setContentView(R.layout.main);
    +
    +    //create new Intent
    +    Intent intent = new Intent(MediaStore.ACTION_VIDEO_CAPTURE);
    +
    +    fileUri = getOutputMediaFileUri(MEDIA_TYPE_VIDEO);  // create a file to save the video
    +    intent.putExtra(MediaStore.EXTRA_OUTPUT, fileUri);  // set the image file name
    +
    +    intent.putExtra(MediaStore.EXTRA_VIDEO_QUALITY, 1); // set the video image quality to high
    +
    +    // start the Video Capture Intent
    +    startActivityForResult(intent, CAPTURE_VIDEO_ACTIVITY_REQUEST_CODE);
    +}
    +
    + +

    When the {@link +android.app.Activity#startActivityForResult(android.content.Intent, int) +startActivityForResult()} method is executed, users see a modified camera application interface. +After the user finishes taking a video (or cancels the operation), the user interface +returns to your application, and you must intercept the {@link +android.app.Activity#onActivityResult(int, int, android.content.Intent) onActivityResult()} +method to receive the result of the intent and continue your application execution. For information +on how to receive the completed intent, see the next section.

    + +

    Receiving camera intent result

    +

    Once you have constructed and executed an image or video camera intent, your application must be +configured to receive the result of the intent. This section shows you how to intercept the callback +from a camera intent so your application can do further processing of the captured image or +video.

    + +

    In order to receive the result of an intent, you must override the {@link +android.app.Activity#onActivityResult(int, int, android.content.Intent) onActivityResult()} in the +activity that started the intent. The following example demonstrates how to override {@link +android.app.Activity#onActivityResult(int, int, android.content.Intent) onActivityResult()} to +capture the result of the image camera intent or video camera intent examples shown in the previous sections.

    + +
    +private static final int CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE = 100;
    +private static final int CAPTURE_VIDEO_ACTIVITY_REQUEST_CODE = 200;
    +
    +@Override
    +protected void onActivityResult(int requestCode, int resultCode, Intent data) {
    +    if (requestCode == CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE) {
    +        if (resultCode == RESULT_OK) {
    +            // Image captured and saved to fileUri specified in the Intent
    +            Toast.makeText(this, "Image saved to:\n" +
    +                     data.getData(), Toast.LENGTH_LONG).show();
    +        } else if (resultCode == RESULT_CANCELED) {
    +            // User cancelled the image capture
    +        } else {
    +            // Image capture failed, advise user
    +        }
    +    }
    +
    +    if (requestCode == CAPTURE_VIDEO_ACTIVITY_REQUEST_CODE) {
    +        if (resultCode == RESULT_OK) {
    +            // Video captured and saved to fileUri specified in the Intent
    +            Toast.makeText(this, "Video saved to:\n" +
    +                     data.getData(), Toast.LENGTH_LONG).show();
    +        } else if (resultCode == RESULT_CANCELED) {
    +            // User cancelled the video capture
    +        } else {
    +            // Video capture failed, advise user
    +        }
    +    }
    +}
    +
    + +

    Once your activity receives a successful result, the captured image or video is available in the +specified location for your application to access.

    + + + +

    Building a Camera App

    +

    Some developers may require a camera user interface that is customized to the look of their +application or provides special features. Creating a customized camera activity requires more +code than using an intent, but it can provide a more compelling experience +for your users.

    + +

    The general steps for creating a custom camera interface for your application are as follows:

    + + + +

    Camera hardware is a shared resource that must be carefully managed so your application does +not collide with other applications that may also want to use it. The following sections discusses +how to detect camera hardware, how to request access to a camera and how to release it when your +application is done using it.

    + +

    Caution: Remember to release the {@link android.hardware.Camera} +object by calling the {@link android.hardware.Camera#release() Camera.release()} when your +application is done using it! If your application does not properly release the camera, all +subsequent attempts to access the camera, including those by your own application, will fail and may +cause your or other applications to be shut down.

    + + +

    Detecting camera hardware

    +

    If your application does not specifically require a camera using a manifest declaration, you +should check to see if a camera is available at runtime. To perform this check, use the {@link +android.content.pm.PackageManager#hasSystemFeature(java.lang.String) +PackageManager.hasSystemFeature()} method, as shown in the example code below:

    + +
    +/** Check if this device has a camera */
    +private boolean checkCameraHardware(Context context) {
    +    if (context.getPackageManager().hasSystemFeature(PackageManager.FEATURE_CAMERA)){
    +        // this device has a camera
    +        return true;
    +    } else {
    +        // no camera on this device
    +        return false;
    +    }
    +}
    +
    + +

    Android devices can have multiple cameras, for example a back-facing camera for photography and a +front-facing camera for video calls. Android 2.3 (API Level 9) and later allows you to check the +number of cameras available on a device using the {@link +android.hardware.Camera#getNumberOfCameras() Camera.getNumberOfCameras()} method.

    + +

    Accessing cameras

    +

    If you have determined that the device on which your application is running has a camera, you +must request to access it by getting an instance of {@link android.hardware.Camera} (unless you +are using an intent to access the camera).

    + +

    To access the primary camera, use the {@link android.hardware.Camera#open() Camera.open()} method +and be sure to catch any exceptions, as shown in the code below:

    + +
    +/** A safe way to get an instance of the Camera object. */
    +public static Camera getCameraInstance(){
    +    Camera c = null;
    +    try {
    +        c = Camera.open(); // attempt to get a Camera instance
    +    }
    +    catch (Exception e){
    +        // Camera is not available (in use or does not exist)
    +    }
    +    return c; // returns null if camera is unavailable
    +}
    +
    + +

    Caution: Always check for exceptions when using {@link +android.hardware.Camera#open() Camera.open()}. Failing to check for exceptions if the camera is in +use or does not exist will cause your application to be shut down by the system.

    + +

    On devices running Android 2.3 (API Level 9) or higher, you can access specific cameras using +{@link android.hardware.Camera#open(int) Camera.open(int)}. The example code above will access +the first, back-facing camera on a device with more than one camera.

    + +

    Checking camera features

    +

    Once you obtain access to a camera, you can get further information about its capabilties using +the {@link android.hardware.Camera#getParameters() Camera.getParameters()} method and checking the +returned {@link android.hardware.Camera.Parameters} object for supported capabilities. When using +API Level 9 or higher, use the {@link android.hardware.Camera#getCameraInfo(int, +android.hardware.Camera.CameraInfo) Camera.getCameraInfo()} to determine if a camera is on the front +or back of the device, and the orientation of the image.

    + + + +

    Creating a preview class

    +

    For users to effectively take pictures or video, they must be able to see what the device camera +sees. A camera preview class is a {@link android.view.SurfaceView} that can display the live image +data coming from a camera, so users can frame and capture a picture or video.

    + +

    The following example code demonstrates how to create a basic camera preview class that can be +included in a {@link android.view.View} layout. This class implements {@link +android.view.SurfaceHolder.Callback SurfaceHolder.Callback} in order to capture the callback events +for creating and destroying the view, which are needed for assigning the camera preview input.

    + +
    +/** A basic Camera preview class */
    +public class CameraPreview extends SurfaceView implements SurfaceHolder.Callback {
    +    private SurfaceHolder mHolder;
    +    private Camera mCamera;
    +
    +    public CameraPreview(Context context, Camera camera) {
    +        super(context);
    +        mCamera = camera;
    +
    +        // Install a SurfaceHolder.Callback so we get notified when the
    +        // underlying surface is created and destroyed.
    +        mHolder = getHolder();
    +        mHolder.addCallback(this);
    +        // deprecated setting, but required on Android versions prior to 3.0
    +        mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    +    }
    +
    +    public void surfaceCreated(SurfaceHolder holder) {
    +        // The Surface has been created, now tell the camera where to draw the preview.
    +        try {
    +            mCamera.setPreviewDisplay(holder);
    +            mCamera.startPreview();
    +        } catch (IOException e) {
    +            Log.d(TAG, "Error setting camera preview: " + e.getMessage());
    +        }
    +    }
    +
    +    public void surfaceDestroyed(SurfaceHolder holder) {
    +        // empty. Take care of releasing the Camera preview in your activity.
    +    }
    +
    +    public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
    +        // If your preview can change or rotate, take care of those events here.
    +        // Make sure to stop the preview before resizing or reformatting it.
    +
    +        if (mHolder.getSurface() == null){
    +          // preview surface does not exist
    +          return;
    +        }
    +
    +        // stop preview before making changes
    +        try {
    +            mCamera.stopPreview();
    +        } catch (Exception e){
    +          // ignore: tried to stop a non-existent preview
    +        }
    +
    +        // make any resize, rotate or reformatting changes here
    +
    +        // start preview with new settings
    +        try {
    +            mCamera.setPreviewDisplay(mHolder);
    +            mCamera.startPreview();
    +
    +        } catch (Exception e){
    +            Log.d(TAG, "Error starting camera preview: " + e.getMessage());
    +        }
    +    }
    +}
    +
    + + +

    Placing preview in a layout

    +

    A camera preview class, such as the example shown in the previous section, must be placed in the +layout of an activity along with other user interface controls for taking a picture or video. This +section shows you how to build a basic layout and activity for the preview.

    + +

    The following layout code provides a very basic view that can be used to display a camera +preview. In this example, the {@link android.widget.FrameLayout} element is meant to be the +container for the camera preview class. This layout type is used so that additional picture +information or controls can be overlayed on the live camera preview images.

    + +
    +<?xml version="1.0" encoding="utf-8"?>
    +<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    +    android:orientation="horizontal"
    +    android:layout_width="fill_parent"
    +    android:layout_height="fill_parent"
    +    >
    +  <FrameLayout
    +    android:id="@+id/camera_preview"
    +    android:layout_width="fill_parent"
    +    android:layout_height="fill_parent"
    +    android:layout_weight="1"
    +    />
    +
    +  <Button
    +    android:id="@+id/button_capture"
    +    android:text="Capture"
    +    android:layout_width="wrap_content"
    +    android:layout_height="wrap_content"
    +    android:layout_gravity="center"
    +    />
    +</LinearLayout>
    +
    + +

    On most devices, the default orientation of the camera preview is landscape. This example layout +specifies a horizontal (landscape) layout and the code below fixes the orientation of the +application to landscape. For simplicity in rendering a camera preview, you should change your +application's preview activity orientation to landscape by adding the following to your +manifest.

    + +
    +<activity android:name=".CameraActivity"
    +          android:label="@string/app_name"
    +
    +          android:screenOrientation="landscape">
    +          <!-- configure this activity to use landscape orientation -->
    +
    +          <intent-filter>
    +        <action android:name="android.intent.action.MAIN" />
    +        <category android:name="android.intent.category.LAUNCHER" />
    +    </intent-filter>
    +</activity>
    +
    + +

    Note: A camera preview does not have to be in landscape mode. +Starting in Android 2.2 (API Level 8), you can use the {@link +android.hardware.Camera#setDisplayOrientation(int) setDisplayOrientation()} method to set the +rotation of the preview image. In order to change preview orientation as the user re-orients the +phone, within the {@link +android.view.SurfaceHolder.Callback#surfaceChanged(android.view.SurfaceHolder, int, int, int) +surfaceChanged()} method of your preview class, first stop the preview with {@link +android.hardware.Camera#stopPreview() Camera.stopPreview()} change the orientation and then +start the preview again with {@link android.hardware.Camera#startPreview() +Camera.startPreview()}.

    + +

    In the activity for your camera view, add your preview class to the {@link +android.widget.FrameLayout} element shown in the example above. Your camera activity must also +ensure that it releases the camera when it is paused or shut down. The following example shows how +to modify a camera activity to attach the preview class shown in Creating +a preview class.

    + +
    +public class CameraActivity extends Activity {
    +
    +    private Camera mCamera;
    +    private CameraPreview mPreview;
    +
    +    @Override
    +    public void onCreate(Bundle savedInstanceState) {
    +        super.onCreate(savedInstanceState);
    +        setContentView(R.layout.main);
    +
    +        // Create an instance of Camera
    +        mCamera = getCameraInstance();
    +
    +        // Create our Preview view and set it as the content of our activity.
    +        mPreview = new CameraPreview(this, mCamera);
    +        FrameLayout preview = (FrameLayout) findViewById(id.camera_preview);
    +        preview.addView(mPreview);
    +    }
    +}
    +
    + +

    Note: The {@code getCameraInstance()} method in the example above +refers to the example method shown in Accessing cameras.

    + + +

    Capturing pictures

    +

    Once you have built a preview class and a view layout in which to display it, you are ready to +start capturing images with your application. In your application code, you must set up listeners +for your user interface controls to respond to a user action by taking a picture.

    + +

    In order to retrieve a picture, use the {@link +android.hardware.Camera#takePicture(android.hardware.Camera.ShutterCallback, +android.hardware.Camera.PictureCallback, android.hardware.Camera.PictureCallback) +Camera.takePicture()} method. This method takes three parameters which receive data from the camera. +In order to receive data in a JPEG format, you must implement an {@link +android.hardware.Camera.PictureCallback} interface to receive the image data and +write it to a file. The following code shows a basic implementation of the {@link +android.hardware.Camera.PictureCallback} interface to save an image received from the camera.

    + +
    +private PictureCallback mPicture = new PictureCallback() {
    +
    +    @Override
    +    public void onPictureTaken(byte[] data, Camera camera) {
    +
    +        File pictureFile = getOutputMediaFile(MEDIA_TYPE_IMAGE);
    +        if (pictureFile == null){
    +            Log.d(TAG, "Error creating media file, check storage permissions: " +
    +                e.getMessage());
    +            return;
    +        }
    +
    +        try {
    +            FileOutputStream fos = new FileOutputStream(pictureFile);
    +            fos.write(data);
    +            fos.close();
    +        } catch (FileNotFoundException e) {
    +            Log.d(TAG, "File not found: " + e.getMessage());
    +        } catch (IOException e) {
    +            Log.d(TAG, "Error accessing file: " + e.getMessage());
    +        }
    +    }
    +};
    +
    + +

    Trigger capturing an image by calling the {@link +android.hardware.Camera#takePicture(android.hardware.Camera.ShutterCallback, +android.hardware.Camera.PictureCallback, android.hardware.Camera.PictureCallback) +Camera.takePicture()} method. The following example code shows how to call this method from a +button {@link android.view.View.OnClickListener}.

    + +
    +// Add a listener to the Capture button
    +Button captureButton = (Button) findViewById(id.button_capture);
    +    captureButton.setOnClickListener(
    +        new View.OnClickListener() {
    +        @Override
    +        public void onClick(View v) {
    +            // get an image from the camera
    +            mCamera.takePicture(null, null, mPicture);
    +        }
    +    }
    +);
    +
    + +

    Note: The {@code mPicture} member in the following example refers +to the example code above.

    + +

    Caution: Remember to release the {@link android.hardware.Camera} +object by calling the {@link android.hardware.Camera#release() Camera.release()} when your +application is done using it! For information about how to release the camera, see Releasing the camera.

    + + +

    Capturing videos

    + +

    Video capture using the Android framework requires careful management of the {@link +android.hardware.Camera} object and coordination with the {@link android.media.MediaRecorder} +class. When recording video with {@link android.hardware.Camera}, you must manage the {@link +android.hardware.Camera#lock() Camera.lock()} and {@link android.hardware.Camera#unlock() +Camera.unlock()} calls to allow {@link android.media.MediaRecorder} access to the camera hardware, +in addition to the {@link android.hardware.Camera#open() Camera.open()} and {@link +android.hardware.Camera#release() Camera.release()} calls.

    + +

    Note: Starting with Android 4.0 (API level 14), the {@link +android.hardware.Camera#lock() Camera.lock()} and {@link android.hardware.Camera#unlock() +Camera.unlock()} calls are managed for you automatically.

    + +

    Unlike taking pictures with a device camera, capturing video requires a very particular call +order. You must follow a specific order of execution to successfully prepare for and capture video +with your application, as detailed below.

    + +
      +
    1. Open Camera - Use the {@link android.hardware.Camera#open() Camera.open()} +to get an instance of the camera object.
    2. +
    3. Connect Preview - Prepare a live camera image preview by connecting a {@link +android.view.SurfaceView} to the camera using {@link +android.hardware.Camera#setPreviewDisplay(android.view.SurfaceHolder) Camera.setPreviewDisplay()}. +
    4. +
    5. Start Preview - Call {@link android.hardware.Camera#startPreview() +Camera.startPreview()} to begin displaying the live camera images.
    6. +
    7. Start Recording Video - The following steps must be completed in +order to successfully record video: +
        +
      1. Unlock the Camera - Unlock the camera for use by {@link +android.media.MediaRecorder} by calling {@link android.hardware.Camera#unlock() +Camera.unlock()}.
      2. +
      3. Configure MediaRecorder - Call in the following {@link +android.media.MediaRecorder} methods in this order. For more information, see the {@link +android.media.MediaRecorder} reference documentation. +
          +
        1. {@link android.media.MediaRecorder#setCamera(android.hardware.Camera) +setCamera()} - Set the camera to be used for video capture, use your application's current instance +of {@link android.hardware.Camera}.
        2. +
        3. {@link android.media.MediaRecorder#setAudioSource(int) setAudioSource()} - Set the +audio source, use {@link android.media.MediaRecorder.AudioSource#CAMCORDER +MediaRecorder.AudioSource.CAMCORDER}.
        4. +
        5. {@link android.media.MediaRecorder#setVideoSource(int) setVideoSource()} - Set +the video source, use {@link android.media.MediaRecorder.VideoSource#CAMERA +MediaRecorder.VideoSource.CAMERA}.
        6. +
        7. Set the video output format and encoding. For Android 2.2 (API Level 8) and +higher, use the {@link android.media.MediaRecorder#setProfile(android.media.CamcorderProfile) +MediaRecorder.setProfile} method, and get a profile instance using {@link +android.media.CamcorderProfile#get(int) CamcorderProfile.get()}. For versions of Android prior to +2.2, you must set the video output format and encoding parameters: +
            +
          1. {@link android.media.MediaRecorder#setOutputFormat(int) setOutputFormat()} - Set +the output format, specify the default setting or {@link +android.media.MediaRecorder.OutputFormat#MPEG_4 MediaRecorder.OutputFormat.MPEG_4}.
          2. +
          3. {@link android.media.MediaRecorder#setAudioEncoder(int) setAudioEncoder()} - Set +the sound encoding type, specify the default setting or {@link +android.media.MediaRecorder.AudioEncoder#AMR_NB MediaRecorder.AudioEncoder.AMR_NB}.
          4. +
          5. {@link android.media.MediaRecorder#setVideoEncoder(int) setVideoEncoder()} - Set +the video encoding type, specify the default setting or {@link +android.media.MediaRecorder.VideoEncoder#MPEG_4_SP MediaRecorder.VideoEncoder.MPEG_4_SP}.
          6. +
          +
        8. +
        9. {@link android.media.MediaRecorder#setOutputFile(java.lang.String) setOutputFile()} - +Set the output file, use {@code getOutputMediaFile(MEDIA_TYPE_VIDEO).toString()} from the example +method in the Saving Media Files section.
        10. +
        11. {@link android.media.MediaRecorder#setPreviewDisplay(android.view.Surface) +setPreviewDisplay()} - Specify the {@link android.view.SurfaceView} preview layout element for +your application. Use the same object you specified for Connect Preview.
        12. +
        +

        Caution: You must call these {@link +android.media.MediaRecorder} configuration methods in this order, otherwise your +application will encounter errors and the recording will fail.

        +
      4. +
      5. Prepare MediaRecorder - Prepare the {@link android.media.MediaRecorder} +with provided configuration settings by calling {@link android.media.MediaRecorder#prepare() +MediaRecorder.prepare()}.
      6. +
      7. Start MediaRecorder - Start recording video by calling {@link +android.media.MediaRecorder#start() MediaRecorder.start()}.
      8. +
      +
    8. +
    9. Stop Recording Video - Call the following methods in order, to +successfully complete a video recording: +
        +
      1. Stop MediaRecorder - Stop recording video by calling {@link +android.media.MediaRecorder#stop() MediaRecorder.stop()}.
      2. +
      3. Reset MediaRecorder - Optionally, remove the configuration settings from +the recorder by calling {@link android.media.MediaRecorder#reset() MediaRecorder.reset()}.
      4. +
      5. Release MediaRecorder - Release the {@link android.media.MediaRecorder} +by calling {@link android.media.MediaRecorder#release() MediaRecorder.release()}.
      6. +
      7. Lock the Camera - Lock the camera so that future {@link +android.media.MediaRecorder} sessions can use it by calling {@link android.hardware.Camera#lock() +Camera.lock()}. Starting with Android 4.0 (API level 14), this call is not required unless the +{@link android.media.MediaRecorder#prepare() MediaRecorder.prepare()} call fails.
      8. +
      +
    10. +
    11. Stop the Preview - When your activity has finished using the camera, stop the +preview using {@link android.hardware.Camera#stopPreview() Camera.stopPreview()}.
    12. +
    13. Release Camera - Release the camera so that other applications can use +it by calling {@link android.hardware.Camera#release() Camera.release()}.
    14. +
    + +

    Note: It is possible to use {@link android.media.MediaRecorder} +without creating a camera preview first and skip the first few steps of this process. However, +since users typically prefer to see a preview before starting a recording, that process is not +discussed here.

    + +

    Configuring MediaRecorder

    +

    When using the {@link android.media.MediaRecorder} class to record video, you must perform +configuration steps in a specific order and then call the {@link +android.media.MediaRecorder#prepare() MediaRecorder.prepare()} method to check and implement the +configuration. The following example code demonstrates how to properly configure and prepare the +{@link android.media.MediaRecorder} class for video recording.

    + +
    +private boolean prepareVideoRecorder(){
    +
    +    mCamera = getCameraInstance();
    +    mMediaRecorder = new MediaRecorder();
    +
    +    // Step 1: Unlock and set camera to MediaRecorder
    +    mCamera.unlock();
    +    mMediaRecorder.setCamera(mCamera);
    +
    +    // Step 2: Set sources
    +    mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
    +    mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
    +
    +    // Step 3: Set a CamcorderProfile (requires API Level 8 or higher)
    +    mMediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH));
    +
    +    // Step 4: Set output file
    +    mMediaRecorder.setOutputFile(getOutputMediaFile(MEDIA_TYPE_VIDEO).toString());
    +
    +    // Step 5: Set the preview output
    +    mMediaRecorder.setPreviewDisplay(mPreview.getHolder().getSurface());
    +
    +    // Step 6: Prepare configured MediaRecorder
    +    try {
    +        mMediaRecorder.prepare();
    +    } catch (IllegalStateException e) {
    +        Log.d(TAG, "IllegalStateException preparing MediaRecorder: " + e.getMessage());
    +        releaseMediaRecorder();
    +        return false;
    +    } catch (IOException e) {
    +        Log.d(TAG, "IOException preparing MediaRecorder: " + e.getMessage());
    +        releaseMediaRecorder();
    +        return false;
    +    }
    +    return true;
    +}
    +
    + +

    Prior to Android 2.2 (API Level 8), you must set the output format and encoding formats +parameters directly, instead of using {@link android.media.CamcorderProfile}. This approach is +demonstrated in the following code:

    + +
    +    // Step 3: Set output format and encoding (for versions prior to API Level 8)
    +    mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
    +    mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT);
    +    mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.DEFAULT);
    +
    + +

    The following video recording parameters for {@link android.media.MediaRecorder} are given +default settings, however, you may want to adjust these settings for your application:

    + + + +

    Starting and Stopping MediaRecorder

    +

    When starting and stopping video recording using the {@link android.media.MediaRecorder} class, +you must follow a specific order, as listed below.

    + +
      +
    1. Unlock the camera with {@link android.hardware.Camera#unlock() Camera.unlock()}
    2. +
    3. Configure {@link android.media.MediaRecorder} as shown in the code example above
    4. +
    5. Start recording using {@link android.media.MediaRecorder#start() +MediaRecorder.start()}
    6. +
    7. Record the video
    8. +
    9. Stop recording using {@link +android.media.MediaRecorder#stop() MediaRecorder.stop()}
    10. +
    11. Release the media recorder with {@link android.media.MediaRecorder#release() +MediaRecorder.release()}
    12. +
    13. Lock the camera using {@link android.hardware.Camera#lock() Camera.lock()}
    14. +
    + +

    The following example code demonstrates how to wire up a button to properly start and stop +video recording using the camera and the {@link android.media.MediaRecorder} class.

    + +

    Note: When completing a video recording, do not release the camera +or else your preview will be stopped.

    + +
    +private boolean isRecording = false;
    +
    +// Add a listener to the Capture button
    +Button captureButton = (Button) findViewById(id.button_capture);
    +captureButton.setOnClickListener(
    +    new View.OnClickListener() {
    +        @Override
    +        public void onClick(View v) {
    +            if (isRecording) {
    +                // stop recording and release camera
    +                mMediaRecorder.stop();  // stop the recording
    +                releaseMediaRecorder(); // release the MediaRecorder object
    +                mCamera.lock();         // take camera access back from MediaRecorder
    +
    +                // inform the user that recording has stopped
    +                setCaptureButtonText("Capture");
    +                isRecording = false;
    +            } else {
    +                // initialize video camera
    +                if (prepareVideoRecorder()) {
    +                    // Camera is available and unlocked, MediaRecorder is prepared,
    +                    // now you can start recording
    +                    mMediaRecorder.start();
    +
    +                    // inform the user that recording has started
    +                    setCaptureButtonText("Stop");
    +                    isRecording = true;
    +                } else {
    +                    // prepare didn't work, release the camera
    +                    releaseMediaRecorder();
    +                    // inform user
    +                }
    +            }
    +        }
    +    }
    +);
    +
    + +

    Note: In the above example, the {@code prepareVideoRecorder()} +method refers to the example code shown in Configuring MediaRecorder. This method takes care of locking +the camera, configuring and preparing the {@link android.media.MediaRecorder} instance.

    + + +

    Releasing the camera

    +

    Cameras are a resource that is shared by applications on a device. Your application can make +use of the camera after getting an instance of {@link android.hardware.Camera}, and you must be +particularly careful to release the camera object when your application stops using it, and as +soon as your application is paused ({@link android.app.Activity#onPause() Activity.onPause()}). If +your application does not properly release the camera, all subsequent attempts to access the camera, +including those by your own application, will fail and may cause your or other applications to be +shut down.

    + +

    To release an instance of the {@link android.hardware.Camera} object, use the {@link +android.hardware.Camera#release() Camera.release()} method, as shown in the example code below.

    + +
    +public class CameraActivity extends Activity {
    +    private Camera mCamera;
    +    private SurfaceView mPreview;
    +    private MediaRecorder mMediaRecorder;
    +
    +    ...
    +    
    +    @Override
    +    protected void onPause() {
    +        super.onPause();
    +        releaseMediaRecorder();       // if you are using MediaRecorder, release it first
    +        releaseCamera();              // release the camera immediately on pause event
    +    }
    +
    +    private void releaseMediaRecorder(){
    +        if (mMediaRecorder != null) {
    +            mMediaRecorder.reset();   // clear recorder configuration
    +            mMediaRecorder.release(); // release the recorder object
    +            mMediaRecorder = null;
    +            mCamera.lock();           // lock camera for later use
    +        }
    +    }
    +
    +    private void releaseCamera(){
    +        if (mCamera != null){
    +            mCamera.release();        // release the camera for other applications
    +            mCamera = null;
    +        }
    +    }
    +}
    +
    + +

    Caution: If your application does not properly release the +camera, all subsequent attempts to access the camera, including those by your own application, will +fail and may cause your or other applications to be shut down.

    + + +

    Saving Media Files

    +

    Media files created by users such as pictures and videos should be saved to a device's external +storage directory (SD Card) to conserve system space and to allow users to access these files +without their device. There are many possible directory locations to save media files on a device, +however there are only two standard locations you should consider as a developer:

    + + + +

    The following example code demonstrates how to create a {@link java.io.File} or {@link +android.net.Uri} location for a media file that can be used when invoking a device's camera with +an {@link android.content.Intent} or as part of a Building a Camera +App.

    + +
    +public static final int MEDIA_TYPE_IMAGE = 1;
    +public static final int MEDIA_TYPE_VIDEO = 2;
    +
    +/** Create a file Uri for saving an image or video */
    +private static Uri getOutputMediaFileUri(int type){
    +      return Uri.fromFile(getOutputMediaFile(type));
    +}
    +
    +/** Create a File for saving an image or video */
    +private static Uri getOutputMediaFile(int type){
    +    // To be safe, you should check that the SDCard is mounted
    +    // using Environment.getExternalStorageState() before doing this.
    +
    +    File mediaStorageDir = new File(Environment.getExternalStoragePublicDirectory(
    +              Environment.DIRECTORY_PICTURES), "MyCameraApp");
    +    // This location works best if you want the created images to be shared
    +    // between applications and persist after your app has been uninstalled.
    +
    +    // Create the storage directory if it does not exist
    +    if (! mediaStorageDir.exists()){
    +        if (! mediaStorageDir.mkdirs()){
    +            Log.d("MyCameraApp", "failed to create directory");
    +            return null;
    +        }
    +    }
    +
    +    // Create a media file name
    +    String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
    +    File mediaFile;
    +    if (type == MEDIA_TYPE_IMAGE){
    +        mediaFile = new File(mediaStorageDir.getPath() + File.separator +
    +        "IMG_"+ timeStamp + ".jpg");
    +    } else if(type == MEDIA_TYPE_VIDEO) {
    +        mediaFile = new File(mediaStorageDir.getPath() + File.separator +
    +        "VID_"+ timeStamp + ".mp4");
    +    } else {
    +        return null;
    +    }
    +
    +    return mediaFile;
    +}
    +
    + +

    Note: {@link +android.os.Environment#getExternalStoragePublicDirectory(java.lang.String) +Environment.getExternalStoragePublicDirectory()} is available in Android 2.2 (API Level 8) or +higher. If you are targeting devices with earlier versions of Android, use {@link +android.os.Environment#getExternalStorageDirectory() Environment.getExternalStorageDirectory()} +instead. For more information, see Saving Shared Files.

    + +

    For more information about saving files on an Android device, see Data Storage.

    \ No newline at end of file diff --git a/docs/html/guide/topics/media/index.jd b/docs/html/guide/topics/media/index.jd index 06e6208154706..7c1754feb94f5 100644 --- a/docs/html/guide/topics/media/index.jd +++ b/docs/html/guide/topics/media/index.jd @@ -1,971 +1,62 @@ -page.title=Media +page.title=Multimedia and Camera @jd:body
    -

    Quickview

    -
      -
    • MediaPlayer APIs allow you to play and record media
    • -
    • You can handle data from raw resources, files, and streams
    • -
    • The platform supports a variety of media formats. See Android Supported Media Formats
    • -
    - -

    In this document

    +

    Topics

      -
    1. Using MediaPlayer -
        -
      1. Asynchronous Preparation
      2. -
      3. Managing State
      4. -
      5. Releasing the MediaPlayer
      6. -
      -
    2. -
    3. Using a Service with MediaPlayer -
        -
      1. Running asynchronously
      2. -
      3. Handling asynchronous errors
      4. -
      5. Using wake locks
      6. -
      7. Running as a foreground service
      8. -
      9. Handling audio focus
      10. -
      11. Performing cleanup
      12. -
      -
    4. -
    5. Handling the AUDIO_BECOMING_NOISY Intent -
    6. Retrieving Media from a Content Resolver -
    7. Playing JET content -
    8. Performing Audio Capture +
    9. MediaPlayer
    10. +
    11. JetPlayer
    12. +
    13. Camera
    14. +
    15. Audio Capture

    Key classes

    1. {@link android.media.MediaPlayer}
    2. +
    3. {@link android.media.JetPlayer}
    4. +
    5. {@link android.hardware.Camera}
    6. {@link android.media.MediaRecorder}
    7. {@link android.media.AudioManager}
    8. -
    9. {@link android.media.JetPlayer}
    10. {@link android.media.SoundPool}

    See also

      -
    1. Data Storage
    2. -
    3. JetCreator User Manual
    4. +
    5. +
    6. Android Supported Media Formats
    7. +
    8. JetCreator User +Manual
    -

    The Android multimedia framework includes support for encoding and decoding a -variety of common media types, so that you can easily integrate audio, -video and images into your applications. You can play audio or video from media files stored in your -application's resources (raw resources), from standalone files in the filesystem, or from a data -stream arriving over a network connection, all using {@link android.media.MediaPlayer} APIs.

    - -

    You can also record audio and video using the {@link android.media.MediaRecorder} APIs if -supported by the device hardware. Note that the emulator doesn't have hardware to capture audio or -video, but actual mobile devices are likely to provide these capabilities.

    - -

    This document shows you how to write a media-playing application that interacts with the user and -the system in order to obtain good performance and a pleasant user experience.

    - -

    Note: You can play back the audio data only to the standard output -device. Currently, that is the mobile device speaker or a Bluetooth headset. You cannot play sound -files in the conversation audio during a call.

    - - -

    Using MediaPlayer

    - -

    One of the most important components of the media framework is the -{@link android.media.MediaPlayer MediaPlayer} -class. An object of this class can fetch, decode, and play both audio and video -with minimal setup. It supports several different media sources such as: -

    -

    - -

    For a list of media formats that Android supports, -see the Android Supported Media -Formats document.

    - -

    Here is an example -of how to play audio that's available as a local raw resource (saved in your application's -{@code res/raw/} directory):

    - -
    MediaPlayer mediaPlayer = MediaPlayer.create(context, R.raw.sound_file_1);
    -mediaPlayer.start(); // no need to call prepare(); create() does that for you
    -
    - -

    In this case, a "raw" resource is a file that the system does not -try to parse in any particular way. However, the content of this resource should not -be raw audio. It should be a properly encoded and formatted media file in one -of the supported formats.

    - -

    And here is how you might play from a URI available locally in the system -(that you obtained through a Content Resolver, for instance):

    - -
    Uri myUri = ....; // initialize Uri here
    -MediaPlayer mediaPlayer = new MediaPlayer();
    -mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
    -mediaPlayer.setDataSource(getApplicationContext(), myUri);
    -mediaPlayer.prepare();
    -mediaPlayer.start();
    - -

    Playing from a remote URL via HTTP streaming looks like this:

    - -
    String url = "http://........"; // your URL here
    -MediaPlayer mediaPlayer = new MediaPlayer();
    -mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
    -mediaPlayer.setDataSource(url);
    -mediaPlayer.prepare(); // might take long! (for buffering, etc)
    -mediaPlayer.start();
    - -

    Note: -If you're passing a URL to stream an online media file, the file must be capable of -progressive download.

    - -

    Caution: You must either catch or pass -{@link java.lang.IllegalArgumentException} and {@link java.io.IOException} when using -{@link android.media.MediaPlayer#setDataSource setDataSource()}, because -the file you are referencing might not exist.

    - -

    Asynchronous Preparation

    - -

    Using {@link android.media.MediaPlayer MediaPlayer} can be straightforward in -principle. However, it's important to keep in mind that a few more things are -necessary to integrate it correctly with a typical Android application. For -example, the call to {@link android.media.MediaPlayer#prepare prepare()} can -take a long time to execute, because -it might involve fetching and decoding media data. So, as is the case with any -method that may take long to execute, you should never call it from your -application's UI thread. Doing that will cause the UI to hang until the method returns, -which is a very bad user experience and can cause an ANR (Application Not Responding) error. Even if -you expect your resource to load quickly, remember that anything that takes more than a tenth -of a second to respond in the UI will cause a noticeable pause and will give -the user the impression that your application is slow.

    - -

    To avoid hanging your UI thread, spawn another thread to -prepare the {@link android.media.MediaPlayer} and notify the main thread when done. However, while -you could write the threading logic -yourself, this pattern is so common when using {@link android.media.MediaPlayer} that the framework -supplies a convenient way to accomplish this task by using the -{@link android.media.MediaPlayer#prepareAsync prepareAsync()} method. This method -starts preparing the media in the background and returns immediately. When the media -is done preparing, the {@link android.media.MediaPlayer.OnPreparedListener#onPrepared onPrepared()} -method of the {@link android.media.MediaPlayer.OnPreparedListener -MediaPlayer.OnPreparedListener}, configured through -{@link android.media.MediaPlayer#setOnPreparedListener setOnPreparedListener()} is called.

    - -

    Managing State

    - -

    Another aspect of a {@link android.media.MediaPlayer} that you should keep in mind is -that it's state-based. That is, the {@link android.media.MediaPlayer} has an internal state -that you must always be aware of when writing your code, because certain operations -are only valid when then player is in specific states. If you perform an operation while in the -wrong state, the system may throw an exception or cause other undesireable behaviors.

    - -

    The documentation in the -{@link android.media.MediaPlayer MediaPlayer} class shows a complete state diagram, -that clarifies which methods move the {@link android.media.MediaPlayer} from one state to another. -For example, when you create a new {@link android.media.MediaPlayer}, it is in the Idle -state. At that point, you should initialize it by calling -{@link android.media.MediaPlayer#setDataSource setDataSource()}, bringing it -to the Initialized state. After that, you have to prepare it using either the -{@link android.media.MediaPlayer#prepare prepare()} or -{@link android.media.MediaPlayer#prepareAsync prepareAsync()} method. When -the {@link android.media.MediaPlayer} is done preparing, it will then enter the Prepared -state, which means you can call {@link android.media.MediaPlayer#start start()} -to make it play the media. At that point, as the diagram illustrates, -you can move between the Started, Paused and PlaybackCompleted states by -calling such methods as -{@link android.media.MediaPlayer#start start()}, -{@link android.media.MediaPlayer#pause pause()}, and -{@link android.media.MediaPlayer#seekTo seekTo()}, -amongst others. When you -call {@link android.media.MediaPlayer#stop stop()}, however, notice that you -cannot call {@link android.media.MediaPlayer#start start()} again until you -prepare the {@link android.media.MediaPlayer} again.

    - -

    Always keep the state diagram -in mind when writing code that interacts with a -{@link android.media.MediaPlayer} object, because calling its methods from the wrong state is a -common cause of bugs.

    - -

    Releasing the MediaPlayer

    - -

    A {@link android.media.MediaPlayer MediaPlayer} can consume valuable -system resources. -Therefore, you should always take extra precautions to make sure you are not -hanging on to a {@link android.media.MediaPlayer} instance longer than necessary. When you -are done with it, you should always call -{@link android.media.MediaPlayer#release release()} to make sure any -system resources allocated to it are properly released. For example, if you are -using a {@link android.media.MediaPlayer} and your activity receives a call to {@link -android.app.Activity#onStop onStop()}, you must release the {@link android.media.MediaPlayer}, -because it -makes little sense to hold on to it while your activity is not interacting with -the user (unless you are playing media in the background, which is discussed in the next section). -When your activity is resumed or restarted, of course, you need to -create a new {@link android.media.MediaPlayer} and prepare it again before resuming playback.

    - -

    Here's how you should release and then nullify your {@link android.media.MediaPlayer}:

    -
    -mediaPlayer.release();
    -mediaPlayer = null;
    -
    - -

    As an example, consider the problems that could happen if you -forgot to release the {@link android.media.MediaPlayer} when your activity is stopped, but create a -new one when the activity starts again. As you may know, when the user changes the -screen orientation (or changes the device configuration in another way), -the system handles that by restarting the activity (by default), so you might quickly -consume all of the system resources as the user -rotates the device back and forth between portrait and landscape, because at each -orientation change, you create a new {@link android.media.MediaPlayer} that you never -release. (For more information about runtime restarts, see Handling Runtime Changes.)

    - -

    You may be wondering what happens if you want to continue playing -"background media" even when the user leaves your activity, much in the same -way that the built-in Music application behaves. In this case, what you need is -a {@link android.media.MediaPlayer MediaPlayer} controlled by a {@link android.app.Service}, as -discussed in Using a Service with MediaPlayer.

    - -

    Using a Service with MediaPlayer

    - -

    If you want your media to play in the background even when your application -is not onscreen—that is, you want it to continue playing while the user is -interacting with other applications—then you must start a -{@link android.app.Service Service} and control the -{@link android.media.MediaPlayer MediaPlayer} instance from there. -You should be careful about this setup, because the user and the system have expectations -about how an application running a background service should interact with the rest of the -system. If your application does not fulfil those expectations, the user may -have a bad experience. This section describes the main issues that you should be -aware of and offers suggestions about how to approach them.

    - - -

    Running asynchronously

    - -

    First of all, like an {@link android.app.Activity Activity}, all work in a -{@link android.app.Service Service} is done in a single thread by -default—in fact, if you're running an activity and a service from the same application, they -use the same thread (the "main thread") by default. Therefore, services need to -process incoming intents quickly -and never perform lengthy computations when responding to them. If any heavy -work or blocking calls are expected, you must do those tasks asynchronously: either from -another thread you implement yourself, or using the framework's many facilities -for asynchronous processing.

    - -

    For instance, when using a {@link android.media.MediaPlayer} from your main thread, -you should call {@link android.media.MediaPlayer#prepareAsync prepareAsync()} rather than -{@link android.media.MediaPlayer#prepare prepare()}, and implement -a {@link android.media.MediaPlayer.OnPreparedListener MediaPlayer.OnPreparedListener} -in order to be notified when the preparation is complete and you can start playing. -For example:

    - -
    -public class MyService extends Service implements MediaPlayer.OnPreparedListener {
    -    private static final ACTION_PLAY = "com.example.action.PLAY";
    -    MediaPlayer mMediaPlayer = null;
    -
    -    public int onStartCommand(Intent intent, int flags, int startId) {
    -        ...
    -        if (intent.getAction().equals(ACTION_PLAY)) {
    -            mMediaPlayer = ... // initialize it here
    -            mMediaPlayer.setOnPreparedListener(this);
    -            mMediaPlayer.prepareAsync(); // prepare async to not block main thread
    -        }
    -    }
    -
    -    /** Called when MediaPlayer is ready */
    -    public void onPrepared(MediaPlayer player) {
    -        player.start();
    -    }
    -}
    -
    - - -

    Handling asynchronous errors

    - -

    On synchronous operations, errors would normally -be signaled with an exception or an error code, but whenever you use asynchronous -resources, you should make sure your application is notified -of errors appropriately. In the case of a {@link android.media.MediaPlayer MediaPlayer}, -you can accomplish this by implementing a -{@link android.media.MediaPlayer.OnErrorListener MediaPlayer.OnErrorListener} and -setting it in your {@link android.media.MediaPlayer} instance:

    - -
    -public class MyService extends Service implements MediaPlayer.OnErrorListener {
    -    MediaPlayer mMediaPlayer;
    -
    -    public void initMediaPlayer() {
    -        // ...initialize the MediaPlayer here...
    -
    -        mMediaPlayer.setOnErrorListener(this);
    -    }
    -
    -    @Override
    -    public boolean onError(MediaPlayer mp, int what, int extra) {
    -        // ... react appropriately ...
    -        // The MediaPlayer has moved to the Error state, must be reset!
    -    }
    -}
    -
    - -

    It's important to remember that when an error occurs, the {@link android.media.MediaPlayer} -moves to the Error state (see the documentation for the -{@link android.media.MediaPlayer MediaPlayer} class for the full state diagram) -and you must reset it before you can use it again. - - -

    Using wake locks

    - -

    When designing applications that play media -in the background, the device may go to sleep -while your service is running. Because the Android system tries to conserve -battery while the device is sleeping, the system tries to shut off any -of the phone's features that are -not necessary, including the CPU and the WiFi hardware. -However, if your service is playing or streaming music, you want to prevent -the system from interfering with your playback.

    - -

    In order to ensure that your service continues to run under -those conditions, you have to use "wake locks." A wake lock is a way to signal to -the system that your application is using some feature that should -stay available even if the phone is idle.

    - -

    Notice: You should always use wake locks sparingly and hold them -only for as long as truly necessary, because they significantly reduce the battery life of the -device.

    - -

    To ensure that the CPU continues running while your {@link android.media.MediaPlayer} is -playing, call the {@link android.media.MediaPlayer#setWakeMode -setWakeMode()} method when initializing your {@link android.media.MediaPlayer}. Once you do, -the {@link android.media.MediaPlayer} holds the specified lock while playing and releases the lock -when paused or stopped:

    - -
    -mMediaPlayer = new MediaPlayer();
    -// ... other initialization here ...
    -mMediaPlayer.setWakeMode(getApplicationContext(), PowerManager.PARTIAL_WAKE_LOCK);
    -
    - -

    However, the wake lock acquired in this example guarantees only that the CPU remains awake. If -you are streaming media over the -network and you are using Wi-Fi, you probably want to hold a -{@link android.net.wifi.WifiManager.WifiLock WifiLock} as -well, which you must acquire and release manually. So, when you start preparing the -{@link android.media.MediaPlayer} with the remote URL, you should create and acquire the Wi-Fi lock. -For example:

    - -
    -WifiLock wifiLock = ((WifiManager) getSystemService(Context.WIFI_SERVICE))
    -    .createWifiLock(WifiManager.WIFI_MODE_FULL, "mylock");
    -
    -wifiLock.acquire();
    -
    - -

    When you pause or stop your media, or when you no longer need the -network, you should release the lock:

    - -
    -wifiLock.release();
    -
    - - -

    Running as a foreground service

    - -

    Services are often used for performing background tasks, such as fetching emails, -synchronizing data, downloading content, amongst other possibilities. In these -cases, the user is not actively aware of the service's execution, and probably -wouldn't even notice if some of these services were interrupted and later restarted.

    - -

    But consider the case of a service that is playing music. Clearly this is a service that the user -is actively aware of and the experience would be severely affected by any interruptions. -Additionally, it's a service that the user will likely wish to interact with during its execution. -In this case, the service should run as a "foreground service." A -foreground service holds a higher level of importance within the system—the system will -almost never kill the service, because it is of immediate importance to the user. When running -in the foreground, the service also must provide a status bar notification to ensure that users are -aware of the running service and allow them to open an activity that can interact with the -service.

    - -

    In order to turn your service into a foreground service, you must create a -{@link android.app.Notification Notification} for the status bar and call -{@link android.app.Service#startForeground startForeground()} from the {@link -android.app.Service}. For example:

    - -
    String songName;
    -// assign the song name to songName
    -PendingIntent pi = PendingIntent.getActivity(getApplicationContext(), 0,
    -                new Intent(getApplicationContext(), MainActivity.class),
    -                PendingIntent.FLAG_UPDATE_CURRENT);
    -Notification notification = new Notification();
    -notification.tickerText = text;
    -notification.icon = R.drawable.play0;
    -notification.flags |= Notification.FLAG_ONGOING_EVENT;
    -notification.setLatestEventInfo(getApplicationContext(), "MusicPlayerSample",
    -                "Playing: " + songName, pi);
    -startForeground(NOTIFICATION_ID, notification);
    -
    - -

    While your service is running in the foreground, the notification you -configured is visible in the notification area of the device. If the user -selects the notification, the system invokes the {@link android.app.PendingIntent} you supplied. In -the example above, it opens an activity ({@code MainActivity}).

    - -

    Figure 1 shows how your notification appears to the user:

    - - -   - -

    Figure 1. Screenshots of a foreground service's notification, showing the notification icon in the status bar (left) and the expanded view (right).

    - -

    You should only hold on to the "foreground service" status while your -service is actually performing something the user is actively aware of. Once -that is no longer true, you should release it by calling -{@link android.app.Service#stopForeground stopForeground()}:

    - -
    -stopForeground(true);
    -
    - -

    For more information, see the documentation about Services and -Status Bar Notifications.

    - - -

    Handling audio focus

    - -

    Even though only one activity can run at any given time, Android is a -multi-tasking environment. This poses a particular challenge to applications -that use audio, because there is only one audio output and there may be several -media services competing for its use. Before Android 2.2, there was no built-in -mechanism to address this issue, which could in some cases lead to a bad user -experience. For example, when a user is listening to -music and another application needs to notify the user of something very important, -the user might not hear the notification tone due to the loud music. Starting with -Android 2.2, the platform offers a way for applications to negotiate their -use of the device's audio output. This mechanism is called Audio Focus.

    - -

    When your application needs to output audio such as music or a notification, -you should always request audio focus. Once it has focus, it can use the sound output freely, but it should -always listen for focus changes. If it is notified that it has lost the audio -focus, it should immediately either kill the audio or lower it to a quiet level -(known as "ducking"—there is a flag that indicates which one is appropriate) and only resume -loud playback after it receives focus again.

    - -

    Audio Focus is cooperative in nature. That is, applications are expected -(and highly encouraged) to comply with the audio focus guidelines, but the -rules are not enforced by the system. If an application wants to play loud -music even after losing audio focus, nothing in the system will prevent that. -However, the user is more likely to have a bad experience and will be more -likely to uninstall the misbehaving application.

    - -

    To request audio focus, you must call -{@link android.media.AudioManager#requestAudioFocus requestAudioFocus()} from the {@link -android.media.AudioManager}, as the example below demonstrates:

    - -
    -AudioManager audioManager = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
    -int result = audioManager.requestAudioFocus(this, AudioManager.STREAM_MUSIC,
    -    AudioManager.AUDIOFOCUS_GAIN);
    -
    -if (result != AudioManager.AUDIOFOCUS_REQUEST_GRANTED) {
    -    // could not get audio focus.
    -}
    -
    - -

    The first parameter to {@link android.media.AudioManager#requestAudioFocus requestAudioFocus()} -is an {@link android.media.AudioManager.OnAudioFocusChangeListener -AudioManager.OnAudioFocusChangeListener}, -whose {@link android.media.AudioManager.OnAudioFocusChangeListener#onAudioFocusChange -onAudioFocusChange()} method is called whenever there is a change in audio focus. Therefore, you -should also implement this interface on your service and activities. For example:

    - -
    -class MyService extends Service
    -                implements AudioManager.OnAudioFocusChangeListener {
    -    // ....
    -    public void onAudioFocusChange(int focusChange) {
    -        // Do something based on focus change...
    -    }
    -}
    -
    - -

    The focusChange parameter tells you how the audio focus has changed, and -can be one of the following values (they are all constants defined in -{@link android.media.AudioManager AudioManager}):

    - - - -

    Here is an example implementation:

    - -
    -public void onAudioFocusChange(int focusChange) {
    -    switch (focusChange) {
    -        case AudioManager.AUDIOFOCUS_GAIN:
    -            // resume playback
    -            if (mMediaPlayer == null) initMediaPlayer();
    -            else if (!mMediaPlayer.isPlaying()) mMediaPlayer.start();
    -            mMediaPlayer.setVolume(1.0f, 1.0f);
    -            break;
    -
    -        case AudioManager.AUDIOFOCUS_LOSS:
    -            // Lost focus for an unbounded amount of time: stop playback and release media player
    -            if (mMediaPlayer.isPlaying()) mMediaPlayer.stop();
    -            mMediaPlayer.release();
    -            mMediaPlayer = null;
    -            break;
    -
    -        case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT:
    -            // Lost focus for a short time, but we have to stop
    -            // playback. We don't release the media player because playback
    -            // is likely to resume
    -            if (mMediaPlayer.isPlaying()) mMediaPlayer.pause();
    -            break;
    -
    -        case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK:
    -            // Lost focus for a short time, but it's ok to keep playing
    -            // at an attenuated level
    -            if (mMediaPlayer.isPlaying()) mMediaPlayer.setVolume(0.1f, 0.1f);
    -            break;
    -    }
    -}
    -
    - -

    Keep in mind that the audio focus APIs are available only with API level 8 (Android 2.2) -and above, so if you want to support previous -versions of Android, you should adopt a backward compatibility strategy that -allows you to use this feature if available, and fall back seamlessly if not.

    - -

    You can achieve backward compatibility either by calling the audio focus methods by reflection -or by implementing all the audio focus features in a separate class (say, -AudioFocusHelper). Here is an example of such a class:

    - -
    -public class AudioFocusHelper implements AudioManager.OnAudioFocusChangeListener {
    -    AudioManager mAudioManager;
    -
    -    // other fields here, you'll probably hold a reference to an interface
    -    // that you can use to communicate the focus changes to your Service
    -
    -    public AudioFocusHelper(Context ctx, /* other arguments here */) {
    -        mAudioManager = (AudioManager) mContext.getSystemService(Context.AUDIO_SERVICE);
    -        // ...
    -    }
    -
    -    public boolean requestFocus() {
    -        return AudioManager.AUDIOFOCUS_REQUEST_GRANTED ==
    -            mAudioManager.requestAudioFocus(mContext, AudioManager.STREAM_MUSIC,
    -            AudioManager.AUDIOFOCUS_GAIN);
    -    }
    -
    -    public boolean abandonFocus() {
    -        return AudioManager.AUDIOFOCUS_REQUEST_GRANTED ==
    -            mAudioManager.abandonAudioFocus(this);
    -    }
    -
    -    @Override
    -    public void onAudioFocusChange(int focusChange) {
    -        // let your service know about the focus change
    -    }
    -}
    -
    - - -

    You can create an instance of AudioFocusHelper class only if you detect that -the system is running API level 8 or above. For example:

    - -
    -if (android.os.Build.VERSION.SDK_INT >= 8) {
    -    mAudioFocusHelper = new AudioFocusHelper(getApplicationContext(), this);
    -} else {
    -    mAudioFocusHelper = null;
    -}
    -
    - - -

    Performing cleanup

    - -

    As mentioned earlier, a {@link android.media.MediaPlayer} object can consume a significant -amount of system resources, so you should keep it only for as long as you need and call -{@link android.media.MediaPlayer#release release()} when you are done with it. It's important -to call this cleanup method explicitly rather than rely on system garbage collection because -it might take some time before the garbage collector reclaims the {@link android.media.MediaPlayer}, -as it's only sensitive to memory needs and not to shortage of other media-related resources. -So, in the case when you're using a service, you should always override the -{@link android.app.Service#onDestroy onDestroy()} method to make sure you are releasing -the {@link android.media.MediaPlayer}:

    - -
    -public class MyService extends Service {
    -   MediaPlayer mMediaPlayer;
    -   // ...
    -
    -   @Override
    -   public void onDestroy() {
    -       if (mMediaPlayer != null) mMediaPlayer.release();
    -   }
    -}
    -
    - -

    You should always look for other opportunities to release your {@link android.media.MediaPlayer} -as well, apart from releasing it when being shut down. For example, if you expect not -to be able to play media for an extended period of time (after losing audio focus, for example), -you should definitely release your existing {@link android.media.MediaPlayer} and create it again -later. On the -other hand, if you only expect to stop playback for a very short time, you should probably -hold on to your {@link android.media.MediaPlayer} to avoid the overhead of creating and preparing it -again.

    - - - -

    Handling the AUDIO_BECOMING_NOISY Intent

    - -

    Many well-written applications that play audio automatically stop playback when an event -occurs that causes the audio to become noisy (ouput through external speakers). For instance, -this might happen when a user is listening to music through headphones and accidentally -disconnects the headphones from the device. However, this behavior does not happen automatically. -If you don't implement this feature, audio plays out of the device's external speakers, which -might not be what the user wants.

    - -

    You can ensure your app stops playing music in these situations by handling -the {@link android.media.AudioManager#ACTION_AUDIO_BECOMING_NOISY} intent, for which you can register a receiver by -adding the following to your manifest:

    - -
    -<receiver android:name=".MusicIntentReceiver">
    -   <intent-filter>
    -      <action android:name="android.media.AUDIO_BECOMING_NOISY" />
    -   </intent-filter>
    -</receiver>
    -
    - -

    This registers the MusicIntentReceiver class as a broadcast receiver for that -intent. You should then implement this class:

    - -
    -public class MusicIntentReceiver implements android.content.BroadcastReceiver {
    -   @Override
    -   public void onReceive(Context ctx, Intent intent) {
    -      if (intent.getAction().equals(
    -                    android.media.AudioManager.ACTION_AUDIO_BECOMING_NOISY)) {
    -          // signal your service to stop playback
    -          // (via an Intent, for instance)
    -      }
    -   }
    -}
    -
    - - - - -

    Retrieving Media from a Content Resolver

    - -

    Another feature that may be useful in a media player application is the ability to -retrieve music that the user has on the device. You can do that by querying the {@link -android.content.ContentResolver} for external media:

    - -
    -ContentResolver contentResolver = getContentResolver();
    -Uri uri = android.provider.MediaStore.Audio.Media.EXTERNAL_CONTENT_URI;
    -Cursor cursor = contentResolver.query(uri, null, null, null, null);
    -if (cursor == null) {
    -    // query failed, handle error.
    -} else if (!cursor.moveToFirst()) {
    -    // no media on the device
    -} else {
    -    int titleColumn = cursor.getColumnIndex(android.provider.MediaStore.Audio.Media.TITLE);
    -    int idColumn = cursor.getColumnIndex(android.provider.MediaStore.Audio.Media._ID);
    -    do {
    -       long thisId = cursor.getLong(idColumn);
    -       String thisTitle = cursor.getString(titleColumn);
    -       // ...process entry...
    -    } while (cursor.moveToNext());
    -}
    -
    - -

    To use this with the {@link android.media.MediaPlayer}, you can do this:

    - -
    -long id = /* retrieve it from somewhere */;
    -Uri contentUri = ContentUris.withAppendedId(
    -        android.provider.MediaStore.Audio.Media.EXTERNAL_CONTENT_URI, id);
    -
    -mMediaPlayer = new MediaPlayer();
    -mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
    -mMediaPlayer.setDataSource(getApplicationContext(), contentUri);
    -
    -// ...prepare and start...
    -
    - - - -

    Playing JET content

    - -

    The Android platform includes a JET engine that lets you add interactive playback of JET audio -content in your applications. You can create JET content for interactive playback using the -JetCreator authoring application that ships with the SDK. To play and manage JET content from your -application, use the {@link android.media.JetPlayer JetPlayer} class.

    - -

    For a description of JET concepts and instructions on how to use the JetCreator authoring tool, -see the JetCreator User -Manual. The tool is available on Windows, OS X, and Linux platforms (Linux does not -support auditioning of imported assets like with the Windows and OS X versions). -

    - -

    Here's an example of how to set up JET playback from a .jet file stored on the SD card:

    - -
    -JetPlayer jetPlayer = JetPlayer.getJetPlayer();
    -jetPlayer.loadJetFile("/sdcard/level1.jet");
    -byte segmentId = 0;
    -
    -// queue segment 5, repeat once, use General MIDI, transpose by -1 octave
    -jetPlayer.queueJetSegment(5, -1, 1, -1, 0, segmentId++);
    -// queue segment 2
    -jetPlayer.queueJetSegment(2, -1, 0, 0, 0, segmentId++);
    -
    -jetPlayer.play();
    -
    - -

    The SDK includes an example application — JetBoy — that shows how to use {@link -android.media.JetPlayer JetPlayer} to create an interactive music soundtrack in your game. It also -illustrates how to use JET events to synchronize music and game logic. The application is located at -<sdk>/platforms/android-1.5/samples/JetBoy.

    - - -

    Performing Audio Capture

    - -

    Audio capture from the device is a bit more complicated than audio and video playback, but still fairly simple:

    -
      -
    1. Create a new instance of {@link android.media.MediaRecorder android.media.MediaRecorder}.
    2. -
    3. Set the audio source using - {@link android.media.MediaRecorder#setAudioSource MediaRecorder.setAudioSource()}. You will probably want to use - MediaRecorder.AudioSource.MIC.
    4. -
    5. Set output file format using - {@link android.media.MediaRecorder#setOutputFormat MediaRecorder.setOutputFormat()}. -
    6. -
    7. Set output file name using - {@link android.media.MediaRecorder#setOutputFile MediaRecorder.setOutputFile()}. -
    8. -
    9. Set the audio encoder using - {@link android.media.MediaRecorder#setAudioEncoder MediaRecorder.setAudioEncoder()}. -
    10. -
    11. Call {@link android.media.MediaRecorder#prepare MediaRecorder.prepare()} - on the MediaRecorder instance.
    12. -
    13. To start audio capture, call - {@link android.media.MediaRecorder#start MediaRecorder.start()}.
    14. -
    15. To stop audio capture, call {@link android.media.MediaRecorder#stop MediaRecorder.stop()}. -
    16. When you are done with the MediaRecorder instance, call -{@link android.media.MediaRecorder#release MediaRecorder.release()} on it. Calling -{@link android.media.MediaRecorder#release MediaRecorder.release()} is always recommended to -free the resource immediately.
    17. -
    - -

    Example: Record audio and play the recorded audio

    -

    The example class below illustrates how to set up, start and stop audio capture, and to play the recorded audio file.

    -
    -/*
    - * The application needs to have the permission to write to external storage
    - * if the output file is written to the external storage, and also the
    - * permission to record audio. These permissions must be set in the
    - * application's AndroidManifest.xml file, with something like:
    - *
    - * <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
    - * <uses-permission android:name="android.permission.RECORD_AUDIO" />
    - *
    - */
    -package com.android.audiorecordtest;
    -
    -import android.app.Activity;
    -import android.widget.LinearLayout;
    -import android.os.Bundle;
    -import android.os.Environment;
    -import android.view.ViewGroup;
    -import android.widget.Button;
    -import android.view.View;
    -import android.view.View.OnClickListener;
    -import android.content.Context;
    -import android.util.Log;
    -import android.media.MediaRecorder;
    -import android.media.MediaPlayer;
    -
    -import java.io.IOException;
    -
    -
    -public class AudioRecordTest extends Activity
    -{
    -    private static final String LOG_TAG = "AudioRecordTest";
    -    private static String mFileName = null;
    -
    -    private RecordButton mRecordButton = null;
    -    private MediaRecorder mRecorder = null;
    -
    -    private PlayButton   mPlayButton = null;
    -    private MediaPlayer   mPlayer = null;
    -
    -    private void onRecord(boolean start) {
    -        if (start) {
    -            startRecording();
    -        } else {
    -            stopRecording();
    -        }
    -    }
    -
    -    private void onPlay(boolean start) {
    -        if (start) {
    -            startPlaying();
    -        } else {
    -            stopPlaying();
    -        }
    -    }
    -
    -    private void startPlaying() {
    -        mPlayer = new MediaPlayer();
    -        try {
    -            mPlayer.setDataSource(mFileName);
    -            mPlayer.prepare();
    -            mPlayer.start();
    -        } catch (IOException e) {
    -            Log.e(LOG_TAG, "prepare() failed");
    -        }
    -    }
    -
    -    private void stopPlaying() {
    -        mPlayer.release();
    -        mPlayer = null;
    -    }
    -
    -    private void startRecording() {
    -        mRecorder = new MediaRecorder();
    -        mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
    -        mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
    -        mRecorder.setOutputFile(mFileName);
    -        mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
    -
    -        try {
    -            mRecorder.prepare();
    -        } catch (IOException e) {
    -            Log.e(LOG_TAG, "prepare() failed");
    -        }
    -
    -        mRecorder.start();
    -    }
    -
    -    private void stopRecording() {
    -        mRecorder.stop();
    -        mRecorder.release();
    -        mRecorder = null;
    -    }
    -
    -    class RecordButton extends Button {
    -        boolean mStartRecording = true;
    -
    -        OnClickListener clicker = new OnClickListener() {
    -            public void onClick(View v) {
    -                onRecord(mStartRecording);
    -                if (mStartRecording) {
    -                    setText("Stop recording");
    -                } else {
    -                    setText("Start recording");
    -                }
    -                mStartRecording = !mStartRecording;
    -            }
    -        };
    -
    -        public RecordButton(Context ctx) {
    -            super(ctx);
    -            setText("Start recording");
    -            setOnClickListener(clicker);
    -        }
    -    }
    -
    -    class PlayButton extends Button {
    -        boolean mStartPlaying = true;
    -
    -        OnClickListener clicker = new OnClickListener() {
    -            public void onClick(View v) {
    -                onPlay(mStartPlaying);
    -                if (mStartPlaying) {
    -                    setText("Stop playing");
    -                } else {
    -                    setText("Start playing");
    -                }
    -                mStartPlaying = !mStartPlaying;
    -            }
    -        };
    -
    -        public PlayButton(Context ctx) {
    -            super(ctx);
    -            setText("Start playing");
    -            setOnClickListener(clicker);
    -        }
    -    }
    -
    -    public AudioRecordTest() {
    -        mFileName = Environment.getExternalStorageDirectory().getAbsolutePath();
    -        mFileName += "/audiorecordtest.3gp";
    -    }
    -
    -    @Override
    -    public void onCreate(Bundle icicle) {
    -        super.onCreate(icicle);
    -
    -        LinearLayout ll = new LinearLayout(this);
    -        mRecordButton = new RecordButton(this);
    -        ll.addView(mRecordButton,
    -            new LinearLayout.LayoutParams(
    -                ViewGroup.LayoutParams.WRAP_CONTENT,
    -                ViewGroup.LayoutParams.WRAP_CONTENT,
    -                0));
    -        mPlayButton = new PlayButton(this);
    -        ll.addView(mPlayButton,
    -            new LinearLayout.LayoutParams(
    -                ViewGroup.LayoutParams.WRAP_CONTENT,
    -                ViewGroup.LayoutParams.WRAP_CONTENT,
    -                0));
    -        setContentView(ll);
    -    }
    -
    -    @Override
    -    public void onPause() {
    -        super.onPause();
    -        if (mRecorder != null) {
    -            mRecorder.release();
    -            mRecorder = null;
    -        }
    -
    -        if (mPlayer != null) {
    -            mPlayer.release();
    -            mPlayer = null;
    -        }
    -    }
    -}
    -
    - - - +

    The Android multimedia framework includes support for capturing and playing audio, video and +images in a variety of common media types, so that you can easily integrate them into your +applications. You can play audio or video from media files stored in your application's resources, +from standalone files in the file system, or from a data stream arriving over a +network connection, all using the {@link android.media.MediaPlayer} or {@link +android.media.JetPlayer} APIs. You can also record audio, video and take pictures using the {@link +android.media.MediaRecorder} and {@link android.hardware.Camera} APIs if supported by the device +hardware.

    + +

    The following topics show you how to use the Android framework to implement multimedia capture +and playback.

    + +
    +
    MediaPlayer
    +
    How to play audio and video in your application.
    + +
    JetPlayer
    +
    How to play interactive audio and video in your application using content created with +JetCreator.
    + +
    Camera
    +
    How to use a device camera to take pictures or video in your application.
    + +
    Audio +Capture
    +
    How to record sound in your application.
    +
    \ No newline at end of file diff --git a/docs/html/guide/topics/media/jetplayer.jd b/docs/html/guide/topics/media/jetplayer.jd new file mode 100644 index 0000000000000..f3d55f90f4aeb --- /dev/null +++ b/docs/html/guide/topics/media/jetplayer.jd @@ -0,0 +1,70 @@ +page.title=JetPlayer +parent.title=Multimedia and Camera +parent.link=index.html +@jd:body + +
    +
    + +

    In this document

    +
      +
    1. Playing JET content +
    + +

    Key classes

    +
      +
    1. {@link android.media.JetPlayer}
    2. +
    + +

    Related Samples

    +
      +
    1. JetBoy
    2. +
    + +

    See also

    +
      +
    1. JetCreator User +Manual
    2. +
    3. Android Supported Media Formats
    4. +
    5. Data Storage
    6. +
    7. MediaPlayer
    8. +
    + +
    +
    + +

    The Android platform includes a JET engine that lets you add interactive playback of JET audio +content in your applications. You can create JET content for interactive playback using the +JetCreator authoring application that ships with the SDK. To play and manage JET content from your +application, use the {@link android.media.JetPlayer JetPlayer} class.

    + + +

    Playing JET content

    + +

    This section shows you how to write, set up and play JET content. For a description of JET +concepts and instructions on how to use the JetCreator authoring tool, see the JetCreator User +Manual. The tool is available on Windows, OS X, and Linux platforms (Linux does not +support auditioning of imported assets like with the Windows and OS X versions). +

    + +

    Here's an example of how to set up JET playback from a .jet file stored on the SD +card:

    + +
    +JetPlayer jetPlayer = JetPlayer.getJetPlayer();
    +jetPlayer.loadJetFile("/sdcard/level1.jet");
    +byte segmentId = 0;
    +
    +// queue segment 5, repeat once, use General MIDI, transpose by -1 octave
    +jetPlayer.queueJetSegment(5, -1, 1, -1, 0, segmentId++);
    +// queue segment 2
    +jetPlayer.queueJetSegment(2, -1, 0, 0, 0, segmentId++);
    +
    +jetPlayer.play();
    +
    + +The SDK includes an example application — JetBoy — that shows how to use {@link +android.media.JetPlayer JetPlayer} to create an interactive music soundtrack in your game. It also +illustrates how to use JET events to synchronize music and game logic. The application is located at +JetBoy.

    \ No newline at end of file diff --git a/docs/html/guide/topics/media/mediaplayer.jd b/docs/html/guide/topics/media/mediaplayer.jd new file mode 100644 index 0000000000000..b3ca7dd4c7bbe --- /dev/null +++ b/docs/html/guide/topics/media/mediaplayer.jd @@ -0,0 +1,747 @@ +page.title=Media Playback +parent.title=Multimedia and Camera +parent.link=index.html +@jd:body + + + +

    The Android multimedia framework includes support for playing variety of common media types, so +that you can easily integrate audio, video and images into your applications. You can play audio or +video from media files stored in your application's resources (raw resources), from standalone files +in the filesystem, or from a data stream arriving over a network connection, all using {@link +android.media.MediaPlayer} APIs.

    + +

    This document shows you how to write a media-playing application that interacts with the user and +the system in order to obtain good performance and a pleasant user experience.

    + +

    Note: You can play back the audio data only to the standard output +device. Currently, that is the mobile device speaker or a Bluetooth headset. You cannot play sound +files in the conversation audio during a call.

    + +

    The Basics

    +

    The following classes are used to play sound and video in the Android framework:

    + +
    +
    {@link android.media.MediaPlayer}
    +
    This class is the primary API for playing sound and video.
    +
    {@link android.media.AudioManager}
    +
    This class manages audio sources and audio output on a device.
    +
    + +

    Manifest Declarations

    +

    Before starting development on your application using MediaPlayer, make sure your manifest has +the appropriate declarations to allow use of related features.

    + + + +

    Using MediaPlayer

    +

    One of the most important components of the media framework is the +{@link android.media.MediaPlayer MediaPlayer} +class. An object of this class can fetch, decode, and play both audio and video +with minimal setup. It supports several different media sources such as: +

    +

    + +

    For a list of media formats that Android supports, +see the Android Supported Media +Formats document.

    + +

    Here is an example +of how to play audio that's available as a local raw resource (saved in your application's +{@code res/raw/} directory):

    + +
    MediaPlayer mediaPlayer = MediaPlayer.create(context, R.raw.sound_file_1);
    +mediaPlayer.start(); // no need to call prepare(); create() does that for you
    +
    + +

    In this case, a "raw" resource is a file that the system does not +try to parse in any particular way. However, the content of this resource should not +be raw audio. It should be a properly encoded and formatted media file in one +of the supported formats.

    + +

    And here is how you might play from a URI available locally in the system +(that you obtained through a Content Resolver, for instance):

    + +
    Uri myUri = ....; // initialize Uri here
    +MediaPlayer mediaPlayer = new MediaPlayer();
    +mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
    +mediaPlayer.setDataSource(getApplicationContext(), myUri);
    +mediaPlayer.prepare();
    +mediaPlayer.start();
    + +

    Playing from a remote URL via HTTP streaming looks like this:

    + +
    String url = "http://........"; // your URL here
    +MediaPlayer mediaPlayer = new MediaPlayer();
    +mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
    +mediaPlayer.setDataSource(url);
    +mediaPlayer.prepare(); // might take long! (for buffering, etc)
    +mediaPlayer.start();
    + +

    Note: +If you're passing a URL to stream an online media file, the file must be capable of +progressive download.

    + +

    Caution: You must either catch or pass +{@link java.lang.IllegalArgumentException} and {@link java.io.IOException} when using +{@link android.media.MediaPlayer#setDataSource setDataSource()}, because +the file you are referencing might not exist.

    + +

    Asynchronous Preparation

    + +

    Using {@link android.media.MediaPlayer MediaPlayer} can be straightforward in +principle. However, it's important to keep in mind that a few more things are +necessary to integrate it correctly with a typical Android application. For +example, the call to {@link android.media.MediaPlayer#prepare prepare()} can +take a long time to execute, because +it might involve fetching and decoding media data. So, as is the case with any +method that may take long to execute, you should never call it from your +application's UI thread. Doing that will cause the UI to hang until the method returns, +which is a very bad user experience and can cause an ANR (Application Not Responding) error. Even if +you expect your resource to load quickly, remember that anything that takes more than a tenth +of a second to respond in the UI will cause a noticeable pause and will give +the user the impression that your application is slow.

    + +

    To avoid hanging your UI thread, spawn another thread to +prepare the {@link android.media.MediaPlayer} and notify the main thread when done. However, while +you could write the threading logic +yourself, this pattern is so common when using {@link android.media.MediaPlayer} that the framework +supplies a convenient way to accomplish this task by using the +{@link android.media.MediaPlayer#prepareAsync prepareAsync()} method. This method +starts preparing the media in the background and returns immediately. When the media +is done preparing, the {@link android.media.MediaPlayer.OnPreparedListener#onPrepared onPrepared()} +method of the {@link android.media.MediaPlayer.OnPreparedListener +MediaPlayer.OnPreparedListener}, configured through +{@link android.media.MediaPlayer#setOnPreparedListener setOnPreparedListener()} is called.

    + +

    Managing State

    + +

    Another aspect of a {@link android.media.MediaPlayer} that you should keep in mind is +that it's state-based. That is, the {@link android.media.MediaPlayer} has an internal state +that you must always be aware of when writing your code, because certain operations +are only valid when then player is in specific states. If you perform an operation while in the +wrong state, the system may throw an exception or cause other undesireable behaviors.

    + +

    The documentation in the +{@link android.media.MediaPlayer MediaPlayer} class shows a complete state diagram, +that clarifies which methods move the {@link android.media.MediaPlayer} from one state to another. +For example, when you create a new {@link android.media.MediaPlayer}, it is in the Idle +state. At that point, you should initialize it by calling +{@link android.media.MediaPlayer#setDataSource setDataSource()}, bringing it +to the Initialized state. After that, you have to prepare it using either the +{@link android.media.MediaPlayer#prepare prepare()} or +{@link android.media.MediaPlayer#prepareAsync prepareAsync()} method. When +the {@link android.media.MediaPlayer} is done preparing, it will then enter the Prepared +state, which means you can call {@link android.media.MediaPlayer#start start()} +to make it play the media. At that point, as the diagram illustrates, +you can move between the Started, Paused and PlaybackCompleted states by +calling such methods as +{@link android.media.MediaPlayer#start start()}, +{@link android.media.MediaPlayer#pause pause()}, and +{@link android.media.MediaPlayer#seekTo seekTo()}, +amongst others. When you +call {@link android.media.MediaPlayer#stop stop()}, however, notice that you +cannot call {@link android.media.MediaPlayer#start start()} again until you +prepare the {@link android.media.MediaPlayer} again.

    + +

    Always keep the state diagram +in mind when writing code that interacts with a +{@link android.media.MediaPlayer} object, because calling its methods from the wrong state is a +common cause of bugs.

    + +

    Releasing the MediaPlayer

    + +

    A {@link android.media.MediaPlayer MediaPlayer} can consume valuable +system resources. +Therefore, you should always take extra precautions to make sure you are not +hanging on to a {@link android.media.MediaPlayer} instance longer than necessary. When you +are done with it, you should always call +{@link android.media.MediaPlayer#release release()} to make sure any +system resources allocated to it are properly released. For example, if you are +using a {@link android.media.MediaPlayer} and your activity receives a call to {@link +android.app.Activity#onStop onStop()}, you must release the {@link android.media.MediaPlayer}, +because it +makes little sense to hold on to it while your activity is not interacting with +the user (unless you are playing media in the background, which is discussed in the next section). +When your activity is resumed or restarted, of course, you need to +create a new {@link android.media.MediaPlayer} and prepare it again before resuming playback.

    + +

    Here's how you should release and then nullify your {@link android.media.MediaPlayer}:

    +
    +mediaPlayer.release();
    +mediaPlayer = null;
    +
    + +

    As an example, consider the problems that could happen if you +forgot to release the {@link android.media.MediaPlayer} when your activity is stopped, but create a +new one when the activity starts again. As you may know, when the user changes the +screen orientation (or changes the device configuration in another way), +the system handles that by restarting the activity (by default), so you might quickly +consume all of the system resources as the user +rotates the device back and forth between portrait and landscape, because at each +orientation change, you create a new {@link android.media.MediaPlayer} that you never +release. (For more information about runtime restarts, see Handling Runtime Changes.)

    + +

    You may be wondering what happens if you want to continue playing +"background media" even when the user leaves your activity, much in the same +way that the built-in Music application behaves. In this case, what you need is +a {@link android.media.MediaPlayer MediaPlayer} controlled by a {@link android.app.Service}, as +discussed in Using a Service with MediaPlayer.

    + +

    Using a Service with MediaPlayer

    + +

    If you want your media to play in the background even when your application +is not onscreen—that is, you want it to continue playing while the user is +interacting with other applications—then you must start a +{@link android.app.Service Service} and control the +{@link android.media.MediaPlayer MediaPlayer} instance from there. +You should be careful about this setup, because the user and the system have expectations +about how an application running a background service should interact with the rest of the +system. If your application does not fulfil those expectations, the user may +have a bad experience. This section describes the main issues that you should be +aware of and offers suggestions about how to approach them.

    + + +

    Running asynchronously

    + +

    First of all, like an {@link android.app.Activity Activity}, all work in a +{@link android.app.Service Service} is done in a single thread by +default—in fact, if you're running an activity and a service from the same application, they +use the same thread (the "main thread") by default. Therefore, services need to +process incoming intents quickly +and never perform lengthy computations when responding to them. If any heavy +work or blocking calls are expected, you must do those tasks asynchronously: either from +another thread you implement yourself, or using the framework's many facilities +for asynchronous processing.

    + +

    For instance, when using a {@link android.media.MediaPlayer} from your main thread, +you should call {@link android.media.MediaPlayer#prepareAsync prepareAsync()} rather than +{@link android.media.MediaPlayer#prepare prepare()}, and implement +a {@link android.media.MediaPlayer.OnPreparedListener MediaPlayer.OnPreparedListener} +in order to be notified when the preparation is complete and you can start playing. +For example:

    + +
    +public class MyService extends Service implements MediaPlayer.OnPreparedListener {
    +    private static final ACTION_PLAY = "com.example.action.PLAY";
    +    MediaPlayer mMediaPlayer = null;
    +
    +    public int onStartCommand(Intent intent, int flags, int startId) {
    +        ...
    +        if (intent.getAction().equals(ACTION_PLAY)) {
    +            mMediaPlayer = ... // initialize it here
    +            mMediaPlayer.setOnPreparedListener(this);
    +            mMediaPlayer.prepareAsync(); // prepare async to not block main thread
    +        }
    +    }
    +
    +    /** Called when MediaPlayer is ready */
    +    public void onPrepared(MediaPlayer player) {
    +        player.start();
    +    }
    +}
    +
    + + +

    Handling asynchronous errors

    + +

    On synchronous operations, errors would normally +be signaled with an exception or an error code, but whenever you use asynchronous +resources, you should make sure your application is notified +of errors appropriately. In the case of a {@link android.media.MediaPlayer MediaPlayer}, +you can accomplish this by implementing a +{@link android.media.MediaPlayer.OnErrorListener MediaPlayer.OnErrorListener} and +setting it in your {@link android.media.MediaPlayer} instance:

    + +
    +public class MyService extends Service implements MediaPlayer.OnErrorListener {
    +    MediaPlayer mMediaPlayer;
    +
    +    public void initMediaPlayer() {
    +        // ...initialize the MediaPlayer here...
    +
    +        mMediaPlayer.setOnErrorListener(this);
    +    }
    +
    +    @Override
    +    public boolean onError(MediaPlayer mp, int what, int extra) {
    +        // ... react appropriately ...
    +        // The MediaPlayer has moved to the Error state, must be reset!
    +    }
    +}
    +
    + +

    It's important to remember that when an error occurs, the {@link android.media.MediaPlayer} +moves to the Error state (see the documentation for the +{@link android.media.MediaPlayer MediaPlayer} class for the full state diagram) +and you must reset it before you can use it again. + + +

    Using wake locks

    + +

    When designing applications that play media +in the background, the device may go to sleep +while your service is running. Because the Android system tries to conserve +battery while the device is sleeping, the system tries to shut off any +of the phone's features that are +not necessary, including the CPU and the WiFi hardware. +However, if your service is playing or streaming music, you want to prevent +the system from interfering with your playback.

    + +

    In order to ensure that your service continues to run under +those conditions, you have to use "wake locks." A wake lock is a way to signal to +the system that your application is using some feature that should +stay available even if the phone is idle.

    + +

    Notice: You should always use wake locks sparingly and hold them +only for as long as truly necessary, because they significantly reduce the battery life of the +device.

    + +

    To ensure that the CPU continues running while your {@link android.media.MediaPlayer} is +playing, call the {@link android.media.MediaPlayer#setWakeMode +setWakeMode()} method when initializing your {@link android.media.MediaPlayer}. Once you do, +the {@link android.media.MediaPlayer} holds the specified lock while playing and releases the lock +when paused or stopped:

    + +
    +mMediaPlayer = new MediaPlayer();
    +// ... other initialization here ...
    +mMediaPlayer.setWakeMode(getApplicationContext(), PowerManager.PARTIAL_WAKE_LOCK);
    +
    + +

    However, the wake lock acquired in this example guarantees only that the CPU remains awake. If +you are streaming media over the +network and you are using Wi-Fi, you probably want to hold a +{@link android.net.wifi.WifiManager.WifiLock WifiLock} as +well, which you must acquire and release manually. So, when you start preparing the +{@link android.media.MediaPlayer} with the remote URL, you should create and acquire the Wi-Fi lock. +For example:

    + +
    +WifiLock wifiLock = ((WifiManager) getSystemService(Context.WIFI_SERVICE))
    +    .createWifiLock(WifiManager.WIFI_MODE_FULL, "mylock");
    +
    +wifiLock.acquire();
    +
    + +

    When you pause or stop your media, or when you no longer need the +network, you should release the lock:

    + +
    +wifiLock.release();
    +
    + + +

    Running as a foreground service

    + +

    Services are often used for performing background tasks, such as fetching emails, +synchronizing data, downloading content, amongst other possibilities. In these +cases, the user is not actively aware of the service's execution, and probably +wouldn't even notice if some of these services were interrupted and later restarted.

    + +

    But consider the case of a service that is playing music. Clearly this is a service that the user +is actively aware of and the experience would be severely affected by any interruptions. +Additionally, it's a service that the user will likely wish to interact with during its execution. +In this case, the service should run as a "foreground service." A +foreground service holds a higher level of importance within the system—the system will +almost never kill the service, because it is of immediate importance to the user. When running +in the foreground, the service also must provide a status bar notification to ensure that users are +aware of the running service and allow them to open an activity that can interact with the +service.

    + +

    In order to turn your service into a foreground service, you must create a +{@link android.app.Notification Notification} for the status bar and call +{@link android.app.Service#startForeground startForeground()} from the {@link +android.app.Service}. For example:

    + +
    String songName;
    +// assign the song name to songName
    +PendingIntent pi = PendingIntent.getActivity(getApplicationContext(), 0,
    +                new Intent(getApplicationContext(), MainActivity.class),
    +                PendingIntent.FLAG_UPDATE_CURRENT);
    +Notification notification = new Notification();
    +notification.tickerText = text;
    +notification.icon = R.drawable.play0;
    +notification.flags |= Notification.FLAG_ONGOING_EVENT;
    +notification.setLatestEventInfo(getApplicationContext(), "MusicPlayerSample",
    +                "Playing: " + songName, pi);
    +startForeground(NOTIFICATION_ID, notification);
    +
    + +

    While your service is running in the foreground, the notification you +configured is visible in the notification area of the device. If the user +selects the notification, the system invokes the {@link android.app.PendingIntent} you supplied. In +the example above, it opens an activity ({@code MainActivity}).

    + +

    Figure 1 shows how your notification appears to the user:

    + + +   + +

    Figure 1. Screenshots of a foreground service's +notification, showing the notification icon in the status bar (left) and the expanded view +(right).

    + +

    You should only hold on to the "foreground service" status while your +service is actually performing something the user is actively aware of. Once +that is no longer true, you should release it by calling +{@link android.app.Service#stopForeground stopForeground()}:

    + +
    +stopForeground(true);
    +
    + +

    For more information, see the documentation about Services and +Status Bar Notifications.

    + + +

    Handling audio focus

    + +

    Even though only one activity can run at any given time, Android is a +multi-tasking environment. This poses a particular challenge to applications +that use audio, because there is only one audio output and there may be several +media services competing for its use. Before Android 2.2, there was no built-in +mechanism to address this issue, which could in some cases lead to a bad user +experience. For example, when a user is listening to +music and another application needs to notify the user of something very important, +the user might not hear the notification tone due to the loud music. Starting with +Android 2.2, the platform offers a way for applications to negotiate their +use of the device's audio output. This mechanism is called Audio Focus.

    + +

    When your application needs to output audio such as music or a notification, +you should always request audio focus. Once it has focus, it can use the sound output freely, but it +should +always listen for focus changes. If it is notified that it has lost the audio +focus, it should immediately either kill the audio or lower it to a quiet level +(known as "ducking"—there is a flag that indicates which one is appropriate) and only resume +loud playback after it receives focus again.

    + +

    Audio Focus is cooperative in nature. That is, applications are expected +(and highly encouraged) to comply with the audio focus guidelines, but the +rules are not enforced by the system. If an application wants to play loud +music even after losing audio focus, nothing in the system will prevent that. +However, the user is more likely to have a bad experience and will be more +likely to uninstall the misbehaving application.

    + +

    To request audio focus, you must call +{@link android.media.AudioManager#requestAudioFocus requestAudioFocus()} from the {@link +android.media.AudioManager}, as the example below demonstrates:

    + +
    +AudioManager audioManager = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
    +int result = audioManager.requestAudioFocus(this, AudioManager.STREAM_MUSIC,
    +    AudioManager.AUDIOFOCUS_GAIN);
    +
    +if (result != AudioManager.AUDIOFOCUS_REQUEST_GRANTED) {
    +    // could not get audio focus.
    +}
    +
    + +

    The first parameter to {@link android.media.AudioManager#requestAudioFocus requestAudioFocus()} +is an {@link android.media.AudioManager.OnAudioFocusChangeListener +AudioManager.OnAudioFocusChangeListener}, +whose {@link android.media.AudioManager.OnAudioFocusChangeListener#onAudioFocusChange +onAudioFocusChange()} method is called whenever there is a change in audio focus. Therefore, you +should also implement this interface on your service and activities. For example:

    + +
    +class MyService extends Service
    +                implements AudioManager.OnAudioFocusChangeListener {
    +    // ....
    +    public void onAudioFocusChange(int focusChange) {
    +        // Do something based on focus change...
    +    }
    +}
    +
    + +

    The focusChange parameter tells you how the audio focus has changed, and +can be one of the following values (they are all constants defined in +{@link android.media.AudioManager AudioManager}):

    + + + +

    Here is an example implementation:

    + +
    +public void onAudioFocusChange(int focusChange) {
    +    switch (focusChange) {
    +        case AudioManager.AUDIOFOCUS_GAIN:
    +            // resume playback
    +            if (mMediaPlayer == null) initMediaPlayer();
    +            else if (!mMediaPlayer.isPlaying()) mMediaPlayer.start();
    +            mMediaPlayer.setVolume(1.0f, 1.0f);
    +            break;
    +
    +        case AudioManager.AUDIOFOCUS_LOSS:
    +            // Lost focus for an unbounded amount of time: stop playback and release media player
    +            if (mMediaPlayer.isPlaying()) mMediaPlayer.stop();
    +            mMediaPlayer.release();
    +            mMediaPlayer = null;
    +            break;
    +
    +        case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT:
    +            // Lost focus for a short time, but we have to stop
    +            // playback. We don't release the media player because playback
    +            // is likely to resume
    +            if (mMediaPlayer.isPlaying()) mMediaPlayer.pause();
    +            break;
    +
    +        case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK:
    +            // Lost focus for a short time, but it's ok to keep playing
    +            // at an attenuated level
    +            if (mMediaPlayer.isPlaying()) mMediaPlayer.setVolume(0.1f, 0.1f);
    +            break;
    +    }
    +}
    +
    + +

    Keep in mind that the audio focus APIs are available only with API level 8 (Android 2.2) +and above, so if you want to support previous +versions of Android, you should adopt a backward compatibility strategy that +allows you to use this feature if available, and fall back seamlessly if not.

    + +

    You can achieve backward compatibility either by calling the audio focus methods by reflection +or by implementing all the audio focus features in a separate class (say, +AudioFocusHelper). Here is an example of such a class:

    + +
    +public class AudioFocusHelper implements AudioManager.OnAudioFocusChangeListener {
    +    AudioManager mAudioManager;
    +
    +    // other fields here, you'll probably hold a reference to an interface
    +    // that you can use to communicate the focus changes to your Service
    +
    +    public AudioFocusHelper(Context ctx, /* other arguments here */) {
    +        mAudioManager = (AudioManager) mContext.getSystemService(Context.AUDIO_SERVICE);
    +        // ...
    +    }
    +
    +    public boolean requestFocus() {
    +        return AudioManager.AUDIOFOCUS_REQUEST_GRANTED ==
    +            mAudioManager.requestAudioFocus(mContext, AudioManager.STREAM_MUSIC,
    +            AudioManager.AUDIOFOCUS_GAIN);
    +    }
    +
    +    public boolean abandonFocus() {
    +        return AudioManager.AUDIOFOCUS_REQUEST_GRANTED ==
    +            mAudioManager.abandonAudioFocus(this);
    +    }
    +
    +    @Override
    +    public void onAudioFocusChange(int focusChange) {
    +        // let your service know about the focus change
    +    }
    +}
    +
    + + +

    You can create an instance of AudioFocusHelper class only if you detect that +the system is running API level 8 or above. For example:

    + +
    +if (android.os.Build.VERSION.SDK_INT >= 8) {
    +    mAudioFocusHelper = new AudioFocusHelper(getApplicationContext(), this);
    +} else {
    +    mAudioFocusHelper = null;
    +}
    +
    + + +

    Performing cleanup

    + +

    As mentioned earlier, a {@link android.media.MediaPlayer} object can consume a significant +amount of system resources, so you should keep it only for as long as you need and call +{@link android.media.MediaPlayer#release release()} when you are done with it. It's important +to call this cleanup method explicitly rather than rely on system garbage collection because +it might take some time before the garbage collector reclaims the {@link android.media.MediaPlayer}, +as it's only sensitive to memory needs and not to shortage of other media-related resources. +So, in the case when you're using a service, you should always override the +{@link android.app.Service#onDestroy onDestroy()} method to make sure you are releasing +the {@link android.media.MediaPlayer}:

    + +
    +public class MyService extends Service {
    +   MediaPlayer mMediaPlayer;
    +   // ...
    +
    +   @Override
    +   public void onDestroy() {
    +       if (mMediaPlayer != null) mMediaPlayer.release();
    +   }
    +}
    +
    + +

    You should always look for other opportunities to release your {@link android.media.MediaPlayer} +as well, apart from releasing it when being shut down. For example, if you expect not +to be able to play media for an extended period of time (after losing audio focus, for example), +you should definitely release your existing {@link android.media.MediaPlayer} and create it again +later. On the +other hand, if you only expect to stop playback for a very short time, you should probably +hold on to your {@link android.media.MediaPlayer} to avoid the overhead of creating and preparing it +again.

    + + + +

    Handling the AUDIO_BECOMING_NOISY Intent

    + +

    Many well-written applications that play audio automatically stop playback when an event +occurs that causes the audio to become noisy (ouput through external speakers). For instance, +this might happen when a user is listening to music through headphones and accidentally +disconnects the headphones from the device. However, this behavior does not happen automatically. +If you don't implement this feature, audio plays out of the device's external speakers, which +might not be what the user wants.

    + +

    You can ensure your app stops playing music in these situations by handling +the {@link android.media.AudioManager#ACTION_AUDIO_BECOMING_NOISY} intent, for which you can +register a receiver by +adding the following to your manifest:

    + +
    +<receiver android:name=".MusicIntentReceiver">
    +   <intent-filter>
    +      <action android:name="android.media.AUDIO_BECOMING_NOISY" />
    +   </intent-filter>
    +</receiver>
    +
    + +

    This registers the MusicIntentReceiver class as a broadcast receiver for that +intent. You should then implement this class:

    + +
    +public class MusicIntentReceiver implements android.content.BroadcastReceiver {
    +   @Override
    +   public void onReceive(Context ctx, Intent intent) {
    +      if (intent.getAction().equals(
    +                    android.media.AudioManager.ACTION_AUDIO_BECOMING_NOISY)) {
    +          // signal your service to stop playback
    +          // (via an Intent, for instance)
    +      }
    +   }
    +}
    +
    + + + + +

    Retrieving Media from a Content Resolver

    + +

    Another feature that may be useful in a media player application is the ability to +retrieve music that the user has on the device. You can do that by querying the {@link +android.content.ContentResolver} for external media:

    + +
    +ContentResolver contentResolver = getContentResolver();
    +Uri uri = android.provider.MediaStore.Audio.Media.EXTERNAL_CONTENT_URI;
    +Cursor cursor = contentResolver.query(uri, null, null, null, null);
    +if (cursor == null) {
    +    // query failed, handle error.
    +} else if (!cursor.moveToFirst()) {
    +    // no media on the device
    +} else {
    +    int titleColumn = cursor.getColumnIndex(android.provider.MediaStore.Audio.Media.TITLE);
    +    int idColumn = cursor.getColumnIndex(android.provider.MediaStore.Audio.Media._ID);
    +    do {
    +       long thisId = cursor.getLong(idColumn);
    +       String thisTitle = cursor.getString(titleColumn);
    +       // ...process entry...
    +    } while (cursor.moveToNext());
    +}
    +
    + +

    To use this with the {@link android.media.MediaPlayer}, you can do this:

    + +
    +long id = /* retrieve it from somewhere */;
    +Uri contentUri = ContentUris.withAppendedId(
    +        android.provider.MediaStore.Audio.Media.EXTERNAL_CONTENT_URI, id);
    +
    +mMediaPlayer = new MediaPlayer();
    +mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
    +mMediaPlayer.setDataSource(getApplicationContext(), contentUri);
    +
    +// ...prepare and start...
    +
    \ No newline at end of file