diff --git a/media/java/android/media/MediaCodec.java b/media/java/android/media/MediaCodec.java index 6cab56c1d4669..2de69e7bc1dea 100644 --- a/media/java/android/media/MediaCodec.java +++ b/media/java/android/media/MediaCodec.java @@ -75,12 +75,12 @@ import java.util.Map;

Compressed Buffers

Input buffers (for decoders) and output buffers (for encoders) contain compressed data according - to the {@linkplain MediaFormat#KEY_MIME format's type}. For video types this is a single + to the {@linkplain MediaFormat#KEY_MIME format's type}. For video types this is normally a single compressed video frame. For audio data this is normally a single access unit (an encoded audio segment typically containing a few milliseconds of audio as dictated by the format type), but this requirement is slightly relaxed in that a buffer may contain multiple encoded access units of audio. In either case, buffers do not start or end on arbitrary byte boundaries, but rather on - frame/access unit boundaries. + frame/access unit boundaries unless they are flagged with {@link #BUFFER_FLAG_PARTIAL_FRAME}.

Raw Audio Buffers

diff --git a/media/java/android/media/MediaSync.java b/media/java/android/media/MediaSync.java index 5522d362bc093..799f4bf4f8661 100644 --- a/media/java/android/media/MediaSync.java +++ b/media/java/android/media/MediaSync.java @@ -35,7 +35,7 @@ import java.util.LinkedList; import java.util.List; /** - * MediaSync class can be used to synchronously playback audio and video streams. + * MediaSync class can be used to synchronously play audio and video streams. * It can be used to play audio-only or video-only stream, too. * *

MediaSync is generally used like this: