docs: Created content describing high-performance audio.
am: 44aff87b6d
* commit '44aff87b6decadc91f0725b70cfb7a4d1da67b84':
docs: Created content describing high-performance audio.
Change-Id: Id9d72ce45dff409f88d5a0d6e179eed1001dc829
This commit is contained in:
@@ -60,6 +60,16 @@ toc:
|
||||
path: /ndk/guides/audio/basics.html
|
||||
- title: OpenSL ES for Android
|
||||
path: /ndk/guides/audio/opensl-for-android.html
|
||||
- title: Audio Input Latency
|
||||
path: /ndk/guides/audio/input-latency.html
|
||||
- title: Audio Output Latency
|
||||
path: /ndk/guides/audio/output-latency.html
|
||||
- title: Floating-Point Audio
|
||||
path: /ndk/guides/audio/floating-point.html
|
||||
- title: Sample Rates
|
||||
path: /ndk/guides/audio/sample-rates.html
|
||||
- title: OpenSL ES Programming Notes
|
||||
path: /ndk/guides/audio/opensl-prog-notes.html
|
||||
|
||||
- title: Vulkan
|
||||
path: /ndk/guides/graphics/index.html
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
page.title=OpenSL ES™ Basics
|
||||
page.title=High-Performance Audio Basics
|
||||
@jd:body
|
||||
|
||||
<div id="qv-wrapper">
|
||||
@@ -6,26 +6,51 @@ page.title=OpenSL ES™ Basics
|
||||
<h2>On this page</h2>
|
||||
|
||||
<ol>
|
||||
<li><a href="#overview">Building Great Audio Apps</a></li>
|
||||
<li><a href="#adding">Adding OpenSL ES to Your App</a></li>
|
||||
<li><a href="#building">Building and Debugging</a></li>
|
||||
<li><a href="#power">Audio Power Consumption</a></li>
|
||||
<li><a href="#samples">Samples</a></li>
|
||||
</ol>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<a href="https://www.youtube.com/watch?v=d3kfEeMZ65c" class="notice-developers-video">
|
||||
<div>
|
||||
<h3>Video</h3>
|
||||
<p>Google I/O 2013 - High Performance Audio</p>
|
||||
</div>
|
||||
</a>
|
||||
|
||||
<p>
|
||||
The Khronos Group's OpenSL ES standard exposes audio features
|
||||
The Khronos Group's OpenSL ES™ standard exposes audio features
|
||||
similar to those in the {@link android.media.MediaPlayer} and {@link android.media.MediaRecorder}
|
||||
APIs in the Android Java framework. OpenSL ES provides a C language interface as well as
|
||||
C++ bindings, allowing you to call it from code written in either language.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
This page describes how to add these audio APIs into your app's source code, and how to incorporate
|
||||
them into the build process.
|
||||
This page describes the typical use cases for these high-performance audio APIs, how to add them
|
||||
into your app's source code, and how to incorporate them into the build process.
|
||||
</p>
|
||||
|
||||
<h2 id="overview">Building Great Audio Apps</h2>
|
||||
|
||||
<p>
|
||||
The OpenSL ES APIs are available to help you develop and improve your app's audio performance.
|
||||
Some typical use cases include the following:</p>
|
||||
|
||||
<ul>
|
||||
<li>Digital Audio Workstations (DAWs).</li>
|
||||
<li>Synthesizers.</li>
|
||||
<li>Drum machines.</li>
|
||||
<li>Music learning apps.</li>
|
||||
<li>Karaoke apps.</li>
|
||||
<li>DJ mixing.</li>
|
||||
<li>Audio effects.</li>
|
||||
<li>Video/audio conferencing.</li>
|
||||
</ul>
|
||||
|
||||
<h2 id="adding">Adding OpenSL ES to your App</h2>
|
||||
|
||||
<p>
|
||||
@@ -45,6 +70,18 @@ Android extensions</a> as well, include the {@code OpenSLES_Android.h} header fi
|
||||
#include <SLES/OpenSLES_Android.h>
|
||||
</pre>
|
||||
|
||||
<p>
|
||||
When you include the {@code OpenSLES_Android.h} header file, the following headers are included
|
||||
automatically:
|
||||
</p>
|
||||
<pre>
|
||||
#include <SLES/OpenSLES_AndroidConfiguration.h>
|
||||
#include <SLES/OpenSLES_AndroidMetadata.h>
|
||||
</pre>
|
||||
|
||||
<p class="note"><strong>Note: </strong>
|
||||
These headers are not required, but are shown as an aid in learning the API.
|
||||
</p>
|
||||
|
||||
<h2 id="building">Building and Debugging</h2>
|
||||
|
||||
@@ -69,9 +106,9 @@ for a given use case.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
We use asserts in our <a href="https://github.com/googlesamples/android-ndk">examples</a>, because
|
||||
they help catch unrealistic conditions that would indicate a coding error. We have used explicit
|
||||
error handling for other conditions more likely to occur in production.
|
||||
We use asserts in our <a class="external-link" href="https://github.com/googlesamples/android-ndk">
|
||||
examples</a>, because they help catch unrealistic conditions that would indicate a coding error. We
|
||||
have used explicit error handling for other conditions more likely to occur in production.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
@@ -91,18 +128,25 @@ $ adb logcat
|
||||
</pre>
|
||||
|
||||
<p>
|
||||
To examine the log from Android Studio, either click the <em>Logcat</em> tab in the
|
||||
<a href="{@docRoot}tools/debugging/debugging-studio.html#runDebug"><em>Debug</em></a>
|
||||
window, or click the <em>Devices | logcat</em> tab in the
|
||||
<a href="{@docRoot}tools/debugging/debugging-studio.html#systemLogView"><em>Android DDMS</em></a>
|
||||
To examine the log from Android Studio, either click the <strong>Logcat</strong> tab in the
|
||||
<a href="{@docRoot}tools/debugging/debugging-studio.html#runDebug">Debug</a>
|
||||
window, or click the <strong>Devices | logcat</strong> tab in the
|
||||
<a href="{@docRoot}tools/debugging/debugging-studio.html#systemLogView">Android DDMS</a>
|
||||
window.
|
||||
</p>
|
||||
|
||||
<h2 id="power">Audio Power Consumption</h2>
|
||||
<p>Constantly outputting audio incurs significant power consumption. Ensure that you stop the
|
||||
output in the
|
||||
<a href="{@docRoot}reference/android/app/Activity.html#onPause()">onPause()</a> method.
|
||||
Also consider pausing the silent output after some period of user inactivity.
|
||||
</p>
|
||||
<h2 id="samples">Samples</h2>
|
||||
|
||||
<p>
|
||||
Supported and tested example code that you can use as a model for your own code resides both locally
|
||||
and on GitHub. The local examples are located in
|
||||
and on
|
||||
<a class="external-link" href="https://github.com/googlesamples/android-audio-high-performance/">
|
||||
GitHub</a>. The local examples are located in
|
||||
{@code platforms/android-9/samples/native-audio/}, under your NDK root installation directory.
|
||||
On GitHub, they are available from the
|
||||
<a class="external-link" href="https://github.com/googlesamples/android-ndk">{@code android-ndk}</a>
|
||||
@@ -122,4 +166,4 @@ Android app.
|
||||
For more information on differences between the reference specification and the
|
||||
Android implementation, see
|
||||
<a href="{@docRoot}ndk/guides/audio/opensl-for-android.html">
|
||||
OpenSL ES™ for Android</a>.
|
||||
OpenSL ES for Android</a>.
|
||||
|
||||
101
docs/html/ndk/guides/audio/floating-point.jd
Normal file
101
docs/html/ndk/guides/audio/floating-point.jd
Normal file
@@ -0,0 +1,101 @@
|
||||
page.title=Floating-Point Audio
|
||||
@jd:body
|
||||
|
||||
<div id="qv-wrapper">
|
||||
<div id="qv">
|
||||
<h2>On this page</h2>
|
||||
|
||||
<ol>
|
||||
<li><a href="#best">Best Practices for Floating-Point Audio</a></li>
|
||||
<li><a href="#support">Floating-Point Audio in Android SDK</a></li>
|
||||
<li><a href="#more">For More Information</a></li>
|
||||
</ol>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<a href="https://www.youtube.com/watch?v=sIcieUqMml8" class="notice-developers-video">
|
||||
<div>
|
||||
<h3>Video</h3>
|
||||
<p>Will it Float? The Glory and Shame of Floating-Point Audio</p>
|
||||
</div>
|
||||
</a>
|
||||
|
||||
<p>Using floating-point numbers to represent audio data can significantly enhance audio
|
||||
quality in high-performance audio applications. Floating point offers the following
|
||||
advantages:</p>
|
||||
|
||||
<ul>
|
||||
<li>Wider dynamic range.</li>
|
||||
<li>Consistent accuracy across the dynamic range.</li>
|
||||
<li>More headroom to avoid clipping during intermediate calculations and transients.</li>
|
||||
</ul>
|
||||
|
||||
<p>While floating-point can enhance audio quality, it does present certain disadvantages:</p>
|
||||
|
||||
<ul>
|
||||
<li>Floating-point numbers use more memory.</li>
|
||||
<li>Floating-point operations employ unexpected properties, for example, addition is
|
||||
not associative.</li>
|
||||
<li>Floating-point calculations can sometimes lose arithmetic precision due to rounding or
|
||||
numerically unstable algorithms.</li>
|
||||
<li>Using floating-point effectively requires greater understanding to achieve accurate
|
||||
and reproducible results.</li>
|
||||
</ul>
|
||||
|
||||
<p>
|
||||
Formerly, floating-point was notorious for being unavailable or slow. This is
|
||||
still true for low-end and embedded processors. But processors on modern
|
||||
mobile devices now have hardware floating-point with performance that is
|
||||
similar (or in some cases even faster) than integer. Modern CPUs also support
|
||||
<a href="http://en.wikipedia.org/wiki/SIMD" class="external-link">SIMD</a>
|
||||
(Single instruction, multiple data), which can improve performance further.
|
||||
</p>
|
||||
|
||||
<h2 id="best">Best Practices for Floating-Point Audio</h2>
|
||||
<p>The following best practices help you avoid problems with floating-point calculations:</p>
|
||||
<ul>
|
||||
<li>Use double precision floating-point for infrequent calculations,
|
||||
such as computing filter coefficients.</li>
|
||||
<li>Pay attention to the order of operations.</li>
|
||||
<li>Declare explicit variables for intermediate values.</li>
|
||||
<li>Use parentheses liberally.</li>
|
||||
<li>If you get a NaN or infinity result, use binary search to discover
|
||||
where it was introduced.</li>
|
||||
</ul>
|
||||
|
||||
<h2 id="support">Floating-Point Audio in Android SDK</h2>
|
||||
|
||||
<p>For floating-point audio, the audio format encoding
|
||||
<code>AudioFormat.ENCODING_PCM_FLOAT</code> is used similarly to
|
||||
<code>ENCODING_PCM_16_BIT</code> or <code>ENCODING_PCM_8_BIT</code> for specifying
|
||||
AudioTrack data
|
||||
formats. The corresponding overloaded method <code>AudioTrack.write()</code>
|
||||
takes in a float array to deliver data.</p>
|
||||
|
||||
<pre>
|
||||
public int write(float[] audioData,
|
||||
int offsetInFloats,
|
||||
int sizeInFloats,
|
||||
int writeMode)
|
||||
</pre>
|
||||
|
||||
<h2 id="more">For More Information</h2>
|
||||
|
||||
<p>The following Wikipedia pages are helpful in understanding floating-point audio:</p>
|
||||
|
||||
<ul>
|
||||
<li><a href="http://en.wikipedia.org/wiki/Audio_bit_depth" class="external-link" >Audio bit depth</a></li>
|
||||
<li><a href="http://en.wikipedia.org/wiki/Floating_point" class="external-link" >Floating point</a></li>
|
||||
<li><a href="http://en.wikipedia.org/wiki/IEEE_floating_point" class="external-link" >IEEE 754 floating-point</a></li>
|
||||
<li><a href="http://en.wikipedia.org/wiki/Loss_of_significance" class="external-link" >Loss of significance</a>
|
||||
(catastrophic cancellation)</li>
|
||||
<li><a href="https://en.wikipedia.org/wiki/Numerical_stability" class="external-link" >Numerical stability</a></li>
|
||||
</ul>
|
||||
|
||||
<p>The following article provides information on those aspects of floating-point that have a
|
||||
direct impact on designers of computer systems:</p>
|
||||
<ul>
|
||||
<li><a href="http://docs.oracle.com/cd/E19957-01/806-3568/ncg_goldberg.html" class="external-link" >What every
|
||||
computer scientist should know about floating-point arithmetic</a>
|
||||
by David Goldberg, Xerox PARC (edited reprint).</li>
|
||||
</ul>
|
||||
@@ -1,15 +1,27 @@
|
||||
page.title=NDK Audio: OpenSL ES™
|
||||
page.title=NDK High-Performance Audio
|
||||
@jd:body
|
||||
|
||||
<p>The NDK package includes an Android-specific implementation of the
|
||||
<a href="https://www.khronos.org/opensles/">OpenSL ES</a> API
|
||||
specification from the <a href="https://www.khronos.org">Khronos Group</a>. This library
|
||||
allows you to use C or C++ to implement high-performance, low-latency audio in your game or other
|
||||
demanding app.</p>
|
||||
<a class="external-link" href="https://www.khronos.org/opensles/">OpenSL ES™</a> API
|
||||
specification from the <a class="external-link" href="https://www.khronos.org">Khronos Group</a>.
|
||||
This library allows you to use C or C++ to implement high-performance, low-latency audio, whether
|
||||
you are writing a synthesizer, digital audio workstation, karaoke, game,
|
||||
or other real-time app.</p>
|
||||
|
||||
<p>This section begins by providing some
|
||||
<a href="{@docRoot}ndk/guides/audio/basics.html">basic information</a> about the API, including how
|
||||
to incorporate it into your app. It then explains what you need to know about the
|
||||
<a href="{@docRoot}ndk/guides/audio/opensl-for-android.html">Android-specific implementation</a>
|
||||
of OpenSL ES, focusing on differences between this implementation and the reference specification.
|
||||
</p>
|
||||
<a href="{@docRoot}ndk/guides/audio/basics.html">basic information</a> about the API, including
|
||||
typical use cases and how to incorporate it into your app. It then explains what you need to know
|
||||
about the <a href="{@docRoot}ndk/guides/audio/opensl-for-android.html">Android-specific
|
||||
implementation</a> of OpenSL ES, focusing on the differences between this implementation and the
|
||||
reference specification. Next, you'll learn how to minimze
|
||||
<a href="{@docRoot}ndk/guides/audio/input-latency.html">input latency</a>
|
||||
when using built-in or external microphones
|
||||
and some actions that you can take to minimize
|
||||
<a href="{@docRoot}ndk/guides/audio/output-latency.html">output latency</a>.
|
||||
It describes the reasons that you should use
|
||||
<a href="{@docRoot}ndk/guides/audio/floating-point.html">floating-point</a>
|
||||
numbers to represent your audio data, and it provides information that will help you choose the
|
||||
optimal <a href="{@docRoot}ndk/guides/audio/sample-rates.html">sample rate</a>. This section
|
||||
concludes with some supplemental <a href="{@docRoot}ndk/guides/audio/opensl-prog-notes.html">
|
||||
programming notes</a> to ensure proper implementation of OpenSL ES.
|
||||
</p>
|
||||
|
||||
95
docs/html/ndk/guides/audio/input-latency.jd
Normal file
95
docs/html/ndk/guides/audio/input-latency.jd
Normal file
@@ -0,0 +1,95 @@
|
||||
page.title=Audio Input Latency
|
||||
@jd:body
|
||||
|
||||
<div id="qv-wrapper">
|
||||
<div id="qv">
|
||||
<h2>On this page</h2>
|
||||
|
||||
<ol>
|
||||
<li><a href="#check-list">Checklist</a></li>
|
||||
<li><a href="#ways">Ways to Reduce Audio Input Latency</a></li>
|
||||
<li><a href="#avoid">What to Avoid</a></li>
|
||||
</ol>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
|
||||
<p>This page provides guidelines to help you reduce audio input latency when recording with a
|
||||
built-in microphone or an external headset microphone.</p>
|
||||
|
||||
<h2 id="check-list">Checklist</h2>
|
||||
|
||||
<p>Here are a few important prerequisites:</p>
|
||||
|
||||
<ul>
|
||||
<li>You must use the Android-specific implementation of the
|
||||
<a class="external-link" href="https://www.khronos.org/opensles/">OpenSL ES™</a> API.
|
||||
|
||||
<li>If you haven't already done so, download and install the
|
||||
<a href="{@docRoot}tools/sdk/ndk/index.html">Android NDK</a>.</li>
|
||||
|
||||
<li>Many of the same requirements for low-latency audio output also apply to low-latency input,
|
||||
so read the requirements for low-latency output in
|
||||
<a href="{@docRoot}ndk/guides/audio/output-latency.html">Audio Output Latency</a>.</li>
|
||||
</ul>
|
||||
|
||||
<h2 id="ways">Ways to Reduce Audio Input Latency</h2>
|
||||
|
||||
<p>The following are some methods to help ensure low audio input latency:
|
||||
|
||||
<ul>
|
||||
<li>Suggest to your users, if your app relies on low-latency audio, that they use a headset
|
||||
(for example, by displaying a <em>Best with headphones</em> screen on first run). Note
|
||||
that just using the headset doesn’t guarantee the lowest possible latency. You may need to
|
||||
perform other steps to remove any unwanted signal processing from the audio path, such as by
|
||||
using the <a href="http://developer.android.com/reference/android/media/MediaRecorder.AudioSource.html#VOICE_RECOGNITION">
|
||||
VOICE_RECOGNITION</a> preset when recording.</li>
|
||||
|
||||
<li>It's difficult to test audio input and output latency in isolation. The best solution to
|
||||
determine the lowest possible audio input latency is to measure round-trip audio and divide
|
||||
by two.</li>
|
||||
<li> Be prepared to handle nominal sample rates of 44,100 and 48,000 Hz as reported by
|
||||
<a href="{@docRoot}reference/android/media/AudioManager.html#getProperty(java.lang.String)">
|
||||
getProperty(String)</a> for
|
||||
<a href="{@docRoot}reference/android/media/AudioManager.html#PROPERTY_OUTPUT_SAMPLE_RATE">
|
||||
PROPERTY_OUTPUT_SAMPLE_RATE</a>. Other sample rates are possible, but rare.</li>
|
||||
|
||||
<li>Be prepared to handle the buffer size reported by
|
||||
<a href="{@docRoot}reference/android/media/AudioManager.html#getProperty(java.lang.String)">
|
||||
getProperty(String)</a> for
|
||||
<a href="{@docRoot}reference/android/media/AudioManager.html#PROPERTY_OUTPUT_FRAMES_PER_BUFFER">
|
||||
PROPERTY_OUTPUT_FRAMES_PER_BUFFER</a>. Typical buffer sizes include 96, 128, 160, 192, 240, 256,
|
||||
or 512 frames, but other values are possible.</li>
|
||||
</ul>
|
||||
|
||||
<h2 id="avoid">What to Avoid</h2>
|
||||
|
||||
<p>Be sure to take these things into account to help avoid latency issues:</p>
|
||||
|
||||
<ul>
|
||||
<li>Don’t assume that the speakers and microphones used in mobile devices generally have good
|
||||
acoustics. Due to their small size, the acoustics are generally poor so signal processing is
|
||||
added to improve the sound quality. This signal processing introduces latency.</li>
|
||||
|
||||
<li>Don't assume that your input and output callbacks are synchronized. For simultaneous input
|
||||
and output, separate buffer queue completion handlers are used for each side. There is no
|
||||
guarantee of the relative order of these callbacks or the synchronization of the audio clocks,
|
||||
even when both sides use the same sample rate. Your application should buffer the data with
|
||||
proper buffer synchronization.</li>
|
||||
|
||||
<li>Don't assume that the actual sample rate exactly matches the nominal sample rate. For
|
||||
example, if the nominal sample rate is 48,000 Hz, it is normal for the audio clock to advance
|
||||
at a slightly different rate than the operating system {@code CLOCK_MONOTONIC}. This is because
|
||||
the audio and system clocks may derive from different crystals.</li>
|
||||
|
||||
<li>Don't assume that the actual playback sample rate exactly matches the actual capture sample
|
||||
rate, especially if the endpoints are on separate paths. For example, if you are capturing from
|
||||
the on-device microphone at 48,000 Hz nominal sample rate, and playing on USB audio
|
||||
at 48,000 Hz nominal sample rate, the actual sample rates are likely to be slightly different
|
||||
from each other.</li>
|
||||
</ul>
|
||||
|
||||
<p>A consequence of potentially independent audio clocks is the need for asynchronous sample rate
|
||||
conversion. A simple (though not ideal for audio quality) technique for asynchronous sample rate
|
||||
conversion is to duplicate or drop samples as needed near a zero-crossing point. More
|
||||
sophisticated conversions are possible.</p>
|
||||
@@ -1,4 +1,4 @@
|
||||
page.title=Native Audio: OpenSL ES™ for Android
|
||||
page.title=OpenSL ES for Android
|
||||
@jd:body
|
||||
|
||||
<div id="qv-wrapper">
|
||||
@@ -6,23 +6,158 @@ page.title=Native Audio: OpenSL ES™ for Android
|
||||
<h2>On this page</h2>
|
||||
|
||||
<ol>
|
||||
<li><a href="#getstarted">Getting Started</a></li>
|
||||
<li><a href="#inherited">Features Inherited from the Reference Specification</a></li>
|
||||
<li><a href="#planning">Planning for Future Versions of OpenSL ES</a></li>
|
||||
<li><a href="#ae">Android Extensions</a></li>
|
||||
<li><a href="#notes">Programming Notes</a></li>
|
||||
<li><a href="#platform-issues">Platform Issues</a></li>
|
||||
</ol>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<p>
|
||||
This page provides details about how the NDK implementation of OpenSL ES™ differs
|
||||
from the reference specification for OpenSL ES 1.0.1. When using sample code from the
|
||||
This page provides details about how the
|
||||
<a href="{@docRoot}tools/sdk/ndk/index.html">NDK</a> implementation of OpenSL
|
||||
ES™ differs from the reference specification for OpenSL ES 1.0.1. When using sample code from the
|
||||
specification, you may need to modify it to work on Android.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
Unless otherwise noted, all features are available at Android 2.3 (API level 9) and higher.
|
||||
Some features are only available for Android 4.0 (API level 14); these are noted.
|
||||
</p>
|
||||
|
||||
<p class="note"><strong>Note: </strong>
|
||||
The Android Compatibility Definition Document (CDD) enumerates the hardware and software
|
||||
requirements of a compatible Android device. See
|
||||
<a class="external-link" href="https://source.android.com/compatibility/">Android Compatibility</a>
|
||||
for more information on the overall compatibility program, and
|
||||
<a class="external-link" href="https://static.googleusercontent.com/media/source.android.com/en//compatibility/android-cdd.pdf">
|
||||
CDD</a> for the actual CDD document.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
<a class="external-link" href="https://www.khronos.org/opensles/">OpenSL ES</a> provides a C
|
||||
language interface that is also accessible using C++. It exposes features similar to the audio
|
||||
portions of these Android Java APIs:
|
||||
</p>
|
||||
|
||||
<ul>
|
||||
<li><a href="{@docRoot}reference/android/media/MediaPlayer.html">
|
||||
android.media.MediaPlayer</a></li>
|
||||
<li><a href="{@docRoot}reference/android/media/MediaRecorder.html">
|
||||
android.media.MediaRecorder</a></li>
|
||||
</ul>
|
||||
|
||||
<p>
|
||||
As with all of the Android Native Development Kit (NDK), the primary purpose of OpenSL ES for
|
||||
Android is to facilitate the implementation of shared libraries to be called using the Java Native
|
||||
Interface (<a class="external-link" href="https://en.wikipedia.org/wiki/Java_Native_Interface">JNI
|
||||
</a>). NDK is not intended for writing pure C/C++ applications. However, OpenSL ES is a
|
||||
full-featured API, and we expect that you should be able to accomplish most of your audio needs
|
||||
using only this API, without up-calls to code running in the Android runtime.
|
||||
</p>
|
||||
|
||||
<p class="note"><strong>Note: </strong>
|
||||
Though based on OpenSL ES, the Android native audio (high-performance audio) API is not a
|
||||
conforming implementation of any OpenSL ES 1.0.1 profile (game, music, or phone). This is because
|
||||
Android does not implement all of the features required by any one of the profiles. Any known cases
|
||||
where Android behaves differently than the specification are described in the <a href="#ae">
|
||||
Android extensions</a> section below.
|
||||
</p>
|
||||
|
||||
<h2 id="getstarted">Getting Started</h2>
|
||||
|
||||
<p>
|
||||
This section provides the information needed to get started using the OpenSL ES APIs.
|
||||
</p>
|
||||
|
||||
<h3>Example code</h3>
|
||||
|
||||
<p>
|
||||
We recommend using supported and tested example code that is usable as a model for your own
|
||||
code, which is located in the NDK folder {@code platforms/android-9/samples/native-audio/}, as well
|
||||
as in the
|
||||
<a class="external-link" href="https://github.com/googlesamples/android-ndk/tree/master/audio-echo">audio-echo</a>
|
||||
and
|
||||
<a class="external-link" href="https://github.com/googlesamples/android-ndk/tree/master/native-audio">native-audio</a>
|
||||
folders of the
|
||||
<a class="external-link" href="https://github.com/googlesamples/android-ndk">android-ndk</a> GitHub
|
||||
repository.
|
||||
</p>
|
||||
|
||||
<p class="caution"><strong>Caution: </strong>
|
||||
The OpenSL ES 1.0.1 specification contains example code in the appendices (see
|
||||
<a class="external-link" href="https://www.khronos.org/registry/sles/">Khronos OpenSL ES Registry</a>
|
||||
for more details). However, the examples in <em>Appendix B: Sample Code</em> and
|
||||
<em>Appendix C: Use Case Sample Code</em> use features that are not supported by Android. Some
|
||||
examples also contain typographical errors, or use APIs that are likely to change. Proceed with
|
||||
caution when referring to these; though the code may be helpful in understanding the full OpenSL ES
|
||||
standard, it should not be used as-is with Android.
|
||||
</p>
|
||||
|
||||
<h3>Makefile</h3>
|
||||
|
||||
<p>
|
||||
Modify your {@code Android.mk} file as follows:
|
||||
</p>
|
||||
<pre>
|
||||
LOCAL_LDLIBS += -lOpenSLES
|
||||
</pre>
|
||||
|
||||
<h3>Audio content</h3>
|
||||
|
||||
<p>
|
||||
The following are some of the many ways to package audio content for your application:
|
||||
</p>
|
||||
|
||||
<ul>
|
||||
<li><strong>Resources</strong>: By placing your audio files into the {@code res/raw/} folder,
|
||||
they can be accessed easily by the associated APIs for
|
||||
<a href="{@docRoot}reference/android/content/res/Resources.html">Resources</a>.
|
||||
However, there is no direct native access to resources, so you must write Java
|
||||
programming language code to copy them out before use.</li>
|
||||
<li><strong>Assets</strong>: By placing your audio files into the {@code assets/} folder, they
|
||||
are directly accessible by the Android native asset manager APIs. See the header files {@code
|
||||
android/asset_manager.h} and {@code android/asset_manager_jni.h} for more information on these
|
||||
APIs. The example code located in the NDK folder {@code platforms/android-9/samples/native-audio/}
|
||||
uses these native asset manager APIs in conjunction with the Android file descriptor data
|
||||
locator.</li>
|
||||
<li><strong>Network</strong>: You can use the URI data locator to play audio content directly
|
||||
from the network. However, be sure to read the <a href="#sandp">Security and permissions</a>
|
||||
section below.</li>
|
||||
<li><strong>Local file system</strong>: The URI data locator supports the {@code file:} scheme
|
||||
for local files, provided the files are accessible by the application. Note that the Android
|
||||
security framework restricts file access via the Linux user ID and group ID mechanisms.</li>
|
||||
<li><strong>Recorded</strong>: Your application can record audio data from the microphone input,
|
||||
store this content, and then play it back later. The example code uses this method for the <em>
|
||||
Playback</em> clip.</li>
|
||||
<li><strong>Compiled and linked inline</strong>: You can link your audio content directly into
|
||||
the shared library, and then play it using an audio player with buffer queue data locator. This
|
||||
is most suitable for short PCM format clips. The example code uses this technique for the <em>
|
||||
Hello</em> and <em>Android</em> clips. The PCM data was converted to hex strings using a
|
||||
{@code bin2c} tool (not supplied).</li>
|
||||
<li><strong>Real-time synthesis</strong>: Your application can synthesize PCM data on the fly and
|
||||
then play it using an audio player with buffer queue data locator. This is a relatively advanced
|
||||
technique, and the details of audio synthesis are beyond the scope of this article.</li>
|
||||
</ul>
|
||||
|
||||
<p class="note"><strong>Note: </strong>
|
||||
Finding or creating useful audio content for your application is beyond the scope of this article.
|
||||
You can use web search terms such as <em>interactive audio</em>, <em>game audio</em>, <em>sound
|
||||
design</em>, and <em>audio programming</em> to locate more information.
|
||||
</p>
|
||||
<p class="caution"><strong>Caution:</strong> It is your responsibility
|
||||
to ensure that you are legally permitted to play or record content. There may be privacy
|
||||
considerations for recording content.
|
||||
</p>
|
||||
|
||||
<h2 id="inherited">Features Inherited from the Reference Specification</h2>
|
||||
|
||||
<p>
|
||||
The Android NDK implementation of OpenSL ES inherits much of the feature set from
|
||||
the reference specification, although with certain limitations.
|
||||
the reference specification, with certain limitations.
|
||||
</p>
|
||||
|
||||
<h3>Global entry points</h3>
|
||||
@@ -44,8 +179,9 @@ These entry points include:
|
||||
<h3>Objects and interfaces</h3>
|
||||
|
||||
<p>
|
||||
Table 1 shows which objects and interfaces the Android NDK implementation of
|
||||
OpenSL ES supports. Green cells indicate features available in this implementation.
|
||||
Table 1 shows the objects and interfaces that the Android NDK implementation of
|
||||
OpenSL ES supports. If a <em>Yes</em> appears in the cell, then the feature is available in this
|
||||
implementation.
|
||||
</p>
|
||||
|
||||
<p class="table-caption" id="Objects-and-interfaces">
|
||||
@@ -214,7 +350,9 @@ OpenSL ES supports. Green cells indicate features available in this implementati
|
||||
</tr>
|
||||
</table>
|
||||
|
||||
The next section explains limitations of some of these features.
|
||||
<p>
|
||||
The next section explains the limitations for some of these features.
|
||||
</p>
|
||||
|
||||
<h3>Limitations</h3>
|
||||
|
||||
@@ -265,7 +403,7 @@ The Android implementation of OpenSL ES requires you to initialize <code>mimeTyp
|
||||
to either <code>NULL</code> or a valid UTF-8 string. You must also initialize
|
||||
<code>containerType</code> to a valid value.
|
||||
In the absence of other considerations, such as portability to other
|
||||
implementations, or content format that an app cannot identify by header,
|
||||
implementations or content format that an app cannot identify by header,
|
||||
we recommend that you
|
||||
set <code>mimeType</code> to <code>NULL</code> and <code>containerType</code>
|
||||
to <code>SL_CONTAINERTYPE_UNSPECIFIED</code>.
|
||||
@@ -275,30 +413,32 @@ OpenSL ES for Android supports the following audio formats, so long as the
|
||||
Android platform supports them as well:</p>
|
||||
|
||||
<ul>
|
||||
<li>WAV PCM</li>
|
||||
<li>WAV alaw</li>
|
||||
<li>WAV ulaw</li>
|
||||
<li>MP3 Ogg Vorbis</li>
|
||||
<li>AAC LC</li>
|
||||
<li>HE-AACv1 (AAC+)</li>
|
||||
<li>HE-AACv2 (enhanced AAC+)</li>
|
||||
<li>AMR</li>
|
||||
<li>FLAC</li>
|
||||
<li><a class="external-link" href="https://en.wikipedia.org/wiki/WAV">WAV</a> PCM.</li>
|
||||
<li>WAV alaw.</li>
|
||||
<li>WAV ulaw.</li>
|
||||
<li>MP3 Ogg Vorbis.</li>
|
||||
<li>AAC LC.</li>
|
||||
<li>HE-AACv1 (AAC+).</li>
|
||||
<li>HE-AACv2 (enhanced AAC+).</li>
|
||||
<li>AMR.</li>
|
||||
<li>FLAC.</li>
|
||||
</ul>
|
||||
|
||||
<p>
|
||||
<p class="note"><strong>Note: </strong>
|
||||
For a list of audio formats that Android supports, see
|
||||
<a href="{@docRoot}guide/appendix/media-formats.html">Supported Media Formats</a>.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
The following limitations apply to handling of these and other formats in this
|
||||
The following limitations apply to the handling of these and other formats in this
|
||||
implementation of OpenSL ES:
|
||||
</p>
|
||||
|
||||
<ul>
|
||||
<li>AAC formats must be reside within an MP4 or ADTS container.</li>
|
||||
<li>OpenSL ES for Android does not support MIDI.</li>
|
||||
<li><a class="external-link" href="https://en.wikipedia.org/wiki/Advanced_Audio_Coding">AAC</a>
|
||||
formats must reside within an MP4 or ADTS container.</li>
|
||||
<li>OpenSL ES for Android does not support
|
||||
<a class="external-link" href="https://source.android.com/devices/audio/midi.html">MIDI</a>.</li>
|
||||
<li>WMA is not part of <a class="external-link" href="https://source.android.com/">AOSP</a>, and we
|
||||
have not verified its compatibility with OpenSL ES for Android.</li>
|
||||
<li>The Android NDK implementation of OpenSL ES does not support direct
|
||||
@@ -333,13 +473,23 @@ playback configurations have the following characteristics:
|
||||
<li>8-bit unsigned or 16-bit signed.</li>
|
||||
<li>Mono or stereo.</li>
|
||||
<li>Little-endian byte ordering.</li>
|
||||
<li>Sample rates of: 8,000, 11,025, 12,000, 16,000, 22,050, 24,000, 32,000, 44,100, or
|
||||
48,000 Hz.</li>
|
||||
<li>Sample rates of:
|
||||
<ul>
|
||||
<li>8,000 Hz.</li>
|
||||
<li>11,025 Hz.</li>
|
||||
<li>12,000 Hz.</li>
|
||||
<li>16,000 Hz.</li>
|
||||
<li>22,050 Hz.</li>
|
||||
<li>24,000 Hz.</li>
|
||||
<li>32,000 Hz.</li>
|
||||
<li>44,100 Hz.</li>
|
||||
<li>48,000 Hz.</li>
|
||||
</ul></li>
|
||||
</ul>
|
||||
|
||||
<p>
|
||||
The configurations that OpenSL ES for Android supports for recording are
|
||||
device-dependent; usually, 16,000 Hz mono 16-bit signed is available regardless of device.
|
||||
device-dependent; usually, 16,000 Hz mono/16-bit signed is available regardless of the device.
|
||||
</p>
|
||||
<p>
|
||||
The value of the <code>samplesPerSec</code> field is in units of milliHz, despite the misleading
|
||||
@@ -393,7 +543,7 @@ to <code>SL_TIME_UNKNOWN</code>.
|
||||
An audio player or recorder with a data locator for a buffer queue supports PCM data format only.
|
||||
</p>
|
||||
|
||||
<h4>I/O Device data locator</h4>
|
||||
<h4>I/O device data locator</h4>
|
||||
|
||||
<p>
|
||||
OpenSL ES for Android only supports use of an I/O device data locator when you have
|
||||
@@ -421,6 +571,150 @@ and only for an audio player. You cannot use this data format for an audio recor
|
||||
We have not verified support for {@code rtsp:} with audio on the Android platform.
|
||||
</p>
|
||||
|
||||
<h4>Data structures</h4>
|
||||
|
||||
<p>
|
||||
Android supports these OpenSL ES 1.0.1 data structures:
|
||||
</p>
|
||||
<ul>
|
||||
<li>{@code SLDataFormat_MIME}</li>
|
||||
<li>{@code SLDataFormat_PCM}</li>
|
||||
<li>{@code SLDataLocator_BufferQueue}</li>
|
||||
<li>{@code SLDataLocator_IODevice}</li>
|
||||
<li>{@code SLDataLocator_OutputMix}</li>
|
||||
<li>{@code SLDataLocator_URI}</li>
|
||||
<li>{@code SLDataSink}</li>
|
||||
<li>{@code SLDataSource}</li>
|
||||
<li>{@code SLEngineOption}</li>
|
||||
<li>{@code SLEnvironmentalReverbSettings}</li>
|
||||
<li>{@code SLInterfaceID}</li>
|
||||
</ul>
|
||||
|
||||
<h4>Platform configuration</h4>
|
||||
|
||||
<p>
|
||||
OpenSL ES for Android is designed for multi-threaded applications and is thread-safe. It supports a
|
||||
single engine per application, and up to 32 objects per engine. Available device memory and CPU may
|
||||
further restrict the usable number of objects.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
These engine options are recognized, but ignored by {@code slCreateEngine}:
|
||||
</p>
|
||||
|
||||
<ul>
|
||||
<li>{@code SL_ENGINEOPTION_THREADSAFE}</li>
|
||||
<li>{@code SL_ENGINEOPTION_LOSSOFCONTROL}</li>
|
||||
</ul>
|
||||
|
||||
<p>
|
||||
OpenMAX AL and OpenSL ES may be used together in the same application. In this case, there is
|
||||
a single shared engine object internally, and the 32 object limit is shared between OpenMAX AL
|
||||
and OpenSL ES. The application should first create both engines, use both engines, and finally
|
||||
destroy both engines. The implementation maintains a reference count on the shared engine so that
|
||||
it is correctly destroyed during the second destroy operation.
|
||||
</p>
|
||||
|
||||
<h2 id="planning">Planning for Future Versions of OpenSL ES</h2>
|
||||
|
||||
<p>
|
||||
The Android high-performance audio APIs are based on
|
||||
<a class="external-link" href="https://www.khronos.org/registry/sles/">Khronos Group OpenSL ES
|
||||
1.0.1</a>. Khronos has released a revised version 1.1 of the standard. The
|
||||
revised version includes new features, clarifications, corrections of typographical errors, and
|
||||
some incompatibilities. Most of the expected incompatibilities are relatively minor or are in
|
||||
areas of OpenSL ES that are not supported by Android.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
An application
|
||||
developed with this version should work on future versions of the Android platform, provided
|
||||
that you follow the guidelines that are outlined in the <a href="#binary-compat">Planning for
|
||||
binary compatibility</a> section below.
|
||||
</p>
|
||||
|
||||
<p class="note"><strong>Note: </strong>
|
||||
Future source compatibility is not a goal. That is, if you upgrade to a newer version of the NDK,
|
||||
you may need to modify your application source code to conform to the new API. We expect that most
|
||||
such changes will be minor; see details below.
|
||||
</p>
|
||||
|
||||
<h3 id="binary-compat">Planning for binary compatibility</h3>
|
||||
|
||||
<p>
|
||||
We recommend that your application follow these guidelines to improve future binary compatibility:
|
||||
</p>
|
||||
|
||||
<ul>
|
||||
<li>Use only the documented subset of Android-supported features from OpenSL ES 1.0.1.</li>
|
||||
<li>Do not depend on a particular result code for an unsuccessful operation; be prepared to deal
|
||||
with a different result code.</li>
|
||||
<li>Application callback handlers generally run in a restricted context. They should be written
|
||||
to perform their work quickly, and then return as soon as possible. Do not run complex operations
|
||||
within a callback handler. For example, within a buffer queue completion callback, you can
|
||||
enqueue another buffer, but do not create an audio player.</li>
|
||||
<li>Callback handlers should be prepared to be called more or less frequently, to receive
|
||||
additional event types, and should ignore event types that they do not recognize. Callbacks that
|
||||
are configured with an event mask made of enabled event types should be prepared to be called
|
||||
with multiple event type bits set simultaneously. Use "&" to test for each event bit rather than
|
||||
a switch case.</li>
|
||||
<li>Use prefetch status and callbacks as a general indication of progress, but do not depend on
|
||||
specific hard-coded fill levels or callback sequences. The meaning of the prefetch status fill
|
||||
level, and the behavior for errors that are detected during prefetch, may change.</li>
|
||||
</ul>
|
||||
|
||||
<p class="note"><strong>Note: </strong>
|
||||
See the <a href="#bq-behavior">Buffer queue behavior</a> section below for more details.
|
||||
</p>
|
||||
|
||||
<h3>Planning for source compatibility</h3>
|
||||
|
||||
<p>
|
||||
As mentioned, source code incompatibilities are expected in the next version of OpenSL ES from
|
||||
Khronos Group. The likely areas of change include:
|
||||
</p>
|
||||
|
||||
<ul>
|
||||
<li>The buffer queue interface is expected to have significant changes, especially in the areas
|
||||
of {@code BufferQueue::Enqueue}, the parameter list for {@code slBufferQueueCallback}, and the
|
||||
name of field {@code SLBufferQueueState.playIndex}. We recommend that your application code use
|
||||
Android simple buffer queues instead. In the example
|
||||
code that is supplied with the NDK, we have used Android simple buffer queues for playback for
|
||||
this reason. (We also use Android simple buffer queue for recording and decoding to PCM, but that
|
||||
is because standard OpenSL ES 1.0.1 does not support record or decode to a buffer queue data
|
||||
sink.)</li>
|
||||
<li>There will be an addition of {@code const} to the input parameters passed by reference, and
|
||||
to {@code SLchar *} struct fields used as input values. This should not require any changes to
|
||||
your code.</li>
|
||||
<li>There will be a substitution of unsigned types for some parameters that are currently signed.
|
||||
You may need to change a parameter type from {@code SLint32} to {@code SLuint32} or similar, or
|
||||
add a cast.</li>
|
||||
<li>{@code Equalizer::GetPresetName} copies the string to application memory instead of returning
|
||||
a pointer to implementation memory. This will be a significant change, so we recommend that you
|
||||
either avoid calling this method, or isolate your use of it.</li>
|
||||
<li>There will be additional fields in the struct types. For output parameters, these new fields
|
||||
can be ignored, but for input parameters the new fields will need to be initialized. Fortunately,
|
||||
all of these are expected to be in areas that are not supported by Android.</li>
|
||||
<li>Interface <a class="external-link" href="http://en.wikipedia.org/wiki/Globally_unique_identifier">
|
||||
GUIDs</a> will change. Refer to interfaces by symbolic name rather than GUID to avoid a
|
||||
dependency.</li>
|
||||
<li>{@code SLchar} will change from {@code unsigned char} to {@code char}. This primarily affects
|
||||
the URI data locator and MIME data format.</li>
|
||||
<li>{@code SLDataFormat_MIME.mimeType} will be renamed to {@code pMimeType}, and
|
||||
{@code SLDataLocator_URI.URI} will be renamed to {@code pURI}. We recommend that you initialize
|
||||
the {@code SLDataFormat_MIME} and {@code SLDataLocator_URI} data structures using a
|
||||
brace-enclosed, comma-separated list of values, rather than by field name, to isolate your code
|
||||
from this change. This technique is used in the example code.</li>
|
||||
<li>{@code SL_DATAFORMAT_PCM} does not permit the application to specify the representation of
|
||||
the data as signed integer, unsigned integer, or floating-point. The Android implementation
|
||||
assumes that 8-bit data is unsigned integer and 16-bit is signed integer. In addition, the field
|
||||
{@code samplesPerSec} is a misnomer, as the actual units are milliHz. These issues are expected
|
||||
to be addressed in the next OpenSL ES version, which will introduce a new extended PCM data
|
||||
format that permits the application to explicitly specify the representation and corrects the
|
||||
field name. As this will be a new data format, and the current PCM data format will still be
|
||||
available (though deprecated), it should not require any immediate changes to your code.</li>
|
||||
</ul>
|
||||
|
||||
<h2 id="ae">Android Extensions</h2>
|
||||
|
||||
<p>
|
||||
@@ -444,8 +738,8 @@ avoiding use of the extensions or by using {@code #ifdef} to exclude them at com
|
||||
|
||||
<p>
|
||||
Table 2 shows the Android-specific interfaces and data locators that Android OpenSL ES supports
|
||||
for each object type. Green cells indicate interfaces and data locators available for each
|
||||
object type.
|
||||
for each object type. The <em>Yes</em> values in the cells indicate the interfaces and data
|
||||
locators that are available for each object type.
|
||||
</p>
|
||||
|
||||
<p class="table-caption" id="Android-extensions">
|
||||
@@ -523,7 +817,7 @@ object type.
|
||||
</tr>
|
||||
</table>
|
||||
|
||||
<h3>Android configuration interface</h3>
|
||||
<h3 id="configuration-interface">Android configuration interface</h3>
|
||||
|
||||
<p>
|
||||
The Android configuration interface provides a means to set
|
||||
@@ -581,6 +875,11 @@ audio effects. Device manufacturers should document any available device-specifi
|
||||
that they provide.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
Portable applications should use the OpenSL ES 1.0.1 APIs for audio effects instead of the Android
|
||||
effect extensions.
|
||||
</p>
|
||||
|
||||
<h3>Android file descriptor data locator</h3>
|
||||
|
||||
<p>
|
||||
@@ -597,9 +896,9 @@ the app reads assets from the APK via a file descriptor.
|
||||
<p>
|
||||
The Android simple buffer queue data locator and interface are
|
||||
identical to those in the OpenSL ES 1.0.1 reference specification, with two exceptions: You
|
||||
can also use Android simple buffer queues with both audio players and audio recorders. Also, PCM
|
||||
can also use Android simple buffer queues with both audio players and audio recorders. Also, PCM
|
||||
is the only data format you can use with these queues.
|
||||
In the reference specification, buffer queues are for audio players only, but
|
||||
In the reference specification, buffer queues are for audio players only, but they are
|
||||
compatible with data formats beyond PCM.
|
||||
</p>
|
||||
<p>
|
||||
@@ -613,7 +912,7 @@ compatibility, however, we suggest that applications use Android simple
|
||||
buffer queues instead of OpenSL ES 1.0.1 buffer queues.
|
||||
</p>
|
||||
|
||||
<h3>Dynamic interfaces at object creation</h3>
|
||||
<h3 id="dynamic-interfaces">Dynamic interfaces at object creation</h3>
|
||||
|
||||
<p>
|
||||
For convenience, the Android implementation of OpenSL ES 1.0.1
|
||||
@@ -622,7 +921,7 @@ This is an alternative to using <code>DynamicInterfaceManagement::AddInterface()
|
||||
to add these interfaces after instantiation.
|
||||
</p>
|
||||
|
||||
<h3>Buffer queue behavior</h3>
|
||||
<h3 id="bq-behavior">Buffer queue behavior</h3>
|
||||
|
||||
<p>
|
||||
The Android implementation does not include the
|
||||
@@ -641,7 +940,7 @@ you should explicitly call the <code>BufferQueue::Clear()</code> method after a
|
||||
<p>
|
||||
Similarly, there is no specification governing whether the trigger for a buffer queue callback must
|
||||
be a transition to <code>SL_PLAYSTATE_STOPPED</code> or execution of
|
||||
<code>BufferQueue::Clear()</code>. Therefore, we recommend against creating a dependency on
|
||||
<code>BufferQueue::Clear()</code>. Therefore, we recommend that you do not create a dependency on
|
||||
one or the other; instead, your app should be able to handle both.
|
||||
</p>
|
||||
|
||||
@@ -679,27 +978,34 @@ The table below gives recommendations for use of this extension and alternatives
|
||||
</tr>
|
||||
<tr>
|
||||
<td>13 and below</td>
|
||||
<td>An open-source codec with a suitable license.</td>
|
||||
<td>An open-source codec with a suitable license</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>14 to 15</td>
|
||||
<td>An open-source codec with a suitable license.</td>
|
||||
<td>An open-source codec with a suitable license</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>16 to 20</td>
|
||||
<td>
|
||||
The {@link android.media.MediaCodec} class or an open-source codec with a suitable license.
|
||||
The {@link android.media.MediaCodec} class or an open-source codec with a suitable license
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>21 and above</td>
|
||||
<td>
|
||||
NDK MediaCodec in the {@code <media/NdkMedia*.h>} header files, the
|
||||
{@link android.media.MediaCodec} class, or an open-source codec with a suitable license.
|
||||
{@link android.media.MediaCodec} class, or an open-source codec with a suitable license
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
|
||||
<p class="note"><strong>Note: </strong>
|
||||
There is currently no documentation for the NDK version of the {@code MediaCodec} API. However,
|
||||
you can refer to the
|
||||
<a class="external-link" href="https://github.com/googlesamples/android-ndk/tree/master/native-codec">
|
||||
native-codec</a> sample code for an example.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
A standard audio player plays back to an audio device, specifying the output mix as the data sink.
|
||||
The Android extension differs in that an audio player instead
|
||||
@@ -710,17 +1016,18 @@ an Android simple buffer queue data locator with PCM data format.
|
||||
|
||||
<p>
|
||||
This feature is primarily intended for games to pre-load their audio assets when changing to a
|
||||
new game level, similar to the functionality that the {@link android.media.SoundPool} class
|
||||
provides.
|
||||
new game level, which is similar to the functionality that the {@link android.media.SoundPool}
|
||||
class provides.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
The application should initially enqueue a set of empty buffers in the Android simple
|
||||
buffer queue. After that, the app fills the buffers with with PCM data. The Android simple
|
||||
buffer queue. After that, the app fills the buffers with PCM data. The Android simple
|
||||
buffer queue callback fires after each buffer is filled. The callback handler processes
|
||||
the PCM data, re-enqueues the now-empty buffer, and then returns. The application is responsible for
|
||||
keeping track of decoded buffers; the callback parameter list does not include
|
||||
sufficient information to indicate which buffer contains data or which buffer to enqueue next.
|
||||
sufficient information to indicate the buffer that contains data or the buffer that should be
|
||||
enqueued next.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
@@ -753,8 +1060,8 @@ PCM data, pausing the decoding process, or terminating the decoder outright.
|
||||
To decode an encoded stream to PCM but not play back immediately, for apps running on
|
||||
Android 4.x (API levels 16–20), we recommend using the {@link android.media.MediaCodec} class.
|
||||
For new applications running on Android 5.0 (API level 21) or higher, we recommend using the NDK
|
||||
equivalent, {@code <NdkMedia*.h>}. These header files reside under
|
||||
the {@code media/} directory, under your installation root.
|
||||
equivalent, {@code <NdkMedia*.h>}. These header files reside in
|
||||
the {@code media/} directory under your installation root.
|
||||
</p>
|
||||
|
||||
<h3>Decode streaming ADTS AAC to PCM</h3>
|
||||
@@ -796,7 +1103,7 @@ Each buffer contains one or more complete ADTS AAC frames.
|
||||
The Android buffer queue callback fires after each buffer is emptied.
|
||||
The callback handler should refill and re-enqueue the buffer, and then return.
|
||||
The application need not keep track of encoded buffers; the callback parameter
|
||||
list includes sufficient information to indicate which buffer to enqueue next.
|
||||
list includes sufficient information to indicate the buffer that should be enqueued next.
|
||||
The end of stream is explicitly marked by enqueuing an EOS item.
|
||||
After EOS, no more enqueues are permitted.
|
||||
</p>
|
||||
@@ -812,6 +1119,7 @@ The result of decoder starvation is unspecified.
|
||||
In all respects except for the data source, the streaming decode method is the same as
|
||||
the one that <a href="#da">Decode audio to PCM</a> describes.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
Despite the similarity in names, an Android buffer queue is <em>not</em>
|
||||
the same as an <a href="#simple">Android simple buffer queue</a>. The streaming decoder
|
||||
@@ -840,8 +1148,8 @@ available until after the app decodes the first encoded data. A good
|
||||
practice is to query for the key indices in the main thread after calling the {@code
|
||||
Object::Realize} method, and to read the PCM format metadata values in the Android simple
|
||||
buffer queue callback handler when calling it for the first time. Consult the
|
||||
<a href="https://github.com/googlesamples/android-ndk">example code in the NDK package</a>
|
||||
for examples of working with this interface.
|
||||
<a class="external-link" href="https://github.com/googlesamples/android-ndk">example code in the
|
||||
NDK package</a> for examples of working with this interface.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
@@ -879,3 +1187,25 @@ SLDataSource audiosrc;
|
||||
audiosrc.pLocator = ...
|
||||
audiosrc.pFormat = &pcm;
|
||||
</pre>
|
||||
|
||||
<h2 id="notes">Programming Notes</h2>
|
||||
<p><a href="{@docRoot}ndk/guides/audio/opensl-prog-notes.html">OpenSL ES Programming Notes</a>
|
||||
provides supplemental information to ensure proper implementation of OpenSL ES.</p>
|
||||
<p class="note"><strong>Note: </strong>
|
||||
For your convenience, we have included a copy of the OpenSL ES 1.0.1 specification with the NDK in
|
||||
{@code docs/opensles/OpenSL_ES_Specification_1.0.1.pdf}.
|
||||
</p>
|
||||
|
||||
<h2 id="platform-issues">Platform Issues</h2>
|
||||
|
||||
<p>
|
||||
This section describes known issues in the initial platform release that supports these APIs.
|
||||
</p>
|
||||
|
||||
<h3>Dynamic interface management</h3>
|
||||
|
||||
<p>
|
||||
{@code DynamicInterfaceManagement::AddInterface} does not work. Instead, specify the interface in
|
||||
the array that is passed to Create, as shown in the example code for environmental reverb.
|
||||
</p>
|
||||
|
||||
|
||||
461
docs/html/ndk/guides/audio/opensl-prog-notes.jd
Normal file
461
docs/html/ndk/guides/audio/opensl-prog-notes.jd
Normal file
@@ -0,0 +1,461 @@
|
||||
page.title=OpenSL ES Programming Notes
|
||||
@jd:body
|
||||
|
||||
<div id="qv-wrapper">
|
||||
<div id="qv">
|
||||
<h2>On this page</h2>
|
||||
|
||||
<ol>
|
||||
<li><a href="#init">Objects and Interface Initialization</a></li>
|
||||
<li><a href="#prefetch">Audio Player Prefetch</a></li>
|
||||
<li><a href="#destroy">Destroy</a></li>
|
||||
<li><a href="#panning">Stereo Panning</a></li>
|
||||
<li><a href="#callbacks">Callbacks and Threads</a></li>
|
||||
<li><a href="#perform">Performance</a></li>
|
||||
<li><a href="#sandp">Security and Permissions</a></li>
|
||||
</ol>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<p>
|
||||
The notes in this section supplement the
|
||||
<a class="external-link" href="https://www.khronos.org/registry/sles/">OpenSL ES 1.0.1
|
||||
specification</a>.
|
||||
</p>
|
||||
|
||||
<h2 id="init">Objects and Interface Initialization</h2>
|
||||
|
||||
<p>
|
||||
Two aspects of the OpenSL ES programming model that may be unfamiliar to new developers are the
|
||||
distinction between objects and interfaces, and the initialization sequence.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
Briefly, an OpenSL ES object is similar to the object concept in
|
||||
programming languages such as Java
|
||||
and C++, except an OpenSL ES object is only visible via its associated interfaces.
|
||||
This includes
|
||||
the initial interface for all objects, called {@code SLObjectItf}.
|
||||
There is no handle for an object
|
||||
itself, only a handle to the {@code SLObjectItf} interface of the object.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
An OpenSL ES object is first <em>created</em>, which returns an {@code SLObjectItf}, then
|
||||
<em>realized</em>. This is similar to the common programming pattern of first constructing an
|
||||
object (which should never fail other than for lack of memory or invalid parameters), and then
|
||||
completing initialization (which may fail due to lack of resources). The realize step gives the
|
||||
implementation a logical place to allocate additional resources if needed.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
As part of the API to create an object, an application specifies an array of desired interfaces
|
||||
that it plans to acquire later. Note that this array does not automatically
|
||||
acquire the interfaces;
|
||||
it merely indicates a future intention to acquire them. Interfaces are distinguished as
|
||||
<em>implicit</em> or <em>explicit</em>. An explicit interface must be listed in the array if it
|
||||
will be acquired later. An implicit interface need not be listed in the
|
||||
object create array, but
|
||||
there is no harm in listing it there. OpenSL ES has one more kind of interface called
|
||||
<em>dynamic</em>, which does not need to be specified in the object
|
||||
create array and can be added
|
||||
later after the object is created. The Android implementation provides
|
||||
a convenience feature to
|
||||
avoid this complexity; see the
|
||||
<a href="#dynamic-interfaces">Dynamic interfaces at object creation</a> section above.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
After the object is created and realized, the application should acquire interfaces for each
|
||||
feature it needs, using {@code GetInterface} on the initial {@code SLObjectItf}.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
Finally, the object is available for use via its interfaces, though note that
|
||||
some objects require
|
||||
further setup. In particular, an audio player with URI data source needs a bit
|
||||
more preparation in
|
||||
order to detect connection errors. See the
|
||||
<a href="#prefetch">Audio player prefetch</a> section for details.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
After your application is done with the object, you should explicitly destroy it; see the
|
||||
<a href="#destroy">Destroy</a> section below.
|
||||
</p>
|
||||
|
||||
<h2 id="prefetch">Audio Player Prefetch</h2>
|
||||
|
||||
<p>
|
||||
For an audio player with URI data source, {@code Object::Realize} allocates
|
||||
resources but does not
|
||||
connect to the data source (<em>prepare</em>) or begin pre-fetching data. These occur once the
|
||||
player state is set to either {@code SL_PLAYSTATE_PAUSED} or {@code SL_PLAYSTATE_PLAYING}.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
Some information may still be unknown until relatively late in this sequence. In
|
||||
particular, initially {@code Player::GetDuration} returns {@code SL_TIME_UNKNOWN} and
|
||||
{@code MuteSolo::GetChannelCount} either returns successfully with channel count zero or the
|
||||
error result {@code SL_RESULT_PRECONDITIONS_VIOLATED}. These APIs return the proper values
|
||||
once they are known.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
Other properties that are initially unknown include the sample rate and
|
||||
actual media content type
|
||||
based on examining the content's header (as opposed to the
|
||||
application-specified MIME type and
|
||||
container type). These are also determined later during
|
||||
prepare/prefetch, but there are no APIs to
|
||||
retrieve them.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
The prefetch status interface is useful for detecting when all information
|
||||
is available, or your
|
||||
application can poll periodically. Note that some information, such as the
|
||||
duration of a streaming
|
||||
MP3, may <em>never</em> be known.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
The prefetch status interface is also useful for detecting errors. Register a callback
|
||||
and enable
|
||||
at least the {@code SL_PREFETCHEVENT_FILLLEVELCHANGE} and {@code SL_PREFETCHEVENT_STATUSCHANGE}
|
||||
events. If both of these events are delivered simultaneously, and
|
||||
{@code PrefetchStatus::GetFillLevel} reports a zero level, and
|
||||
{@code PrefetchStatus::GetPrefetchStatus} reports {@code SL_PREFETCHSTATUS_UNDERFLOW},
|
||||
then this
|
||||
indicates a non-recoverable error in the data source. This includes the inability to
|
||||
connect to the
|
||||
data source because the local filename does not exist or the network URI is invalid.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
The next version of OpenSL ES is expected to add more explicit support for
|
||||
handling errors in the
|
||||
data source. However, for future binary compatibility, we intend to continue
|
||||
to support the current
|
||||
method for reporting a non-recoverable error.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
In summary, a recommended code sequence is:
|
||||
</p>
|
||||
|
||||
<ol>
|
||||
<li>{@code Engine::CreateAudioPlayer}</li>
|
||||
<li>{@code Object:Realize}</li>
|
||||
<li>{@code Object::GetInterface} for {@code SL_IID_PREFETCHSTATUS}</li>
|
||||
<li>{@code PrefetchStatus::SetCallbackEventsMask}</li>
|
||||
<li>{@code PrefetchStatus::SetFillUpdatePeriod}</li>
|
||||
<li>{@code PrefetchStatus::RegisterCallback}</li>
|
||||
<li>{@code Object::GetInterface} for {@code SL_IID_PLAY}</li>
|
||||
<li>{@code Play::SetPlayState} to {@code SL_PLAYSTATE_PAUSED}, or
|
||||
{@code SL_PLAYSTATE_PLAYING}</li>
|
||||
</ol>
|
||||
|
||||
<p class="note"><strong>Note: </strong>
|
||||
Preparation and prefetching occur here; during this time your callback is called with
|
||||
periodic status updates.
|
||||
</p>
|
||||
|
||||
<h2 id="destroy">Destroy</h2>
|
||||
|
||||
<p>
|
||||
Be sure to destroy all objects when exiting from your application.
|
||||
Objects should be destroyed in
|
||||
reverse order of their creation, as it is not safe to destroy an object that has any dependent
|
||||
objects. For example, destroy in this order: audio players and recorders, output mix, and then
|
||||
finally the engine.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
OpenSL ES does not support automatic garbage collection or
|
||||
<a class="external-link" href="http://en.wikipedia.org/wiki/Reference_counting">reference
|
||||
counting</a> of interfaces. After you call {@code Object::Destroy}, all extant
|
||||
interfaces that are
|
||||
derived from the associated object become undefined.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
The Android OpenSL ES implementation does not detect the incorrect use of such interfaces.
|
||||
Continuing to use such interfaces after the object is destroyed can cause your application to
|
||||
crash or behave in unpredictable ways.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
We recommend that you explicitly set both the primary object interface and all associated
|
||||
interfaces to NULL as part of your object destruction sequence, which prevents the accidental
|
||||
misuse of a stale interface handle.
|
||||
</p>
|
||||
|
||||
<h2 id="panning">Stereo Panning</h2>
|
||||
|
||||
<p>
|
||||
When {@code Volume::EnableStereoPosition} is used to enable stereo panning of a mono source,
|
||||
there is a 3-dB reduction in total
|
||||
<a class="external-link" href="http://en.wikipedia.org/wiki/Sound_power_level">sound power
|
||||
level</a>. This is needed to permit the total sound power level to remain constant as
|
||||
the source is
|
||||
panned from one channel to the other. Therefore, only enable stereo positioning if you need
|
||||
it. See the Wikipedia article on
|
||||
<a class="external-link" href="http://en.wikipedia.org/wiki/Panning_(audio)">audio panning</a>
|
||||
for more information.
|
||||
</p>
|
||||
|
||||
<h2 id="callbacks">Callbacks and Threads</h2>
|
||||
|
||||
<p>
|
||||
Callback handlers are generally called synchronously with respect to the event. That is, at the
|
||||
moment and location that the event is detected by the implementation. This point is
|
||||
asynchronous with respect to the application, so you should use a non-blocking synchronization
|
||||
mechanism to control access to any variables shared between the application and the callback
|
||||
handler. In the example code, such as for buffer queues, we have either omitted this
|
||||
synchronization or used blocking synchronization in the interest of simplicity. However, proper
|
||||
non-blocking synchronization is critical for any production code.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
Callback handlers are called from internal non-application threads that are not attached to the
|
||||
Android runtime, so they are ineligible to use JNI. Because these internal threads are
|
||||
critical to
|
||||
the integrity of the OpenSL ES implementation, a callback handler should also not block
|
||||
or perform
|
||||
excessive work.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
If your callback handler needs to use JNI or execute work that is not proportional to the
|
||||
callback, the handler should instead post an event for another thread to process. Examples of
|
||||
acceptable callback workload include rendering and enqueuing the next output buffer
|
||||
(for an AudioPlayer), processing the just-filled input buffer and enqueueing the next
|
||||
empty buffer
|
||||
(for an AudioRecorder), or simple APIs such as most of the <em>Get</em> family. See the
|
||||
<a href="#perform">Performance</a> section below regarding the workload.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
Note that the converse is safe: an Android application thread that has entered JNI
|
||||
is allowed to
|
||||
directly call OpenSL ES APIs, including those that block. However, blocking calls are not
|
||||
recommended from the main thread, as they may result in
|
||||
<em>Application Not Responding</em> (ANR).
|
||||
</p>
|
||||
|
||||
<p>
|
||||
The determination regarding the thread that calls a callback handler is largely left up to the
|
||||
implementation. The reason for this flexibility is to permit future optimizations,
|
||||
especially on
|
||||
multi-core devices.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
The thread on which the callback handler runs is not guaranteed to have the same
|
||||
identity across
|
||||
different calls. Therefore, do not rely on the {@code pthread_t returned by pthread_self()}
|
||||
or the
|
||||
{@code pid_t returned by gettid()} to be consistent across calls. For the same reason,
|
||||
do not use
|
||||
the thread local storage (TLS) APIs such as {@code pthread_setspecific()} and
|
||||
{@code pthread_getspecific()} from a callback.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
The implementation guarantees that concurrent callbacks of the same kind, for the
|
||||
same object, does
|
||||
not occur. However, concurrent callbacks of different kinds for the same object are possible on
|
||||
different threads.
|
||||
</p>
|
||||
|
||||
<h2 id="perform">Performance</h2>
|
||||
|
||||
<p>
|
||||
As OpenSL ES is a native C API, non-runtime application threads that call OpenSL ES have no
|
||||
runtime-related overhead such as garbage collection pauses. With one exception described below,
|
||||
there is no additional performance benefit to the use of OpenSL ES other than this.
|
||||
In particular,
|
||||
the use of OpenSL ES does not guarantee enhancements such as lower audio latency and higher
|
||||
scheduling priority over that which the platform generally provides. On the other hand, as the
|
||||
Android platform and specific device implementations continue to evolve, an OpenSL ES application
|
||||
can expect to benefit from any future system performance improvements.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
One such evolution is support for reduced
|
||||
<a href="{@docRoot}ndk/guides/audio/output-latency.html">audio output latency</a>.
|
||||
The underpinnings for reduced
|
||||
output latency were first included in Android 4.1 (API level 16), and then
|
||||
continued progress occurred in Android 4.2 (API level 17). These improvements are available via
|
||||
OpenSL ES for device implementations that
|
||||
claim feature {@code android.hardware.audio.low_latency}.
|
||||
If the device doesn't claim this feature but supports Android 2.3 (API level 9)
|
||||
or later, then you can still use the OpenSL ES APIs but the output latency may be higher.
|
||||
The lower
|
||||
output latency path is used only if the application requests a buffer size and sample rate
|
||||
that are
|
||||
compatible with the device's native output configuration. These parameters are
|
||||
device-specific and
|
||||
should be obtained as described below.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
Beginning with Android 4.2 (API level 17), an application can query for the
|
||||
platform native or optimal output sample rate and buffer size for the device's primary output
|
||||
stream. When combined with the feature test just mentioned, an app can now configure itself
|
||||
appropriately for lower latency output on devices that claim support.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
For Android 4.2 (API level 17) and earlier, a buffer count of two or more is
|
||||
required for lower latency. Beginning with Android 4.3 (API level 18), a buffer
|
||||
count of one is sufficient for lower latency.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
All OpenSL ES interfaces for output effects preclude the lower latency path.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
The recommended sequence is as follows:
|
||||
</p>
|
||||
|
||||
<ol>
|
||||
<li>Check for API level 9 or higher to confirm the use of OpenSL ES.</li>
|
||||
<li>Check for the {@code android.hardware.audio.low_latency} feature using code such as this:
|
||||
<pre>import android.content.pm.PackageManager;
|
||||
...
|
||||
PackageManager pm = getContext().getPackageManager();
|
||||
boolean claimsFeature = pm.hasSystemFeature(PackageManager.FEATURE_AUDIO_LOW_LATENCY);
|
||||
</pre></li>
|
||||
<li>Check for API level 17 or higher to confirm the use of
|
||||
{@code android.media.AudioManager.getProperty()}.</li>
|
||||
<li>Get the native or optimal output sample rate and buffer size for this device's
|
||||
primary output
|
||||
stream using code such as this:
|
||||
<pre>import android.media.AudioManager;
|
||||
...
|
||||
AudioManager am = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
|
||||
String sampleRate = am.getProperty(AudioManager.PROPERTY_OUTPUT_SAMPLE_RATE));
|
||||
String framesPerBuffer = am.getProperty(AudioManager.PROPERTY_OUTPUT_FRAMES_PER_BUFFER));
|
||||
</pre>
|
||||
Note that {@code sampleRate} and {@code framesPerBuffer} are <em>strings</em>. First check for
|
||||
null and then convert to int using {@code Integer.parseInt()}.</li>
|
||||
<li>Now use OpenSL ES to create an AudioPlayer with PCM buffer queue data locator.</li>
|
||||
</ol>
|
||||
|
||||
<p class="note"><strong>Note: </strong>
|
||||
You can use the
|
||||
<a class="external-link"
|
||||
href="https://play.google.com/store/apps/details?id=com.levien.audiobuffersize">
|
||||
Audio Buffer Size</a>
|
||||
test app to determine the native buffer size and sample rate for OpenSL ES audio
|
||||
applications on your audio device. You can also visit GitHub to view <a class="external-link"
|
||||
href="https://github.com/gkasten/high-performance-audio/tree/master/audio-buffer-size">
|
||||
audio-buffer-size</a> samples.
|
||||
|
||||
<p>
|
||||
The number of lower latency audio players is limited. If your application requires more
|
||||
than a few
|
||||
audio sources, consider mixing your audio at the application level. Be sure to destroy your audio
|
||||
players when your activity is paused, as they are a global resource shared with other apps.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
To avoid audible glitches, the buffer queue callback handler must execute within a small and
|
||||
predictable time window. This typically implies no unbounded blocking on mutexes, conditions,
|
||||
or I/O operations. Instead consider <em>try locks</em>, locks and waits with timeouts, and
|
||||
<a class="external-link"
|
||||
href="https://source.android.com/devices/audio/avoiding_pi.html#nonBlockingAlgorithms">
|
||||
non-blocking algorithms</a>.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
The computation required to render the next buffer (for AudioPlayer) or consume the previous
|
||||
buffer (for AudioRecord) should take approximately the same amount of time for each callback.
|
||||
Avoid algorithms that execute in a non-deterministic amount of time or are <em>bursty</em> in
|
||||
their computations. A callback computation is bursty if the CPU time spent in any given callback
|
||||
is significantly larger than the average. In summary, the ideal is for the CPU execution time of
|
||||
the handler to have variance near zero, and for the handler to not block for unbounded times.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
Lower latency audio is possible for these outputs only:
|
||||
</p>
|
||||
|
||||
<ul>
|
||||
<li>On-device speakers.</li>
|
||||
<li>Wired headphones.</li>
|
||||
<li>Wired headsets.</li>
|
||||
<li>Line out.</li>
|
||||
<li><a class="external-link" href="https://source.android.com/devices/audio/usb.html">
|
||||
USB digital
|
||||
audio</a>.</li>
|
||||
</ul>
|
||||
|
||||
<p>
|
||||
On some devices, speaker latency is higher than other paths due to digital signal processing for
|
||||
speaker correction and protection.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
As of API level 21,
|
||||
<a href="{@docRoot}ndk/guides/audio/input-latency.html">lower latency audio input</a>
|
||||
is supported
|
||||
on select devices. To take advantage of
|
||||
this feature, first confirm that lower latency output is available as described above. The
|
||||
capability for lower latency output is a prerequisite for the lower latency input feature. Then,
|
||||
create an AudioRecorder with the same sample rate and buffer size as would be used for output.
|
||||
OpenSL ES interfaces for input effects preclude the lower latency path. The record preset
|
||||
{@code SL_ANDROID_RECORDING_PRESET_VOICE_RECOGNITION} must be used for lower latency; this preset
|
||||
disables device-specific digital signal processing that may add latency to the input path. For
|
||||
more information on record presets, see the <a href="#configuration-interface">Android
|
||||
configuration interface</a> section above.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
For simultaneous input and output, separate buffer queue completion handlers are used for each
|
||||
side. There is no guarantee of the relative order of these callbacks, or the synchronization of
|
||||
the audio clocks, even when both sides use the same sample rate. Your application
|
||||
should buffer the
|
||||
data with proper buffer synchronization.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
One consequence of potentially independent audio clocks is the need for asynchronous sample rate
|
||||
conversion. A simple (though not ideal for audio quality) technique for asynchronous sample rate
|
||||
conversion is to duplicate or drop samples as needed near a zero-crossing point.
|
||||
More sophisticated
|
||||
conversions are possible.
|
||||
</p>
|
||||
|
||||
<h2 id="sandp">Security and Permissions</h2>
|
||||
|
||||
<p>
|
||||
As far as who can do what, security in Android is done at the process level. Java programming
|
||||
language code cannot do anything more than native code, nor can native code do anything more than
|
||||
Java programming language code. The only differences between them are the available APIs.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
Applications using OpenSL ES must request the permissions that they would need for similar
|
||||
non-native APIs. For example, if your application records audio, then it needs the
|
||||
{@code android.permission.RECORD_AUDIO} permission. Applications that use audio effects need
|
||||
{@code android.permission.MODIFY_AUDIO_SETTINGS}. Applications that play network URI resources
|
||||
need {@code android.permission.NETWORK}. See
|
||||
<a href="https://developer.android.com/training/permissions/index.html">Working with System
|
||||
Permissions</a> for more information.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
Depending on the platform version and implementation, media content parsers and
|
||||
software codecs may
|
||||
run within the context of the Android application that calls OpenSL ES (hardware codecs are
|
||||
abstracted but are device-dependent). Malformed content designed to exploit parser and codec
|
||||
vulnerabilities is a known attack vector. We recommend that you play media only from trustworthy
|
||||
sources or that you partition your application such that code that handles media from
|
||||
untrustworthy sources runs in a relatively <em>sandboxed</em> environment. For example, you could
|
||||
process media from untrustworthy sources in a separate process. Though both processes would still
|
||||
run under the same UID, this separation does make an attack more difficult.
|
||||
</p>
|
||||
310
docs/html/ndk/guides/audio/output-latency.jd
Normal file
310
docs/html/ndk/guides/audio/output-latency.jd
Normal file
@@ -0,0 +1,310 @@
|
||||
page.title=Audio Output Latency
|
||||
@jd:body
|
||||
|
||||
<div id="qv-wrapper">
|
||||
<div id="qv">
|
||||
<h2>On this page</h2>
|
||||
|
||||
<ol>
|
||||
<li><a href="#prereq">Prerequisites</a></li>
|
||||
<li><a href="#low-lat-track">Obtain a Low-Latency Track</a></li>
|
||||
<li><a href="#buffer-size">Use the Optimal Buffer Size When Enqueuing Audio Data</a></li>
|
||||
<li><a href="#warmup-lat">Avoid Warmup Latency</a></li>
|
||||
</ol>
|
||||
<h2>Also read</h2>
|
||||
|
||||
<ol>
|
||||
<li><a href="https://source.android.com/devices/audio/latency_app.html" class="external-link">
|
||||
Audio Latency for App Developers</a></li>
|
||||
<li><a href="https://source.android.com/devices/audio/latency_contrib.html" class="external-link">
|
||||
Contributors to Audio Latency</a></li>
|
||||
<li><a href="https://source.android.com/devices/audio/latency_measure.html" class="external-link">
|
||||
Measuring Audio Latency</a></li>
|
||||
<li><a href="https://source.android.com/devices/audio/warmup.html" class="external-link">
|
||||
Audio Warmup</a></li>
|
||||
<li><a href="https://en.wikipedia.org/wiki/Latency_%28audio%29" class="external-link">
|
||||
Latency (audio)</a></li>
|
||||
<li><a href="https://en.wikipedia.org/wiki/Round-trip_delay_time" class="external-link">
|
||||
Round-trip delay time</a></li>
|
||||
</ol>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<a href="https://www.youtube.com/watch?v=PnDK17zP9BI" class="notice-developers-video">
|
||||
<div>
|
||||
<h3>Video</h3>
|
||||
<p>Audio latency: buffer sizes</p>
|
||||
</div>
|
||||
</a>
|
||||
|
||||
<a href="https://www.youtube.com/watch?v=92fgcUNCHic" class="notice-developers-video">
|
||||
<div>
|
||||
<h3>Video</h3>
|
||||
<p>Building great multi-media experiences on Android</p>
|
||||
</div>
|
||||
</a>
|
||||
|
||||
<p>This page describes how to develop your audio app for low-latency output and how to avoid
|
||||
warmup latency.</p>
|
||||
|
||||
<h2 id="prereq">Prerequisites</h2>
|
||||
|
||||
<p>Low-latency audio is currently only supported when using Android's implementation of the
|
||||
OpenSL ES™ API specification, and the Android NDK:
|
||||
</p>
|
||||
|
||||
<ol>
|
||||
<li>Download and install the <a href="{@docRoot}tools/sdk/ndk/index.html">Android NDK</a>.</li>
|
||||
<li>Read the <a href="{@docRoot}ndk/guides/audio/opensl-for-android.html">OpenSL ES
|
||||
documentation</a>.
|
||||
</ol>
|
||||
|
||||
<h2 id="low-lat-track">Obtain a Low-Latency Track</h2>
|
||||
|
||||
<p>Latency is the time it takes for a signal to travel through a system. These are the common
|
||||
types of latency related to audio apps:
|
||||
|
||||
<ul>
|
||||
<li><strong>Audio output latency</strong> is the time between an audio sample being generated by an
|
||||
app and the sample being played through the headphone jack or built-in speaker.</li>
|
||||
|
||||
<li><strong>Audio input latency</strong> is the time between an audio signal being received by a
|
||||
device’s audio input, such as the microphone, and that same audio data being available to an
|
||||
app.</li>
|
||||
|
||||
<li><strong>Round-trip latency</strong> is the sum of input latency, app processing time, and
|
||||
output latency.</li>
|
||||
|
||||
<li><strong>Touch latency</strong> is the time between a user touching the screen and that
|
||||
touch event being received by an app.</li>
|
||||
</ul>
|
||||
|
||||
<p>It is difficult to test audio output latency in isolation since it requires knowing exactly
|
||||
when the first sample is sent into the audio path (although this can be done using a
|
||||
<a href="https://source.android.com/devices/audio/testing_circuit.html" class="external-link">
|
||||
light testing circuit</a> and an oscilloscope). If you know the round-trip audio latency, you can
|
||||
use the rough rule of thumb: <strong>audio output latency is half the round-trip audio latency
|
||||
over paths without signal processing</strong>.
|
||||
</p>
|
||||
|
||||
<p>To obtain the lowest latency, you must supply audio data that matches the device's optimal
|
||||
sample rate and buffer size. For more information, see
|
||||
<a href="https://source.android.com/devices/audio/latency_design.html" class="external-link">
|
||||
Design For Reduced Latency</a>.</p>
|
||||
|
||||
<h3>Obtain the optimal sample rate</h3>
|
||||
|
||||
<p>In Java, you can obtain the optimal sample rate from AudioManager as shown in the following
|
||||
code example:</p>
|
||||
|
||||
<pre>
|
||||
AudioManager am = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
|
||||
String frameRate = am.getProperty(AudioManager.PROPERTY_OUTPUT_SAMPLE_RATE);
|
||||
int frameRateInt = Integer.parseInt(frameRate);
|
||||
if (frameRateInt == 0) frameRateInt = 44100; // Use a default value if property not found
|
||||
</pre>
|
||||
|
||||
<p class="note">
|
||||
<strong>Note:</strong> The sample rate refers to the rate of each stream. If your source audio
|
||||
has two channels (stereo), then you will have one stream playing a pair of samples (frame) at
|
||||
<a href="{@docRoot}reference/android/media/AudioManager.html#PROPERTY_OUTPUT_SAMPLE_RATE">
|
||||
PROPERTY_OUTPUT_SAMPLE_RATE</a>.
|
||||
</p>
|
||||
|
||||
<h3>Use the optimal sample rate when creating your audio player</h3>
|
||||
|
||||
<p>Once you have the optimal sample output rate, you can supply it when creating your player
|
||||
using OpenSL ES:</p>
|
||||
|
||||
<pre>
|
||||
// create buffer queue audio player
|
||||
void Java_com_example_audio_generatetone_MainActivity_createBufferQueueAudioPlayer
|
||||
(JNIEnv* env, jclass clazz, jint sampleRate, jint framesPerBuffer)
|
||||
{
|
||||
...
|
||||
// specify the audio source format
|
||||
SLDataFormat_PCM format_pcm;
|
||||
format_pcm.numChannels = 2;
|
||||
format_pcm.samplesPerSec = (SLuint32) sampleRate * 1000;
|
||||
...
|
||||
}
|
||||
</pre>
|
||||
|
||||
<p class="note">
|
||||
<strong>Note:</strong> {@code samplesPerSec} refers to the <em>sample rate per channel in
|
||||
millihertz</em> (1 Hz = 1000 mHz).
|
||||
</p>
|
||||
|
||||
<h3>Avoid adding output interfaces that involve signal processing</h3>
|
||||
|
||||
<p>Only these interfaces are supported by the fast mixer:</p>
|
||||
|
||||
<ul>
|
||||
<li>SL_IID_ANDROIDSIMPLEBUFFERQUEUE</li>
|
||||
<li>SL_IID_VOLUME</li>
|
||||
<li>SL_IID_MUTESOLO</li>
|
||||
</ul>
|
||||
|
||||
<p>These interfaces are not allowed because they involve signal processing and will cause
|
||||
your request for a fast-track to be rejected:</p>
|
||||
|
||||
<ul>
|
||||
<li>SL_IID_BASSBOOST</li>
|
||||
<li>SL_IID_EFFECTSEND</li>
|
||||
<li>SL_IID_ENVIRONMENTALREVERB</li>
|
||||
<li>SL_IID_EQUALIZER</li>
|
||||
<li>SL_IID_PLAYBACKRATE</li>
|
||||
<li>SL_IID_PRESETREVERB</li>
|
||||
<li>SL_IID_VIRTUALIZER</li>
|
||||
<li>SL_IID_ANDROIDEFFECT</li>
|
||||
<li>SL_IID_ANDROIDEFFECTSEND</li>
|
||||
</ul>
|
||||
|
||||
<p>When you create your player, make sure you only add <em>fast</em> interfaces, as shown in
|
||||
the following example:</p>
|
||||
|
||||
<pre>
|
||||
const SLInterfaceID interface_ids[2] = { SL_IID_ANDROIDSIMPLEBUFFERQUEUE, SL_IID_VOLUME };
|
||||
</pre>
|
||||
|
||||
<h3>Verify you're using a low-latency track</h3>
|
||||
|
||||
<p>Complete these steps to verify that you have successfully obtained a low-latency track:</p>
|
||||
|
||||
<ol>
|
||||
<li>Launch your app and then run the following command:</li>
|
||||
|
||||
<pre>
|
||||
adb shell ps | grep your_app_name
|
||||
</pre>
|
||||
|
||||
<li>Make a note of your app's process ID.</li>
|
||||
|
||||
<li>Now, play some audio from your app. You have approximately three seconds to run the
|
||||
following command from the terminal:</li>
|
||||
|
||||
<pre>
|
||||
adb shell dumpsys media.audio_flinger
|
||||
</pre>
|
||||
|
||||
<li>Scan for your process ID. If you see an <em>F</em> in the <em>Name</em> column, it's on a
|
||||
low-latency track (the F stands for <em>fast track</em>).</li>
|
||||
|
||||
</ol>
|
||||
|
||||
<h3>Measure round-trip latency</h3>
|
||||
|
||||
<p>You can measure round-trip audio latency by creating an app that generates an audio signal,
|
||||
listens for that signal, and measures the time between sending it and receiving it.
|
||||
Alternatively, you can install this
|
||||
<a href="https://play.google.com/store/apps/details?id=org.drrickorang.loopback" class="external-link">
|
||||
latency testing app</a>. This performs a round-trip latency test using the
|
||||
<a href="https://source.android.com/devices/audio/latency_measure.html#larsenTest" class="external-link">
|
||||
Larsen test</a>. You can also
|
||||
<a href="https://github.com/gkasten/drrickorang/tree/master/LoopbackApp" class="external-link">
|
||||
view the source code</a> for the latency testing app.</p>
|
||||
|
||||
<p>Since the lowest latency is achieved over audio paths with minimal signal processing, you may
|
||||
also want to use an
|
||||
<a href="https://source.android.com/devices/audio/latency_measure.html#loopback" class="external-link">
|
||||
Audio Loopback Dongle</a>, which allows the test to be run over the headset connector.</p>
|
||||
|
||||
<p>The lowest possible round-trip audio latency varies greatly depending on device model and
|
||||
Android build. You can measure it yourself using the latency testing app and loopback
|
||||
dongle. When creating apps for <em>Nexus devices</em>, you can also use the
|
||||
<a href="https://source.android.com/devices/audio/latency_measurements.html" class="external-link">
|
||||
published measurements</a>.</p>
|
||||
|
||||
<p>You can also get a rough idea of audio performance by testing whether the device reports
|
||||
support for the
|
||||
<a href="http://developer.android.com/reference/android/content/pm/PackageManager.html#FEATURE_AUDIO_LOW_LATENCY">
|
||||
low_latency</a> and
|
||||
<a href="http://developer.android.com/reference/android/content/pm/PackageManager.html#FEATURE_AUDIO_PRO">
|
||||
pro</a> hardware features.</p>
|
||||
|
||||
<h3>Review the CDD and audio latency</h3>
|
||||
|
||||
<p>The Android Compatibility Definition Document (CDD) enumerates the hardware and software
|
||||
requirements of a compatible Android device.
|
||||
See <a href="https://source.android.com/compatibility/" class="external-link">
|
||||
Android Compatibility</a> for more information on the overall compatibility program, and
|
||||
<a href="https://static.googleusercontent.com/media/source.android.com/en//compatibility/android-cdd.pdf" class="external-link">
|
||||
CDD</a> for the actual CDD document.</p>
|
||||
|
||||
<p>In the CDD, round-trip latency is specified as 20 ms or lower (even though musicians
|
||||
generally require 10 ms). This is because there are important use cases that are enabled by
|
||||
20 ms.</p>
|
||||
|
||||
<p>There is currently no API to determine audio latency over any path on an Android device at
|
||||
runtime. You can, however, use the following hardware feature flags to find out whether the
|
||||
device makes any guarantees for latency:</p>
|
||||
|
||||
<ul>
|
||||
<li><a href="http://developer.android.com/reference/android/content/pm/PackageManager.html#FEATURE_AUDIO_LOW_LATENCY">
|
||||
android.hardware.audio.low_latency</a> indicates a continuous output latency of 45 ms or
|
||||
less.</li>
|
||||
|
||||
<li><a href="http://developer.android.com/reference/android/content/pm/PackageManager.html#FEATURE_AUDIO_PRO">
|
||||
android.hardware.audio.pro</a> indicates a continuous round-trip latency of 20 ms or
|
||||
less.</li>
|
||||
</ul>
|
||||
|
||||
<p>The criteria for reporting these flags is defined in the CDD in sections <em>5.6 Audio
|
||||
Latency</em> and <em>5.10 Professional Audio</em>.</p>
|
||||
|
||||
<p>Here’s how to check for these features in Java:</p>
|
||||
|
||||
<pre>
|
||||
boolean hasLowLatencyFeature =
|
||||
getPackageManager().hasSystemFeature(PackageManager.FEATURE_AUDIO_LOW_LATENCY);
|
||||
|
||||
boolean hasProFeature =
|
||||
getPackageManager().hasSystemFeature(PackageManager.FEATURE_AUDIO_PRO);
|
||||
</pre>
|
||||
|
||||
<p>Regarding the relationship of audio features, the {@code android.hardware.audio.low_latency}
|
||||
feature is a prerequisite for {@code android.hardware.audio.pro}. A device can implement
|
||||
{@code android.hardware.audio.low_latency} and not {@code android.hardware.audio.pro}, but not
|
||||
vice-versa.</p>
|
||||
|
||||
<h2 id="buffer-size">Use the Optimal Buffer Size When Enqueuing Audio Data</h2>
|
||||
|
||||
<p>You can obtain the optimal buffer size in a similar way to the optimal frame rate, using the
|
||||
AudioManager API:</p>
|
||||
|
||||
<pre>
|
||||
AudioManager am = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
|
||||
String framesPerBuffer = am.getProperty(AudioManager.PROPERTY_OUTPUT_FRAMES_PER_BUFFER);
|
||||
int framesPerBufferInt = Integer.parseInt(framesPerBuffer);
|
||||
if (framesPerBufferInt == 0) framesPerBufferInt = 256; // Use default
|
||||
</pre>
|
||||
|
||||
<p>The
|
||||
<a href="{@docRoot}reference/android/media/AudioManager.html#PROPERTY_OUTPUT_FRAMES_PER_BUFFER">
|
||||
PROPERTY_OUTPUT_FRAMES_PER_BUFFER</a> property indicates the number of audio frames
|
||||
that the HAL (Hardware Abstraction Layer) buffer can hold. You should construct your audio
|
||||
buffers so that they contain an exact multiple of this number. If you use the correct number
|
||||
of audio frames, your callbacks occur at regular intervals, which reduces jitter.</p>
|
||||
|
||||
<p>It is important to use the API to determine buffer size rather than using a hardcoded value,
|
||||
because HAL buffer sizes differ across devices and across Android builds.</p>
|
||||
|
||||
<h2 id="warmup-lat">Avoid Warmup Latency</h2>
|
||||
|
||||
<p>When you enqueue audio data for the first time, it takes a small, but still significant,
|
||||
amount of time for the device audio circuit to warm up. To avoid this warmup latency, you should
|
||||
enqueue buffers of audio data containing silence, as shown in the following code example:</p>
|
||||
|
||||
<pre>
|
||||
#define CHANNELS 1
|
||||
static short* silenceBuffer;
|
||||
int numSamples = frames * CHANNELS;
|
||||
silenceBuffer = malloc(sizeof(*silenceBuffer) * numSamples);
|
||||
for (i = 0; i < numSamples; i++) {
|
||||
silenceBuffer[i] = 0;
|
||||
}
|
||||
</pre>
|
||||
|
||||
<p>At the point when audio should be produced, you can switch to enqueuing buffers containing
|
||||
real audio data.</p>
|
||||
|
||||
151
docs/html/ndk/guides/audio/sample-rates.jd
Normal file
151
docs/html/ndk/guides/audio/sample-rates.jd
Normal file
@@ -0,0 +1,151 @@
|
||||
page.title=Sample Rates
|
||||
@jd:body
|
||||
|
||||
<div id="qv-wrapper">
|
||||
<div id="qv">
|
||||
<h2>On this page</h2>
|
||||
|
||||
<ol>
|
||||
<li><a href="#best">Best Practices for Sampling and Resampling</a></li>
|
||||
<li><a href="#info">For More Information</a></li>
|
||||
</ol>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<a class="notice-developers-video" href="https://www.youtube.com/watch?v=6Dl6BdrA-sQ">
|
||||
<div>
|
||||
<h3>Video</h3>
|
||||
<p>Sample Rates: Why Can't We All Just Agree?</p>
|
||||
</div>
|
||||
</a>
|
||||
|
||||
<p>As of Android 5.0 (Lollipop), the audio resamplers are now entirely based
|
||||
on FIR filters derived from a Kaiser windowed-sinc function. The Kaiser windowed-sinc
|
||||
offers the following properties:
|
||||
<ul>
|
||||
<li>It is straightforward to calculate for its design parameters (stopband
|
||||
ripple, transition bandwidth, cutoff frequency, filter length).</li>
|
||||
<li>It is nearly optimal for reduction of stopband energy compared to overall
|
||||
energy.</li>
|
||||
</ul>
|
||||
See P.P. Vaidyanathan, <a class="external-link"
|
||||
href="https://drive.google.com/file/d/0B7tBh7YQV0DGak9peDhwaUhqY2c/view">
|
||||
<i>Multirate Systems and Filter Banks</i></a>, p. 50 for discussions of the
|
||||
Kaiser Window and its optimality and relationship to Prolate Spheroidal
|
||||
Windows.</p>
|
||||
|
||||
<p>The design parameters are automatically computed based on internal
|
||||
quality determination and the sampling ratios desired. Based on the
|
||||
design parameters, the windowed-sinc filter is generated. For music use,
|
||||
the resampler for 44.1 to 48 kHz and vice versa is generated at a higher
|
||||
quality than for arbitrary frequency conversion.</p>
|
||||
|
||||
<p>The audio resamplers provide increased quality, as well as speed
|
||||
to achieve that quality. But resamplers can introduce small amounts
|
||||
of passband ripple and aliasing harmonic noise, and they can cause some high
|
||||
frequency loss in the transition band, so avoid using them unnecessarily.</p>
|
||||
|
||||
<h2 id="best">Best Practices for Sampling and Resampling</h2>
|
||||
<p>This section describes some best practices to help you avoid problems with sampling rates.</p>
|
||||
<h3>Choose the sampling rate to fit the device</h3>
|
||||
|
||||
<p>In general, it is best to choose the sampling rate to fit the device,
|
||||
typically 44.1 kHz or 48 kHz. Use of a sample rate greater than
|
||||
48 kHz will typically result in decreased quality because a resampler must be
|
||||
used to play back the file.</p>
|
||||
|
||||
<h3>Use simple resampling ratios (fixed versus interpolated polyphases)</h3>
|
||||
|
||||
<p>The resampler operates in one of the following modes:</p>
|
||||
<ul>
|
||||
<li>Fixed polyphase mode. The filter coefficients for each polyphase are precomputed.</li>
|
||||
<li>Interpolated polyphase mode. The filter coefficients for each polyphase must
|
||||
be interpolated from the nearest two precomputed polyphases.</li>
|
||||
</ul>
|
||||
<p>The resampler is fastest in fixed polyphase mode, when the ratio of input
|
||||
rate over output rate L/M (taking out the greatest common divisor)
|
||||
has M less than 256. For example, for 44,100 to 48,000 conversion, L = 147,
|
||||
M = 160.</p>
|
||||
|
||||
<p>In fixed polyphase mode, the sampling rate is locked for as
|
||||
many samples converted and does not change. In interpolated polyphase
|
||||
mode, the sampling rate is approximate. The drift is generally on the
|
||||
order of one sample over a few hours of playback on a 48-kHz device.
|
||||
This is not usually a concern because approximation error is much less than
|
||||
frequency error of internal quartz oscillators, thermal drift, or jitter
|
||||
(typically tens of ppm).</p>
|
||||
|
||||
<p>Choose simple-ratio sampling rates such as 24 kHz (1:2) and 32 kHz (2:3) when playing back
|
||||
on a 48-kHz device, even though other sampling
|
||||
rates and ratios may be permitted through AudioTrack.</p>
|
||||
|
||||
<h3>Use upsampling rather than downsampling when changing sample rates</h3>
|
||||
|
||||
<p>Sampling rates can be changed on the fly. The granularity of
|
||||
such change is based on the internal buffering (typically a few hundred
|
||||
samples), not on a sample-by-sample basis. This can be used for effects.</p>
|
||||
|
||||
<p>Do not dynamically change sampling rates when
|
||||
downsampling. When changing sample rates after an audio track is
|
||||
created, differences of around 5 to 10 percent from the original rate may
|
||||
trigger a filter recomputation when downsampling (to properly suppress
|
||||
aliasing). This can consume computing resources and may cause an audible click
|
||||
if the filter is replaced in real time.</p>
|
||||
|
||||
<h3>Limit downsampling to no more than 6:1</h3>
|
||||
|
||||
<p>Downsampling is typically triggered by hardware device requirements. When the
|
||||
Sample Rate converter is used for downsampling,
|
||||
try to limit the downsampling ratio to no more than 6:1 for good aliasing
|
||||
suppression (for example, no greater downsample than 48,000 to 8,000). The filter
|
||||
lengths adjust to match the downsampling ratio, but you sacrifice more
|
||||
transition bandwidth at higher downsampling ratios to avoid excessively
|
||||
increasing the filter length. There are no similar aliasing concerns for
|
||||
upsampling. Note that some parts of the audio pipeline
|
||||
may prevent downsampling greater than 2:1.</p>
|
||||
|
||||
<h3 id="latency">If you are concerned about latency, do not resample</h3>
|
||||
|
||||
<p>Resampling prevents the track from being placed in the FastMixer
|
||||
path, which means that significantly higher latency occurs due to the additional,
|
||||
larger buffer in the ordinary Mixer path. Furthermore,
|
||||
there is an implicit delay from the filter length of the resampler,
|
||||
though this is typically on the order of one millisecond or less,
|
||||
which is not as large as the additional buffering for the ordinary Mixer path
|
||||
(typically 20 milliseconds).</p>
|
||||
|
||||
<h2 id="info">For More Information</h2>
|
||||
<p>This section lists some additional resources about sampling and resampling.</p>
|
||||
|
||||
<h3>Sample rates</h3>
|
||||
|
||||
<p>
|
||||
<a href="http://en.wikipedia.org/wiki/Sampling_%28signal_processing%29" class="external-link" >
|
||||
Sampling (signal processing)</a> at Wikipedia.</p>
|
||||
|
||||
<h3>Resampling</h3>
|
||||
|
||||
<p><a href="http://en.wikipedia.org/wiki/Sample_rate_conversion" class="external-link" >
|
||||
Sample rate conversion</a> at Wikipedia.</p>
|
||||
|
||||
<p><a href="http://source.android.com/devices/audio/src.html" class="external-link" >
|
||||
Sample Rate Conversion</a> at source.android.com.</p>
|
||||
|
||||
<h3>The high bit-depth and high kHz controversy</h3>
|
||||
|
||||
<p><a href="http://people.xiph.org/~xiphmont/demo/neil-young.html" class="external-link" >
|
||||
24/192 Music Downloads ... and why they make no sense</a>
|
||||
by Christopher "Monty" Montgomery of Xiph.Org.</p>
|
||||
|
||||
<p><a href="https://www.youtube.com/watch?v=cIQ9IXSUzuM" class="external-link" >
|
||||
D/A and A/D | Digital Show and Tell</a>
|
||||
video by Christopher "Monty" Montgomery of Xiph.Org.</p>
|
||||
|
||||
<p><a href="http://www.trustmeimascientist.com/2013/02/04/the-science-of-sample-rates-when-higher-is-better-and-when-it-isnt/" class="external-link">
|
||||
The Science of Sample Rates (When Higher Is Better - And When It Isn't)</a>.</p>
|
||||
|
||||
<p><a href="http://www.image-line.com/support/FLHelp/html/app_audio.htm" class="external-link" >
|
||||
Audio Myths & DAW Wars</a></p>
|
||||
|
||||
<p><a href="http://forums.stevehoffman.tv/threads/192khz-24bit-vs-96khz-24bit-debate-interesting-revelation.317660/" class="external-link">
|
||||
192kHz/24bit vs. 96kHz/24bit "debate"- Interesting revelation</a></p>
|
||||
@@ -63,13 +63,23 @@
|
||||
</ul>
|
||||
</li>
|
||||
|
||||
<li class="nav-section">
|
||||
<li class="nav-section">
|
||||
<div class="nav-section-header"><a href="<?cs var:toroot ?>ndk/guides/audio/index.html">
|
||||
<span class="en">Audio</span></a></div>
|
||||
<ul>
|
||||
<li><a href="<?cs var:toroot ?>ndk/guides/audio/basics.html">Basics</a></li>
|
||||
<li><a href="<?cs var:toroot ?>ndk/guides/audio/opensl-for-android.html">OpenSL ES for
|
||||
Android</a></li>
|
||||
<li><a href="<?cs var:toroot ?>ndk/guides/audio/input-latency.html">Audio Input
|
||||
Latency</a></li>
|
||||
<li><a href="<?cs var:toroot ?>ndk/guides/audio/output-latency.html">Audio Output
|
||||
Latency</a></li>
|
||||
<li><a href="<?cs var:toroot ?>ndk/guides/audio/floating-point.html">Floating-Point
|
||||
Audio</a></li>
|
||||
<li><a href="<?cs var:toroot ?>ndk/guides/audio/sample-rates.html">Sample Rates
|
||||
</a></li>
|
||||
<li><a href="<?cs var:toroot ?>ndk/guides/audio/opensl-prog-notes.html">OpenSL ES Programming Notes
|
||||
</a></li>
|
||||
</ul>
|
||||
</li>
|
||||
|
||||
|
||||
Reference in New Issue
Block a user