From adff7b54478a5f774165a161547cd09caeec9e2f Mon Sep 17 00:00:00 2001
From: Eric Schmidt
+ The Android Runtime (ART) and Dalvik virtual machine use
+ paging
+ and memory-mapping
+ (mmapping) to manage memory. This means that any memory an app
+ modifies—whether by allocating
+ new objects or touching mmapped pages—remains resident in RAM and
+ cannot be paged out. The only way to release memory from an app is to release
+ object references that the app holds, making the memory available to the
+ garbage collector.
+ That is with one exception: any files
+ mmapped in without modification, such as code,
+ can be paged out of RAM if the system wants to use that memory elsewhere.
+
+ This page explains how Android manages app processes and memory
+ allocation. For more information about how to manage memory more efficiently
+ in your app, see
+ Manage Your App's Memory.
+
+ A managed memory environment, like the ART or Dalvik virtual machine,
+ keeps track of each memory allocation. Once it determines
+ that a piece of memory is no longer being used by the program,
+ it frees it back to the heap, without any intervention from the programmer.
+ The mechanism for reclaiming unused memory
+ within a managed memory environment
+ is known as garbage collection. Garbage collection has two goals:
+ find data objects in a program that cannot be accessed in the future; and
+ reclaim the resources used by those objects.
+
+ Android’s memory heap is a generational one, meaning that there are
+ different buckets of allocations that it tracks,
+ based on the expected life and size of an object being allocated.
+ For example, recently allocated objects belong in the Young generation.
+ When an object stays active long enough, it can be promoted
+ to an older generation, followed by a permanent generation.
+
+ Each heap generation has its own dedicated upper limit on the amount
+ of memory that objects there can occupy. Any time a generation starts
+ to fill up, the system executes a garbage collection
+ event in an attempt to free up memory. The duration of the garbage collection
+ depends on which generation of objects it's collecting
+ and how many active objects are in each generation.
+
+ Even though garbage collection can be quite fast, it can still
+ affect your app's performance. You don’t generally control
+ when a garbage collection event occurs from within your code.
+ The system has a running set of criteria for determining when to perform
+ garbage collection. When the criteria are satisfied,
+ the system stops executing the process and begins garbage collection. If
+ garbage collection occurs in the middle of an intensive processing loop
+ like an animation or during music playback, it can increase processing time.
+ This increase can potentially push code execution in your app past the
+ recommended 16ms threshold for efficient and smooth frame rendering.
+
+ Additionally, your code flow may perform kinds of work that
+ force garbage collection events to occur
+ more often or make them last longer-than-normal.
+ For example, if you allocate multiple objects in the
+ innermost part of a for-loop during each frame of an alpha
+ blending animation, you might pollute your memory heap with a
+ lot of objects.
+ In that circumstance, the garbage collector executes multiple garbage
+ collection events and can degrade the performance of your app.
+
+ For more general information about garbage collection, see
+ Garbage collection.
+
+ In order to fit everything it needs in RAM,
+ Android tries to share RAM pages across processes. It
+ can do so in the following ways:
+
+ Due to the extensive use of shared memory, determining
+ how much memory your app is using requires
+ care. Techniques to properly determine your app's
+ memory use are discussed in
+ Investigating Your RAM Usage.
+
+ The Dalvik heap is constrained to a
+ single virtual memory range for each app process. This defines
+ the logical heap size, which can grow as it needs to
+ but only up to a limit that the system defines
+ for each app.
+
+ The logical size of the heap is not the same as
+ the amount of physical memory used by the heap.
+ When inspecting your app's heap, Android computes
+ a value called the Proportional Set Size (PSS),
+ which accounts for both dirty and clean pages
+ that are shared with other processes—but only in an
+ amount that's proportional to how many apps share
+ that RAM. This (PSS) total is what the system
+ considers to be your physical memory footprint.
+ For more information about PSS, see the
+ Investigating Your RAM Usage
+ guide.
+
+ The Dalvik heap does not compact the logical
+ size of the heap, meaning that Android does not
+ defragment the heap to close up space. Android
+ can only shrink the logical heap size when there
+ is unused space at the end of the heap. However,
+ the system can still reduce physical memory used by the heap.
+ After garbage collection, Dalvik
+ walks the heap and finds unused pages, then returns
+ those pages to the kernel using madvise. So, paired
+ allocations and deallocations of large
+ chunks should result in reclaiming all (or nearly all)
+ the physical memory used. However,
+ reclaiming memory from small allocations can be much
+ less efficient because the page used
+ for a small allocation may still be shared with
+ something else that has not yet been freed.
+
+
+ To maintain a functional multi-tasking environment,
+ Android sets a hard limit on the heap size
+ for each app. The exact heap size limit varies
+ between devices based on how much RAM the device
+ has available overall. If your app has reached the
+ heap capacity and tries to allocate more
+ memory, it can receive an {@link java.lang.OutOfMemoryError}.
+
+ In some cases, you might want to query the
+ system to determine exactly how much heap space you
+ have available on the current device—for example, to
+ determine how much data is safe to keep in a
+ cache. You can query the system for this figure by calling
+ {@link android.app.ActivityManager#getMemoryClass() }.
+ This method returns an integer indicating the number of
+ megabytes available for your app's heap.
+
+ When users switch between apps,
+ Android keeps apps that
+ are not foreground—that is, not visible to the user or running a
+ foreground service like music playback—
+ in a least-recently used (LRU) cache.
+ For example, when a user first launches an app,
+ a process is created for it; but when the user
+ leaves the app, that process does not quit.
+ The system keeps the process cached. If
+ the user later returns to the app, the system reuses the process, thereby
+ making the app switching faster.
+
+ If your app has a cached process and it retains memory
+ that it currently does not need,
+ then your app—even while the user is not using it—
+ affects the system's
+ overall performance. As the system runs low on memory,
+ it kills processes in the LRU cache
+ beginning with the process least recently used. The system also
+ accounts for processes that hold onto the most memory
+ and can terminate them to free up RAM.
+
+ Note: When the system begins killing processes in the
+ LRU cache, it primarily works bottom-up. The system also considers which
+ processes consume more memory and thus provide the system
+ more memory gain if killed.
+ The less memory you consume while in the LRU list overall,
+ the better your chances are
+ to remain in the list and be able to quickly resume.
+
+ For more information about how processes are cached while
+ not running in the foreground and how
+ Android decides which ones
+ can be killed, see the
+ Processes and Threads
+ guide.
+ Random-access memory (RAM) is a valuable resource in any software development environment, but
-it's even more valuable on a mobile operating system where physical memory is often constrained.
-Although Android's Dalvik virtual machine performs routine garbage collection, this doesn't allow
-you to ignore when and where your app allocates and releases memory. In order for the garbage collector to reclaim memory from your app, you need to avoid
-introducing memory leaks (usually caused by holding onto object references in global members) and
-release any {@link java.lang.ref.Reference} objects at the appropriate time (as defined by
-lifecycle callbacks discussed further below). For most apps, the Dalvik garbage collector takes
-care of the rest: the system reclaims your memory allocations when the corresponding objects leave
-the scope of your app's active threads. This document explains how Android manages app processes and memory allocation, and how you can
-proactively reduce memory usage while developing for Android. For more information about general
-practices to clean up your resources when programming in Java, refer to other books or online
-documentation about managing resource references. If you’re looking for information about how to
-analyze your app’s memory once you’ve already built it, read Investigating Your RAM Usage. Android does not offer swap space for memory, but it does use paging and memory-mapping
-(mmapping) to manage memory. This means that any memory you modify—whether by allocating
-new objects or touching mmapped pages—remains resident in RAM and cannot be paged out.
-So the only way to completely release memory from your app is to release object references you may
-be holding, making the memory available to the garbage collector. That is with one exception:
-any files mmapped in without modification, such as code, can be paged out of RAM if the system
-wants to use that memory elsewhere. In order to fit everything it needs in RAM, Android tries to share RAM pages across processes. It
-can do so in the following ways: Due to the extensive use of shared memory, determining how much memory your app is using requires
-care. Techniques to properly determine your app's memory use are discussed in Investigating Your RAM Usage. Here are some facts about how Android allocates then reclaims memory from your app: To maintain a functional multi-tasking environment, Android sets a hard limit on the heap size
-for each app. The exact heap size limit varies between devices based on how much RAM the device
-has available overall. If your app has reached the heap capacity and tries to allocate more
-memory, it will receive an {@link java.lang.OutOfMemoryError}. In some cases, you might want to query the system to determine exactly how much heap space you
-have available on the current device—for example, to determine how much data is safe to keep in a
-cache. You can query the system for this figure by calling {@link
-android.app.ActivityManager#getMemoryClass()}. This returns an integer indicating the number of
-megabytes available for your app's heap. This is discussed further below, under
-Check how much memory you should use. Instead of using swap space when the user switches between apps, Android keeps processes that
-are not hosting a foreground ("user visible") app component in a least-recently used (LRU) cache.
-For example, when the user first launches an app, a process is created for it, but when the user
-leaves the app, that process does not quit. The system keeps the process cached, so if
-the user later returns to the app, the process is reused for faster app switching. If your app has a cached process and it retains memory that it currently does not need,
-then your app—even while the user is not using it—is constraining the system's
-overall performance. So, as the system runs low on memory, it may kill processes in the LRU cache
-beginning with the process least recently used, but also giving some consideration toward
-which processes are most memory intensive. To keep your process cached as long as possible, follow
-the advice in the following sections about when to release your references. More information about how processes are cached while not running in the foreground and how
-Android decides which ones
-can be killed is available in the Processes and Threads guide. You should consider RAM constraints throughout all phases of development, including during app
-design (before you begin development). There are many
-ways you can design and write code that lead to more efficient results, through aggregation of the
-same techniques applied over and over. You should apply the following techniques while designing and implementing your app to make it
-more memory efficient. If your app needs a service
-to perform work in the background, do not keep it running unless
-it's actively performing a job. Also be careful to never leak your service by failing to stop it
-when its work is done. When you start a service, the system prefers to always keep the process for that service
-running. This makes the process very expensive because the RAM used by the service can’t be used by
-anything else or paged out. This reduces the number of cached processes that the system can keep in
-the LRU cache, making app switching less efficient. It can even lead to thrashing in the system
-when memory is tight and the system can’t maintain enough processes to host all the services
-currently running. The best way to limit the lifespan of your service is to use an {@link
-android.app.IntentService}, which finishes
-itself as soon as it's done handling the intent that started it. For more information, read
-Running in a Background Service
-. Leaving a service running when it’s not needed is one of the worst memory-management
-mistakes an Android app can make. So don’t be greedy by keeping a service for your app
-running. Not only will it increase the risk of your app performing poorly due to RAM constraints,
-but users will discover such misbehaving apps and uninstall them. When the user navigates to a different app and your UI is no longer visible, you should
-release any resources that are used by only your UI. Releasing UI resources at this time can
-significantly increase the system's capacity for cached processes, which has a direct impact on the
-quality of the user experience. To be notified when the user exits your UI, implement the {@link
-android.content.ComponentCallbacks2#onTrimMemory onTrimMemory()} callback in your {@link
-android.app.Activity} classes. You should use this
-method to listen for the {@link android.content.ComponentCallbacks2#TRIM_MEMORY_UI_HIDDEN} level,
-which indicates your UI is now hidden from view and you should free resources that only your UI
-uses. Notice that your app receives the {@link android.content.ComponentCallbacks2#onTrimMemory
-onTrimMemory()} callback with {@link android.content.ComponentCallbacks2#TRIM_MEMORY_UI_HIDDEN}
-only when all the UI components of your app process become hidden from the user.
-This is distinct
-from the {@link android.app.Activity#onStop onStop()} callback, which is called when an {@link
-android.app.Activity} instance becomes hidden, which occurs even when the user moves to
-another activity in your app. So although you should implement {@link android.app.Activity#onStop
-onStop()} to release activity resources such as a network connection or to unregister broadcast
-receivers, you usually should not release your UI resources until you receive {@link
-android.content.ComponentCallbacks2#onTrimMemory onTrimMemory(TRIM_MEMORY_UI_HIDDEN)}. This ensures
-that if the user navigates back from another activity in your app, your UI resources are
-still available to resume the activity quickly. During any stage of your app's lifecycle, the {@link
-android.content.ComponentCallbacks2#onTrimMemory onTrimMemory()} callback also tells you when
-the overall device memory is getting low. You should respond by further releasing resources based
-on the following memory levels delivered by {@link android.content.ComponentCallbacks2#onTrimMemory
-onTrimMemory()}: Your app is running and not considered killable, but the device is running low on memory and the
-system is actively killing processes in the LRU cache. Your app is running and not considered killable, but the device is running much lower on
-memory so you should release unused resources to improve system performance (which directly
-impacts your app's performance). Your app is still running, but the system has already killed most of the processes in the
-LRU cache, so you should release all non-critical resources now. If the system cannot reclaim
-sufficient amounts of RAM, it will clear all of the LRU cache and begin killing processes that
-the system prefers to keep alive, such as those hosting a running service. Also, when your app process is currently cached, you may receive one of the following
-levels from {@link android.content.ComponentCallbacks2#onTrimMemory onTrimMemory()}: The system is running low on memory and your process is near the beginning of the LRU list.
-Although your app process is not at a high risk of being killed, the system may already be killing
-processes in the LRU cache. You should release resources that are easy to recover so your process
-will remain in the list and resume quickly when the user returns to your app. The system is running low on memory and your process is near the middle of the LRU list. If the
-system becomes further constrained for memory, there's a chance your process will be killed. The system is running low on memory and your process is one of the first to be killed if the
-system does not recover memory now. You should release everything that's not critical to
-resuming your app state. Because the {@link android.content.ComponentCallbacks2#onTrimMemory onTrimMemory()} callback was
-added in API level 14, you can use the {@link android.content.ComponentCallbacks#onLowMemory()}
-callback as a fallback for older versions, which is roughly equivalent to the {@link
-android.content.ComponentCallbacks2#TRIM_MEMORY_COMPLETE} event. Note: When the system begins killing processes in the LRU cache,
-although it primarily works bottom-up, it does give some consideration to which processes are
-consuming more memory and will thus provide the system more memory gain if killed.
-So the less memory you consume while in the LRU list overall, the better your chances are
-to remain in the list and be able to quickly resume. As mentioned earlier, each Android-powered device has a different amount of RAM available to the
-system and thus provides a different heap limit for each app. You can call {@link
-android.app.ActivityManager#getMemoryClass()} to get an estimate of your app's available heap in
-megabytes. If your app tries to allocate more memory than is available here, it will receive an
-{@link java.lang.OutOfMemoryError}. In very special situations, you can request a larger heap size by setting the {@code largeHeap}
-attribute to "true" in the manifest {@code However, the ability to request a large heap is intended only for a small set of apps that can
-justify the need to consume more RAM (such as a large photo editing app). Never request a
-large heap simply because you've run out of memory and you need a quick fix—you
-should use it only when you know exactly where all your memory is being allocated and why it must
-be retained. Yet, even when you're confident your app can justify the large heap, you should avoid
-requesting it to whatever extent possible. Using the extra memory will increasingly be to the
-detriment of the overall user experience because garbage collection will take longer and system
-performance may be slower when task switching or performing other common operations. Additionally, the large heap size is not the same on all devices and, when running on
-devices that have limited RAM, the large heap size may be exactly the same as the regular heap
-size. So even if you do request the large heap size, you should call {@link
-android.app.ActivityManager#getMemoryClass()} to check the regular heap size and strive to always
-stay below that limit. When you load a bitmap, keep it in RAM only at the resolution you need for the current device's
-screen, scaling it down if the original bitmap is a higher resolution. Keep in mind that an
-increase in bitmap resolution results in a corresponding (increase2) in memory needed,
-because both the X and Y dimensions increase. Note: On Android 2.3.x (API level 10) and below, bitmap objects
-always appear as the same size in your app heap regardless of the image resolution (the actual
-pixel data is stored separately in native memory). This makes it more difficult to debug the bitmap
-memory allocation because most heap analysis tools do not see the native allocation. However,
-beginning in Android 3.0 (API level 11), the bitmap pixel data is allocated in your app's Dalvik
-heap, improving garbage collection and debuggability. So if your app uses bitmaps and you're having
-trouble discovering why your app is using some memory on an older device, switch to a device
-running Android 3.0 or higher to debug it. For more tips about working with bitmaps, read Managing Bitmap Memory. Take advantage of optimized containers in the Android framework, such as {@link
-android.util.SparseArray}, {@link android.util.SparseBooleanArray}, and {@link
-android.support.v4.util.LongSparseArray}. The generic {@link java.util.HashMap}
-implementation can be quite memory
-inefficient because it needs a separate entry object for every mapping. Additionally, the {@link
-android.util.SparseArray} classes are more efficient because they avoid the system's need
-to autobox
-the key and sometimes value (which creates yet another object or two per entry). And don't be
-afraid of dropping down to raw arrays when that makes sense. Be knowledgeable about the cost and overhead of the language and libraries you are using, and
-keep this information in mind when you design your app, from start to finish. Often, things on the
-surface that look innocuous may in fact have a large amount of overhead. Examples include: A few bytes here and there quickly add up—app designs that are class- or object-heavy will suffer
-from this overhead. That can leave you in the difficult position of looking at a heap analysis and
-realizing your problem is a lot of small objects using up your RAM. Often, developers use abstractions simply as a "good programming practice," because abstractions
-can improve code flexibility and maintenance. However, abstractions come at a significant cost:
-generally they require a fair amount more code that needs to be executed, requiring more time and
-more RAM for that code to be mapped into memory. So if your abstractions aren't supplying a
-significant benefit, you should avoid them. Protocol
-buffers are a language-neutral, platform-neutral, extensible mechanism designed by Google for
-serializing structured data—think XML, but smaller, faster, and simpler. If you decide to use
-protobufs for your data, you should always use nano protobufs in your client-side code. Regular
-protobufs generate extremely verbose code, which will cause many kinds of problems in your app:
-increased RAM use, significant APK size increase, slower execution, and quickly hitting the DEX
-symbol limit. For more information, see the "Nano version" section in the protobuf readme. Using a dependency injection framework such as Guice or
-RoboGuice may be
-attractive because they can simplify the code you write and provide an adaptive environment
-that's useful for testing and other configuration changes. However, these frameworks tend to perform
-a lot of process initialization by scanning your code for annotations, which can require significant
-amounts of your code to be mapped into RAM even though you don't need it. These mapped pages are
-allocated into clean memory so Android can drop them, but that won't happen until the pages have
-been left in memory for a long period of time. External library code is often not written for mobile environments and can be inefficient when used
-for work on a mobile client. At the very least, when you decide to use an external library, you
-should assume you are taking on a significant porting and maintenance burden to optimize the
-library for mobile. Plan for that work up-front and analyze the library in terms of code size and
-RAM footprint before deciding to use it at all. Even libraries supposedly designed for use on Android are potentially dangerous because each
-library may do things differently. For example, one library may use nano protobufs while another
-uses micro protobufs. Now you have two different protobuf implementations in your app. This can and
-will also happen with different implementations of logging, analytics, image loading frameworks,
-caching, and all kinds of other things you don't expect. ProGuard won't save you here because these
-will all be lower-level dependencies that are required by the features for which you want the
-library. This becomes especially problematic when you use an {@link android.app.Activity}
-subclass from a library (which
-will tend to have wide swaths of dependencies), when libraries use reflection (which is common and
-means you need to spend a lot of time manually tweaking ProGuard to get it to work), and so on. Also be careful not to fall into the trap of using a shared library for one or two features out of
-dozens of other things it does; you don't want to pull in a large amount of code and overhead that
-you don't even use. At the end of the day, if there isn't an existing implementation that is a
-strong match for what you need to do, it may be best if you create your own implementation. A variety of information about optimizing your app's overall performance is available
-in other documents listed in Best Practices
-for Performance. Many of these documents include optimizations tips for CPU performance, but
-many of these tips also help optimize your app's memory use, such as by reducing the number of
-layout objects required by your UI. You should also read about optimizing
-your UI with the layout debugging tools and take advantage of
-the optimization suggestions provided by the lint tool. The ProGuard tool shrinks,
-optimizes, and obfuscates your code by removing unused code and renaming classes, fields, and
-methods with semantically obscure names. Using ProGuard can make your code more compact, requiring
-fewer RAM pages to be mapped. If you do any post-processing of an APK generated by a build system (including signing it
-with your final production certificate), then you must run zipalign on it to have it re-aligned.
-Failing to do so can cause your app to require significantly more RAM, because things like
-resources can no longer be mmapped from the APK. Note: Google Play Store does not accept APK files that
-are not zipaligned. Once you achieve a relatively stable build, begin analyzing how much RAM your app is using
-throughout all stages of its lifecycle. For information about how to analyze your app, read Investigating Your RAM Usage. If it's appropriate for your app, an advanced technique that may help you manage your app's
-memory is dividing components of your app into multiple processes. This technique must always be
-used carefully and most apps should not run multiple processes, as it can easily
-increase—rather than decrease—your RAM footprint if done incorrectly. It is primarily
-useful to apps that may run significant work in the background as well as the foreground and can
-manage those operations separately. An example of when multiple processes may be appropriate is when building a music player that
-plays music from a service for long period of time. If
-the entire app runs in one process, then many of the allocations performed for its activity UI must
-be kept around as long as it is playing music, even if the user is currently in another app and the
-service is controlling the playback. An app like this may be split into two process: one for its
-UI, and the other for the work that continues running in the background service. You can specify a separate process for each app component by declaring the {@code android:process} attribute
-for each component in the manifest file. For example, you can specify that your service should run
-in a process separate from your app's main process by declaring a new process named "background"
-(but you can name the process anything you like): Your process name should begin with a colon (':') to ensure that the process remains private to
-your app. Before you decide to create a new process, you need to understand the memory implications.
-To illustrate the consequences of each process, consider that an empty process doing basically
-nothing has an extra memory footprint of about 1.4MB, as shown by the memory information
-dump below. Note: More information about how to read this output is provided
-in Investigating
-Your RAM Usage. The key data here is the Private Dirty and Private
-Clean memory, which shows that this process is using almost 1.4MB of non-pageable RAM
-(distributed across the Dalvik heap, native allocations, book-keeping, and library-loading),
-and another 150K of RAM for code that has been mapped in to execute. This memory footprint for an empty process is fairly significant and it can quickly
-grow as you start doing work in that process. For
-example, here is the memory use of a process that is created only to show an activity with some
-text in it: The process has now almost tripled in size, to 4MB, simply by showing some text in the UI. This
-leads to an important conclusion: If you are going to split your app into multiple processes, only
-one process should be responsible for UI. Other processes should avoid any UI, as this will quickly
-increase the RAM required by the process (especially once you start loading bitmap assets and other
-resources). It may then be hard or impossible to reduce the memory usage once the UI is drawn. Additionally, when running more than one process, it's more important than ever that you keep your
-code as lean as possible, because any unnecessary RAM overhead for common implementations are now
-replicated in each process. For example, if you are using enums (though you should not use enums), all of
-the RAM needed to create and initialize those constants is duplicated in each process, and any
-abstractions you have with adapters and temporaries or other overhead will likewise be replicated. Another concern with multiple processes is the dependencies that exist between them. For example,
-if your app has a content provider that you have running in the default process which also hosts
-your UI, then code in a background process that uses that content provider will also require that
-your UI process remain in RAM. If your goal is to have a background process that can run
-independently of a heavy-weight UI process, it can't have dependencies on content providers or
-services that execute in the UI process.
+ Random-access memory (RAM) is a valuable
+ resource in any software development environment, but
+ it's even more valuable on a mobile operating system
+ where physical memory is often constrained.
+ Although both the Android Runtime (ART) and Dalvik virtual machine perform
+ routine garbage collection, this does not mean you can ignore
+ when and where your app allocates and releases memory.
+ You still need to avoid
+ introducing memory leaks, usually caused by holding onto
+ object references in static member variables, and
+ release any {@link java.lang.ref.Reference} objects at the appropriate
+ time as defined by
+ lifecycle callbacks.
+
+ This page explains how you can
+ proactively reduce memory usage within your app.
+ For more information about general
+ practices to clean up your resources when programming in Java,
+ refer to other books or online
+ documentation about managing resource references.
+ If you’re looking for information about how to
+ analyze memory in a running app, read
+ Tools for analyzing RAM usage.
+ For more detailed information about how the Android Runtime and Dalvik
+ virtual machine manage memory, see the
+ Overview of Android Memory Management.
+
+ The Android framework, Android Studio, and Android SDK
+ can help you analyze and adjust your app's memory usage.
+ The Android framework
+ exposes several APIs that allow your app to reduce its memory usage
+ dynamically during runtime. Android Studio and the Android SDK
+ contain several tools that allow you to investigate how your
+ app uses memory.
+
+ Before you can fix the memory usage problems in your app, you first need
+ to find them. Android Studio and the Android SDK include several tools
+ for analyzing memory usage in your app:
+ For more information about how to use the DDMS tool, see
+ Using DDMS.
+
+ For more information about how to use Memory Monitor tool, see
+ Viewing Heap Updates.
+
+ For more information about how to use the Traceview viewer, see
+ Profiling with Traceview and dmtracedump.
+
+ For more information about how to use the Allocation Tracker tool, see
+ Allocation Tracker Walkthrough.
+ This numbered list of processes is essentially the LRU list of processes that the framework
-provides to the kernel to help it determine which processes it should kill as it needs more RAM.
-The kernel's out of memory killer will generally begin from the bottom of this list, killing the
-last process and working its way up. It may not do it in exactly this order, as it can also take
-into consideration other factors such as the relative RAM footprint of processes to some degree. There are many other options you can use with the activity command to analyze further details of
-your app's state—use
+ An Android device can run with varying amounts of free memory
+ depending on the physical amount of RAM on the device and how the user
+ operates it. The system broadcasts signals to indicate when it is under
+ memory pressure, and apps should listen for these signals and adjust
+ their memory usage as appropriate.
+In this document
+
+
+See Also
+
+
+Garbage collection
+
+Sharing Memory
+
+
+
+
+.odex
+ file for direct mmapping), app resources
+ (by designing the resource table to be a structure
+ that can be mmapped and by aligning the zip
+ entries of the APK), and traditional project
+ elements like native code in .so files.
+ Allocating and Reclaiming App Memory
+
+Restricting App Memory
+
+Switching apps
+
+In this document
-
+ See Also
-
-
-How Android Manages Memory
-
-Sharing Memory
-
-
-
-
-Allocating and Reclaiming App Memory
-
-
-
-
-
-Restricting App Memory
-
-Switching Apps
-
-How Your App Should Manage Memory
-
-Use services sparingly
-
-Release memory when your user interface becomes hidden
-
-Release memory as memory becomes tight
-
-
-
-
-
-
-
-Check how much memory you should use
-
-Avoid wasting memory with bitmaps
-
-Use optimized data containers
-
-Be aware of memory overhead
-
-
-
-
-Be careful with code abstractions
-
-Use nano protobufs for serialized data
-
-Avoid dependency injection frameworks
-
-Be careful about using external libraries
-
-Optimize overall performance
-
-Use ProGuard to strip out any unneeded code
-
-Use zipalign on your final APK
-
-Analyze your RAM usage
-
-Use multiple processes
-
-
-<service android:name=".PlaybackService"
- android:process=":background" />
-
-
-
-adb shell dumpsys meminfo com.example.android.apis:empty
-
-** MEMINFO in pid 10172 [com.example.android.apis:empty] **
- Pss Pss Shared Private Shared Private Heap Heap Heap
- Total Clean Dirty Dirty Clean Clean Size Alloc Free
- ------ ------ ------ ------ ------ ------ ------ ------ ------
- Native Heap 0 0 0 0 0 0 1864 1800 63
- Dalvik Heap 764 0 5228 316 0 0 5584 5499 85
- Dalvik Other 619 0 3784 448 0 0
- Stack 28 0 8 28 0 0
- Other dev 4 0 12 0 0 4
- .so mmap 287 0 2840 212 972 0
- .apk mmap 54 0 0 0 136 0
- .dex mmap 250 148 0 0 3704 148
- Other mmap 8 0 8 8 20 0
- Unknown 403 0 600 380 0 0
- TOTAL 2417 148 12480 1392 4832 152 7448 7299 148
-
-
-
-** MEMINFO in pid 10226 [com.example.android.helloactivity] **
- Pss Pss Shared Private Shared Private Heap Heap Heap
- Total Clean Dirty Dirty Clean Clean Size Alloc Free
- ------ ------ ------ ------ ------ ------ ------ ------ ------
- Native Heap 0 0 0 0 0 0 3000 2951 48
- Dalvik Heap 1074 0 4928 776 0 0 5744 5658 86
- Dalvik Other 802 0 3612 664 0 0
- Stack 28 0 8 28 0 0
- Ashmem 6 0 16 0 0 0
- Other dev 108 0 24 104 0 4
- .so mmap 2166 0 2824 1828 3756 0
- .apk mmap 48 0 0 0 632 0
- .ttf mmap 3 0 0 0 24 0
- .dex mmap 292 4 0 0 5672 4
- Other mmap 10 0 8 8 68 0
- Unknown 632 0 412 624 0 0
- TOTAL 5169 4 11832 4032 10152 8 8744 8609 134
-
-
-Monitor Available Memory and Memory Usage
+
+Tools for analyzing RAM usage
+
+
-
-adb shell dumpsys activity -h for help on its use.Release memory in response to events
+
+
+ To listen for these events, implement the {@link + android.content.ComponentCallbacks2#onTrimMemory onTrimMemory()} + callback in your {@link android.app.Activity} + classes, as shown in the following code snippet. +
+ +
+import android.content.ComponentCallbacks2;
+// Other import statements ...
+
+public class MainActivity extends AppCompatActivity
+ implements ComponentCallbacks2 {
+
+ // Other activity code ...
+
+ /**
+ * Release memory when the UI becomes hidden or when system resources become low.
+ * @param level the memory-related event that was raised.
+ */
+ public void onTrimMemory(int level) {
+
+ // Determine which lifecycle or system event was raised.
+ switch (level) {
+
+ case ComponentCallbacks2.TRIM_MEMORY_UI_HIDDEN:
+
+ /*
+ Release any UI objects that currently hold memory.
+
+ The user interface has moved to the background.
+ */
+
+ break;
+
+ case ComponentCallbacks2.TRIM_MEMORY_RUNNING_MODERATE:
+ case ComponentCallbacks2.TRIM_MEMORY_RUNNING_LOW:
+ case ComponentCallbacks2.TRIM_MEMORY_RUNNING_CRITICAL:
+
+ /*
+ Release any memory that your app doesn't need to run.
+
+ The device is running low on memory while the app is running.
+ The event raised indicates the severity of the memory-related event.
+ If the event is TRIM_MEMORY_RUNNING_CRITICAL, then the system will
+ begin killing background processes.
+ */
+
+ break;
+
+ case ComponentCallbacks2.TRIM_MEMORY_BACKGROUND:
+ case ComponentCallbacks2.TRIM_MEMORY_MODERATE:
+ case ComponentCallbacks2.TRIM_MEMORY_COMPLETE:
+
+ /*
+ Release as much memory as the process can.
+
+ The app is on the LRU list and the system is running low on memory.
+ The event raised indicates where the app sits within the LRU list.
+ If the event is TRIM_MEMORY_COMPLETE, the process will be one of
+ the first to be terminated.
+ */
+
+ break;
+
+ default:
+ /*
+ Release any non-critical data structures.
+
+ The app received an unrecognized memory level value
+ from the system. Treat this as a generic low-memory message.
+ */
+ break;
+ }
+ }
+}
+
+
++ The + {@link android.content.ComponentCallbacks2#onTrimMemory onTrimMemory()} + callback was added in Android 4.0 (API level 14). For earlier versions, + you can use the + {@link android.content.ComponentCallbacks#onLowMemory()} + callback as a fallback for older versions, which is roughly equivalent to the + {@link android.content.ComponentCallbacks2#TRIM_MEMORY_COMPLETE} event. +
+ + + ++ To allow multiple running processes, Android sets a hard limit + on the heap size alloted for each app. The exact heap size limit varies + between devices based on how much RAM the device + has available overall. If your app has reached the heap capacity and + tries to allocate more + memory, the system throws an {@link java.lang.OutOfMemoryError}. +
+ ++ To avoid running out of memory, you can to query the system to determine + how much heap space you have available on the current device. + You can query the system for this figure by calling + {@link android.app.ActivityManager#getMemoryInfo(android.app.ActivityManager.MemoryInfo) getMemoryInfo()}. + This returns an + {@link android.app.ActivityManager.MemoryInfo } object that provides + information about the device's + current memory status, including available memory, total memory, and + the memory threshold—the memory level below which the system begins + to kill processes. The + {@link android.app.ActivityManager.MemoryInfo } class also exposes a simple + boolean field, + {@link android.app.ActivityManager.MemoryInfo#lowMemory } + that tells you whether the device is running low on memory. +
+ ++ The following code snippet shows an example of how you can use the + {@link android.app.ActivityManager#getMemoryInfo(android.app.ActivityManager.MemoryInfo) getMemoryInfo()}. + method in your application. +
+ +
+public void doSomethingMemoryIntensive() {
+
+ // Before doing something that requires a lot of memory,
+ // check to see whether the device is in a low memory state.
+ ActivityManager.MemoryInfo memoryInfo = getAvailableMemory();
+
+ if (!memoryInfo.lowMemory) {
+ // Do memory intensive work ...
+ }
+}
+
+// Get a MemoryInfo object for the device's current memory status.
+private ActivityManager.MemoryInfo getAvailableMemory() {
+ ActivityManager activityManager = (ActivityManager) this.getSystemService(ACTIVITY_SERVICE);
+ ActivityManager.MemoryInfo memoryInfo = new ActivityManager.MemoryInfo();
+ activityManager.getMemoryInfo(memoryInfo);
+ return memoryInfo;
+}
+
+
+
+
++ Some Android features, Java classes, and code constructs tend to + use more memory than others. You can minimize how + much memory your app uses by choosing more efficient alternatives in + your code. +
+ + + ++ Leaving a service running when it’s not needed is + one of the worst memory-management + mistakes an Android app can make. If your app needs a + service + to perform work in the background, do not keep it running unless + it needs to run a job. Remember to stop your service when it has completed + its task. Otherwise, you can inadvertently cause a memory leak. +
+ ++ When you start a service, the system prefers to always keep the process + for that service running. This behavior + makes services processes very expensive + because the RAM used by a service remains unavailable to other processes. + This reduces the number of cached processes that the system can keep in + the LRU cache, making app switching less efficient. It can even lead to + thrashing in the system when memory is tight and the system can’t + maintain enough processes to host all the services currently running. +
+ ++ You should generally avoid use of persistent services because of + the on-going demands they place on available memory. Instead, we + recommend that you use an alternative implementation + such as {@llink android.app.job.JobScheduler}. For more information about + how to use {@llink android.app.job.JobScheduler} to schedule background + processes, see + Background Optimizations. +
+ If you must use a service, the + best way to limit the lifespan of your service is to use an {@link + android.app.IntentService}, which finishes + itself as soon as it's done handling the intent that started it. + For more information, read + Running in a Background Service. +
+ + + ++ Some of the classes provided by the programming language are not optimized for + use on mobile devices. For example, the generic + {@link java.util.HashMap} implementation can be quite memory + inefficient because it needs a separate entry object for every mapping. +
+ ++ The Android framework includes several optimized data containers, including + {@link android.util.SparseArray}, {@link android.util.SparseBooleanArray}, + and {@link android.support.v4.util.LongSparseArray}. + For example, the {@link android.util.SparseArray} classes are more + efficient because they avoid the system's need to + autobox + the key and sometimes value (which creates yet another object or + two per entry). +
+ ++ If necessary, you can always switch to raw arrays for a really lean data + structure. +
+ + + ++ Developers often use abstractions simply as a good programming practice, + because abstractions can improve code flexibility and maintenance. + However, abstractions come at a significant cost: + generally they require a fair amount more code that + needs to be executed, requiring more time and + more RAM for that code to be mapped into memory. + So if your abstractions aren't supplying a + significant benefit, you should avoid them. +
+ ++ For example, enums often require more than twice as much memory as static + constants. You should strictly avoid using enums on Android. +
+ + + ++ Protocol buffers + are a language-neutral, platform-neutral, extensible mechanism + designed by Google for serializing structured data—similar to XML, but + smaller, faster, and simpler. If you decide to use + protobufs for your data, you should always use nano protobufs in your + client-side code. Regular protobufs generate extremely verbose code, which + can cause many kinds of problems in your app such as + increased RAM use, significant APK size increase, and slower execution. +
+ ++ For more information, see the "Nano version" section in the + protobuf readme. +
+ + + ++ As mentioned previously, garbage collections events don't normally affect + your app's performance. However, many garbage collection events that occur + over a short period of time can quickly eat up your frame time. The more time + that the system spends on garbage collection, the less time it has to do + other stuff like rendering or streaming audio. +
+ ++ Often, memory churn can cause a large number of + garbage collection events to occur. In practice, memory churn describes the + number of allocated temporary objects that occur in a given amount of time. +
+ +
+ For example, you might allocate multiple temporary objects within a
+ for loop. Or you might create new
+ {@link android.graphics.Paint} or {@link android.graphics.Bitmap}
+ objects inside the
+ {@link android.view.View#onDraw(android.graphics.Canvas) onDraw()}
+ function of a view.
+ In both cases, the app creates a lot of objects quickly at high volume.
+ These can quickly consume all the available memory in the young generation,
+ forcing a garbage collection event to occur.
+
+ Of course, you need to find the places in your code where + the memory churn is high before you can fix them. Use the tools discussed in + Analyze your RAM usage +
+ ++ Once you identify the problem areas in your code, try to reduce the number of + allocations within performance critical areas. Consider moving things out of + inner loops or perhaps moving them into a + Factory + based allocation structure. +
+ + + ++ Some resources and libraries within your code can gobble up memory without + you knowing it. Overall size of your APK, including third-party libraries + or embedded resources, can affect how much memory your app consumes. You can + improve your app's memory consumption by removing any redundant, unnecessary, + or bloated components, resources, or libraries from your code. +
+ + + ++ You can significantly reduce your app's memory usage by reducing the overall + size of your app. Bitmap size, resources, animation frames, and third-party + libraries can all contribute to the size of your APK. + Android Studio and the Android SDK provide multiple tools + to help you reduce the size of your resources and external dependencies. +
+ ++ For more information about how to reduce your overall APK size, see + Reduce APK Size. +
+ + + ++ Dependency injection framework such as + Guice + or + RoboGuice + can simplify the code you write and provide an adaptive environment + that's useful for testing and other configuration changes. However, dependency + frameworks aren't always optimized for mobile devices. +
+ ++ For example, these frameworks tend to initialize processes by + scanning your code for annotations. This which can require significant + amounts of your code to be mapped into RAM unnecessarily. The system + allocates these mapped pages into clean memory so Android can drop them; yet + that can't happen until the pages have remained in memory for a long period + of time. +
+ ++ If you need to use a dependency injection framework in your app, consider + using + Dagger + instead. For example, Dagger does not use reflection to scan your app's code. + Dagger's strict implementation means that it can be used in Android apps + without needlessly increasing memory usage. +
+ + + ++ External library code is often not written for mobile environments and + can be inefficient when used + for work on a mobile client. When you decide to use an + external library, you may need to optimize that library for mobile devices. + Plan for that work up-front and analyze the library in terms of code size and + RAM footprint before deciding to use it at all. +
+ ++ Even some mobile-optimized libraries can cause problems due to differing + implementations. For example, one library may use nano protobufs + while another uses micro protobufs, resulting in two different protobuf + implementations in your app. This can happen with different + implementations of logging, analytics, image loading frameworks, + caching, and many other things you don't expect. +
+ ++ Although ProGuard can + help to remove APIs and resources with the right flags, it can't remove a + library's large internal dependencies. The features that you want in these + libraries may require lower-level dependencies. This becomes especially + problematic when you use an {@link android.app.Activity } subclass from a + library (which will tend to have wide swaths of dependencies), + when libraries use reflection (which is common and means you need to spend a + lot of time manually tweaking ProGuard to get it to work), and so on. +
+ ++ Also avoid using a shared library for just one or two features out of dozens. + You don't want to pull in a large amount of code and overhead that + you don't even use. When you consider whether to use a library, look for + an implementation that strongly matches what you need. Otherwise, you might + decide to create your own implementation. +
---> diff --git a/docs/html/training/training_toc.cs b/docs/html/training/training_toc.cs index d0dccba64d357..39ca6fb405e34 100644 --- a/docs/html/training/training_toc.cs +++ b/docs/html/training/training_toc.cs @@ -1887,6 +1887,12 @@ results." on a variety of mobile devices." >Managing Your App's Memory +