am ec6d0847: am 5b03460f: Merge "cherrypick from jb-dev-docs: Gestures Class Change-Id: I9abebf58a9607c8f52f72ef2ce46308304386596" into jb-mr1-dev

* commit 'ec6d08475ee1a2020556b43b4698b60dcb56a9c6':
  cherrypick from jb-dev-docs: Gestures Class Change-Id: I9abebf58a9607c8f52f72ef2ce46308304386596
This commit is contained in:
kmccormick@google.com
2012-12-21 12:51:02 -08:00
committed by Android Git Automerger
8 changed files with 1502 additions and 1 deletions

View File

@@ -0,0 +1,341 @@
page.title=Detecting Common Gestures
parent.title=Using Touch Gestures
parent.link=index.html
trainingnavtop=true
next.title=Tracking Movement
next.link=movement.html
@jd:body
<div id="tb-wrapper">
<div id="tb">
<!-- table of contents -->
<h2>This lesson teaches you to</h2>
<ol>
<li><a href="#data">Gather Data</a></li>
<li><a href="#detect">Detect Gestures</a></li>
</ol>
<!-- other docs (NOT javadocs) -->
<h2>You should also read</h2>
<ul>
<li><a href="http://developer.android.com/guide/topics/ui/ui-events.html">Input Events</a> API Guide
</li>
<li><a href="{@docRoot}guide/topics/sensors/sensors_overview.html">Sensors Overview</a></li>
<li><a href="http://android-developers.blogspot.com/2010/06/making-sense-of-multitouch.html">Making Sense of Multitouch</a> blog post</li>
<li><a href="{@docRoot}training/custom-views/making-interactive.html">Making the View Interactive</a> </li>
<li>Design Guide for <a href="{@docRoot}design/patterns/gestures.html">Gestures</a></li>
<li>Design Guide for <a href="{@docRoot}design/style/touch-feedback.html">Touch Feedback</a></li>
</ul>
</div>
</div>
<p>A "touch gesture" occurs when a user places one or more fingers on the touch
screen, and your application interprets
that pattern of touches as a particular gesture. There are correspondingly two
phases to gesture detection:</p>
<ol>
<li>Gathering data about touch events.</li>
<li>Interpreting the data to see if it meets the criteria for any of the
gestures your app supports. </li>
</ol>
<h4>Support Library Classes</h4>
<p>The examples in this lesson use the {@link android.support.v4.view.GestureDetectorCompat}
and {@link android.support.v4.view.MotionEventCompat} classes. These classes are in the
<a href="{@docRoot}tools/extras/support-library.html">Support Library</a>. You should use
Support Library classes where possible to provide compatibility with devices
running Android 1.6 and higher. Note that {@link android.support.v4.view.MotionEventCompat} is <em>not</em> a
replacement for the {@link android.view.MotionEvent} class. Rather, it provides static utility
methods to which you pass your {@link android.view.MotionEvent} object in order to receive
the desired action associated with that event.</p>
<h2 id="data">Gather Data</h2>
<p>When a user places one or more fingers on the screen, this triggers the
callback {@link android.view.View#onTouchEvent onTouchEvent()}
on the View that received the touch events.
For each sequence of touch events (position, pressure, size, addition of another finger, etc.)
that is ultimately identified as a gesture,
{@link android.view.View#onTouchEvent onTouchEvent()} is fired several times.</p>
<p>The gesture starts when the user first touches the screen, continues as the system tracks
the position of the user's finger(s), and ends by capturing the final event of
the user's fingers leaving the screen. Throughout this interaction,
the {@link android.view.MotionEvent} delivered to {@link android.view.View#onTouchEvent onTouchEvent()}
provides the details of every interaction. Your app can use the data provided by the {@link android.view.MotionEvent}
to determine if a gesture it cares
about happened.</p>
<h3>Capturing touch events for an Activity or View</h3>
<p><p>To intercept touch events in an Activity or View, override
the {@link android.view.View#onTouchEvent onTouchEvent()} callback.</p>
<p>The following snippet uses
{@link android.support.v4.view.MotionEventCompat#getActionMasked getActionMasked()}
to extract the action the user performed from the {@code event} parameter. This gives you the raw
data you need to determine if a gesture you care about occurred:</p>
<pre>
public class MainActivity extends Activity {
...
// This example shows an Activity, but you would use the same approach if
// you were subclassing a View.
&#64;Override
public boolean onTouchEvent(MotionEvent event){
int action = MotionEventCompat.getActionMasked(event);
switch(action) {
case (MotionEvent.ACTION_DOWN) :
Log.d(DEBUG_TAG,"Action was DOWN");
return true;
case (MotionEvent.ACTION_MOVE) :
Log.d(DEBUG_TAG,"Action was MOVE");
return true;
case (MotionEvent.ACTION_UP) :
Log.d(DEBUG_TAG,"Action was UP");
return true;
case (MotionEvent.ACTION_CANCEL) :
Log.d(DEBUG_TAG,"Action was CANCEL");
return true;
case (MotionEvent.ACTION_OUTSIDE) :
Log.d(DEBUG_TAG,"Movement occurred outside bounds " +
"of current screen element");
return true;
default :
return super.onTouchEvent(event);
}
}</pre>
<p>You can then do your own processing on these events to determine if a
gesture occurred. This is the kind of processing you would have to do for a
custom gesture. However, if your app uses
common gestures such as double tap, long press, fling, and so on, you can
take advantage of the {@link
android.view.GestureDetector} class. {@link
android.view.GestureDetector} makes it easy for you to detect common
gestures without processing the individual touch events yourself. This is
discussed below in <a href="#detect">Detect Gestures</a>.</p>
<h3>Capturing touch events for a single view</h3>
<p>As an alternative to {@link android.view.View#onTouchEvent onTouchEvent()},
you can attach an {@link android.view.View.OnTouchListener} object to any {@link
android.view.View} object using the {@link android.view.View#setOnTouchListener
setOnTouchListener()} method. This makes it possible to to listen for touch
events without subclassing an existing {@link android.view.View}. For
example:</p>
<pre>View myView = findViewById(R.id.my_view);
myView.setOnTouchListener(new OnTouchListener() {
public boolean onTouch(View v, MotionEvent event) {
// ... Respond to touch events
return true;
}
});</pre>
<p>Beware of creating a listener that returns {@code false} for the
{@link android.view.MotionEvent#ACTION_DOWN} event. If you do this, the listener will
not be called for the subsequent {@link android.view.MotionEvent#ACTION_MOVE}
and {@link android.view.MotionEvent#ACTION_UP} string of events. This is because
{@link android.view.MotionEvent#ACTION_DOWN} is the starting point for all touch events.</p>
<p>If you are creating a custom View, you can override
{@link android.view.View#onTouchEvent onTouchEvent()},
as described above.</p>
<h2 id="detect">Detect Gestures</h2>
<p>Android provides the {@link android.view.GestureDetector} class for detecting
common gestures. Some of the gestures it supports include {@link
android.view.GestureDetector.OnGestureListener#onDown onDown()}, {@link
android.view.GestureDetector.OnGestureListener#onLongPress onLongPress()},
{@link android.view.GestureDetector.OnGestureListener#onFling onFling()}, and so
on. You can use {@link android.view.GestureDetector} in conjunction with the
{@link android.view.View#onTouchEvent onTouchEvent()}
method described above.</p>
<h3>Detecting All Supported Gestures</h3>
<p>When you instantiate a {@link android.support.v4.view.GestureDetectorCompat}
object, one of the parameters it takes is a class that implements the
{@link android.view.GestureDetector.OnGestureListener} interface.
{@link android.view.GestureDetector.OnGestureListener} notifies users when
a particular touch event has occurred. To make it possible for your
{@link android.view.GestureDetector} object to receive events, you override
the View or Activity's {@link android.view.View#onTouchEvent onTouchEvent()} method,
and pass along all observed events to the detector instance.</p>
<p>In the following snippet, a return value of {@code true} from the individual
{@code on<em>&lt;TouchEvent&gt;</em>} methods indicates that you
have handled the touch event. A return value of {@code false} passes events down
through the view stack until the touch has been successfully handled.</p>
<p>Run the following snippet to get a feel for how actions are triggered when
you interact with the touch screen, and what the contents of the {@link
android.view.MotionEvent} are for each touch event. You will realize how much
data is being generated for even simple interactions.</p>
<pre>public class MainActivity extends Activity implements
GestureDetector.OnGestureListener,
GestureDetector.OnDoubleTapListener{
private static final String DEBUG_TAG = "Gestures";
private GestureDetectorCompat mDetector;
// Called when the activity is first created.
&#64;Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
// Instantiate the gesture detector with the
// application context and an implementation of
// GestureDetector.OnGestureListener
mDetector = new GestureDetectorCompat(this,this);
// Set the gesture detector as the double tap
// listener.
mDetector.setOnDoubleTapListener(this);
}
&#64;Override
public boolean onTouchEvent(MotionEvent event){
this.mDetector.onTouchEvent(event);
// Be sure to call the superclass implementation
return super.onTouchEvent(event);
}
&#64;Override
public boolean onDown(MotionEvent event) {
Log.d(DEBUG_TAG,"onDown: " + event.toString());
return true;
}
&#64;Override
public boolean onFling(MotionEvent event1, MotionEvent event2,
float velocityX, float velocityY) {
Log.d(DEBUG_TAG, "onFling: " + event1.toString()+event2.toString());
return true;
}
&#64;Override
public void onLongPress(MotionEvent event) {
Log.d(DEBUG_TAG, "onLongPress: " + event.toString());
}
&#64;Override
public boolean onScroll(MotionEvent e1, MotionEvent e2, float distanceX,
float distanceY) {
Log.d(DEBUG_TAG, "onScroll: " + e1.toString()+e2.toString());
return true;
}
&#64;Override
public void onShowPress(MotionEvent event) {
Log.d(DEBUG_TAG, "onShowPress: " + event.toString());
}
&#64;Override
public boolean onSingleTapUp(MotionEvent event) {
Log.d(DEBUG_TAG, "onSingleTapUp: " + event.toString());
return true;
}
&#64;Override
public boolean onDoubleTap(MotionEvent event) {
Log.d(DEBUG_TAG, "onDoubleTap: " + event.toString());
return true;
}
&#64;Override
public boolean onDoubleTapEvent(MotionEvent event) {
Log.d(DEBUG_TAG, "onDoubleTapEvent: " + event.toString());
return true;
}
&#64;Override
public boolean onSingleTapConfirmed(MotionEvent event) {
Log.d(DEBUG_TAG, "onSingleTapConfirmed: " + event.toString());
return true;
}
}</pre>
<h3>Detecting a Subset of Supported Gestures</h3>
<p>If you only want to process a few gestures, you can extend {@link
android.view.GestureDetector.SimpleOnGestureListener} instead of implementing
the {@link android.view.GestureDetector.OnGestureListener} interface. </p>
<p>
{@link
android.view.GestureDetector.SimpleOnGestureListener} provides an implementation
for all of the {@code on<em>&lt;TouchEvent&gt;</em>} methods by returning {@code false}
for all of them. Thus you can override only the methods you care about.
For
example, the snippet below creates a class that extends {@link
android.view.GestureDetector.SimpleOnGestureListener} and overrides {@link
android.view.GestureDetector.OnGestureListener#onFling onFling()} and {@link
android.view.GestureDetector.OnGestureListener#onDown onDown()}.</p>
<p>Whether or not you use {@link android.view.GestureDetector.OnGestureListener},
it's best practice to implement an
{@link android.view.GestureDetector.OnGestureListener#onDown onDown()}
method that returns {@code true}. This is because all gestures begin with an
{@link android.view.GestureDetector.OnGestureListener#onDown onDown()} message. If you return
{@code false} from {@link android.view.GestureDetector.OnGestureListener#onDown onDown()},
as {@link android.view.GestureDetector.SimpleOnGestureListener} does by default,
the system assumes that you want to ignore the rest of the gesture, and the other methods of
{@link android.view.GestureDetector.OnGestureListener} never get called.
This has the potential to cause unexpected problems in your app.
The only time you should return {@code false} from
{@link android.view.GestureDetector.OnGestureListener#onDown onDown()}
is if you truly want to ignore an entire gesture. </p>
<pre>public class MainActivity extends Activity {
private GestureDetectorCompat mDetector;
&#64;Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
mDetector = new GestureDetectorCompat(this, new MyGestureListener());
}
&#64;Override
public boolean onTouchEvent(MotionEvent event){
this.mDetector.onTouchEvent(event);
return super.onTouchEvent(event);
}
class MyGestureListener extends GestureDetector.SimpleOnGestureListener {
private static final String DEBUG_TAG = "Gestures";
&#64;Override
public boolean onDown(MotionEvent event) {
Log.d(DEBUG_TAG,"onDown: " + event.toString());
return true;
}
&#64;Override
public boolean onFling(MotionEvent event1, MotionEvent event2,
float velocityX, float velocityY) {
Log.d(DEBUG_TAG, "onFling: " + event1.toString()+event2.toString());
return true;
}
}
}
</pre>

View File

@@ -0,0 +1,94 @@
page.title=Using Touch Gestures
trainingnavtop=true
startpage=true
next.title=Detect Built-in Gestures
next.link=detector.html
@jd:body
<div id="tb-wrapper">
<div id="tb">
<!-- Required platform, tools, add-ons, devices, knowledge, etc. -->
<h2>Dependencies and prerequisites</h2>
<ul>
<li>Android 1.6 (API Level 4) or higher</li>
</ul>
<h2>You should also read</h2>
<ul>
<li><a href="http://developer.android.com/guide/topics/ui/ui-events.html">Input Events</a> API Guide
</li>
<li><a href="{@docRoot}guide/topics/sensors/sensors_overview.html">Sensors Overview</a></li>
<li><a href="http://android-developers.blogspot.com/2010/06/making-sense-of-multitouch.html">Making Sense of Multitouch</a> blog post</li>
<li><a href="{@docRoot}training/custom-views/making-interactive.html">Making the View Interactive</a> </li>
<li>Design Guide for <a href="{@docRoot}design/patterns/gestures.html">Gestures</a></li>
<li>Design Guide for <a href="{@docRoot}design/style/touch-feedback.html">Touch Feedback</a></li>
</ul>
</div>
</div>
<p> This class describes how to write apps that allow users to interact with an
app via touch gestures. Android provides a variety of APIs to
help you create and detect gestures.</p>
<p>Although your app should not depend on touch gestures for basic behaviors (since the gestures
may not be available to all users in all contexts), adding touch-based
interaction to your app can greatly increase its usefulness and appeal.</p>
<p>To
provide users with a consistent, intuitive experience, your app should follow
the accepted Android conventions for touch gestures. The <a
href="http://developer.android.com/design/patterns/gestures.html">Gestures
design guide</a><a href="{@docRoot}design/patterns/notifications.html"></a>
shows you how to use common gestures in Android apps. Also see the Design Guide
for <a href="{@docRoot}design/style/touch-feedback.html">Touch Feedback</a>. </p>
<h2>Lessons</h2>
<dl>
<dt>
<strong><a href="detector.html">Detecting Common Gestures</a></strong>
</dt>
<dd>
Learn how to detect basic touch gestures such as scrolling, flinging, and double-tapping, using
{@link android.view.GestureDetector}.
</dd>
<dt>
<strong><a href="movement.html">Tracking Movement</a></strong>
</dt>
<dd>
Learn how to track movement.
</dd>
<dt>
<strong><a href="scroll.html">Animating a Scroll Gesture</a></strong>
</dt>
<dd>
Learn how to use scrollers ({@link android.widget.Scroller} or {@link
android.widget.OverScroller}) to produce a scrolling animation in response to a
touch event. </dd>
<dt>
<strong><a href="multi.html">Handling Multi-Touch Gestures</a></strong>
</dt>
<dd>
Learn how to detect multi-pointer (finger) gestures.
</dd>
<dt>
<strong><a href="scale.html">Dragging and Scaling</a></strong>
</dt>
<dd>
Learn how to implement touch-based dragging and scaling.
</dd>
<dt><strong><a href="viewgroup.html">Managing Touch Events in a ViewGroup</a></strong></dt>
<dd>Learn how to manage touch events in a {@link android.view.ViewGroup} to
ensure that touch events are correctly dispatched to their target views.</dd>
</dl>

View File

@@ -0,0 +1,151 @@
page.title=Tracking Movement
parent.title=Using Touch Gestures
parent.link=index.html
trainingnavtop=true
next.title=Animating a Scroll Gesture
next.link=scroll.html
@jd:body
<div id="tb-wrapper">
<div id="tb">
<!-- table of contents -->
<h2>This lesson teaches you to</h2>
<ol>
<li><a href="#velocity">Track Velocity</a></li>
</ol>
<!-- other docs (NOT javadocs) -->
<h2>You should also read</h2>
<ul>
<li><a href="http://developer.android.com/guide/topics/ui/ui-events.html">Input Events</a> API Guide
</li>
<li><a href="{@docRoot}guide/topics/sensors/sensors_overview.html">Sensors Overview</a></li>
<li><a href="http://android-developers.blogspot.com/2010/06/making-sense-of-multitouch.html">Making Sense of Multitouch</a> blog post</li>
<li><a href="{@docRoot}training/custom-views/making-interactive.html">Making the View Interactive</a> </li>
<li>Design Guide for <a href="{@docRoot}design/patterns/gestures.html">Gestures</a></li>
<li>Design Guide for <a href="{@docRoot}design/style/touch-feedback.html">Touch Feedback</a></li>
</ul>
</div>
</div>
<p>This lesson describes how to track movement in touch events.</p>
<p>A new {@link
android.view.View#onTouchEvent onTouchEvent()} is triggered with an {@link
android.view.MotionEvent#ACTION_MOVE} event whenever the current touch contact
position, pressure, or size changes. As described in <a
href="detector.html">Detecting Common Gestures</a>, all of these events are
recorded in the {@link android.view.MotionEvent} parameter of {@link
android.view.View#onTouchEvent onTouchEvent()}.</p>
<p>Because finger-based touch isn't always the most precise form of interaction,
detecting touch events is often based more on movement than on simple contact.
To help apps distinguish between movement-based gestures (such as a swipe) and
non-movement gestures (such as a single tap), Android includes the notion of
"touch slop." Touch slop refers to the distance in pixels a user's touch can wander
before the gesture is interpreted as a movement-based gesture. For more discussion of this
topic, see <a href="viewgroup.html#vc">Managing Touch Events in a ViewGroup</a>.</p>
<p>There are several different ways to track movement in a gesture, depending on
the needs of your application. For example:</p>
<ul>
<li>The starting and ending position of a pointer (for example, move an
on-screen object from point A to point B).</li>
<li>The direction the pointer is traveling in, as determined by the x and y coordinates.</li>
<li>History. You can find the size of a gesture's history by calling the {@link
android.view.MotionEvent} method {@link android.view.MotionEvent#getHistorySize
getHistorySize()}. You can then obtain the positions, sizes, time, and pressures
of each of the historical events by using the motion event's {@code
getHistorical<em>&lt;Value&gt;</em>} methods. History is useful when rendering a trail of the user's finger,
such as for touch drawing. See the {@link android.view.MotionEvent} reference for
details.</li>
<li>The velocity of the pointer as it moves across the touch screen.</li>
</ul>
<h2 id="velocity">Track Velocity</h2>
<p> You could have a movement-based gesture that is simply based on the distance and/or direction the pointer traveled. But velocity often is a
determining factor in tracking a gesture's characteristics or even deciding
whether the gesture occurred. To make velocity calculation easier, Android
provides the {@link android.view.VelocityTracker} class and the
{@link android.support.v4.view.VelocityTrackerCompat} class in the
<a href="{@docRoot}tools/extras/support-library.html">Support Library</a>.
{@link
android.view.VelocityTracker} helps you track the velocity of touch events. This
is useful for gestures in which velocity is part of the criteria for the
gesture, such as a fling.</p>
<p>Here is a simple example that illustrates the purpose of the methods in the
{@link android.view.VelocityTracker} API:</p>
<pre>public class MainActivity extends Activity {
private static final String DEBUG_TAG = "Velocity";
...
private VelocityTracker mVelocityTracker = null;
&#64;Override
public boolean onTouchEvent(MotionEvent event) {
int index = event.getActionIndex();
int action = event.getActionMasked();
int pointerId = event.getPointerId(index);
switch(action) {
case MotionEvent.ACTION_DOWN:
if(mVelocityTracker == null) {
// Retrieve a new VelocityTracker object to watch the velocity of a motion.
mVelocityTracker = VelocityTracker.obtain();
}
else {
// Reset the velocity tracker back to its initial state.
mVelocityTracker.clear();
}
// Add a user's movement to the tracker.
mVelocityTracker.addMovement(event);
break;
case MotionEvent.ACTION_MOVE:
mVelocityTracker.addMovement(event);
// When you want to determine the velocity, call
// computeCurrentVelocity(). Then call getXVelocity()
// and getYVelocity() to retrieve the velocity for each pointer ID.
mVelocityTracker.computeCurrentVelocity(1000);
// Log velocity of pixels per second
// Best practice to use VelocityTrackerCompat where possible.
Log.d("", "X velocity: " +
VelocityTrackerCompat.getXVelocity(mVelocityTracker,
pointerId));
Log.d("", "Y velocity: " +
VelocityTrackerCompat.getYVelocity(mVelocityTracker,
pointerId));
break;
case MotionEvent.ACTION_UP:
case MotionEvent.ACTION_CANCEL:
// Return a VelocityTracker object back to be re-used by others.
mVelocityTracker.recycle();
break;
}
return true;
}
}
</pre>
<p class="note"><strong>Note:</strong> Note that you should calculate velocity after an
{@link android.view.MotionEvent#ACTION_MOVE} event,
not after {@link android.view.MotionEvent#ACTION_UP}. After an {@link android.view.MotionEvent#ACTION_UP},
the X and Y velocities will be 0.
</p>

View File

@@ -0,0 +1,168 @@
page.title=Handling Multi-Touch Gestures
parent.title=Using Touch Gestures
parent.link=index.html
trainingnavtop=true
next.title=Dragging and Scaling
next.link=scale.html
@jd:body
<div id="tb-wrapper">
<div id="tb">
<!-- table of contents -->
<h2>This lesson teaches you to</h2>
<ol>
<li><a href="#track">Track Multiple Pointers</a></li>
<li><a href="#action">Get a MotionEvent's Action</a></li>
</ol>
<!-- other docs (NOT javadocs) -->
<h2>You should also read</h2>
<ul>
<li><a href="http://developer.android.com/guide/topics/ui/ui-events.html">Input Events</a> API Guide
</li>
<li><a href="{@docRoot}guide/topics/sensors/sensors_overview.html">Sensors Overview</a></li>
<li><a href="http://android-developers.blogspot.com/2010/06/making-sense-of-multitouch.html">Making Sense of Multitouch</a> blog post</li>
<li><a href="{@docRoot}training/custom-views/making-interactive.html">Making the View Interactive</a> </li>
<li>Design Guide for <a href="{@docRoot}design/patterns/gestures.html">Gestures</a></li>
<li>Design Guide for <a href="{@docRoot}design/style/touch-feedback.html">Touch Feedback</a></li>
</ul>
</div>
</div>
<p>A multi-touch gesture is when multiple pointers (fingers) touch the screen
at the same time. This lesson describes how to detect gestures that involve
multiple pointers.</p>
<h2 id="track">Track Multiple Pointers</h2>
<p>When multiple pointers touch the screen at the same time, the system generates the
following touch events:</p>
<ul>
<li>{@link android.view.MotionEvent#ACTION_DOWN}&mdash;For the first pointer that
touches the screen. This starts the gesture. The pointer data for this pointer is
always at index 0 in the {@link android.view.MotionEvent}.</li>
<li>{@link android.support.v4.view.MotionEventCompat#ACTION_POINTER_DOWN}&mdash;For
extra pointers that enter the screen beyond the first. The pointer data for this
pointer is at the index returned by {@link android.support.v4.view.MotionEventCompat#getActionIndex getActionIndex()}.</li>
<li>{@link android.view.MotionEvent#ACTION_MOVE}&mdash;A change has happened during a press gesture.</li>
<li>{@link android.support.v4.view.MotionEventCompat#ACTION_POINTER_UP}&mdash;Sent when a non-primary pointer goes up.</li>
<li>{@link android.view.MotionEvent#ACTION_UP}&mdash;Sent when the last pointer leaves the screen.</li>
</ul>
<p>You keep track of individual pointers within a {@link
android.view.MotionEvent} via each pointer's index and ID:</p>
<ul>
<li><strong>Index</strong>: A {@link android.view.MotionEvent} effectively
stores information about each pointer in an array. The index of a pointer is its position
within this array. Most of the {@link
android.view.MotionEvent} methods you use to interact with pointers take the
pointer index as a parameter, not the pointer ID. </li>
<li><strong>ID</strong>: Each pointer also has an ID mapping that stays
persistent across touch events to allow tracking an individual pointer across
the entire gesture.</li>
</ul>
<p>The order in which individual pointers appear within a motion event is
undefined. Thus the index of a pointer can change from one event to the
next, but the pointer ID of a pointer is guaranteed to remain constant as long
as the pointer remains active. Use the {@link
android.view.MotionEvent#getPointerId getPointerId()} method to obtain a
pointer's ID to track the pointer across all subsequent motion events in a
gesture. Then for successive motion events, use the {@link
android.view.MotionEvent#findPointerIndex findPointerIndex()} method to obtain
the pointer index for a given pointer ID in that motion event. For example:</p>
<pre>private int mActivePointerId;
public boolean onTouchEvent(MotionEvent event) {
....
// Get the pointer ID
mActivePointerId = event.getPointerId(0);
// ... Many touch events later...
// Use the pointer ID to find the index of the active pointer
// and fetch its position
int pointerIndex = event.findPointerIndex(mActivePointerId);
// Get the pointer's current position
float x = event.getX(pointerIndex);
float y = event.getY(pointerIndex);
}</pre>
<h2 id="action">Get a MotionEvent's Action</h2>
<p>You should always use the method
{@link android.view.MotionEvent#getActionMasked getActionMasked()} (or better yet, the compatability version
{@link android.support.v4.view.MotionEventCompat#getActionMasked MotionEventCompat.getActionMasked()}) to retrieve
the action of a
{@link android.view.MotionEvent}. Unlike the older {@link android.view.MotionEvent#getAction getAction()}
method, {@link android.support.v4.view.MotionEventCompat#getActionMasked getActionMasked()} is designed to work with
multiple pointers. It returns the masked action
being performed, without including the pointer index bits. You can then use
{@link android.support.v4.view.MotionEventCompat#getActionIndex getActionIndex()} to return the index of
the pointer associated with the action. This is illustrated in the snippet below.</p>
<p class="note"><strong>Note:</strong> This example uses the
{@link android.support.v4.view.MotionEventCompat}
class. This class is in the
<a href="{@docRoot}tools/extras/support-library.html">Support Library</a>. You should use
{@link android.support.v4.view.MotionEventCompat} to provide the best support for a wide range of
platforms. Note that {@link android.support.v4.view.MotionEventCompat} is <em>not</em> a
replacement for the {@link android.view.MotionEvent} class. Rather, it provides static utility
methods to which you pass your {@link android.view.MotionEvent} object in order to receive
the desired action associated with that event.</p>
<pre>int action = MotionEventCompat.getActionMasked(event);
// Get the index of the pointer associated with the action.
int index = MotionEventCompat.getActionIndex(event);
int xPos = -1;
int yPos = -1;
Log.d(DEBUG_TAG,"The action is " + actionToString(action));
if (event.getPointerCount() > 1) {
Log.d(DEBUG_TAG,"Multitouch event");
// The coordinates of the current screen contact, relative to
// the responding View or Activity.
xPos = (int)MotionEventCompat.getX(event, index);
yPos = (int)MotionEventCompat.getY(event, index);
} else {
// Single touch event
Log.d(DEBUG_TAG,"Single touch event");
xPos = (int)MotionEventCompat.getX(event, index);
yPos = (int)MotionEventCompat.getY(event, index);
}
...
// Given an action int, returns a string description
public static String actionToString(int action) {
switch (action) {
case MotionEvent.ACTION_DOWN: return "Down";
case MotionEvent.ACTION_MOVE: return "Move";
case MotionEvent.ACTION_POINTER_DOWN: return "Pointer Down";
case MotionEvent.ACTION_UP: return "Up";
case MotionEvent.ACTION_POINTER_UP: return "Pointer Up";
case MotionEvent.ACTION_OUTSIDE: return "Outside";
case MotionEvent.ACTION_CANCEL: return "Cancel";
}
return "";
}</pre>
<p>For more discussion of multi-touch and some examples, see the lesson <a href="scale.html">Dragging and Scaling</a>.

View File

@@ -0,0 +1,240 @@
page.title=Dragging and Scaling
parent.title=Using Touch Gestures
parent.link=index.html
trainingnavtop=true
next.title=Managing Touch Events in a ViewGroup
next.link=viewgroup.html
@jd:body
<div id="tb-wrapper">
<div id="tb">
<!-- table of contents -->
<h2>This lesson teaches you to</h2>
<ol>
<li><a href="#drag">Drag an Object</a></li>
<li><a href="#scale">Use Touch to Perform Scaling</a></li>
</ol>
<!-- other docs (NOT javadocs) -->
<h2>You should also read</h2>
<ul>
<li><a href="http://developer.android.com/guide/topics/ui/ui-events.html">Input Events</a> API Guide
</li>
<li><a href="{@docRoot}guide/topics/sensors/sensors_overview.html">Sensors Overview</a></li>
<li><a href="http://android-developers.blogspot.com/2010/06/making-sense-of-multitouch.html">Making Sense of Multitouch</a> blog post</li>
<li><a href="{@docRoot}training/custom-views/making-interactive.html">Making the View Interactive</a> </li>
<li>Design Guide for <a href="{@docRoot}design/patterns/gestures.html">Gestures</a></li>
<li>Design Guide for <a href="{@docRoot}design/style/touch-feedback.html">Touch Feedback</a></li>
</ul>
</div>
</div>
<p>This lesson describes how to use touch gestures to drag and scale on-screen
objects, using {@link android.view.View#onTouchEvent onTouchEvent()} to intercept
touch events. Here is the original <a
href="http://code.google.com/p/android-touchexample/">source code</a>
for the examples used in this lesson.
</p>
<h2 id="drag">Drag an Object</h2>
<p class="note">If you are targeting Android 3.0 or higher, you can use the built-in drag-and-drop event
listeners with {@link android.view.View.OnDragListener}, as described in
<a href="{@docRoot}guide/topics/ui/drag-drop.html">Drag and Drop</a>.
<p>A common operation for a touch gesture is to use it to drag an object across
the screen. The following snippet lets the user drag an on-screen image. Note
the following:</p>
<ul>
<li>In a drag (or scroll) operation, the app has to keep track of the original pointer
(finger), even if additional fingers get placed on the screen. For example,
imagine that while dragging the image around, the user places a second finger on
the touch screen and lifts the first finger. If your app is just tracking
individual pointers, it will regard the second pointer as the default and move
the image to that location.</li>
<li>To prevent this from happening, your app needs to distinguish between the
original pointer and any follow-on pointers. To do this, it tracks the
{@link android.view.MotionEvent#ACTION_POINTER_DOWN} and
{@link android.view.MotionEvent#ACTION_POINTER_UP} events described in
<a href="multi.html">Handling Multi-Touch Gestures</a>.
{@link android.view.MotionEvent#ACTION_POINTER_DOWN} and
{@link android.view.MotionEvent#ACTION_POINTER_UP} are
passed to the {@link android.view.View#onTouchEvent onTouchEvent()} callback
whenever a secondary pointer goes down or up. </li>
<li>In the {@link android.view.MotionEvent#ACTION_POINTER_UP} case, the example
extracts this index and ensures that the active pointer ID is not referring to a
pointer that is no longer touching the screen. If it is, the app selects a
different pointer to be active and saves its current X and Y position. Since
this saved position is used in the {@link android.view.MotionEvent#ACTION_MOVE}
case to calculate the distance to move the onscreen object, the app will always
calculate the distance to move using data from the correct pointer.</li>
</ul>
<p>The following snippet enables a user to drag an object around on the screen. It records the initial
position of the active pointer, calculates the distance the pointer traveled, and moves the object to the
new position. It correctly manages the possibility of additional pointers, as described
above.</p>
<p>Notice that the snippet uses the {@link android.view.MotionEvent#getActionMasked getActionMasked()} method.
You should always use this method (or better yet, the compatability version
{@link android.support.v4.view.MotionEventCompat#getActionMasked MotionEventCompat.getActionMasked()})
to retrieve the action of a
{@link android.view.MotionEvent}. Unlike the older
{@link android.view.MotionEvent#getAction getAction()}
method, {@link android.support.v4.view.MotionEventCompat#getActionMasked getActionMasked()}
is designed to work with multiple pointers. It returns the masked action
being performed, without including the pointer index bits.</p>
<pre>// The active pointer is the one currently moving our object.
private int mActivePointerId = INVALID_POINTER_ID;
&#64;Override
public boolean onTouchEvent(MotionEvent ev) {
// Let the ScaleGestureDetector inspect all events.
mScaleDetector.onTouchEvent(ev);
final int action = MotionEventCompat.getActionMasked(ev);
switch (action) {
case MotionEvent.ACTION_DOWN: {
final int pointerIndex = MotionEventCompat.getActionIndex(ev);
final float x = MotionEventCompat.getX(ev, pointerIndex);
final float y = MotionEventCompat.getY(ev, pointerIndex);
// Remember where we started (for dragging)
mLastTouchX = x;
mLastTouchY = y;
// Save the ID of this pointer (for dragging)
mActivePointerId = MotionEventCompat.getPointerId(ev, 0);
break;
}
case MotionEvent.ACTION_MOVE: {
// Find the index of the active pointer and fetch its position
final int pointerIndex =
MotionEventCompat.findPointerIndex(ev, mActivePointerId);
final float x = MotionEventCompat.getX(ev, pointerIndex);
final float y = MotionEventCompat.getY(ev, pointerIndex);
// Only move if the ScaleGestureDetector isn't processing a gesture.
if (!mScaleDetector.isInProgress()) {
// Calculate the distance moved
final float dx = x - mLastTouchX;
final float dy = y - mLastTouchY;
mPosX += dx;
mPosY += dy;
invalidate();
}
// Remember this touch position for the next move event
mLastTouchX = x;
mLastTouchY = y;
break;
}
case MotionEvent.ACTION_UP: {
mActivePointerId = INVALID_POINTER_ID;
break;
}
case MotionEvent.ACTION_CANCEL: {
mActivePointerId = INVALID_POINTER_ID;
break;
}
case MotionEvent.ACTION_POINTER_UP: {
final int pointerIndex = MotionEventCompat.getActionIndex(ev);
final int pointerId = MotionEventCompat.getPointerId(ev, pointerIndex);
if (pointerId == mActivePointerId) {
// This was our active pointer going up. Choose a new
// active pointer and adjust accordingly.
final int newPointerIndex = pointerIndex == 0 ? 1 : 0;
mLastTouchX = MotionEventCompat.getX(ev, newPointerIndex);
mLastTouchY = MotionEventCompat.getY(ev, newPointerIndex);
mActivePointerId = MotionEventCompat.getPointerId(ev, newPointerIndex);
}
break;
}
}
return true;
}</pre>
<h2 id="scale">Use Touch to Perform Scaling</h2>
<p>As discussed in <a href="detector.html">Detecting Common Gestures</a>,
{@link android.view.GestureDetector} helps you detect common gestures used by
Android such as scrolling, flinging, and long press. For scaling, Android
provides {@link android.view.ScaleGestureDetector}. {@link
android.view.GestureDetector} and {@link android.view.ScaleGestureDetector} can
be used together when you want a view to recognize additional gestures.</p>
<p>To report detected gesture events, gesture detectors use listener objects
passed to their constructors. {@link android.view.ScaleGestureDetector} uses
{@link android.view.ScaleGestureDetector.OnScaleGestureListener}.
Android provides
{@link android.view.ScaleGestureDetector.SimpleOnScaleGestureListener}
as a helper class that you can extend if you dont care about all of the reported events.</p>
<p>Here is a snippet that gives you the basic idea of how to perform scaling.
Here is the original <a
href="http://code.google.com/p/android-touchexample/">source code</a>
for the examples.</p>
<pre>private ScaleGestureDetector mScaleDetector;
private float mScaleFactor = 1.f;
public MyCustomView(Context mContext){
...
// View code goes here
...
mScaleDetector = new ScaleGestureDetector(context, new ScaleListener());
}
&#64;Override
public boolean onTouchEvent(MotionEvent ev) {
// Let the ScaleGestureDetector inspect all events.
mScaleDetector.onTouchEvent(ev);
return true;
}
&#64;Override
public void onDraw(Canvas canvas) {
super.onDraw(canvas);
canvas.save();
canvas.scale(mScaleFactor, mScaleFactor);
...
// onDraw() code goes here
...
canvas.restore();
}
private class ScaleListener
extends ScaleGestureDetector.SimpleOnScaleGestureListener {
&#64;Override
public boolean onScale(ScaleGestureDetector detector) {
mScaleFactor *= detector.getScaleFactor();
// Don't let the object get too small or too large.
mScaleFactor = Math.max(0.1f, Math.min(mScaleFactor, 5.0f));
invalidate();
return true;
}
}</pre>

View File

@@ -0,0 +1,161 @@
page.title=Animating a Scroll Gesture
parent.title=Using Touch Gestures
parent.link=index.html
trainingnavtop=true
next.title=Handling Multi-Touch Gestures
next.link=multi.html
@jd:body
<div id="tb-wrapper">
<div id="tb">
<!-- table of contents -->
<h2>This lesson teaches you to</h2>
<ol>
<li><a href="#scroll">Implement Touch-Based Scrolling</a></li>
</ol>
<!-- other docs (NOT javadocs) -->
<h2>You should also read</h2>
<ul>
<li><a href="http://developer.android.com/guide/topics/ui/ui-events.html">Input Events</a> API Guide
</li>
<li><a href="{@docRoot}guide/topics/sensors/sensors_overview.html">Sensors Overview</a></li>
<li><a href="http://android-developers.blogspot.com/2010/06/making-sense-of-multitouch.html">Making Sense of Multitouch</a> blog post</li>
<li><a href="{@docRoot}training/custom-views/making-interactive.html">Making the View Interactive</a> </li>
<li>Design Guide for <a href="{@docRoot}design/patterns/gestures.html">Gestures</a></li>
<li>Design Guide for <a href="{@docRoot}design/style/touch-feedback.html">Touch Feedback</a></li>
</ul>
</div>
</div>
<p>In Android, scrolling is typically achieved by using the
{@link android.widget.ScrollView}
class. Any standard layout that might extend beyond the bounds of its container should be
nested in a {@link android.widget.ScrollView} to provide a scrollable view that's
managed by the framework. Implementing a custom scroller should only be
necessary for special scenarios. This lesson describes such a scenario: displaying
a scrolling effect in response to touch gestures using <em>scrollers</em>.
<p>You can use scrollers ({@link android.widget.Scroller} or {@link
android.widget.OverScroller}) to collect the data you need to produce a
scrolling animation in response to a touch event. {@link
android.widget.Scroller} and {@link android.widget.OverScroller} are largely
interchangeable&mdash;the difference is that {@link android.widget.OverScroller}
allows temporarily scrolling beyond the minimum/maximum boundaries and springing
back to the bounds. This is normally rendered using a "glow" effect, provided by
the {@link android.widget.EdgeEffect} or {@link
android.support.v4.widget.EdgeEffectCompat} classes. </p>
<p>A scroller is used to animate scrolling over time, using platform-standard
scrolling physics (friction, velocity, etc.). The scroller itself doesn't
actually draw anything. Scrollers track scroll offsets for you over time, but
they don't automatically apply those positions to your view. It's your
responsibility to get and apply new coordinates at a rate that will make the
scrolling animation look smooth.</p>
<p class="note"><strong>Note:</strong> You generally only need to use scrollers
when implementing scrolling yourself. {@link android.widget.ScrollView} and
{@link android.widget.HorizontalScrollView} do all this for you do all of this for you if you nest your layout within them.</p>
<h2 id = "scroll">Implement Touch-Based Scrolling</h2>
<p>This snippet illustrates the basics of using a scroller. It uses a
{@link android.view.GestureDetector}, and overrides the
{@link android.view.GestureDetector.SimpleOnGestureListener} methods
{@link android.view.GestureDetector.OnGestureListener#onDown onDown()} and
{@link android.view.GestureDetector.OnGestureListener#onFling onFling()}. It also
overrides {@link android.view.GestureDetector.OnGestureListener#onScroll onScroll()}
to return {@code false} since you don't need to animate a scroll.</p>
<p>It's common to use scrollers in conjunction with a fling gesture, but they
can be used in pretty much any context where you want the UI to display
scrolling in response to a touch event. For example, you could override {@link
android.view.View#onTouchEvent onTouchEvent()} to process touch events directly,
and produce a scrolling effect in response to those touch events.</p>
<pre>
private OverScroller mScroller = new OverScroller(context);
private GestureDetector.SimpleOnGestureListener mGestureListener
= new GestureDetector.SimpleOnGestureListener() {
&#64;Override
public boolean onDown(MotionEvent e) {
// Abort any active scroll animations and invalidate.
mScroller.forceFinished(true);
// There is also a compatibility version:
// ViewCompat.postInvalidateOnAnimation
postInvalidateOnAnimation();
return true;
}
&#64;Override
public boolean onScroll(MotionEvent e1, MotionEvent e2,
float distanceX, float distanceY) {
// You don't use a scroller in onScroll because you don't need to animate
// a scroll. The scroll occurs instantly in response to touch feedback.
return false;
}
&#64;Override
public boolean onFling(MotionEvent e1, MotionEvent e2,
float velocityX, float velocityY) {
// Before flinging, abort the current animation.
mScroller.forceFinished(true);
// Begin the scroll animation
mScroller.fling(
// Current scroll position
startX,
startY,
// Velocities, negated for natural touch response
(int) -velocityX,
(int) -velocityY,
// Minimum and maximum scroll positions. The minimum scroll
// position is generally zero and the maximum scroll position
// is generally the content size less the screen size. So if the
// content width is 1000 pixels and the screen width is 200
// pixels, the maximum scroll offset should be 800 pixels.
minX, maxX,
minY, maxY,
// The maximum overscroll bounds. This is useful when using
// the EdgeEffect class to draw overscroll "glow" overlays.
mContentRect.width() / 2,
mContentRect.height() / 2);
// Invalidate to trigger computeScroll()
postInvalidateOnAnimation();
return true;
}
};
&#64;Override
public void computeScroll() {
super.computeScroll();
// Compute the current scroll offsets. If this returns true, then the
// scroll has not yet finished.
if (mScroller.computeScrollOffset()) {
int currX = mScroller.getCurrX();
int currY = mScroller.getCurrY();
// Actually render the scrolled viewport, or actually scroll the
// view using View.scrollTo.
// If currX or currY are outside the bounds, render the overscroll
// glow using EdgeEffect.
} else {
// The scroll has finished.
}
}</pre>
<p>For another example of scroller usage, see the <a href="http://github.com/android/platform_frameworks_support/blob/master/v4/java/android/support/v4/view/ViewPager.java">source code</a> for the
{@link android.support.v4.view.ViewPager} class. It scrolls in response to flings,
and uses scrolling to implement the "snapping to page" animation.</p>

View File

@@ -0,0 +1,302 @@
page.title=Managing Touch Events in a ViewGroup
parent.title=Using Touch Gestures
parent.link=index.html
trainingnavtop=true
next.title=
next.link=
@jd:body
<div id="tb-wrapper">
<div id="tb">
<!-- table of contents -->
<h2>This lesson teaches you to</h2>
<ol>
<li><a href="#intercept">Intercept Touch Events in a ViewGroup</a></li>
<li><a href="#vc">Use ViewConfiguration Constants</a></li>
<li><a href="#delegate">Extend a Child View's Touchable Area</a></li>
</ol>
<!-- other docs (NOT javadocs) -->
<h2>You should also read</h2>
<ul>
<li><a href="http://developer.android.com/guide/topics/ui/ui-events.html">Input Events</a> API Guide
</li>
<li><a href="{@docRoot}guide/topics/sensors/sensors_overview.html">Sensors Overview</a></li>
<li><a href="http://android-developers.blogspot.com/2010/06/making-sense-of-multitouch.html">Making Sense of Multitouch</a> blog post</li>
<li><a href="{@docRoot}training/custom-views/making-interactive.html">Making the View Interactive</a> </li>
<li>Design Guide for <a href="{@docRoot}design/patterns/gestures.html">Gestures</a></li>
<li>Design Guide for <a href="{@docRoot}design/style/touch-feedback.html">Touch Feedback</a></li>
</ul>
</div>
</div>
<p>Handling touch events in a {@link android.view.ViewGroup} takes special care,
because it's common for a {@link android.view.ViewGroup} to have children that
are targets for different touch events than the {@link android.view.ViewGroup}
itself. To make sure that each view correctly receives the touch events intended
for it, override the {@link android.view.ViewGroup#onInterceptTouchEvent
onInterceptTouchEvent()} method.</p>
<h2 id="intercept">Intercept Touch Events in a ViewGroup</h2>
<p>The {@link android.view.ViewGroup#onInterceptTouchEvent onInterceptTouchEvent()}
method is called whenever a touch event is detected on the surface of a
{@link android.view.ViewGroup}, including on the surface of its children. If
{@link android.view.ViewGroup#onInterceptTouchEvent onInterceptTouchEvent()}
returns {@code true}, the {@link android.view.MotionEvent} is intercepted,
meaning it will be not be passed on to the child, but rather to the
{@link android.view.View#onTouchEvent onTouchEvent()} method of the parent.</p>
<p>The {@link android.view.ViewGroup#onInterceptTouchEvent onInterceptTouchEvent()}
method gives a parent the chance to see any touch event before its children do.
If you return {@code true} from
{@link android.view.ViewGroup#onInterceptTouchEvent onInterceptTouchEvent()},
the child view that was previously handling touch events
receives an {@link android.view.MotionEvent#ACTION_CANCEL}, and the events from that
point forward are sent to the parent's
{@link android.view.View#onTouchEvent onTouchEvent()} method for the usual handling.
{@link android.view.ViewGroup#onInterceptTouchEvent onInterceptTouchEvent()} can also
return {@code false} and simply spy on events as they travel down the view hierarchy
to their usual targets, which will handle the events with their own
{@link android.view.View#onTouchEvent onTouchEvent()}.
<p>In the following snippet, the class {@code MyViewGroup} extends
{@link android.view.ViewGroup}.
{@code MyViewGroup} contains multiple child views. If you drag your finger across
a child view horizontally, the child view should no longer get touch events, and
{@code MyViewGroup} should handle touch events by scrolling its contents. However,
if you press buttons in the child view, or scroll the child view vertically,
the parent shouldn't intercept those touch events, because the child is the
intended target. In those cases,
{@link android.view.ViewGroup#onInterceptTouchEvent onInterceptTouchEvent()} should
return {@code false}, and {@code MyViewGroup}'s
{@link android.view.View#onTouchEvent onTouchEvent()} won't be called.</p>
<pre>public class MyViewGroup extends ViewGroup {
private int mTouchSlop;
...
ViewConfiguration vc = ViewConfiguration.get(view.getContext());
mTouchSlop = vc.getScaledTouchSlop();
...
&#64;Override
public boolean onInterceptTouchEvent(MotionEvent ev) {
/*
* This method JUST determines whether we want to intercept the motion.
* If we return true, onTouchEvent will be called and we do the actual
* scrolling there.
*/
final int action = MotionEventCompat.getActionMasked(ev);
// Always handle the case of the touch gesture being complete.
if (action == MotionEvent.ACTION_CANCEL || action == MotionEvent.ACTION_UP) {
// Release the scroll.
mIsScrolling = false;
return false; // Do not intercept touch event, let the child handle it
}
switch (action) {
case MotionEvent.ACTION_MOVE: {
if (mIsScrolling) {
// We're currently scrolling, so yes, intercept the
// touch event!
return true;
}
// If the user has dragged her finger horizontally more than
// the touch slop, start the scroll
// left as an exercise for the reader
final int xDiff = calculateDistanceX(ev);
// Touch slop should be calculated using ViewConfiguration
// constants.
if (xDiff > mTouchSlop) {
// Start scrolling!
mIsScrolling = true;
return true;
}
break;
}
...
}
// In general, we don't want to intercept touch events. They should be
// handled by the child view.
return false;
}
&#64;Override
public boolean onTouchEvent(MotionEvent ev) {
// Here we actually handle the touch event (e.g. if the action is ACTION_MOVE,
// scroll this container).
// This method will only be called if the touch event was intercepted in
// onInterceptTouchEvent
...
}
}</pre>
<p>Note that {@link android.view.ViewGroup} also provides a
{@link android.view.ViewGroup#requestDisallowInterceptTouchEvent requestDisallowInterceptTouchEvent()} method.
The {@link android.view.ViewGroup} calls this method when a child does not want the parent and its
ancestors to intercept touch events with
{@link android.view.ViewGroup#onInterceptTouchEvent onInterceptTouchEvent()}.
</p>
<h2 id="vc">Use ViewConfiguration Constants</h2>
<p>The above snippet uses the current {@link android.view.ViewConfiguration} to initialize
a variable called {@code mTouchSlop}. You can use the {@link
android.view.ViewConfiguration} class to access common distances, speeds, and
times used by the Android system.</p>
<p>"Touch slop" refers to the distance in pixels a user's touch can wander
before the gesture is interpreted as scrolling. Touch slop is typically used to
prevent accidental scrolling when the user is performing some other touch
operation, such as touching on-screen elements.</p>
<p>Two other commonly used {@link android.view.ViewConfiguration} methods are
{@link android.view.ViewConfiguration#getScaledMinimumFlingVelocity getScaledMinimumFlingVelocity()}
and {@link android.view.ViewConfiguration#getScaledMaximumFlingVelocity getScaledMaximumFlingVelocity()}.
These methods return the minimum and maximum velocity (respectively) to initiate a fling,
as measured in pixels per second. For example:</p>
<pre>ViewConfiguration vc = ViewConfiguration.get(view.getContext());
private int mSlop = vc.getScaledTouchSlop();
private int mMinFlingVelocity = vc.getScaledMinimumFlingVelocity();
private int mMaxFlingVelocity = vc.getScaledMaximumFlingVelocity();
...
case MotionEvent.ACTION_MOVE: {
...
float deltaX = motionEvent.getRawX() - mDownX;
if (Math.abs(deltaX) > mSlop) {
// A swipe occurred, do something
}
...
case MotionEvent.ACTION_UP: {
...
} if (mMinFlingVelocity <= velocityX && velocityX <= mMaxFlingVelocity
&& velocityY < velocityX) {
// The criteria have been satisfied, do something
}
}</pre>
<h2 id="delegate">Extend a Child View's Touchable Area</h2>
<p>Android provides the {@link android.view.TouchDelegate} class to make it possible
for a parent to extend the touchable area of a child view beyond the child's bounds.
This is useful when the child has to be small, but should have a larger touch region. You can
also use this approach to shrink the child's touch region if need be.</p>
<p>In the following example, an {@link android.widget.ImageButton} is the
"delegate view" (that is, the child whose touch area the parent will extend).
Here is the layout file:</p>
<pre>
&lt;RelativeLayout xmlns:android=&quot;http://schemas.android.com/apk/res/android&quot;
android:id=&quot;@+id/parent_layout&quot;
android:layout_width=&quot;match_parent&quot;
android:layout_height=&quot;match_parent&quot;
tools:context=&quot;.MainActivity&quot; &gt;
&lt;ImageButton android:id=&quot;@+id/button&quot;
android:layout_width=&quot;wrap_content&quot;
android:layout_height=&quot;wrap_content&quot;
android:background=&quot;@null&quot;
android:src=&quot;@drawable/icon&quot; /&gt;
&lt;/RelativeLayout&gt;
</pre>
<p>The snippet below does the following:</p>
<ul>
<li>Gets the parent view and posts a {@link java.lang.Runnable} on the UI thread. This ensures that the parent lays out its children before calling the {@link android.view.View#getHitRect getHitRect()} method. The {@link android.view.View#getHitRect getHitRect()} method gets the child's hit rectangle (touchable area) in the parent's coordinates.</li>
<li>Finds the {@link android.widget.ImageButton} child view and calls {@link android.view.View#getHitRect getHitRect()} to get the bounds of the child's touchable area.</li>
<li>Extends the bounds of the {@link android.widget.ImageButton}'s hit rectangle.</li>
<li>Instantiates a {@link android.view.TouchDelegate}, passing in the expanded hit rectangle and the {@link android.widget.ImageButton} child view as parameters.</li>
<li>Sets the {@link android.view.TouchDelegate} on the parent view, such that touches within the touch delegate bounds are routed to the child.</li>
</ul>
In its capacity as touch delegate for the {@link android.widget.ImageButton} child view, the
parent view will receive all touch events. If the touch event occurred within the child's hit
rectangle, the parent will pass the touch
event to the child for handling.</p>
<pre>
public class MainActivity extends Activity {
&#64;Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
// Get the parent view
View parentView = findViewById(R.id.parent_layout);
parentView.post(new Runnable() {
// Post in the parent's message queue to make sure the parent
// lays out its children before you call getHitRect()
&#64;Override
public void run() {
// The bounds for the delegate view (an ImageButton
// in this example)
Rect delegateArea = new Rect();
ImageButton myButton = (ImageButton) findViewById(R.id.button);
myButton.setEnabled(true);
myButton.setOnClickListener(new View.OnClickListener() {
&#64;Override
public void onClick(View view) {
Toast.makeText(MainActivity.this,
"Touch occurred within ImageButton touch region.",
Toast.LENGTH_SHORT).show();
}
});
// The hit rectangle for the ImageButton
myButton.getHitRect(delegateArea);
// Extend the touch area of the ImageButton beyond its bounds
// on the right and bottom.
delegateArea.right += 100;
delegateArea.bottom += 100;
// Instantiate a TouchDelegate.
// "delegateArea" is the bounds in local coordinates of
// the containing view to be mapped to the delegate view.
// "myButton" is the child view that should receive motion
// events.
TouchDelegate touchDelegate = new TouchDelegate(delegateArea,
myButton);
// Sets the TouchDelegate on the parent view, such that touches
// within the touch delegate bounds are routed to the child.
if (View.class.isInstance(myButton.getParent())) {
((View) myButton.getParent()).setTouchDelegate(touchDelegate);
}
}
});
}
}</pre>

View File

@@ -799,7 +799,51 @@
</li>
<!-- End best UX and UI -->
<li class="nav-section">
<div class="nav-section-header">
<a href="<?cs var:toroot ?>training/best-performance.html">
<span class="small">Best Practices for</span><br/>
User Input
</a>
</div>
<ul>
<li class="nav-section">
<div class="nav-section-header">
<a href="<?cs var:toroot ?>training/gestures/index.html"
description=
"How to write apps that allow users to interact with the touch screen via touch gestures."
>Using Touch Gestures</a>
</div>
<ul>
<li><a href="<?cs var:toroot ?>training/gestures/detector.html">
Detecting Common Gestures
</a>
</li>
<li><a href="<?cs var:toroot ?>training/gestures/movement.html">
Tracking Movement
</a>
</li>
<li><a href="<?cs var:toroot ?>training/gestures/scroll.html">
Animating a Scroll Gesture
</a>
</li>
<li><a href="<?cs var:toroot ?>training/gestures/multi.html">
Handling Multi-Touch Gestures
</a>
</li>
<li><a href="<?cs var:toroot ?>training/gestures/scale.html">
Dragging and Scaling
</a>
</li>
<li><a href="<?cs var:toroot ?>training/gestures/viewgroup.html">
Managing Touch Events in a ViewGroup
</a>
</li>
</ul>
</li>
</ul>
</li> <!-- end of User Input -->
<li class="nav-section">
<div class="nav-section-header">