From b4839dee245931effd9b67cf0f34d5b0bca50613 Mon Sep 17 00:00:00 2001 From: Kevin Hufnagle Date: Mon, 25 Apr 2016 13:33:53 -0700 Subject: [PATCH] docs: Updated descriptions of device orientation angles. Given recent reversal of "roll" definition (positive angles now represent counter-clockwise rotation), updated the description for this and the other orientation angles. Also updated explanatory text and code samples within the "Position Sensors" page to reflect the recent deprecation of STRING_TYPE_ORIENTATION. Bug: 23822069 Change-Id: I083a55011ea41c4a6533b78ee38a32479310f4cf --- core/java/android/hardware/SensorManager.java | 37 ++- .../guide/topics/sensors/sensors_position.jd | 259 ++++++++++++------ 2 files changed, 196 insertions(+), 100 deletions(-) diff --git a/core/java/android/hardware/SensorManager.java b/core/java/android/hardware/SensorManager.java index 5d405f92e380b..eff7a9833f107 100644 --- a/core/java/android/hardware/SensorManager.java +++ b/core/java/android/hardware/SensorManager.java @@ -1227,20 +1227,35 @@ public abstract class SensorManager { /** * Computes the device's orientation based on the rotation matrix. *

- * When it returns, the array values is filled with the result: + * When it returns, the array values are as follows: *

*

- * Applying these three intrinsic rotations in azimuth, pitch and roll order transforms - * identity matrix to the rotation matrix given in input R. - * All three angles above are in radians and positive in the - * counter-clockwise direction. Range of output is: azimuth from -π to π, - * pitch from -π/2 to π/2 and roll from -π to π. + * Applying these three rotations in the azimuth, pitch, roll order + * transforms an identity matrix to the rotation matrix passed into this + * method. Also, note that all three orientation angles are expressed in + * radians. * * @param R * rotation matrix see {@link #getRotationMatrix}. diff --git a/docs/html/guide/topics/sensors/sensors_position.jd b/docs/html/guide/topics/sensors/sensors_position.jd index d0ddeadf21661..5ec16c71537c5 100644 --- a/docs/html/guide/topics/sensors/sensors_position.jd +++ b/docs/html/guide/topics/sensors/sensors_position.jd @@ -8,7 +8,7 @@ page.tags=sensorevent,orientation,proximity

  1. Using the Game Rotation Vector Sensor
  2. Using the Geomagnetic Rotation Vector Sensor
  3. -
  4. Using the Orientation Sensor
  5. +
  6. Computing the Device's Orientation
  7. Using the Geomagnetic Field Sensor
  8. Using the Proximity Sensor
@@ -42,38 +42,49 @@ href="{@docRoot}resources/samples/ApiDemos/src/com/example/android/apis/os/Senso -

The Android platform provides two sensors that let you determine the position of a device: the -geomagnetic field sensor and the orientation sensor. The Android platform also -provides a sensor that lets you determine how close the face of a device is to an object (known as -the proximity sensor). The geomagnetic field sensor and the proximity sensor are hardware-based. -Most -handset and tablet manufacturers include a geomagnetic field sensor. Likewise, handset manufacturers -usually include a proximity sensor to determine when a handset is being held close to a user's face -(for example, during a phone call). The orientation sensor is software-based and derives its data -from the accelerometer and the geomagnetic field sensor.

+

+ The Android platform provides two sensors that let you determine the position + of a device: the geomagnetic field sensor and the accelerometer. The Android + platform also provides a sensor that lets you determine how close the face of + a device is to an object (known as the proximity sensor). The + geomagnetic field sensor and the proximity sensor are hardware-based. Most + handset and tablet manufacturers include a geomagnetic field sensor. Likewise, + handset manufacturers usually include a proximity sensor to determine when a + handset is being held close to a user's face (for example, during a phone + call). For determining a device's orientation, you can use the readings from + the device's accelerometer and the geomagnetic field sensor. +

-

Note: The orientation sensor was deprecated in Android 2.2 (API -Level 8).

+

+ Note: The orientation sensor was deprecated in Android 2.2 + (API level 8), and the orientation sensor type was deprecated in Android 4.4W + (API level 20). +

-

Position sensors are useful for determining a device's physical position in the -world's frame of reference. For example, you can use the geomagnetic field sensor in -combination with the accelerometer to determine a device's position relative to -the magnetic North Pole. You can also use the orientation sensor (or similar sensor-based -orientation methods) to determine a device's position in your application's frame of reference. -Position sensors are not typically used to monitor device movement or motion, such as shake, tilt, -or thrust (for more information, see Motion Sensors).

+

+ Position sensors are useful for determining a device's physical position in + the world's frame of reference. For example, you can use the geomagnetic field + sensor in combination with the accelerometer to determine a device's position + relative to the magnetic north pole. You can also use these sensors to + determine a device's orientation in your application's frame of reference. + Position sensors are not typically used to monitor device movement or motion, + such as shake, tilt, or thrust (for more information, see Motion Sensors). +

-

The geomagnetic field sensor and orientation sensor return multi-dimensional arrays of sensor -values -for each {@link android.hardware.SensorEvent}. For example, the orientation sensor provides -geomagnetic -field strength values for each of the three coordinate axes during a single sensor event. Likewise, -the orientation sensor provides azimuth (yaw), pitch, and roll values during a single sensor event. -For more information about the coordinate systems that are used by sensors, see Sensor Coordinate -Systems. The proximity sensor provides a single value for each sensor event. Table 1 summarizes -the position sensors that are supported on the Android platform.

+

+ The geomagnetic field sensor and accelerometer return multi-dimensional arrays + of sensor values for each {@link android.hardware.SensorEvent}. For example, + the geomagnetic field sensor provides geomagnetic field strength values for + each of the three coordinate axes during a single sensor event. Likewise, the + accelerometer sensor measures the acceleration applied to the device during a + sensor event. For more information about the coordinate systems that are used + by sensors, see + Sensor Coordinate Systems. The proximity sensor provides a single value + for each sensor event. Table 1 summarizes the position sensors that are + supported on the Android platform. +

Table 1. Position sensors that are supported on the Android platform.

@@ -174,14 +185,17 @@ the position sensors that are supported on the Android platform.

-

1 This sensor was deprecated in Android 2.2 (API Level - 8). The sensor framework provides alternate methods for acquiring device orientation, which are -discussed in Using the Orientation Sensor.

+

+ 1This sensor was deprecated in Android 2.2 (API + level 8), and this sensor type was deprecated in Android 4.4W (API level 20). + The sensor framework provides alternate methods for acquiring device + orientation, which are discussed in Computing + the Device's Orientation. +

2 Some proximity sensors provide only binary values representing near and far.

-

Using the Game Rotation Vector Sensor

The game rotation vector sensor is identical to the @@ -228,71 +242,106 @@ mSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_GEOMAGNETIC_ROTATION_VECTO -

Using the Orientation Sensor

- -

The orientation sensor lets you monitor the position of a device relative to the earth's frame of -reference (specifically, magnetic north). The following code shows you how to get an instance of the -default orientation sensor:

+

Computing the Device's Orientation

+

By computing a device's orientation, you can monitor the position of the + device relative to the earth's frame of reference (specifically, the magnetic + north pole). The following code shows you how to compute a device's + orientation: +

 private SensorManager mSensorManager;
-private Sensor mSensor;
 ...
-mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
-mSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_ORIENTATION);
+// Rotation matrix based on current readings from accelerometer and magnetometer.
+final float[] rotationMatrix = new float[9];
+mSensorManager.getRotationMatrix(rotationMatrix, null,
+  accelerometerReading, magnetometerReading);
+
+// Express the updated rotation matrix as three orientation angles.
+final float[] orientationAngles = new float[3];
+mSensorManager.getOrientation(rotationMatrix, orientationAngles);
 
- -

The orientation sensor derives its data by using a device's geomagnetic field sensor in -combination with a device's accelerometer. Using these two hardware sensors, an orientation sensor -provides data for the following three dimensions:

- +

The system computes the orientation angles by using a device's geomagnetic + field sensor in combination with the device's accelerometer. Using these two + hardware sensors, the system provides data for the following three + orientation angles: +

-

This definition is different from yaw, pitch, and roll used in aviation, where the X axis is -along the long side of the plane (tail to nose). Also, for historical reasons the roll angle is -positive in the clockwise direction (mathematically speaking, it should be positive in the -counter-clockwise direction).

+

+ Note:The sensor's roll definition has changed to reflect the + vast majority of implementations in the geosensor ecosystem. +

-

The orientation sensor derives its data by processing the raw sensor data from the accelerometer -and the geomagnetic field sensor. Because of the heavy processing that is involved, the accuracy and -precision of the orientation sensor is diminished (specifically, this sensor is only reliable when -the roll component is 0). As a result, the orientation sensor was deprecated in Android 2.2 (API -level 8). Instead of using raw data from the orientation sensor, we recommend that you use the -{@link android.hardware.SensorManager#getRotationMatrix getRotationMatrix()} method in conjunction -with the {@link android.hardware#getOrientation getOrientation()} method to compute orientation -values. You can also use the {@link android.hardware.SensorManager#remapCoordinateSystem -remapCoordinateSystem()} method to translate the orientation values to your application's frame of -reference.

+

+ Note that these angles work off of a different coordinate system than the + one used in aviation (for yaw, pitch, and roll). In the aviation system, the + x axis is along the long side of the plane, from tail to nose. +

-

The following code sample shows how to acquire orientation data directly from the orientation -sensor. We recommend that you do this only if a device has negligible roll.

+

+ The orientation sensor derives its data by processing the raw sensor data + from the accelerometer and the geomagnetic field sensor. Because of the heavy + processing that is involved, the accuracy and precision of the orientation + sensor is diminished. Specifically, this sensor is reliable only when the roll + angle is 0. As a result, the orientation sensor was deprecated in Android + 2.2 (API level 8), and the orientation sensor type was deprecated in Android + 4.4W (API level 20). + Instead of using raw data from the orientation sensor, we recommend that you + use the {@link android.hardware.SensorManager#getRotationMatrix getRotationMatrix()} + method in conjunction with the + {@link android.hardware.SensorManager#getOrientation getOrientation()} method + to compute orientation values, as shown in the following code sample. As part + of this process, you can use the + {@link android.hardware.SensorManager#remapCoordinateSystem remapCoordinateSystem()} + method to translate the orientation values to your application's frame of + reference. +

 public class SensorActivity extends Activity implements SensorEventListener {
 
   private SensorManager mSensorManager;
-  private Sensor mOrientation;
+  private final float[] mAccelerometerReading = new float[3];
+  private final float[] mMagnetometerReading = new float[3];
+
+  private final float[] mRotationMatrix = new float[9];
+  private final float[] mOrientationAngles = new float[3];
 
   @Override
   public void onCreate(Bundle savedInstanceState) {
     super.onCreate(savedInstanceState);
     setContentView(R.layout.main);
-
     mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
-    mOrientation = mSensorManager.getDefaultSensor(Sensor.TYPE_ORIENTATION);
   }
 
   @Override
@@ -304,31 +353,63 @@ public class SensorActivity extends Activity implements SensorEventListener {
   @Override
   protected void onResume() {
     super.onResume();
-    mSensorManager.registerListener(this, mOrientation, SensorManager.SENSOR_DELAY_NORMAL);
+
+    // Get updates from the accelerometer and magnetometer at a constant rate.
+    // To make batch operations more efficient and reduce power consumption,
+    // provide support for delaying updates to the application.
+    //
+    // In this example, the sensor reporting delay is small enough such that
+    // the application receives an update before the system checks the sensor
+    // readings again.
+    mSensorManager.registerListener(this, Sensor.TYPE_ACCELEROMETER,
+      SensorManager.SENSOR_DELAY_NORMAL, SensorManager.SENSOR_DELAY_UI);
+    mSensorManager.registerListener(this, Sensor.TYPE_MAGNETIC_FIELD,
+      SensorManager.SENSOR_DELAY_NORMAL, SensorManager.SENSOR_DELAY_UI);
   }
 
   @Override
   protected void onPause() {
     super.onPause();
+
+    // Don't receive any more updates from either sensor.
     mSensorManager.unregisterListener(this);
   }
 
+  // Get readings from accelerometer and magnetometer. To simplify calculations,
+  // consider storing these readings as unit vectors.
   @Override
   public void onSensorChanged(SensorEvent event) {
-    float azimuth_angle = event.values[0];
-    float pitch_angle = event.values[1];
-    float roll_angle = event.values[2];
-    // Do something with these orientation angles.
+    if (event.sensor == Sensor.TYPE_ACCELEROMETER) {
+      System.arraycopy(event.values, 0, mAccelerometerReading,
+        0, mAccelerometerReading.length);
+    }
+    else if (event.sensor == Sensor.TYPE_MAGNETIC_FIELD) {
+      System.arraycopy(event.values, 0, mMagnetometerReading,
+        0, mMagnetometerReading.length);
+    }
+  }
+
+  // Compute the three orientation angles based on the most recent readings from
+  // the device's accelerometer and magnetometer.
+  public void updateOrientationAngles() {
+    // Update rotation matrix, which is needed to update orientation angles.
+    mSensorManager.getRotationMatrix(mRotationMatrix, null,
+      mAccelerometerReading, mMagnetometerReading);
+
+    // "mRotationMatrix" now has up-to-date information.
+
+    mSensorManager.getOrientation(mRotationMatrix, mOrientationAngles);
+
+    // "mOrientationAngles" now has up-to-date information.
   }
 }
 
-

You do not usually need to perform any data processing or filtering of the raw data that you -obtain from an orientation sensor, other than translating the sensor's coordinate system to your -application's frame of reference. The Accelerometer Play sample shows -you how to translate acceleration sensor data into another frame of reference; the technique is -similar to the one you might use with the orientation sensor.

+

+ You don't usually need to perform any data processing or filtering of the + device's raw orientation angles other than translating the sensor's + coordinate system to your application's frame of reference. +

Using the Geomagnetic Field Sensor