Android Orientation Sensor (Direction Sensor) details and applications, androidorientation

Source: Internet
Author: User
Tags asin

Android Orientation Sensor (Direction Sensor) details and applications, androidorientation



I. Preface



This blog is my first article on "Android advanced". It has been around for more than four months since I started Android, and I have some experiences, I can also do something on my own. Thanks to the android development tips of our company and the omnipotent help and advice from Hongyang. This series of blogs will record the key knowledge points and skills that I have encountered during development and learning. Simply put, they are no longer basic tutorials. As the project needs to use a direction sensor, we will take this opportunity to learn about the Android sensor, which is naturally the content of this blog.



Ii. Sensor Basics



The official documentation clearly states that the Android platform supports three categories of sensors:

A. Motion sensors

B. Environmental sensors

C. Position sensors


From another perspective, Android sensors can be divided into hardware-based and software-based. Hardware-based sensors are often implemented through physical components. They usually obtain data by measuring attributes of special environments, such as changes in gravity acceleration, geomagnetic intensity, or azimuth angle. Software-based sensors do not rely on physical devices, though they mimic hardware-based sensors. Software-based sensors usually obtain data through one or more hardware sensors, and sometimes call virtual sensors or artificial sensors, linear accelerometer and gravity sensor are examples of software-based sensors. The following figure shows all the sensor types supported by the Android platform:





When using sensors, you must first understand the Sensor API. In Android, the Sensor class is represented by the Sensor class, which belongs to android. classes in the hardware package, as the name suggests, are hardware-related classes. The sensor API is not complex and contains three classes and one interface:

SensorManager

Sensor

SensorEvent

SensorEventListener


Based on the overview in the official documentation, the following describes the usefulness of these four APIs:

SensorManager: You can use this class to create an instance of the sensor service. This class provides various methods to access the sensor list, register or unregister sensor event monitoring, and obtain location information.

Sensor: used to create a specific Sensor instance. The method provided by this class allows you to determine the function of a Sensor.

SensorEvent: the system will create a sensor event object through this class, provide a sensor event information, including the following content, native sensor data, trigger sensor event types, precise data, and event occurrence time.

SensorEventListener: You can use this interface to create two callback usage to receive sensor Event Notifications, for example, when the sensor value changes.


After introducing the basic classification, let's look at the sensor availability table. Different sensors differ in different Android versions. For example, some can be used in lower versions, however, it was discarded in the high version. For detailed data, see the official table:



The version marked with 1 in the upper right corner is added in Android1.5 and cannot be used after Android2.3.

2 in the upper-right corner is outdated.

Obviously, we need a direction sensor.TYPE_ORIENTATIONIt is out of date, and I will talk about how to replace it later.


Finally, let's take a look at the common sensor methods:

1. instantiate SensorManager

SensorManager mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);

2. Obtain the List of all Sensor supported by the device

List<Sensor> deviceSensors = mSensorManager.getSensorList(Sensor.TYPE_ALL);


The following two methods are used to check which sensors are supported by the mobile phone and display the data in the list. The code is very simple:

Package com. example. sensordemo; import java. util. arrayList; import java. util. list; import android. app. activity; import android. content. context; import android. hardware. sensor; import android. hardware. sensorManager; import android. OS. bundle; import android. widget. arrayAdapter; import android. widget. listView; public class MainActivity extends Activity {private SensorManager mSensorManager; private ListView sensorListView; private List <Sensor> sensorList; @ Overrideprotected void onCreate (Bundle savedInstanceState) {super. onCreate (savedInstanceState); setContentView (R. layout. activity_main); sensorListView = (ListView) findViewById (R. id. lv_all_sensors); // instantiate the sensor manager mSensorManager = (SensorManager) getSystemService (Context. SENSOR_SERVICE); // obtain the ListsensorList = mSensorManager for all the sensors supported by the settings. getSensorList (Sensor. TYPE_ALL); List <String> sensorNameList = new ArrayList <String> (); for (Sensor sensor: sensorList) {sensorNameList. add (sensor. getName ();} ArrayAdapter <String> adapter = new ArrayAdapter <String> (this, android. r. layout. simple_list_item_1, sensorNameList); sensorListView. setAdapter (adapter );}}

Finally, let's take a look at the real machine's:



After learning about the basic Sensor knowledge, let's take a look at the Orientation Sensor we need.



Iii. Orientation Sensor



Android provides two sensors for us to determine the location of the device, the geomagnetic field sensor and the orientation sensor ). The overview of Orientation Sensor in the official documentation contains the following sentence:

The orientation sensor is software-based and derives its data from the accelerometer and the geomagnetic field sensor. (The direction sensor is software-based and its data is obtained through the acceleration sensor and the magnetic field sensor)

As for the specific algorithm Android platform, we don't have to worry about implementation. We need to understand an important concept: Sensor Coordinate System ).


In Android, a sensor framework usually uses a standard three-dimensional coordinate system to represent a value. Taking a direction sensor as an example, determining a direction also requires a three-dimensional coordinate. After all, our devices cannot be horizontally mounted forever, to be accurate, android returns a float array with a length of 3 and contains values in three directions. Next, let's take a look at the coordinate system used by the sensor API officially provided:



Taking a closer look at this figure, it is not difficult to find that z is the azimuth angle pointing to the center of gravity, and the X axis is the inverted angle (beginning from the static state ), the Y axis is the turning angle (from the rest state to the left and right reverse ). Next, let's take a look at how to get directions through the direction sensor API. Take a look at the returned values of the direction sensor event provided by Alibaba Cloud Based on the figure above:


In this way, it is consistent with the above mentioned. A float array with a length of 3 is used to represent a location information, and the first element of the array represents the azimuth (Z axis ), the second element of the array indicates the inverted angle (X axis), and the third element of the array indicates the turning angle (Y axis). Finally, let's see how to write the code.


For more information, see Using the Orientation Sensor. First, instantiate a direction Sensor:

mOrientation = mSensorManager.getDefaultSensor(Sensor.TYPE_ORIENTATION);

Although this is correct, if you have written such a line of code in IDE, it is not difficult to find that it has expired, but it doesn't matter. Let's take a look at it first, the method to replace it will be introduced later.


The following is an interface for creating a custom sensor event listener:

class MySensorEventListener implements SensorEventListener {@Overridepublic void onSensorChanged(SensorEvent event) {// TODO Auto-generated method stubfloat a = event.values[0];azimuthAngle.setText(a + "");float b = event.values[1];pitchAngle.setText(b + "");float c = event.values[2];rollAngle.setText(c + "");}@Overridepublic void onAccuracyChanged(Sensor sensor, int accuracy) {// TODO Auto-generated method stub}}

Finally, use SensorManager to register the listener for the Sensor:

mSensorManager.registerListener(new MySensorEventListener(),mOrientation, SensorManager.SENSOR_DELAY_NORMAL);


When the device location changes, a listener is triggered, and the value on the interface changes. Because the simulator cannot demonstrate the sensor effect, the real machine does not have the root machine to use screen projection, so I will post a symbolic review. These values are changing all the time:


When I was there, the mobile phone was horizontal, so the last two values were close to 0, and the first azimuth represents the current direction. Well, now the function is basically implemented, now we can solve the problem of the constant expiration of the Sensor class. In IDE, this line of code is as follows:

MOrientation = mSensorManager. getdefasensensor (Sensor.TYPE_ORIENTATION);

Since there must be something new to replace it after it expires, we can open the source code to see such a comment:


Obviously, we officially recommend using the SensorManager. getOrientation () method to replace the originalTYPE_ORITNTATION. Let's continue to look at this method in the source code:

    public static float[] getOrientation(float[] R, float values[]) {        /*         * 4x4 (length=16) case:         *   /  R[ 0]   R[ 1]   R[ 2]   0  \         *   |  R[ 4]   R[ 5]   R[ 6]   0  |         *   |  R[ 8]   R[ 9]   R[10]   0  |         *   \      0       0       0   1  /         *         * 3x3 (length=9) case:         *   /  R[ 0]   R[ 1]   R[ 2]  \         *   |  R[ 3]   R[ 4]   R[ 5]  |         *   \  R[ 6]   R[ 7]   R[ 8]  /         *         */        if (R.length == 9) {            values[0] = (float)Math.atan2(R[1], R[4]);            values[1] = (float)Math.asin(-R[7]);            values[2] = (float)Math.atan2(-R[6], R[8]);        } else {            values[0] = (float)Math.atan2(R[1], R[5]);            values[1] = (float)Math.asin(-R[9]);            values[2] = (float)Math.atan2(-R[8], R[10]);        }        return values;    }

Let's look at one sentence in the comment of this method:


The first line describes the role of this method,The orientation of the computing device is based on the rotation matrix.This rotating matrix is okay when we form an algorithm in the calculation direction, so we don't have to go into it. Let's look at the sentence I marked below, which clearly shows that we usuallyNoThe return value of this method. This method will fill the values [] According to the data of the R [] parameter, and the latter is what we want. Since no return value is required, it is a parameter problem. How can we obtain these two parameters: float [] R and float [] values? Continue to read the comments. The first parameter is R:


Since this method is based on the rotating matrix to calculate the direction, the first parameter R naturally represents a rotating matrix. In fact, it is used to store the magnetic field and acceleration data, according to the annotations, we can find that we can fill in the R [] parameter through the getRotationMatrix method. Then we can look at the source code of this method, which is still a static method of SensorManager:

    public static boolean getRotationMatrix(float[] R, float[] I,            float[] gravity, float[] geomagnetic) {        // TODO: move this to native code for efficiency        float Ax = gravity[0];        float Ay = gravity[1];        float Az = gravity[2];        final float Ex = geomagnetic[0];        final float Ey = geomagnetic[1];        final float Ez = geomagnetic[2];        float Hx = Ey*Az - Ez*Ay;        float Hy = Ez*Ax - Ex*Az;        float Hz = Ex*Ay - Ey*Ax;        final float normH = (float)Math.sqrt(Hx*Hx + Hy*Hy + Hz*Hz);        if (normH < 0.1f) {            // device is close to free fall (or in space?), or close to            // magnetic north pole. Typical values are  > 100.            return false;        }        final float invH = 1.0f / normH;        Hx *= invH;        Hy *= invH;        Hz *= invH;        final float invA = 1.0f / (float)Math.sqrt(Ax*Ax + Ay*Ay + Az*Az);        Ax *= invA;        Ay *= invA;        Az *= invA;        final float Mx = Ay*Hz - Az*Hy;        final float My = Az*Hx - Ax*Hz;        final float Mz = Ax*Hy - Ay*Hx;        if (R != null) {            if (R.length == 9) {                R[0] = Hx;     R[1] = Hy;     R[2] = Hz;                R[3] = Mx;     R[4] = My;     R[5] = Mz;                R[6] = Ax;     R[7] = Ay;     R[8] = Az;            } else if (R.length == 16) {                R[0]  = Hx;    R[1]  = Hy;    R[2]  = Hz;   R[3]  = 0;                R[4]  = Mx;    R[5]  = My;    R[6]  = Mz;   R[7]  = 0;                R[8]  = Ax;    R[9]  = Ay;    R[10] = Az;   R[11] = 0;                R[12] = 0;     R[13] = 0;     R[14] = 0;    R[15] = 1;            }        }        if (I != null) {            // compute the inclination matrix by projecting the geomagnetic            // vector onto the Z (gravity) and X (horizontal component            // of geomagnetic vector) axes.            final float invE = 1.0f / (float)Math.sqrt(Ex*Ex + Ey*Ey + Ez*Ez);            final float c = (Ex*Mx + Ey*My + Ez*Mz) * invE;            final float s = (Ex*Ax + Ey*Ay + Ez*Az) * invE;            if (I.length == 9) {                I[0] = 1;     I[1] = 0;     I[2] = 0;                I[3] = 0;     I[4] = c;     I[5] = s;                I[6] = 0;     I[7] =-s;     I[8] = c;            } else if (I.length == 16) {                I[0] = 1;     I[1] = 0;     I[2] = 0;                I[4] = 0;     I[5] = c;     I[6] = s;                I[8] = 0;     I[9] =-s;     I[10]= c;                I[3] = I[7] = I[11] = I[12] = I[13] = I[14] = 0;                I[15] = 1;            }        }        return true;    }

There are still four parameters. Please observe30 ~ 41 rowsBetween code, it is not difficult to find that this rotating matrix is nothing more than3*3Or4*4And then observe the Code in the if statement block. It is not difficult to find that the array elements are assigned values in turn. Where do these values come from? We29 rowsLook backwards until4 rowsIt is not difficult to find that the final data source is provided through the last two parameters of this method, namely: float [] gravity, float [] geomagnetic, old rule, look at the comments of these two parameters:


It should be clear here that the values are obtained from the accelerometer and the Geomagnetic sensor respectively. Obviously, the data should be obtained in the callback method onSensorChanged In the listener, at the same time, the aboveTwo sensors are required to determine the direction.They are: Sensor. TYPE_ACCELEROMETER and Geomagnetic Sensor (TYPE_MAGNETIC_FIELD ).

After the last two parameters of the getRotationMatrix method are completed, the first two parametersRAndIHow should we define it? It's actually very simple. The first parameterRThe array to be filled in the getOrientation () method. The size is 9. The second parameterIIt is used to convert the magnetic field data into the actual gravity coordinate system. It is generally set to NULL by default. At this point, the basic introduction of the direction sensor has been completed. Finally, let's take a complete example:

Package com. example. sensordemo; import android. app. activity; import android. content. context; import android. hardware. sensor; import android. hardware. sensorEvent; import android. hardware. sensorEventListener; import android. hardware. sensorManager; import android. OS. bundle; import android. util. log; import android. widget. textView; public class MainActivity extends Activity {private SensorManager mSensorManager; private Sensor accelerometer; // acceleration Sensor private Sensor magnetic; // Geomagnetic Sensor private TextView azimuthAngle; private float [] accelerometerValues = new float [3]; private float [] magneticFieldValues = new float [3]; private static final String TAG = "--- MainActivity "; @ Overrideprotected void onCreate (Bundle savedInstanceState) {super. onCreate (savedInstanceState); setContentView (R. layout. activity_main); // instantiate the sensor manager mSensorManager = (SensorManager) getSystemService (Context. SENSOR_SERVICE); // initialize the acceleration sensor accelerometer = mSensorManager. getdefasensensor (Sensor. TYPE_ACCELEROMETER); // initialize the Geomagnetic sensor magnetic = mSensorManager. getdefasensensor (Sensor. TYPE_MAGNETIC_FIELD); azimuthAngle = (TextView) findViewById (R. id. azimuth_angle_value); calculateOrientation () ;}@ Overrideprotected void onResume () {// TODO Auto-generated method stub // register the listener mSensorManager. registerListener (new MySensorEventListener (), accelerometer, Sensor. TYPE_ACCELEROMETER); mSensorManager. registerListener (new MySensorEventListener (), magnetic, Sensor. TYPE_MAGNETIC_FIELD); super. onResume () ;}@ Overrideprotected void onPause () {// TODO Auto-generated method stub // unregister mSensorManager. unregisterListener (new MySensorEventListener (); super. onPause () ;}// the calculation direction is private void calculateOrientation () {float [] values = new float [3]; float [] R = new float [9]; SensorManager. getRotationMatrix (R, null, accelerometerValues, magneticFieldValues); SensorManager. getOrientation (R, values); values [0] = (float) Math. toDegrees (values [0]); Log. I (TAG, values [0] + ""); if (values [0] >=- 5 & values [0] <5) {azimuthAngle. setText ("North");} else if (values [0]> = 5 & values [0] <85) {// Log. I (TAG, "Northeast"); azimuthAngle. setText ("Northeast");} else if (values [0]> = 85 & values [0] <= 95) {// Log. I (TAG, "Zhengdong"); azimuthAngle. setText ("East");} else if (values [0]> = 95 & values [0] <175) {// Log. I (TAG, "southeast"); azimuthAngle. setText ("southeast");} else if (values [0]> = 175 & values [0] <= 180) | (values [0])> =-180 & values [0] <-175) {// Log. I (TAG, "zhengnan"); azimuthAngle. setText ("South");} else if (values [0] >=- 175 & values [0] <-95) {// Log. I (TAG, "Southwest China"); azimuthAngle. setText ("Southwest China");} else if (values [0]> =-95 & values [0] <-85) {// Log. I (TAG, "zhengxi"); azimuthAngle. setText ("");} else if (values [0]> =-85 & values [0] <-5) {// Log. I (TAG, "Northwest"); azimuthAngle. setText ("Northwest") ;}} class MySensorEventListener implements SensorEventListener {@ Overridepublic void onSensorChanged (SensorEvent event) {// TODO Auto-generated method stubif (event. sensor. getType () = Sensor. TYPE_ACCELEROMETER) {accelerometerValues = event. values;} if (event. sensor. getType () = Sensor. TYPE_MAGNETIC_FIELD) {magneticFieldValues = event. values;} calculateOrientation () ;}@ Overridepublic void onAccuracyChanged (Sensor sensor, int accuracy) {// TODO Auto-generated method stub }}}



Iv. Summary



It is still difficult to study the Android sensor for the first time. Let's continue to work hard. There are still many things to learn. We should first integrate the compass for the Map project ~


Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.