Android Note 7 Android Sensor introduction (3) obtain the user's mobile direction and the principle of the compass

Source: Internet
Author: User

 

Good evening ~ Today, I will continue to share with you the second important sensor. In fact, it is very easy to get the direction. In the first article, we will see a TYPE_ORIENTATION keyword, indicating that we can directly get the device's direction of movement, however, the latest SDK adds the following sentence: "TYPE_ORIENTATION This constant is deprecated. use SensorManager. getOrientation () instead. that is to say, this method has been canceled and developers need to use SensorManager. getOrientation () to obtain the original data.

In fact, the android acquisition direction is obtained through the magnetic field sensor and the acceleration sensor. The specific algorithm SDK has been encapsulated. That is to say, there are two ways to get the user direction. One is officially recommended through SensorManager. the getOrientation () method is easy to obtain (because you haven't seen its parameters yet .. I will talk about it later), but in fact two sensors need to work together to ensure accuracy. The second method is very simple. Just like getting acceleration from the previous article, you can directly obtain data on three axes.

Well, it is difficult to introduce it, because after all, the first method will be an option for android in the future, and the second method will become history in the future.

 

The direction data provided by android is a float array containing values in three directions.

 

 

When your mobile phone is placed horizontally, it is set to static by default, that is, the XY angle is 0.

 

Values [0] indicates the angle of the Z axis: The direction angle. We usually see this data for East and West. After my experiment, I found an interesting thing, that is to say, the first method is used to obtain the direction (magnetic field + acceleration) and the data range is (-180 ~ 180), that is, 0 indicates the north, 90 indicates the east, 180/-180 indicates the south, and-90 indicates the West. In the second method (directly through the direction sensor), the data range is (0 ~ 360) 360/0 indicates the north, 90 indicates the east, 180 indicates the south, and 270 indicates the West.

Values [1] indicates the angle of the X axis: the pitch angle starts from the static state and is flipped before and after

Values [2] indicates the Y axis angle. The rotation angle starts from the static state, and the left and right sides are flipped.

It can be seen that the method to obtain the direction in a unified way is necessary, because the algorithm for processing the data may be used in the first way, so the portability is not good when used in the second way.

See the following method.

Zookeeper --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Public static float [] getOrientation (float [] R, float [] values)

Since: API Level 3

Computes the device's orientation based on the rotation matrix.

When it returns, the array values is filled with the result:

Values [0]: azimuth, rotation around the Z axis.

Values [1]: pitch, rotation around the X axis.

Values [2]: roll, rotation around the Y axis.

The reference coordinate-system used is different from the world coordinate-system defined for the rotation matrix:

X is defined as the vector product Y.Z (It is tangential to the ground at the device's current location and roughly points West ).

Y is tangential to the ground at the device's current location and points towards the magnetic North Pole.

Z points towards the center of the Earth and is perpendicular to the ground.

All three angles abve are in radians and positive in the counter-clockwise direction.

Generally, we do not need to obtain the return value of this function. This method will fill the values [] According to the data of the R [] parameter, and the latter is what we want.

So what does R mean? How can this problem be obtained?

R [] is a rotating matrix used to store Magnetic Field and acceleration data. You can understand the unprocessed data.

R is obtained through the static method below. This method is also used to fill R []

Public static boolean getRotationMatrix (float [] R, float [] I, float [] gravity, float [] geomagnetic)

 

The following parameters are explained. The first one is the R array to be filled. The size is 9.

The second is a conversion matrix, which converts the magnetic field data to the actual gravity coordinate. Generally, it can be set to null by default.

The third is an array of 3, indicating that the data obtained from the accelerometer is in onSensorChanged.

The fourth is an array of 3, indicating that the data obtained from the magnetic sensor is in onSensorChanged.

 

Well, the basic logic is like this. The following is an example of a simple test direction, which can monitor the user's direction at all times.

 

/*

* @ Author octobershiner

* 2011 07 28

* SE. HIT

* An example of how to obtain direction data through the magnetic field and acceleration sensors

**/

 

 

Package uni. sensor;

 

Import android. app. Activity;

Import android. content. Context;

Import android. hardware. Sensor;

Import android. hardware. SensorEvent;

Import android. hardware. SensorEventListener;

Import android. hardware. SensorManager;

Import android. OS. Bundle;

Import android. util. Log;

 

Public class OrientationActivity extends Activity {

 

Private SensorManager sm;

// Requires two Sensor

Private Sensor aSensor;

Private Sensor mSensor;

Float [] accelerometerValues = new float [3];

Float [] magneticFieldValues = new float [3];

Private static final String TAG = "sensor ";

@ Override

Public void onCreate (Bundle savedInstanceState ){

// TODO Auto-generated method stub

Super. onCreate (savedInstanceState );

SetContentView (R. layout. main );

 

Sm = (SensorManager) getSystemService (Context. SENSOR_SERVICE );

ASensor = sm. getdefasensensor (Sensor. TYPE_ACCELEROMETER );

MSensor = sm. getdefasensensor (Sensor. TYPE_MAGNETIC_FIELD );

 

Sm. registerListener (myListener, aSensor, SensorManager. SENSOR_DELAY_NORMAL );

Sm. registerListener (myListener, mSensor, SensorManager. SENSOR_DELAY_NORMAL );

// Update the display data

CalculateOrientation ();

 

}

// Re-emphasize: Pay attention to releasing when the activity is paused

Public void onPause (){

Sm. unregisterListener (myListener );

Super. onPause ();

}

Final SensorEventListener myListener = new SensorEventListener (){

Public void onSensorChanged (SensorEvent sensorEvent ){

If (sensorEvent. sensor. getType () = Sensor. TYPE_MAGNETIC_FIELD)

MagneticFieldValues = sensorEvent. values;

If (sensorEvent. sensor. getType () = Sensor. TYPE_ACCELEROMETER)

AccelerometerValues = sensorEvent. values;

CalculateOrientation ();

}

Public void onAccuracyChanged (Sensor sensor, int accuracy ){}

};

 

Private void calculateOrientation (){

Float [] values = new float [3];

Float [] R = new float [9];

SensorManager. getRotationMatrix (R, null, accelerometerValues, magneticFieldValues );

SensorManager. getOrientation (R, values );

 

// The data format must be converted to degrees.

Values [0] = (float) Math. toDegrees (values [0]);

Log. I (TAG, values [0] + "");

// Values [1] = (float) Math. toDegrees (values [1]);

// Values [2] = (float) Math. toDegrees (values [2]);

If (values [0] >=- 5 & values [0] <5 ){

Log. I (TAG, "North ");

}

Else if (values [0]> = 5 & values [0] <85 ){

Log. I (TAG, "Northeast ");

}

Else if (values [0]> = 85 & values [0] <= 95 ){

Log. I (TAG, "Zhengdong ");

}

Else if (values [0] >=95 & values [0] <175 ){

Log. I (TAG, "southeast ");

}

Else if (values [0]> = 175 & values [0] <= 180) | (values [0])> =-180 & values [0] <-175 ){

Log. I (TAG, "zhengnan ");

}

Else if (values [0] >=- 175 & values [0] <-95 ){

Log. I (TAG, "Southwest China ");

}

Else if (values [0] >=- 95 & values [0] <-85 ){

Log. I (TAG, "zhengxi ");

}

Else if (values [0] >=- 85 & values [0] <-5 ){

Log. I (TAG, "Northwest ");

}

}

 

}

 

The practical training is very time-consuming, and I feel tired when I take the time to write and summarize, but I feel a lot of gains. If I have time, I would like to share with you the second method, which is much easier than this, in fact, you can fully refer to the Code in the previous article: http://www.bkjia.com/kf/201111/110232.html

You only need to set two Sensor. You can change TYPE_ACCELEROMETER to Sensor. TYPE_ORIENTATIO, but the methods we share today are the best. This should be the future android standard.

 

 

The Sensor should be introduced here for the time being. Let's take a look at the process thread. In fact, there is a very important class in the hardware package, Camera. I believe you have heard of the android scanner, very powerful. I will share it with you later.

The next arrangement should be thread activity and geocode.

I don't have a mentor either. I am tired of studying this in the SDK ~ Please advise.

 



From: column of Octobershiner

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.