Good evening ~ Today, I will continue to share with you the second important sensor. In fact, it is very easy to get the direction.ArticleA type_orientation keyword indicates that the device's moving direction can be directly obtained. However, the latest SDK adds the phrase "type_orientation this constant is deprecated. use sensormanager. getorientation () instead. that is to say, this method has been canceled and developers need to use sensormanager. getorientation () to obtain the original data.
In fact, the android acquisition direction is obtained through the magnetic field sensor and the acceleration sensor.AlgorithmThe SDK has been encapsulated. That is to say, there are two ways to get the user direction. One is officially recommended through sensormanager. the getorientation () method is easy to obtain (because you haven't seen its parameters yet .. I will talk about it later), but in fact two sensors need to work together to ensure accuracy. The second method is very simple. Just like getting acceleration from the previous article, you can directly obtain data on three axes.
Well, it is difficult to introduce it, because after all, the first method will be an option for Android in the future, and the second method will become history in the future.
The direction data provided by Android is a float array containing values in three directions.
When your mobile phone is placed horizontally, it is set to static by default, that is, the XY angle is 0.
Values [0] indicates the angle of the Z axis: The direction angle. We usually see this data for East and West. After my experiment, I found an interesting thing, that is to say, the first method is used to obtain the direction (magnetic field + acceleration) and the data range is (-180 ~ 180), that is, 0 indicates the north, 90 indicates the east, 180/-180 indicates the south, and-90 indicates the West. In the second method (directly through the direction sensor), the data range is (0 ~ 360)360/0 indicates the north, 90 indicates the east, 180 indicates the south, and 270 indicates the West.
Values [1] indicates the angle of the X axis: the pitch angle starts from the static state and is flipped before and after
Values [2] indicates the Y axis angle. The rotation angle starts from the static state, and the left and right sides are flipped.
It can be seen that the method to obtain the direction in a unified way is necessary, because the algorithm for processing the data may be used in the first way, so the portability is not good when used in the second way.
See the following method.
Certificate ---------------------------------------------------------------------------------------------------------------------------------------
Public static float [] Getorientation (Float [] r, float [] values) Since:API level 3
Computes the device's orientation based on the rotation matrix.
When it returns, the array values is filled with the result:
- Values [0]:Azimuth, Rotation around the Z axis.
- Values [1]:Pitch, Rotation around the X axis.
- Values [2]:Roll, Rotation around the Y axis.
The reference coordinate-system used is different from the world coordinate-system defined for the rotation matrix:
- X is defined as the vector productY.z(It is tangential to the ground at the device's current location and roughly points west ).
- Y is tangential to the ground at the device's current location and points towards the magnetic north pole.
- Z points towards the center of the earth and is perpendicular to the ground.
All three angles above are inRadiansAndPositiveInCounter-clockwiseDirection.
Generally, we do not need to obtain the return value of this function. This method will fill the values [] According to the data of the R [] parameter, and the latter is what we want.
So what does R mean? How can this problem be obtained?
R [] is a rotating matrix used to store Magnetic Field and acceleration data. You can understand the unprocessed data.
R is obtained through the static method below. This method is also used to fill R []
Public static Boolean Getrotationmatrix (Float [] r, float [] I, float [] Gravity, float [] geomagnetic)
The following parameters are explained. The first one is the R array to be filled. The size is 9.
The second is a conversion matrix, which converts the magnetic field data to the actual gravity coordinate. Generally, it can be set to null by default.
The third is an array of 3, indicating that the data obtained from the accelerometer is in onsensorchanged.
The fourth is an array of 3, which indicates the data obtained from the magnetic sensor.In onsensorchanged
Well, the basic logic is like this. The following is an example of a simple test direction, which can monitor the user's direction at all times.
1 View plain
2 /*
3 * @ Author octobershiner
4 * 2011 07 28
5 * Se. Hit
6 * An example of how to obtain direction data through the magnetic field and acceleration sensors
7 * */
8
9
10 Package Uni. sensor;
11
12 Import Android. App. activity;
13 Import Android. content. context;
14 Import Android. Hardware. sensor;
15 Import Android. Hardware. sensorevent;
16 Import Android. Hardware. sensoreventlistener;
17 Import Android. Hardware. sensormanager;
18 Import Android. OS. Bundle;
19 Import Android. util. log;
20
21 Public Class Orientationactivity Extends Activity {
22
23 Private Sensormanager Sm;
24 // Requires two sensor
25 Private Sensor ASensor;
26 Private Sensor msensor;
27
28 Float [] Accelerometervalues = New Float [3];
29 Float [] Magneticfieldvalues = New Float [3];
30
31 Private Static Final String tag = "sensor ";
32
33 @ Override
34 Public Void Oncreate (bundle savedinstancestate ){
35 // Todo auto-generated method stub
36 Super . Oncreate (savedinstancestate );
37 Setcontentview (R. layout. Main );
38
39 Sm = (sensormanager) getsystemservice (context. sensor_service );
40 ASensor = Sm. getdefasensensor (sensor. type_accelerometer );
41 Msensor = Sm. getdefasensensor (sensor. type_magnetic_field );
42
43 SM. registerlistener (mylistener, ASensor, sensormanager. sensor_delay_normal );
44 SM. registerlistener (mylistener, msensor, sensormanager. sensor_delay_normal );
45 // How to update display data
46 Calculateorientation ();
47
48 }
49 // Note: release when activity is paused
50 Public Void Onpause (){
51 SM. unregisterlistener (mylistener );
52 Super . Onpause ();
53 }
54
55
56 Final Sensoreventlistener mylistener = New Sensoreventlistener (){
57 Public Void Onsensorchanged (sensorevent ){
58
59 If (Sensorevent. sensor. GetType () = sensor. type_magnetic_field)
60 Magneticfieldvalues = sensorevent. values;
61 If (Sensorevent. sensor. GetType () = sensor. type_accelerometer)
62 Accelerometervalues = sensorevent. values;
63 Calculateorientation ();
64 }
65 Public Void Onaccuracychanged (sensor,Int Accuracy ){}
66 };
67
68
69 Private Void Calculateorientation (){
70 Float [] Values = New Float [3];
71 Float [] R =New Float [9];
72 Sensormanager. getrotationmatrix (r, Null , Accelerometervalues, magneticfieldvalues );
73 Sensormanager. getorientation (R, values );
74
75 // The data format must be converted to degrees.
76 Values [0] = (Float ) Math. todegrees (Values [0]);
77 Log. I (TAG, values [0] + "");
78 // Values [1] = (float) math. todegrees (Values [1]);
79 // Values [2] = (float) math. todegrees (Values [2]);
80
81 If (Values [0]> =-5 & values [0] <5 ){
82 Log. I (TAG, "North ");
83 }
84 Else If (Values [0]> = 5 & values [0] <85 ){
85 Log. I (TAG, "Northeast ");
86 }
87 Else If (Values [0]> = 85 & values [0] <= 95 ){
88 Log. I (TAG, "Zhengdong ");
89 }
90 Else If (Values [0]> = 95 & values [0] <175 ){
91 Log. I (TAG, "southeast ");
92 }
93 Else If (Values [0]> = 175 & values [0] <= 180) | (Values [0])> =-180 & values [0] <-175 ){
94 Log. I (TAG, "zhengnan ");
95 }
96 Else If (Values [0] >=- 175 & values [0] <-95 ){
97 Log. I (TAG, "Southwest China ");
98 }
99 Else If (Values [0] >=- 95 & values [0] <-85 ){
100 Log. I (TAG, "zhengxi ");
101 }
102 Else If (Values [0]> =-85 & values [0] <-5 ){
103 Log. I (TAG, "Northwest ");
104 }
105 }
106
107
108 }
The practical training is very time-consuming, and I feel tired when I take the time to write and summarize, but I feel a lot of gains. If I have time, I would like to share with you the second method, which is much easier than this, in fact, you can refer toCodeThe practical training is very time-consuming, and I feel tired when I take the time to write and summarize, but I feel a lot of gains. If I have time, I would like to share with you the second method, which is much easier than this, in fact, you can fully refer to the Code http://blog.csdn.net/octobershiner/article/details/6639040 in the previous article
You only need to set two sensor. You can change type_accelerometer to sensor. type_orientatio, but the methods we share today are the best. This should be the future android standard.
The sensor should be introduced here for the time being. Let's take a look at the process thread. In fact, there is a very important class in the hardware package, camera. I believe you have heard of the android scanner, very powerful. I will share it with you later.
The next arrangement should be thread activity and geocode.
I don't have a mentor either. I am tired of studying this in the SDK ~ Please advise.
You only need to set two sensor. You can change type_accelerometer to sensor. type_orientatio, but the methods we share today are the best. This should be the future android standard.
The sensor should be introduced here for the time being. Let's take a look at the process thread. In fact, there is a very important class in the hardware package, camera. I believe you have heard of the android scanner, very powerful. I will share it with you later.
The next arrangement should be thread activity and geocode.
I don't have a mentor either. I am tired of studying this in the SDK ~ Please advise.