Android uses the sensor sensor to get the user's direction of movement (Compass principle) _android

Source: Internet
Author: User

This article illustrates how Android uses the sensor sensor to get the direction of a user's movement. Share to everyone for your reference, specific as follows:

Today continue to share a second important sensor, in fact, to get the direction of the things should be very simple, in the previous article to see that there is a type_orientation keyword, you can directly get the direction of the device, but the latest version of the SDK plus such a sentence "Type_ Orientation This constant is deprecated. Use Sensormanager.getorientation () instead. "In other words, this method has been canceled and the developer is using Sensormanager.getorientation () to get the original data."

In fact, the Android acquisition direction is obtained by magnetic field sensors and accelerometer sensors, and the specific algorithm SDK is already encapsulated. That is, there are two ways to get the user's direction now, one is officially recommended, and by Sensormanager.getorientation () The method looks easy on the surface (because you haven't seen his parameters yet). Later), but in fact it takes two sensors to work together, with a more accurate feature. The second method is very simple, just like the previous article to get the acceleration, directly to the three axes of data.

Well, from the difficult introduction, because after all, the first method will be a choice for Android in the future, the second do not know when it will become history.

The direction that Android gives us is a float array with values in three directions, as shown in

When your phone is placed horizontally, the default is the static state, that is, XY angle is 0

Values[0] is the angle of the Z axis: direction angle, we usually judge the East and the West is to see this data, after my experiment, found an interesting thing, that is, using the first way to obtain the direction (magnetic field + acceleration) to get the data range is ( -180~180), that is to say, 0 for North, 90 for the east, 180/-180 said South,-90 said West. and the second way (directly through the direction of the sensor) data range is (0~360) 360/0 for the north, 90 for the east, 180 for the South, 270 means west.

VALUES[1] represents the angle of the x-axis: the pitch angle begins with a static state and flips back and forth
VALUES[2] represents the angle of the y-axis: The flip angle starts with a static state and flips around

It is necessary to see a uniform way of getting directions, because the algorithm that handles the data may be the first way to get it, so porting is not good when used in the second way.

Look at the following method

public static float[] Getorientation (float[] R, float[] values)
Since:api level 3
Computes the device ' s Orientation based on the rotation matrix.
when it returns, the array values are filled with the result:
Values[0: Azimuth, rotation around the Z axis.
VALUES[1]: Pitch, rotation around the X axis.
Values[2]: Roll, rotation around the Y axis.
The reference Coordinate-system used is different to the world Coordinate-system defined for the rotation matrix: X is defined as the vector product y.z (It's tangential to the ground in the device ' s current location and roughly POI NTS West). The
Y is tangential to the ground at the device's current location and points towards the magnetic.
Z points towards the center of the earth and are perpendicular to the ground.
All three angles above are in radians and positive in the counter-clockwise.

Usually we don't need to get the return value of this function, which populates values[with the data of the parameter r[] and the latter is what we want.

So what does R mean? And how will it be obtained?

R[] is a rotational matrix that holds the data of the magnetic field and acceleration, and you can understand the raw direction data.
R is obtained by using the following static method, which is also used to populate r[]
public static Boolean Getrotationmatrix (float[] R, float[] I, float[] gravity, float[]

Explain the following parameters:

The first one is the R array we need to populate, the size is 9
The second is a transformation matrix, which converts the magnetic field data into the actual gravitational coordinates, which can be set to NULL by default
The third is an array of size 3, which indicates that the data obtained from the accelerometer is in onsensorchanged
The fourth is an array of size 3, which indicates that the data obtained from the magnetic field sensor is in the onsensorchanged

All right, so here's the basic logic, and here's a simple example of a test direction that you can always monitor the user's direction

* * * @author Octobershiner * SE. 
HIT * A demo example of obtaining directional data through magnetic field and acceleration two sensors * * */package uni.sensor; 
Import android.app.Activity; 
Import Android.content.Context; 
Import Android.hardware.Sensor; 
Import android.hardware.SensorEvent; 
Import Android.hardware.SensorEventListener; 
Import Android.hardware.SensorManager; 
Import Android.os.Bundle; 
Import Android.util.Log; 
 public class Orientationactivity extends activity{private sensormanager sm; 
 Need two Sensor private Sensor asensor; 
 Private Sensor msensor; 
 float[] accelerometervalues = new Float[3]; 
 float[] magneticfieldvalues = new Float[3]; 
 private static final String TAG = "sensor"; @Override public void OnCreate (Bundle savedinstancestate) {//TODO auto-generated Method stub super.oncreate (save 
  Dinstancestate); 
  Setcontentview (R.layout.main); 
  SM = (Sensormanager) getsystemservice (Context.sensor_service); 
  Asensor = Sm.getdefaultsensor (Sensor.type_accelerometer); Msensor = Sm.getdefaultSensor (Sensor.type_magnetic_field); 
  Sm.registerlistener (MyListener, Asensor, sensormanager.sensor_delay_normal); 
  Sm.registerlistener (MyListener, msensor,sensormanager.sensor_delay_normal); 
 Update the method that displays the data calculateorientation (); 
  //again: Note the release of public void OnPause () {Sm.unregisterlistener (MyListener) When the activity is paused; 
 Super.onpause (); Final Sensoreventlistener MyListener = new Sensoreventlistener () {public void onsensorchanged (Sensorevent sensorev 
 ENT) {if (sensorEvent.sensor.getType () = = Sensor.type_magnetic_field) magneticfieldvalues = sensorevent.values; 
 if (sensorEvent.sensor.getType () = = sensor.type_accelerometer) accelerometervalues = sensorevent.values; 
 Calculateorientation (); 
 public void onaccuracychanged (Sensor Sensor, int accuracy) {}}; 
   private void Calculateorientation () {float[] values = new FLOAT[3]; 
   float[] R = new FLOAT[9];   
   Sensormanager.getrotationmatrix (R, NULL, accelerometervalues, magneticfieldvalues);Sensormanager.getorientation (R, values); 
   Convert to degree values[0] = (float) math.todegrees (Values[0]) after a conversion of a data format; 
   LOG.I (TAG, values[0]+ ""); 
   Values[1] = (float) math.todegrees (values[1]); 
   VALUES[2] = (float) math.todegrees (values[2]); 
   if (Values[0] >=-5 && Values[0] < 5) {log.i (TAG, "North"); 
   else if (Values[0] >= 5 && values[0] < () {log.i (TAG, "ne"); 
   else if (Values[0] >= && values[0] <=95) {log.i (TAG, "East"); 
   else if (Values[0] >= && values[0] <175) {log.i (TAG, "southeast"); else if ((Values[0] >= 175 && values[0] <= 180) | | 
   (Values[0]) >= -180 && values[0] < -175) {log.i (TAG, "South"); 
   else if (Values[0] >= -175 && values[0] <-95) {log.i (TAG, "SW"); 
   else if (Values[0] >= -95 && values[0] < -85) {log.i (TAG, "West"); else if (Values[0] >= -85 && values[0] <-5 {log.i (TAG, "Northwest");

 } 
  }
}

Training time is very tense, take time to write summary feeling very tired, but feel a lot of harvest, if there is time, also want to share a second method, and this compared to a lot simpler, in fact, we can fully refer to the code in the previous article Android method for obtaining gravitational induction acceleration based on sensor sensors

Just put two of them sensor. Type_accelerometer changed into a sensor.type_orientatio, but today's sharing the best way to master, this should be the future of Android standards.

Sensor sensor should be introduced temporarily here, it is time to look at the process thread of things, in fact, hardware package also has a very important class, camera camera, I believe you have heard of Android scanners, very powerful. Have time to share with you later.

The next arrangement should be thread activity and then GeoCode.

Say I also did not have a guidance teacher, a person to the SDK research these, some tired ~ beg high man advice.

I hope this article will help you with the Android program.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.