For an accelerometer, we only need to know its three-axis data, while the android upper layer only needs data in a data structure.
1. About the Linux driver layer
The driver registers the input device, reads the three-axis data in the sensor register through the I2C interface, and reports the three data.
2. About the Android hardware abstraction layer
There is a sensors. h header file under hardware \ libhardware \ include \ hardware. Fill in a data structure here.
The structure is as follows:
typedef struct { union { float v[3]; struct { float x; float y; float z; }; struct { float azimuth; float pitch;、 float roll; }; }; int8_t status; uint8_t reserved[3];} sensors_vec_t; /** * Union ofthe various types of sensor data * that can be returned. */typedef structsensors_event_t { /* must be sizeof(struct sensors_event_t)*/ int32_t version; /* sensor identifier */ int32_t sensor; /* sensor type */ int32_t type; /* reserved */ int32_t reserved0; /* time is in nanosecond */ int64_t timestamp; union { float data[16]; /* acceleration values are in meter persecond per second (m/s^2) */ sensors_vec_t acceleration; /* magnetic vector values are inmicro-Tesla (uT) */ sensors_vec_t magnetic; /* orientation values are in degrees */ sensors_vec_t orientation; /* gyroscope values are in rad/s */ sensors_vec_t gyro; /* temperature is in degrees centigrade(Celsius) */ float temperature; /* distance in centimeters */ float distance; /* light in SI lux units */ float light; /* pressure in hectopascal (hPa) */ float pressure; /* relative humidity in percent */ float relative_humidity; }; uint32_t reserved1[4];}sensors_event_t;
In Android, the source code has a dedicated sensor class for this abstraction layer to process the reported data, provide it to JNI, and send it to the android upper layer.
The code is in the device \ Samsung \ tuna \ libsensors folder.
Specific porting is as follows:
static structsensor_t sSensorList[LOCAL_SENSORS + MPLSensor::numSensors] = { { "GP2A Light sensor", "Sharp", 1, SENSORS_LIGHT_HANDLE, SENSOR_TYPE_LIGHT, 3000.0f, 1.0f,0.75f, 0, { } }, { "GP2A Proximity sensor", "Sharp", 1, SENSORS_PROXIMITY_HANDLE, SENSOR_TYPE_PROXIMITY, 5.0f, 5.0f,0.75f, 0, { } }, { "BMP180 Pressure sensor", "Bosch", 1, SENSORS_PRESSURE_HANDLE, SENSOR_TYPE_PRESSURE, 1100.0f, 0.01f,0.67f, 20000, { } },};
Here you need to change to your own device.
private: enum { mpl = 0, //all mpl entries must be consecutive and inthis order mpl_accel, mpl_timer, light, proximity, pressure, numSensorDrivers, // wake pipe goes here mpl_power, //special handle for MPL pminteraction numFds, };
Select the device ID.
inthandleToDriver(int handle) const { switch (handle) { case ID_RV: case ID_LA: case ID_GR: case ID_GY: case ID_A: case ID_M: case ID_O: return mpl; case ID_L: return light; case ID_P: return proximity; case ID_PR: return pressure; } return -EINVAL; }
Here, the numbers entered above are matched based on different IDs. For example, if I have a three-axis acceleration
Case id_a:
Return accel;
sensors_poll_context_t::sensors_poll_context_t(){ FUNC_LOG; MPLSensor* p_mplsen = new MPLSensor(); setCallbackObject(p_mplsen); //setup thecallback object for handing mpl callbacks numSensors = LOCAL_SENSORS + p_mplsen->populateSensorList(sSensorList + LOCAL_SENSORS, sizeof(sSensorList[0]) * (ARRAY_SIZE(sSensorList) - LOCAL_SENSORS)); mSensors[mpl] = p_mplsen; mPollFds[mpl].fd =mSensors[mpl]->getFd(); mPollFds[mpl].events = POLLIN; mPollFds[mpl].revents = 0;……}
In the constructor, enter the Three-Axis acceleration.
Create a new class based on your sensor. Create two files: accelsensor. cpp and accelsensor. h.
The specific function is based on libsensor.
After transplantation, MM will get a sensor. *. So after compilation, where * can be obtained based on your android. mk. Then run the Android system. You can use it.
3. Test
You can download an accelerometer Tester for testing or rotate the screen based on the gravity acceleration.