G-sensor series 1

Source: Internet
Author: User
Android accelerator (G-sensor)

The android accelerometer is of the sensor. type_accelerometer type.

Return the value of the accelerometer through Android. Hardware. sensorevent.

The Unit returned by the accelerometer is the unit of acceleration m/s ^ 2 (meter per second). The values in three directions are

Values [0]: X-axis Acceleration

Values [1]: acceleration in Y-axis direction

Values [2]: acceleration in the z-axis direction

The X, Y, and Z directions are defined as the coordinates of the reference system (such as) on the Right and bottom of the horizontally placed mobile phone)

The Direction X is the horizontal direction of the mobile phone, and the right is positive

The Y direction is the horizontal vertical direction of the mobile phone, and the front is positive

The Y direction is the vertical direction of the mobile phone space, the direction of the sky is positive, and the direction of the earth is negative

X0
^
|
+ ----------- + --> Y> 0
|
|
|
|/Z0 (toward the sky)

O: origin (x = 0, y = 0, Z = 0)

Note that due to the inherent gravity acceleration G of the Earth (the value is 9.8 Mb/s ^ 2 ),

Therefore, in reality, the actual acceleration value should be the Z-direction return value-9.8 m/s ^ 2.

For example, if you throw a mobile phone at the acceleration of 2 Mb/s ^ 2, then the return value in the z direction is 11.8 Mb/s ^ 2.

Otherwise, if the mobile phone drops at a 2 Mbit/s ^ 2 acceleration, the return value in the z direction should be 7.8 Mbit/s ^ 2.

In the X and Y directions, there is no such restriction.

Http://blog.csdn.net/stevenliyong/archive/2009/09/13/4547568.aspx

In-depth exploration of Android Sensors
From: http://www.ibm.com/developerworks/cn/opensource/os-android-sensor/index.html

Android is a rich platform for application development. It has many attractive user interface elements and data management functions. Android also provides a rich set of Interface Options. This article describes how to monitor your environment with various Android sensor options. The sample code shows how to record audio on an Android phone. Want to build your own baby monitor? Do you want to use your voice to answer the call or open the door? Learn how to use the hardware functions of devices equipped with Android.

Introduction

For Java developers, the Android platform is an ideal platform for creating innovative applications by using hardware sensors. We will learn some interface connection options available for Android apps, including using sensor subsystems and recording audio clips.

What applications can be built using the hardware functions of devices equipped with Android? Any application that requires electronic monitoring and listening can be built. Baby monitors, security systems, and even seismic instruments. Theoretically, you cannot appear in two places at the same time, but android can use some feasible methods to achieve this. At the beginning and end of this article, you must remember that the Android devices used are not only limited to mobile phones, but also devices deployed in fixed locations and connected to wireless networks, such as edge or WiFi. Download the source files in this example.

Back to Top

Android sensor function

There is a novelty in using the Android platform, that is, you can access some "good tools" inside the device ". In the past, the ability to access the underlying hardware of the device once made mobile developers very difficult. Although the role of the android Java environment is still a bridge between you and the device, the android development team has brought many hardware features to the fore. This platform is an open-source platform, so you can freely write code to implement your tasks.

If you have not installed Android, you can download the android SDK. You can also browse the content of the android. hardware package and refer to the example in this article. The android. Media package contains classes that provide useful and novel functions.

The android SDK provides the following hardware-oriented functions.

Table 1. Hardware-oriented features provided by the android SDK
Feature description
Android. Hardware. Camera allows apps to interact with cameras. You can take photos, get preview screen images, and modify the parameters used to manage camera operations.
Android. Hardware. sensormanager class that allows access to Android platform sensors. Not all devices equipped with Android support all sensors in sensormanager, although this possibility is very exciting. (See the following for an overview of available sensors)
Android. Hardware. sensorlistener is the interface that you want to implement when the sensor value is changed in real time. The application implements this interface to monitor one or more available sensors in the hardware. For example, the code in this article contains classes that implement this interface. After implementation, You can monitor the device direction and built-in acceleration tables.
Android. Media. mediarecorder is a class used to record media samples. It is useful for recording audio activities at specific locations (such as baby care. You can also analyze audio clips for identity authentication when accessing controls or secure applications. For example, it can help you open the door with sound to save time without getting the key from the real estate agent.
Android. facedetector allows a class for basic recognition of faces (contained in bitmap form. It is impossible to have two identical faces. You can use this type as the device locking method without having to remember the password-this is the biometric feature recognition function of the mobile phone.
Android. OS. * contains several useful packages that can interact with the operating environment, including power management, File Viewer, processor, and message. Like many mobile devices, phones that support Android may consume a lot of power. Waking up devices at the right time to monitor events of interest is the first concern during design.
Java. util. Date
Java. util. Timer
Java. util. timertask data and time are often important when measuring actual events. For example, the Java. util. Date class allows you to obtain a timestamp when a specific event or condition occurs. You can use Java. util. timer and Java. util. timertask to execute periodic or time point tasks, respectively.

Android. Hardware. sensormanager contains several constants, which indicate different aspects of the android sensor system, including:

Sensor Type
Direction, acceleration table, light, magnetic field, closeness, temperature, etc.
Sampling Rate
Fastest, game, common, user interface. When an application requests a specific sampling rate, it only prompts or recommends the sensor subsystem. The specified sampling rate is not guaranteed.
Accuracy
High, low, medium, and unreliable.

The sensorlistener interface is the center of the sensor application. It includes two required methods:

* The onsensorchanged (INT sensor, float values []) method is called when the sensor value is changed. This method is only called by the sensor monitored by this application (for more information, see the following ). Parameters of this method include: an integer indicating the changed sensor; an array of floating-point values indicating the sensor data itself. Some sensors only provide one data value, while others provide three floating point values. The direction sensor and the acceleration table sensor both provide three data values.
* When the sensor's accuracy changes, the onaccuracychanged (INT sensor, int accuracy) method is called. Parameters include two integers: one representing the sensor and the other representing the new accurate value of the sensor.

To interact with a sensor, the application must register to listen for activities related to one or more sensors. Register and use the registerlistener method of the sensormanager class. The sample code in this article demonstrates how to register and deregister the sensorlistener.

Remember, not all devices that support Android support all the sensors defined in the SDK. If a sensor cannot be used on a specific device, your application will be downgraded as appropriate.

Back to Top

Sensor example

The sample application only monitors changes to the peer direction and the acceleration table sensor (for source code, see download ). When a change is received, the sensor value is displayed on the screen of the textview widget. Figure 1 shows the running status of the application.

Figure 1. Monitoring acceleration and direction
Monitor acceleration and direction

Applications created using the eclipse environment and Android Developer Tools plug-in. (For details about how to use eclipse to develop Android applications, see references .) Listing 1 shows the code of the application.

Listing 1. ibmeyes. Java

Package com. MSI. IBM. eyes;
Import Android. App. activity;
Import Android. OS. Bundle;
Import Android. util. log;
Import Android. widget. textview;
Import Android. Hardware. sensormanager;
Import Android. Hardware. sensorlistener;
Public class ibmeyes extends activity implements sensorlistener {
Final string tag = "ibmeyes ";
Sensormanager Sm = NULL;
Textview xviewa = NULL;
Textview yviewa = NULL;
Textview zviewa = NULL;
Textview xviewo = NULL;
Textview yviewo = NULL;
Textview zviewo = NULL;

/** Called when the activity is first created .*/
@ Override
Public void oncreate (bundle savedinstancestate ){
Super. oncreate (savedinstancestate );
// Get reference to sensormanager
Sm = (sensormanager) getsystemservice (sensor_service );
Setcontentview (R. layout. Main );
Xviewa = (textview) findviewbyid (R. Id. Xbox );
Yviewa = (textview) findviewbyid (R. Id. ybox );
Zviewa = (textview) findviewbyid (R. Id. zbox );
Xviewo = (textview) findviewbyid (R. Id. xboxo );
Yviewo = (textview) findviewbyid (R. Id. yboxo );
Zviewo = (textview) findviewbyid (R. Id. zboxo );
}
Public void onsensorchanged (INT sensor, float [] values ){
Synchronized (this ){
Log. D (TAG, "onsensorchanged:" + sensor + ", X:" +
Values [0] + ", Y:" + values [1] + ", Z:" + values [2]);
If (sensor = sensormanager. sensor_orientation ){
Xviewo. settext ("orientation X:" + values [0]);
Yviewo. settext ("orientation Y:" + values [1]);
Zviewo. settext ("orientation Z:" + values [2]);
}
If (sensor = sensormanager. sensor_accelerometer ){
Xviewa. settext ("accel X:" + values [0]);
Yviewa. settext ("accel Y:" + values [1]);
Zviewa. settext ("accel Z:" + values [2]);
}
}
}

Public void onaccuracychanged (INT sensor, int accuracy ){
Log. D (TAG, "onaccuracychanged:" + sensor + ", accuracy:" + accuracy );
}
@ Override
Protected void onresume (){
Super. onresume ();
// Register this class as a listener for the orientation and accelerometer Sensors
SM. registerlistener (this,
Sensormanager. sensor_orientation | sensormanager. sensor_accelerometer,
Sensormanager. sensor_delay_normal );
}

@ Override
Protected void onstop (){
// Unregister listener
SM. unregisterlistener (this );
Super. onstop ();
}
}

Writing an application must be based on common activities because it only updates the screen using the data obtained from the sensor. Building an application as a service may be more appropriate in applications where the device may execute other activities in the foreground.

The oncreate method of this activity can reference sensormanager, which contains all functions related to the sensor. The oncreate method also creates references to six textview Widgets. You need to update these widgets using sensor data values.

The onresume () method uses the reference to sensormanager to register Sensor updates using the registerlistener method:

* The first parameter is an instance of the class that implements the sensorlistener interface.
* The second parameter is the bit mask of the sensor. In this example, the application requests data from sensor_orientation and sensor_accelerometer.
* The third parameter is a system prompt indicating the speed required by the application to update the sensor value.

After the application (activity) is paused, You need to log out of the listener so that no Sensor updates will be received later. This is achieved through the unregisterlistener method of sensormanager. The only parameter is the instance of sensorlistener.

In the call of the registerlistener and unregisterlistener methods, the application uses the keyword "this. Note the implements keyword in the class definition, which declares the class to implement the sensorlistener interface. This is why you want to pass it to registerlistener and unregisterlistener.

The sensorlistener must implement two methods: onsensorchange and onaccuracychanged. The sample application does not care about the accuracy of the sensor, but the current x, y, and z values of the sensor. The onaccuracychanged method does not execute any operations. It only adds one log entry for each call.

It seems that the onsensorchanged method is often needed because the acceleration table and the direction sensor are sending data quickly. Check the first parameter to determine which sensor is sending data. After the sensor that sends the data is confirmed, the corresponding UI element will be updated using the data contained in the floating-point value array passed in method 2. This example only shows these values, but in more advanced applications, you can also analyze these values and compare the original values, or set a pattern recognition algorithm to determine the user's (or external environment) behavior.

Now that you know about the sensor subsystem, the following section will review a sample code for recording audio on your Android phone. This example runs on the dev1 development device.

Back to Top

Use mediarecorder

The android. Media package contains classes for interacting with the media subsystem. Use the Android. Media. mediarecorder class for media sampling, including audio and video. Mediarecorder runs as a state machine. You need to set different parameters, such as the source device and format. After setting, you can execute recording for any duration until the user stops.

The code in Listing 2 records audio on an Android device. The displayed Code does not include the UI elements of the application (for the complete source code, see download ).

List 2. recording audio clips

Mediarecorder mrec;
File audiofile = NULL;
Private Static final string tag = "soundrecordingdemo ";
Protected void startrecording () throws ioexception
{
Mrec. setaudiosource (mediarecorder. audiosource. Mic );
Mrec. setoutputformat (mediarecorder. outputformat. three_gpp );
Mrec. setaudioencoder (mediarecorder. audioencoder. amr_nb );
If (msamplefile = NULL)
{
File sampledir = environment. getexternalstoragedirectory ();
Try
{
Audiofile = file. createtempfile ("IBM", ". 3GP", sampledir );
}
Catch (ioexception E)
{
Log. E (TAG, "sdcard access error ");
Return;
}
}
Mrec. setoutputfile (audiofile. getabsolutepath ());
Mrec. Prepare ();
Mrec. Start ();
}
Protected void stoprecording ()
{
Mrec. Stop ();
Mrec. Release ();
Processaudiofile (audiofile. getabsolutepath ());
}
Protected void processaudiofile ()
{
Contentvalues values = new contentvalues (3 );
Long Current = system. currenttimemillis ();
Values. Put (mediastore. Audio. Media. title, "audio" + audiofile. getname ());
Values. Put (mediastore. Audio. Media. date_added, (INT) (current/1000 ));
Values. Put (mediastore. Audio. Media. mime_type, "audio/3GPP ");
Values. Put (mediastore. Audio. Media. Data, audiofile. getabsolutepath ());
Contentresolver = getcontentresolver ();

Uri base = mediastore. Audio. Media. external_content_uri;
Uri newuri = contentresolver. insert (base, values );

Sendbroadcast (new intent (intent. action_media_scanner_scan_file, newuri ));
}

In the startrecording method, instantiate and initialize the mediarecorder instance:

* The input source is set to MIC ).
* The output format is set to 3GPP (*. 3GP file), which is a media format dedicated to mobile devices.
* The encoder is set to amr_nb. This is the audio format and the sampling rate is 8 kHz. NB indicates a narrow frequency. The SDK documentation explains different data formats and available encoders.

Audio files are stored on the memory card rather than in the memory. External. getexternalstoragedirectory () returns the name of the location of the storage card, where a temporary file name will be created. Then, the file is associated with the mediarecorder instance by calling the setoutputfile method. Audio data is stored in this file.

Call the prepare method to initialize mediarecorder. The start method is called when you prepare to start the recording process. Before calling the stop method, the files on the memory card are recorded. The release method releases the resources allocated to the mediarecorder instance.

After audio sampling is complete, you need to take the following steps:

* Add the audio to the media library of the device.
* Perform some pattern recognition steps to determine the sound:
O is this baby crying?
O is that everyone's voice? Do you want to unlock your phone?
O is this "open sesame? Do you want to open the door to "secret channels?
* Automatically uploads audio files to network locations for processing.

In this sample code, the processaudiofile method adds the audio to the media library. Use intent to notify the media application on the device that new content is available.

Note that, if you try this code snippet, it will not record audio at the beginning. You will see the created file, but there is no audio. You need to add permissions to the androidmanifest. xml file:

Now you have learned something about Android sensors and recording audio. The next section provides a more comprehensive overview of the application architecture related to the data collection and reporting system.

Back to Top

Android as a sensor Platform

The Android platform contains various sensor options for Environment Monitoring. With input or analog option arrays and advanced computing and interconnection functions, Android is the best platform for building a real system. Figure 2 shows a simple view between input, application logic, notification method, or output.

Figure 2. Square map of the Android-centered Sensor System
Square map of the Android-centered Sensor System

This architecture is flexible; application logic can be divided into local Android devices and server resources (larger database and computing functions can be implemented ). For example, a sound track recorded on a local Android device can be post to a Web server, where data is compared based on the audio mode database. Obviously, this is just the tip of the iceberg. We hope that you can study it more deeply and let the Android platform go beyond the scope of mobile phones.

Back to Top

Conclusion

In this article, we introduce Android sensors. The sample application measures the direction and acceleration, and uses the mediarecorder class to interact with the recording function. Android is a flexible and attractive platform for building a real system. The android field has developed rapidly and continues to grow. Be sure to pay attention to this platform.

Download

Description Download Method of name size
Eyes source os-android-sensorEyes.zip 28kb HTTP
Ibmaudio source code os-android-sensorIBMAudio.zip 33kb HTTP

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.