Android is a rich platform for application development. It has many attractive user interface elements and data management functions. Android also provides a rich set of Interface Options. This article describes how to monitor your environment with various Android sensor options. The sample code shows how to record audio on an Android phone. Want to build your own baby monitor? Do you want to use your voice to answer the call or open the door? Learn how to use the hardware functions of devices equipped with Android.
Introduction
For Java developers, the Android platform is an ideal platform for creating innovative applications by using hardware sensors. We will learn some interface connection options available for Android apps, including using sensor subsystems and recording audio clips.
What applications can be built using the hardware functions of devices equipped with Android? Any application that requires electronic monitoring and listening can be built. Baby monitors, security systems, and even seismic instruments. Theoretically, you cannot appear in two places at the same time, but android can use some feasible methods to achieve this. At the beginning and end of this article, you must remember that the Android devices used are not only limited to mobile phones, but also devices deployed in fixed locations and connected to wireless networks, such as edge or WiFi. Download the source files in this example.
Android sensor function
There is a novelty in using the Android platform, that is, you can access some "good tools" inside the device ". In the past, the ability to access the underlying hardware of the device once made mobile developers very difficult. Although the role of the android Java environment is still a bridge between you and the device, the android development team has brought many hardware features to the fore. This platform is an open-source platform, so you can freely write code to implement your tasks.
If you have not installed Android, you can download the android SDK. You can also browse the content of the android. hardware package and refer to the example in this article. The android. Media package contains classes that provide useful and novel functions.
The android SDK provides the following hardware-oriented functions.
Table 1. Hardware-oriented features provided by the android SDK
Features |
Description |
Android. Hardware. Camera |
Classes that allow applications to interact with cameras. You can take photos, obtain preview screen images, and modify the parameters used to manage camera operations. |
Android. Hardware. sensormanager |
Class that allows access to Android platform sensors. Not all devices equipped with Android support all sensors in sensormanager, although this possibility is very exciting. (See the following for an overview of available sensors) |
Android. Hardware. sensorlistener |
The interface to be implemented by the class to receive updates when the sensor value is changed in real time. The application implements this interface to monitor one or more available sensors in the hardware. For example, the code in this article contains classes that implement this interface. After implementation, You can monitor the device direction and built-in acceleration tables. |
Android. Media. mediarecorder |
This class is used to record media samples and is useful for recording audio activities at specific locations (such as baby care. You can also analyze audio clips for identity authentication when accessing controls or secure applications. For example, it can help you open the door with sound to save time without getting the key from the real estate agent. |
Android. facedetector |
Class that allows basic recognition of faces (included in bitmap form. It is impossible to have two identical faces. You can use this type as the device locking method without having to remember the password-this is the biometric feature recognition function of the mobile phone. |
Android. OS .* |
Includes several useful class packages that can interact with the operating environment, including power management, File Viewer, processor, and message class. Like many mobile devices, phones that support Android may consume a lot of power. Waking up devices at the right time to monitor events of interest is the first concern during design. |
Java. util. Date Java. util. Timer Java. util. timertask |
Data and time are often important when measuring actual events. For example, the Java. util. Date class allows you to obtain a timestamp when a specific event or condition occurs. You can use Java. util. timer and Java. util. timertask to execute periodic or time point tasks, respectively. |
Android. Hardware. sensormanager contains several constants, which indicate different aspects of the android sensor system, including:
-
Sensor Type
-
Direction, acceleration table, light, magnetic field, closeness, temperature, etc.
-
Sampling Rate
-
Fastest, game, common, user interface. When an application requests a specific sampling rate, it only prompts or recommends the sensor subsystem. The specified sampling rate is not guaranteed.
-
Accuracy
-
High, low, medium, and unreliable.
The sensorlistener interface is the center of the sensor application. It includes two required methods:
- The onsensorchanged (INT sensor, float values []) method is called when the sensor value is changed. This method is only called by the sensor monitored by this application (for more information, see the following ). Parameters of this method include: an integer indicating the changed sensor; an array of floating-point values indicating the sensor data itself. Some sensors only provide one data value, while others provide three floating point values. The direction sensor and the acceleration table sensor both provide three data values.
- When the sensor's accuracy changes, the onaccuracychanged (INT sensor, int accuracy) method is called. Parameters include two integers: one representing the sensor and the other representing the new accurate value of the sensor.
To interact with a sensor, the application must register to listen for activities related to one or more sensors. Register and use the registerlistener method of the sensormanager class. The sample code in this article demonstrates how to register and deregister the sensorlistener.
Remember, not all devices that support Android support all the sensors defined in the SDK. If a sensor cannot be used on a specific device, your application will be downgraded as appropriate.
Sensor example
The sample application only monitors changes to the peer direction and the acceleration table sensor (for source code, see download ). When a change is received, the sensor value is displayed on the screen of the textview widget. Figure 1 shows the running status of the application.
Applications created using the eclipse environment and Android Developer Tools plug-in. (For details about how to use eclipse to develop Android applications, see references .) Listing 1 shows the code of the application.
Listing 1. ibmeyes. Java
package com.msi.ibm.eyes;import android.app.Activity;import android.os.Bundle;import android.util.Log;import android.widget.TextView;import android.hardware.SensorManager;import android.hardware.SensorListener;public class IBMEyes extends Activity implements SensorListener { final String tag = "IBMEyes"; SensorManager sm = null; TextView xViewA = null; TextView yViewA = null; TextView zViewA = null; TextView xViewO = null; TextView yViewO = null; TextView zViewO = null; /** Called when the activity is first created. */ @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); // get reference to SensorManager sm = (SensorManager) getSystemService(SENSOR_SERVICE); setContentView(R.layout.main); xViewA = (TextView) findViewById(R.id.xbox); yViewA = (TextView) findViewById(R.id.ybox); zViewA = (TextView) findViewById(R.id.zbox); xViewO = (TextView) findViewById(R.id.xboxo); yViewO = (TextView) findViewById(R.id.yboxo); zViewO = (TextView) findViewById(R.id.zboxo); } public void onSensorChanged(int sensor, float[] values) { synchronized (this) { Log.d(tag, "onSensorChanged: " + sensor + ", x: " + values[0] + ", y: " + values[1] + ", z: " + values[2]); if (sensor == SensorManager.SENSOR_ORIENTATION) { xViewO.setText("Orientation X: " + values[0]); yViewO.setText("Orientation Y: " + values[1]); zViewO.setText("Orientation Z: " + values[2]); } if (sensor == SensorManager.SENSOR_ACCELEROMETER) { xViewA.setText("Accel X: " + values[0]); yViewA.setText("Accel Y: " + values[1]); zViewA.setText("Accel Z: " + values[2]); } } } public void onAccuracyChanged(int sensor, int accuracy) { Log.d(tag,"onAccuracyChanged: " + sensor + ", accuracy: " + accuracy); } @Override protected void onResume() { super.onResume(); // register this class as a listener for the orientation and accelerometer sensors sm.registerListener(this, SensorManager.SENSOR_ORIENTATION |SensorManager.SENSOR_ACCELEROMETER, SensorManager.SENSOR_DELAY_NORMAL); } @Override protected void onStop() { // unregister listener sm.unregisterListener(this); super.onStop(); } } |
Writing an application must be based on common activities because it only updates the screen using the data obtained from the sensor. Building an application as a service may be more appropriate in applications where the device may execute other activities in the foreground.
The oncreate method of this activity can reference sensormanager, which contains all functions related to the sensor. The oncreate method also creates references to six textview Widgets. You need to update these widgets using sensor data values.
The onresume () method uses the reference to sensormanager to register Sensor updates using the registerlistener method:
- The first parameter is to implement the sensorlistener connection.
- The second parameter is the bit mask of the sensor. In this example, the application requests data from sensor_orientation and sensor_accelerometer.
- The third parameter is a system prompt indicating the speed required by the application to update the sensor value.
After the application (activity) is paused, You need to log out of the listener so that no Sensor updates will be received later. This is achieved through the unregisterlistener method of sensormanager. The only parameter is the instance of sensorlistener.
In the call of the registerlistener and unregisterlistener methods, the application uses the keyword "this. Note the implements keyword in the class definition, which declares the class to implement the sensorlistener interface. This is why you want to pass it to registerlistener and unregisterlistener.
The sensorlistener must implement two methods: onsensorchange and onaccuracychanged. The sample application does not care about the accuracy of the sensor, but the current x, y, and z values of the sensor. The onaccuracychanged method does not execute any operations. It only adds one log entry for each call.
It seems that the onsensorchanged method is often needed because the acceleration table and the direction sensor are sending data quickly. Check the first parameter to determine which sensor is sending data. After the sensor that sends the data is confirmed, the corresponding UI element will be updated using the data contained in the floating-point value array passed in method 2. This example only shows these values, but in more advanced applications, you can also analyze these values and compare the original values, or set a pattern recognition algorithm to determine the user's (or external environment) behavior.
Now that you know about the sensor subsystem, the following section will review a sample code for recording audio on your Android phone. This example runs on the dev1 development device.
Use mediarecorder
The android. Media package contains classes for interacting with the media subsystem. Use the Android. Media. mediarecorder class for media sampling, including audio and video. Mediarecorder runs as a state machine. You need to set different parameters, such as the source device and format. After setting, you can execute recording for any duration until the user stops.
The code in Listing 2 records audio on an Android device. The displayed Code does not include the UI elements of the application (for the complete source code, see download ).
List 2. recording audio clips
MediaRecorder mrec ;File audiofile = null;private static final String TAG="SoundRecordingDemo";protected void startRecording() throws IOException { mrec.setAudioSource(MediaRecorder.AudioSource.MIC); mrec.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP); mrec.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB); if (mSampleFile == null) { File sampleDir = Environment.getExternalStorageDirectory(); try { audiofile = File.createTempFile("ibm", ".3gp", sampleDir); } catch (IOException e) { Log.e(TAG,"sdcard access error"); return; } } mrec.setOutputFile(audiofile.getAbsolutePath()); mrec.prepare(); mrec.start();}protected void stopRecording() { mrec.stop(); mrec.release(); processaudiofile(audiofile.getAbsolutePath());}protected void processaudiofile() { ContentValues values = new ContentValues(3); long current = System.currentTimeMillis(); values.put(MediaStore.Audio.Media.TITLE, "audio" + audiofile.getName()); values.put(MediaStore.Audio.Media.DATE_ADDED, (int) (current / 1000)); values.put(MediaStore.Audio.Media.MIME_TYPE, "audio/3gpp"); values.put(MediaStore.Audio.Media.DATA, audiofile.getAbsolutePath()); ContentResolver contentResolver = getContentResolver(); Uri base = MediaStore.Audio.Media.EXTERNAL_CONTENT_URI; Uri newUri = contentResolver.insert(base, values); sendBroadcast(new Intent(Intent.ACTION_MEDIA_SCANNER_SCAN_FILE, newUri));} |
In the startrecording method, instantiate and initialize the mediarecorder instance:
- The input source is set to MIC ).
- The output format is set to 3GPP (*. 3GP file), which is a media format dedicated to mobile devices.
- The encoder is set to amr_nb. This is the audio format and the sampling rate is 8 kHz. NB indicates a narrow frequency. The SDK documentation explains different data formats and available encoders.
Audio files are stored on the memory card rather than in the memory. External. getexternalstoragedirectory () returns the name of the location of the storage card, where a temporary file name will be created. Then, the file is associated with the mediarecorder instance by calling the setoutputfile method. Audio data is stored in this file.
Call the prepare method to initialize mediarecorder. The start method is called when you prepare to start the recording process. Before calling the stop method, the files on the memory card are recorded. The release method releases the resources allocated to the mediarecorder instance.
After audio sampling is complete, you need to take the following steps:
- Add the audio to the device's media repository.
- Perform some pattern recognition steps to determine the sound:
- Is this baby crying?
- Is that everyone's voice? Do you want to unlock your phone?
- Is this "open sesame? Do you want to open the door to "secret channels?
- Automatically uploads audio files to network locations for processing.
In this sample code, the processaudiofile method adds the audio to the media library. Use intent to notify the media application on the device that new content is available.
Note that, if you try this code snippet, it will not record audio at the beginning. You will see the created file, but there is no audio. You need to add permissions to the androidmanifest. xml file:
<uses-permission android:name="android.permission.RECORD_AUDIO"></uses-permission> |
Now you have learned something about Android sensors and recording audio. The next section provides a more comprehensive overview of the application architecture related to the data collection and reporting system.
Android as a sensor Platform
The Android platform contains various sensor options for Environment Monitoring. With input or analog option arrays and advanced computing and interconnection functions, Android is the best platform for building a real system. Figure 2 shows a simple view between input, application logic, notification method, or output.
This architecture is flexible; application logic can be divided into local Android devices and server resources (larger database and computing functions can be implemented ). For example, a sound track recorded on a local Android device can be post to a Web server, where data is compared based on the audio mode database. Obviously, this is just the tip of the iceberg. We hope that you can study it more deeply and let the Android platform go beyond the scope of mobile phones.
Conclusion
In this article, we introduce Android sensors. The sample application measures the direction and acceleration, and uses the mediarecorder class to interact with the recording function. Android is a flexible and attractive platform for building a real system. The android field has developed rapidly and continues to grow. Instance of the class.