Android Audio Visualization Development Case Description _android

Source: Internet
Author: User
Android calls your own recorded audio program
Android has its own audio recording program, we can specify an action MediaStore.Audio.Media.RECORD_SOUND_ACTION intent to
Just start it up. Then, in the Onactivityresult () method, get the intent data, which is the corresponding URI of the recorded audio.
Java code:
Copy Code code as follows:

Package Eoe.demo;
Import android.app.Activity;
Import android.content.Intent;
Import Android.net.Uri;
Import Android.os.Bundle;
Import Android.provider.MediaStore;
Import Android.view.View;
Import Android.widget.Toast;
/**
* The example demonstrates how to call the Android self-band application to complete the audio entry
* It's really simple, we need to specify a MediaStore.Audio.Media.RECORD_SOUND_ACTION action to start
* The data returned is the URI of the audio we recorded.
*
* In this way, the flexibility is not high enough, we can use the Mediarecorder class to implement their own audio recording program
* Mediarecorder can be used to record audio or to record video
* After creating a Mediarecorder instance, you need to invoke Setaudiosource and Setaudioencoder to initialize the
* Normally, before preparing the recording, we also need to call the Setoutputformat () method to decide which audio format to use, while calling
* Setoutputfile () to specify the file to hold the recorded content
*
* The order in which these methods are invoked is: Setaudiosource,setoutputformat,setaudioencoder,setoutputfile
*
*
*
* @author Administrator
*
*/
public class Audiorecorddemo extends activity {
public void OnCreate (Bundle savedinstancestate) {
Super.oncreate (savedinstancestate);
Setcontentview (R.layout.audio_record);
}
public void Onactivityresult (int requestcode, int resultcode, Intent data) {
Super.onactivityresult (Requestcode, ResultCode, data);
Here we can get the URI of the audio we just recorded, and we can play and so on, which shows the returned URI.
if (ResultCode = = RESULT_OK) {
Uri audiopath = Data.getdata ();
Toast.maketext (this, audiopath.tostring (), Toast.length_long). Show ();
}
}
public void OnClick (View v) {
int id = V.getid ();
Switch (ID) {
Case R.ID.BTN1://Calling the Android audio recording app with its own
Intent Intent = new Intent (MediaStore.Audio.Media.RECORD_SOUND_ACTION);
Startactivityforresult (Intent, 0);
Break
Case R.ID.BTN2:
Using the Mediarecorder class to implement your own audio recording program
Intent Intent2 = new Intent ();
Intent2.setclass (this, myaudiorecord.class);
Startactivityforresult (Intent2, 1);
Break
Case R.ID.BTN3:
Implement your own audio recording program through the Audiorecord class
Intent Intent3 = new Intent ();
Intent3.setclass (this, myaudiorecord2.class);
Startactivityforresult (Intent3, 2);
Break
}
}
}

Introduction to Android audio
Recently porting Android, when Android was able to run on the device, the first thought was to get the audio device running. "No sound, no good play can not come out." This article briefly describes the Android audio adaptation layer.
The world has ever-changing audio devices, and Android is unlikely to support every device. Android defines a framework to fit the underlying audio device. The definition of the adaptation layer is located at:
Java code:
Copy Code code as follows:

Hardware/libhardware_legacy/include/hardware_legacy/audiohardwareinterface.h

The audio device at the bottom of the video must inherit classes such as the Audiostreamout,audiostreamin,audiohardwareinterface defined in the file and implement the Createaudiohardware function.
Let's take a look at Android's code to create an audio device that is located in the following code:
Java code:
Copy Code code as follows:

Frameworks/base/libs/audioflinger/audiohardwareinterface.cpp

The file has the following code:
Java code:
Copy Code code as follows:

audiohardwareinterface* Audiohardwareinterface::create ()
{
/*
* Fixme:this code needs to instantiate the correct audio device
* interface. For now-we use compile-time switches.
*/
audiohardwareinterface* HW = 0;
Char Value[property_value_max];
#ifdef Generic_audio
HW = new Audiohardwaregeneric ();
#else
If you are running in emulation--Use this simulator
if (Property_get ("Ro.kernel.qemu", value, 0)) {
LOGD ("Running in emulation-using generic audio driver");
HW = new Audiohardwaregeneric ();
}
else {
LOGV ("Creating Vendor Specific Audiohardware");
HW = Createaudiohardware ();
}
#endif
if (Hw->initcheck ()!= no_error) {
LOGW ("Using stubbed audio hardware. No sound would be produced. ");
Delete HW;
HW = new Audiohardwarestub ();
}
#ifdef WITH_A2DP
HW = new A2dpaudiointerface (HW);
#endif
#ifdef Enable_audio_dump
Recorded in the file.
LOGV ("Opening PCM Dump Interface");
HW = new Audiodumpinterface (HW); Replace interface
#endif
return HW;
}

From the code we can see that if you define a Generic_audio macro, you will create a audiohardwaregeneric, and if it is an emulator, Audiohardwaregeneric will not initialize. And then create the audiohardwarestub. These two classes are the audio of the device, which is provided by the Android default. The simulator is audiohardwarestub with no sound output. Devices are audiohardwaregeneric, because the default Generic_audio is set.
Generally we only care about audiohardwaregeneric implementation, who will go to the simulator to debug sound, anyway I do not have this xianxin. First of all, this audio adapter layer is on the Android, you can make sure your audio device is working, but not the best performance of the device. You will learn by following the description. The definition of Audiohardwaregeneric is located at:
Java code:
Copy Code code as follows:

Frameworks/base/libs/audioflinger/audiohardwaregeneric.cpp

The above is EoE to us to introduce the audio use, if has what does not understand more to look at the Android source code, this can help with your understanding of the audio.
look at the effect chart first

Copy Code code as follows:

public class Fftactivity extends activity implements onclicklistener{
Private button button;
Private ImageView ImageView;
private int frequency = 8000;
private int channelconfiguration = Audioformat.channel_configuration_mono;
private int audioencoding = Audioformat.encoding_pcm_16bit;
Private Realdoublefft transformer;
private int blockSize = 256;
Private Boolean started = false;
Private Canvas Canvas;
Private Paint Paint;
Private Bitmap Bitmap;
@Override
protected void OnCreate (Bundle savedinstancestate) {
Super.oncreate (savedinstancestate);
Setcontentview (R.LAYOUT.FFT);
Button = (button) Findviewbyid (R.id.fft_button);
Button.setonclicklistener (this);
ImageView = (ImageView) Findviewbyid (R.id.fft_imageview);
transformer = new Realdoublefft (blockSize);
Bitmap = Bitmap.createbitmap (256, Bitmap.Config.ARGB_8888);
Canvas = new canvas (bitmap);
Paint = new paint ();
Paint.setcolor (Color.green);
Imageview.setimagebitmap (bitmap);
}
Private class Recordaudio extends Asynctask<void, double[], void> {
@Override
protected void Doinbackground (void ... params) {
int buffersize = audiorecord.getminbuffersize (Frequency,
Channelconfiguration, audioencoding);
Audiorecord Audiorecord = new Audiorecord (
MediaRecorder.AudioSource.MIC, Frequency,
Channelconfiguration, audioencoding, buffersize);
short[] buffer = new Short[blocksize];
double[] Totransform = new Double[blocksize];
Audiorecord.startrecording ();
while (started) {
Read the record data into the buffer, but I think it might be more appropriate to call write.
int bufferresult = audiorecord.read (buffer, 0, blockSize);
for (int i = 0; i < Bufferresult; i++) {
totransform<i> = (double) buffer<i>/short.max_value;
}
Transformer.ft (Totransform);
Publishprogress (Totransform);
}
Audiorecord.stop ();
return null;
}
@Override
protected void Onprogressupdate (double[] ... values) {
Super.onprogressupdate (values);
Canvas.drawcolor (Color.Black);
for (int i = 0; i < values[0].length; i++) {
int x=i;
int downy= (int) (100-(values[0]<i>) *10);
int upy=100;
Canvas.drawline (x, downy, X, upy, paint);
}
Imageview.invalidate ();
}
}
@Override
public void OnClick (View v) {
Started=true;
New Recordaudio (). Execute ();
}
}

The principle of Android audio visualization is to use discrete Fourier transform, but don't worry about math students, there are open source Java discrete Fourier transform code!! Direct to www.netlib.org/fftpack/jfftpack.tgz, directly inside the Javasource directory dragged to (ca directory) SRC can!!
Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.