Android Audio Development (3): How to play a frame of audio

Source: Internet
Author: User

This article focuses on how to play a frame of audio data on the Android platform. Before reading this article, it is recommended to read "Android Audio Development (1): Fundamentals", because in the audio development process, often involves these basic knowledge, after mastering these important concepts, the development process of many parameters and processes will be easier to understand.


Android SDK provides 3 sets of audio playback APIs, namely: Mediaplayer,soundpool,audiotrack, the difference between them can see this article: "Intro to the three Android Audio APIs", In simple terms, MediaPlayer is more suitable for long-time playback of local music files or online streaming resources in the background; Soundpool is suitable for playing relatively short audio clips, such as game sounds, key sounds, ringtone clips, etc., it can play multiple audio simultaneously; And the Audiotrack is closer to the bottom, providing a very powerful control that supports low-latency playback, suitable for streaming media and VoIP voice telephony scenarios.


Audio development, more widely used is not limited to playing local files or audio clips, so this article focuses on how to audiotrack API to play audio data (note that the audio played using Audiotrack must be decoded PCM data).


1. Audiotrack Workflow


First, let's look at the Audiotrack workflow:


(1) Configure parameters to initialize the internal audio playback buffer

(2) Start playback

(3) Need a thread, constantly to the audiotrack buffer "write" audio data, note that this process must be timely, otherwise there will be "underrun" error, this error in audio development is more common, meaning that the application layer does not "feed" the audio data in a timely manner, Causes the internal audio playback buffer to be empty.

(4) Stop playback, release resources


2. Parameter configuration of Audiotrack


650) this.width=650; "src=" Http://s3.51cto.com/wyfs02/M02/7D/4D/wKiom1blOsuyeMLcAAGvYt87ovo649.png "title=" 51776e30-1969-4ab9-a79f-dc41934b8958.png "alt=" Wkiom1blosuyemlcaagvyt87ovo649.png "/>


Above is the Audiotrack constructor prototype, mainly by the constructor to configure the relevant parameters, the following one by one explanation (again, it is recommended to read "Android Audio Development (1): Basic knowledge"):


(1) Streamtype


This parameter represents which audio management strategy is used by the current application, and when the system has multiple processes that need to play audio, this management policy determines the final presentation, and the optional value of the parameter is defined in the Audiomanager class as a constant, mainly including:


Stream_vocie_call: Telephone Voice

Stream_system: System Sound

Stream_ring: Ringtones

Stream_musci: The Sound of Music

Stream_alarm: Warning Sound

Stream_notification: Notification Sound


(2) Samplerateinhz


Sampling rate, from the audiotrack source of the "Audioparamcheck" function can be seen, this sampling rate range must be between 4000hz~192000hz.


(3) Channelconfig


The configuration of the number of channels, the optional value is defined in the form of constants in the Audioformat class, commonly used is Channel_in_mono (single channel), Channel_in_stereo (dual channel)


(4) Audioformat


This parameter is used to configure "data bit width", the optional value is also defined in the form of constants in the Audioformat class, commonly used is encoding_pcm_16bit (16bit), Encoding_pcm_8bit (8bit), note that The former is guaranteed to be compatible with all Android phones.


(5) Buffersizeinbytes


This is the most difficult to understand and the most important parameter, it is configured Audiotrack internal audio buffer size, the value of the buffer can not be less than one frame "audio frame" size, and the previous article introduced, the size of a frame audio frame is calculated as follows:


int size = Sample rate x bit width x sample time X number of channels


Sampling time generally take 2.5ms~120ms between, by the manufacturer or the specific application decision, we can actually infer that the shorter the sampling time of each frame, the resulting delay should be smaller, of course, fragmented data will be more.


In Android development, the Audiotrack class provides a function that helps you determine this buffersizeinbytes, with the following prototype:


int getminbuffersize (int samplerateinhz, int channelconfig, int audioformat);


Different manufacturers of the underlying implementation is not the same, but nothing is based on the above formula to get a frame size, the size of the audio buffer must be a frame size of 2~n times, interested friends can continue to delve into the source code exploration.


In practical development, it is strongly recommended that the function calculate the buffersizeinbytes that need to be passed in, instead of manually calculating it yourself.


(6) mode


Audiotrack provides two modes of play, one is static, one is streaming, the former needs to write all the data to the playing buffer at once, simple and efficient, usually used for playing ringtones, system reminders of audio clips; The latter is the uninterrupted writing of audio data at a certain time interval, which in theory can be used for any audio playback scenario.


The optional values are defined as constants in the Audiotrack class, one is Mode_static, the other is Mode_stream, and the corresponding values are passed in according to the specific application.


4. Sample Code


I have simply encapsulated the interface of the Audiotrack class and provided a Audioplayer class that can be downloaded to my github: https://github.com/Jhuster/Android/blob/master/Audio/ Audioplayer.java


Here's a copy:


/* *  copyright notice   *  copyright  (C)  2016,  jhuster <[email protected]> *  https://github.com/jhuster/android *     *   @license  under the apache license, version 2.0   * *   @file     AudioPlayer.java *   *    @version  1.0      *   @author   jhuster *    @date     2016/03/13     */package  com.jhuster.audiodemo;import android.util.log;import android.media.audioformat;import  android.media.audiomanager;import android.media.audiotrack;public class audioplayer {         private static final string tag =   "Audioplayer";     private static final int default_stream_type = audiomanager.stream_music;     private static final int DEFAULT_SAMPLE_RATE = 44100;     private static final int DEFAULT_CHANNEL_CONFIG =  Audioformat.channel_in_stereo;    private static final int default_ Audio_format = audioformat.encoding_pcm_16bit;    private static final  int DEFAULT_PLAY_MODE = AudioTrack.MODE_STREAM;                 private boolean mIsPlayStarted =  false;    private int mminbuffersize = 0;     Private audiotrack maudiotrack;          public  boolean startplayer ()  {         return startplayer (Default_stream_type,default_sample_rate,default_ Channel_config,default_audio_format);    }         Public boolean startplayer (int streamtype, int samplerateinhz, int  Channelconfig, int audioformat)  {                 if  (misplaystarted)  {        &NBSP;&NBSP;&NBSP;&NBSP;&NBSP;LOG.E (tag,  "player already started !");             return false;         }                 mminbuffersize = audiotrack.getminbuffersize (Samplerateinhz,channelconfig, Audioformat);         if  (Mminbuffersize == audiotrack.error_bad_value)  {      &NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;LOG.E (tag,  "invalid parameter !");             return false;    &NBSP;&NBSP;&NBSP;&NBSP;&NBSP;}&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;LOG.D (TAG ,  " getminbuffersize =  "+mminbuffersize+"  bytes ! ");                 maudiotrack  = new audiotrack (Streamtype,samplerateinhz,channelconfig,audioformat,mminbuffersize,default_ Play_mode);        if  (maudiotrack.getstate ()  ==  audiotrack.state_uninitialized)  {             LOG.E (tag,  "audiotrack initialize fail !");             return false;        }                              mIsPlayStarted = true;    &NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;LOG.D (TAG,  "Start  Audio player success ! ");                 return  true;    }        public int  Getminbuffersize ()  {        return mMinBufferSize;     }        public void stopplayer ()  {                 if  (!misplaystarted)  {            return;         }                 if  (Maudiotrack.getplaystate ()  == audiotrack.playstate_playing)  {            maudiotrack.stop ();                                  }                 maudiotrack.release ();         mIsPlayStarted = false;         &NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;LOG.D (tag,  "Stop audio player  success ! ");     }        public boolean play (byte[]  Audiodata, int offsetinbytes, int sizeinbytes)  {                 if  (!misplaystarted)  {    &NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;LOG.E (tag,  "player not started !");             return false;         }                 if  (sizeinbytes < mminbuffersize)  {      &NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;LOG.E (tag,  "audio data is not enough !");             return false;       &Nbsp; }                if   (Maudiotrack.write (audiodata,offsetinbytes,sizeinbytes)  != sizeinbytes)  {                      &NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;LOG.E (tag,  "could not write all the  samples to the audio device ! ");         }                                                                                                   Maudiotrack.play ();                &NBSP;LOG.D (tag ,  "ok, played " +sizeinbytes+ " bytes !");                 return  true;    }}


5. Summary


about how to use Audiotrack to play a frame of audio data on the Android platform is introduced here, the article is not clear of the place welcome message or letter [email protected] exchange, or follow my Sina Weibo @ Lou _ June or the public number @Jhuster Get the latest articles and information.

650) this.width=650; "src=" http://s3.51cto.com/wyfs02/M01/7D/4D/wKiom1blPYyBDthmAACb8XAe6Uo877.jpg "title=" Weixin _jhuster.jpg "alt=" Wkiom1blpyybdthmaacb8xae6uo877.jpg "/>

This article is from the "jhuster column" blog, be sure to keep this source http://ticktick.blog.51cto.com/823160/1750593

Android Audio Development (3): How to play a frame of audio

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.