To play and record audio on iOS devices, Apple recommends that we use the Avaudioplayer and Avaudiorecorder classes in the Avfoundation framework. Although the usage is simpler, it does not support streaming; This means that before playing the audio, you must wait until the entire audio load is complete before you can start playing the audio, and you must wait until the recording is finished before you can get the recording data. This has caused a great limitation to the application. To solve this problem, we need to use audio Queue services to play and record audio , and in order to simplify the processing of audio files, we need to use audio file Services( Previously thought C language has no audio file processing function library, now found.
Before using, we need to understand the basics of how the Audioqueue service works.
Figure 1 a recording audio queue (recording Audio queue)
As you can see from the diagram above, a recording Audio queue, including a buffer queue with buffer (buffer), and a callback (callback). How do they work with each other?
Figure 2 recording process
1. Fill in the first buffer with the audio
2. When the first buffer in the queue fills up, the next buffer is automatically populated. At this point, the callback is triggered.
3. Need to write audio data stream to disk in callback function
4. Then, you need to reseat the buffer into the buffer queue in the callback function so that the buffer can be reused. Repeat step 2.
After learning the recording process, let's take a look at playing the basic architecture.
Figure 3 a playback audio queue (a playback audio queue)
As can be learned from the above diagram, its structure and recording audio queues are basically the same, but the timing of the callback trigger is different, the workflow is a slight difference.
Figure 4 playback Process
1. Read the audio into the cache. Once filled with a buffer, it enters the cache queue, which is now on standby.
2. The application command sends instructions requiring the audio queue to start playing.
3. Audio takes data from the first buffer and starts playing.
4. Once the playback is complete, the callback is triggered and the contents of the next buffer are started to play.
5. The callback needs to take the following audio data to the buffer and then back into the cache queue. Repeat step 3.
So far, the basic principle of Audio queueservices even introduced. In the actual application, also need to handle various states, as well as abnormal conditions, such as playback interrupt, no recording equipment. About the use of audio File Services, I do not introduce here, the detailed reference source.
Examples of source code associated with this article: http://download.csdn.net/detail/midfar/4044390
Resources:
Audio Queue Services Programming Guide
https://developer.apple.com/library/ios/#documentation/musicaudio/conceptual/audioqueueprogrammingguide/ introduction/introduction.html#//apple_ref/doc/uid/tp40005343
Audio Queue Services Reference
https://developer.apple.com/library/ios/#documentation/musicaudio/reference/audioqueuereference/reference/ reference.html#//apple_ref/doc/uid/tp40005117
Audio File Services Reference
https://developer.apple.com/library/ios/#documentation/musicaudio/reference/audiofileconvertref/reference/ reference.html#//apple_ref/doc/uid/tp40006072