IOS Audio Development CoreAudio

Source: Internet
Author: User

Transferred from: http://www.cnblogs.com/javawebsoa/archive/2013/05/20/3089511.html

The students who have been in touch with iOS audio development know that core audio is the basis for the digital audio processing of iOS and MAC, which provides a set of software frameworks that the application uses to process audio, and all the interfaces for iOS audio development are created by the core Audio to provide or through the interface it provides for encapsulation, according to the official statement is set play, audio processing recording as one of the professional technology, through which our program can record simultaneously, play one or more audio streams, automatically adapt to headphones, Bluetooth headset and other hardware, respond to various phone interruptions, mute, vibration, etc. Even provides 3D effects for music playback.

The API structure of Core Audio is divided into three layers, such as:

Low-level layer is the API layer about hardware interface, this level of API is mainly provided to the Mac to write the need for real-time performance audio processing interface, the general application will not use this level of Api,core Audio provides a higher-level API to the iOS mobile platform to handle real-time audio, which is more compact and efficient.

Mid-level layer function is complete, including audio data format conversion, audio file reading and writing, audio stream parsing, plug-in work support, etc.

The audio convert Services API is responsible for converting the data format

The audio File Services API is responsible for reading and writing sound data

Audio Unit Services and audio processing Graph Services support the plug-in for digital signal processing such as equalizer and mixer.

Audio File Scream Services responsible for stream parsing

Core Audio Clock Services is responsible for voice audio clocking synchronization

The high-level layer is a group of high-level applications that are combined from the lower level interface, and basically a lot of our work on audio development can be done at this level.

Audio Queue Services provides recording, playback, pausing, looping, and synchronizing audios it automatically handles compressed audio formats with the necessary codecs.

Avaudioplayer is a OBJECTIVE-C interface-based audio playback class for the iOS platform that supports playback of all audio supported by iOS.

The openal is an implementation of the OpenAL standard by CoreAudio, which plays 3D mixing effects.


The CoreAudio API is not packaged into a separate framework, and its interfaces are scattered across different frameworks, such as:

Audiotoolbox.framework provides CoreAudio's high-level API services, and we often deal with the Avaudiosession class that is included in this library to handle the application's control over the context of the audio device. It allows you to set the program's audio capabilities, handle interrupts and recovery operations caused by calls and other high-priority voice processing.

Audiounit.framework This library provides DSP-related digital signal processing plug-ins, including codec, mixing, audio equalization and so on.

Avfoundation.framework This library provides a streamlined music playback class that can play all iOS-supported audio.

Openal.framework offers 3D sound playback

CoreAudio in the design of the main use of property-to-mechanism to manage and manipulate the state and behavior of audio objects, we can see in each class how this works:

1. A property key is usually an enumeration constant that defines a mnemonic, such as Kaudiofilepropertyfileformat or Kaudioqueuedeviceproperty_numberchannels.

2. A property value is typically a specific data type that is appropriate for describing the property, such as void *,afloat64, a audiochannel data structure, and so on

CoreAudio accesses the function to get the property value of the key, if the property value can be written, you can also change the key corresponding to the property, of course, CoreAudio also provides a normal interface to get the value of the object. For example, set program audio to switch to Speaker mode via Kaudiosessionproperty_overridecategorydefaulttospeaker

UInt32 audiorouteoverride = Kaudiosessionoverrideaudioroute_none;

Audiosessionsetproperty (Kaudiosessionproperty_overridecategorydefaulttospeaker, sizeof (AudioRouteOverride), &audiorouteoverride);

The CoreAudio interface provides a callback mechanism that notifies your application when a property of an audio object changes, implements the callback function when the application uses the Audioqueue class for music playback, and sets it to the Audioqueue object, Then the Audioqueue object calls the function after performing an audio playback operation.

typedef void (*AUDIOQUEUEPROPERTYLISTENERPROC) (

void * Inuserdata,

Audioqueueref Inaq,

Audioqueuepropertyid InID

);

IOS Audio Development CoreAudio

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.