0. Special terms
1. Physical Structure
2. System Architecture
This article uses the Freescale IMX platform codec alc5625 as an example.
0. Special terms
ASLA-advanced sound Linux architecture
OSS-previous Linux audio architecture, replaced by ASLA and compatible
Codec-coder/Decoder
I2S/PCM/ac97-codec and audio communication protocol/interface/bus between CPU
Dai-digital audio interface is actually I2S/PCM/ac97
DAC-digit to analog conversion
ADC-analog to digit Conversion
DSP-Digital Signal Processor
Mixer-mixer, which combines several audio analog signals from different channels into one analog signal
Mute-cancel voice and shield Signal Channels
PCM-pulse code modulation is a technology that converts analog audio signals to digital signals. It is different from PCM audio communication protocol.
Sampling frequency-ADC frequency, the number of samples per second, typical value such as 44.1 kHz
Quantization accuracy-for example, 24bit, equals the audio analog signal to the power of 24 in 2
SSI-serial Sound Interface
Dapm-Dynamic Audio Power Management
1. Physical Structure
Codec processes Audio Information, including ADC, DAC, mixer, DSP, input/output, and volume control.
Codec communicates with the processor through the I2C bus and the digital audio interface Dai.
I2C bus-reads and writes codec register data.
Dai-implements communication between the CPU and codec for audio data.
Codec is used as the research object. Its input includes mic (MICROPHONE) and phonein telephone signals, and the output includes headset HP (headphone), speaker and phoneout telephone signals. In addition, you must note that there are Audio Digital Signal Input and Output between codec and CPU.
1) Play Music
2) Recording
3) telephone
--- Call --- answer ---
4) call through Bluetooth
--- Call --- answer ---
2. System Architecture
The android audio system has a relatively standard and sound architecture, from the upper-layer application, Java Framework Service audiomananger, Local Service audioflinger, abstraction layer alsahal, local library, call the external support library of ALSA-lib of external, and finally the codec of the underlying driver ".
The following uses the system to start auidoflinger as an example to describe the organizational structure of ALSA sound.
The Java service audiomanager serves as the server, and the local service audioflinger serves as the client. The two interact through the Binder Mechanism. Audioflinger handed over the hardware abstraction layer alsahal to implement hardware functions (such as setting the phone/Bluetooth/recording mode in setmode. The abstraction layer can call local standard interfaces, such as masladevice-> route, or directly call the Alsa-lib library to operate on the underlying driver.
The audio driver structure in Linux is relatively complex. The source code is located in the/sound/soc/directory of the kernel, where the/codec folder stores the platform-independent codecs driver, the/IMX folder is stored in the Freescale IMX platform-related audio drivers, which can be divided into SSI drivers and Dai drivers.
Starting with the data structure driven by sound cards,
1) struct snd_soc_codec-implemented by a platform-independent codec driver.
2) struct snd_soc_platform-implemented by the Dai driver related to The IMX platform, mainly implementing the DMA transmission function of audio data.
3) struct snd_soc_dai_link-associate platform-related Dai with platform-independent codec.