Introduction to Android Multimedia development

Source: Internet
Author: User
Tags dtmf tones

Introduction to Android Multimedia development

Transferred from: http://blog.csdn.net/reiliu/article/details/9060557

First, multimedia architecture

Based on the opencore of the third-party PacketVideo company, it supports all common audio/video/Still image formats including: MPEG4, H, MP3, AAC, AMR, JPG, PNG, GIF, etc. The function is divided into two parts, one is the audio/video playback (PlayBack), and the second is the record of the audio and video (Recorder).

The CODEC (codec) is extended with the OpenMAX 1L interface interface, allowing for easy support of Hardware/software Codecplug-ins.

The Open core Multimedia framework has a universal extensible interface for third-party multimedia codecs, input/output devices, and more.

Multimedia file playback, download, including 3GPP, MPEG-4,AAC and Mp3containers

Streaming file download, live playback, including: 3GPP, HTTP and RTSP/RTP

Encoding and decoding of dynamic video and still images such as: MPEG-4, H.263 and AVC (H. a), JPEG

Speech encoding format: AMR-NB and AMR-WB

Music encoding format: MP3, AAC, aac+

Video and image formats: 3GP, MPEG-4 and JPEG

Video conferencing: Based on H324-M standard

Second, Media framework noun explanation

1. Opencore Introduction

Opencore is the core of the Android multimedia Framework , and the audio and video capture and playback of all Android platforms is done through it. Open core programmers can quickly and easily develop the desired multimedia applications, such as: Audio and video capture, playback, video conferencing, real-time streaming media playback and so on.

2. Opencore Code Structure

The Opencore code is in the External/opencore directory of the Android code. This directory is the root directory of the Opencore, which contains subdirectories as follows:

? Android: This is a top-level library that implements an interface for audio and video capture, playback, and DRM Digital rights management for Android.

? Baselibs: A bottom-level library that contains data structures and line Cheng

? CODECS_V2: Audio and video codec, based on OPENMAX implementation

? Engines: Core part, implementation of multimedia engine

? EXTERN_LIBS_V2: Header file containing the Khronos OpenMAX

? Fileformats: Parsing of file formats (parser) tool

? Nodes: Provides some PVMF node, mainly for codec and file parsing.

? OSCL: Operating System Compatibility library

? PVMI: Abstract interface for input and output control

? Protocols: Mainly related to the network related to RTSP, RTP, HTTP and other protocols relevant content

? Pvcommon:pvcommon the Android.mk file for the library file, there is no source file.

? Pvplayer:pvplayer the Android.mk file for the library file, there is no source file.

? Pvauthor:pvauthor the Android.mk file for the library file, there is no source file.

? TOOLS_V2: Compile tools and some modules that can be registered.

3. OpenMAX Introduction

? OpenMAX is Khronos's Api,khronos is also the creator of OpenGL.

? OpenMAX is an unlicensed, cross-platform API that enables media acceleration components to be implemented across multiple operating systems and processor hardware platforms in the development, integration, and programming phases, providing a comprehensive streaming media codec and application portability.

? A unified programming interface (OpenMAX IL API) for multimedia codec and data processing is defined at the bottom of the architecture, which is a system-level abstraction for the processing function of multimedia data, which shields the user from the underlying details. Therefore, multimedia application and multimedia framework can use codec and other multimedia data processing functions in a unified way through OpenMAX IL, which has the portability of spanning hardware and software platforms.

? Location of the OpenMAX in the system

? OpenMAX level

OpenMAX AL (appliction Layer)

The OpenMAX AL API provides a standardized interface between applications and multimedia middleware that provides services to achieve the desired API functionality. Al provides the portability of the application book to the multimedia interface.

Openmaxil (Integration Layer)

OpenMAX IL acts as an audio, video, and image codec that interacts with multimedia codecs and supports components (such as resources and skins) in a unified behavior. The codec vendor must write a private or closed interface to integrate into the mobile device. The purpose of IL is to use feature sets to provide a system abstraction for codecs to solve the problem of portability between multiple different media systems.

OPENMAXDL (Development Layer)

The OpenMAX DL defines an API, which is a collection of audio/video and image functions. Silicon suppliers can implement and optimize them on a new processor, and then encode and decode vendors to use them to write a wider range of codec functions. It includes processing functions for audio signals such as FFT and filter, image raw processing such as color space conversion and video raw processing for optimizations such as MPEG-4, H, MP3, AAC, and JPEG codecs. OpenMAX supports acceleration through IDL and ADL, IDL uses the OpenMAX IL structure, and ADL adds an asynchronous interface to the OpenMAX DL API.

Third, Media framework architecture

1. Camcorder Framework

2. Audio Software Overview

3. Surfaceflinger

? The graphics system in Android uses the Client/server architecture. Server (that is, Surfaceflinger) is written primarily by C + + code. The client-side code is divided into two parts, one for the supply used by Java, and the other for the underlying implementation of C + +.

? Java View and its subclasses (such as TextView, Button) are painted on surface. Each surface creates a canvas object (but the properties change from time to time) to manage the view's drawing operations on surface, such as drawing dots. Each Canvas object corresponds to a bitmap that stores the content that is drawn on surface.

? Each surface usually corresponds to two buffer, a front buffer, and a back buffer. Where back buffer is the corresponding bitmap canvas drawing, painting is always on the back buffer, need to update, then the back buffer and front buffer interchange.

? Surfaceflinger-high-leveloverview

? Java Surface (Frameworks/base/core/java/android/view/surface.java). The object is applied indirectly (via Surfaceview,viewroot, etc.), the application needs to create surface, (and simultaneously creates a canvas), draws the graphic onto the object, and finally delivers it to the screen.

? C + + Surface (frameworks/base/libs/ui/surface.cpp. This object is called by the Java surface via JNI to implement the functionality of Java surface.

? ISurface (and its derived class, Bnsurface). This object is an interface between the application and the server. C + + surface creates this isurface (Bnsurface) and sends commands, such as updating surface content to the screen. The server side accepts this command and performs the appropriate action.

Iv. Media API

Android provides media APIs for developers to use: MediaPlayer, Mediarecorder, Android Mediaapis.

1. MediaPlayer

The basic interfaces provided are as follows:

Public Methods

Static Mediaplayercreate (context context, URI Uri)

Convenience method to Createa MediaPlayer for a given Uri.

int GetCurrentPosition ()

Gets the current playbackposition.

int Getduration ()

Gets the duration of thefile.

int Getvideoheight ()

Returns the height of thevideo.

int Getvideowidth ()

Returns the width of thevideo.

Boolean isplaying ()

Checks whether Themediaplayer is playing.

void Pause ()

Pauses playback.

void Prepare ()

Prepares the player forplayback, synchronously.

void Prepareasync ()

Prepares the player forplayback, asynchronously.

void Release ()

Releases resourcesassociated with this MediaPlayer object.

void Reset ()

Resets the MediaPlayer toits uninitialized state.

void Seekto (int msec)

seeks to specified timeposition.

void Setaudiostreamtype (Intstreamtype)

Sets the audio stream typefor this MediaPlayer.

void Setdatasource (Stringpath)

Sets the data source (File-path or http/rtsp URL) to use.

Voidsetdisplay (Surfaceholder SH)

Sets the Surfaceholder touse for displaying the video portion of the media.

void SetVolume (Floatleftvolume, float rightvolume)

Sets the volume on Thisplayer.

void Start ()

Starts or Resumesplayback.

void Stop ()

Stops playback Afterplayback has been stopped or paused.

You can see that the MediaPlayer class provides a basic operation for a multimedia player, play, pause, stop, set volume, and so on.

2. MediaPlayer structure

MediaPlayer JNI

Code Location/FRAMEWORKS/BASE/MEDIA/JNI

MediaPlayer (Native)

Code Location/frameworks/base/media/libmedia

Mediaplayerservice (Server)

Code Location/frameworks/base/media/libmediaplayerservice

Mediaplayerservice hostprocess

Code Location/frameworks/base/media/mediaserver/main_mediaserver.cpp

Pvplayer

Code Location/external/opencore/android/

3. MediaPlayer Call

4. Media Recorder

The basic interfaces provided are as follows:

Public Method:

void Prepare ()

Prepares the recorder Tobegin capturing and encoding data.

void Release ()

Releases resourcesassociated with this Mediarecorder object.

void Reset ()

Restarts the Mediarecorderto its idle state.

void Setaudioencoder (Intaudio_encoder)

Sets the audio encoder to beused for recording.

void Setaudiosource (Intaudio_source)

Sets the audio source to beused for recording.

void Setoutputfile (Stringpath)

Sets the path of the outputfile to be produced.

void Setoutputformat (Intoutput_format)

Sets the format of Theoutput file produced during recording.

Voidsetpreviewdisplay (Surface SV)

Sets a Surface to show Apreview of recorded Media (video).

void Start ()

Begins capturing andencoding data to the file specified with Setoutputfile ().

void Stop ()

Stops recording.

Five, Android audio

In the audio subsystem of the Android framework, each audio stream corresponds to an instance of a Audiotrack class, and each audiotrack is registered to Audioflinger when it is created. All audiotrack are mixed (Mixer) by Audioflinger and then transported to the audiohardware for playback. Audiotrack and Audioflinger communication mechanisms Typically, audiotrack and Audioflinger are not in the same process, and they connect through the binder mechanism in Android.

1. Basic interface of Android audio processing

In Android development, depending on the scenario, the developer needs to use different interfaces to play audio resources for the purpose of conflict-handling strategies.

The Audiomanager provides a sound settings management interface for upper-level applications.

The Audioservice provides services for all audio-related settings. He defines a Audiosystemthread class that monitors audio control-related signals and, when requested, implements audio control by invoking the Audiosystem interface, where the message processing is asynchronous. In addition, a set of interfaces for transmitting audio control signals is also abstracted in Audioservice to provide support for Audiomanager.

Audiosystem provides the basic type definition of the audio system, as well as the interface of the basic operation.

For tones, it can be played via Tonegenerator, Tonegenerator provides support for DTMF tones (ITU-T q.23), as well as audio as defined in the call supervision tone (3GPP TS 22.001), the special tone (3GPP ts 31.111), Depending on the call status and roaming status, the audio path generated by the file is downstream audio or transmitted to the speakers or headphones.

For the beep, it can be played by ringtone, ringtone and Ringtonemanager provide quick play and management interface for ringtone, beep, alarm, etc. In essence, a simple package is provided for the media Player.

For in-game audio resources, it can be played via Soundpool.

For recording functions, audio recording is required via Mediarecorder and Audiorecord.

In addition to these direct audio classes, for volume adjustment, audio device management, and so on, Android also provides related classes to handle.

Audiomanager through the audio service, to provide the upper layer with the volume and ringtone mode control interface, ringtone mode control including speakers, headphones, Bluetooth, such as whether the open, microphone is mute and so on. Audiomanager is often used in the development of multimedia applications, with regard to the Android Audiomanager volume control process.

Soundpool is able to play a combination of audio streams, which is obviously useful for gaming applications. Soundpool can load an audio resource into memory from a resource file in the APK package or from a file in the file system. At the bottom of the implementation, the Soundpool can decode the audio resource into a 16bit mono or stereo PCM stream through the media playback service, which allows the application to avoid the delay of decoding during playback. In addition to the advantages of a small delay during playback, the Soundpool is capable of simultaneously playing a certain number of audio streams. When the number of audio streams to play exceeds the maximum value set by Soundpool, Soundpool stops a low-priority audio stream that has been played. Soundpool the maximum number of playback audio streams to avoid CPU overloads and affect the UI experience.

Another way to play audio in the Android platform is MediaPlayer. MediaPlayer is characterized by the same time but can only play one, suitable for a longer but not high time requirements.

2. MediaServer Analysis

MediaServer is the core and soul of the media part of the entire Android. Almost all of the content related to multimedia playback is put here. Includes audio and video codec and display output.

1. Initialize Audioflinger First

Its initialization creates a unique Audioflinger instance through Audioflinger's parent class Bindservice.

2. Then initialize Mediaplayerservice and Cameraservice

3. Final initialization of Audiopolicyservice

3, Audiopolicyservice and Audiopolicymanager analysis

Audiopolicyservice is one of the two main services of the Android audio system, and the other service is Audioflinger, both of which are loaded mediasever when the system starts.

Audiopolicyservice mainly accomplishes the following tasks:

The Java application layer accesses the services provided by the Audiopolicyservice through the Iaudiopolicyservice interface via JNI

L Input Connection Status

L Switching of the system's audio strategy (strategy)

L Settings for volume/audio parameters

Audiopolicymanager

A large part of Audiopolicyservice's management work is done in Audiopolicymanager. Includes volume management, audio policy (strategy) management, input management.

With the Audiopolicyservice Audiocommandthread, the settings are eventually applied to the corresponding track in the Audioflinger.

4. Audioflinger Analysis

Audioflinger is responsible for managing each track Audiotrack and Recordtrack, Master volume control, property settings for each sound stream, device control, and sound control. Audioflinger is a service in Android that has been loaded when Android is started.

The playback thread is actually an instance of Mixerthread that mixes the various track in that thread and, if necessary, the Resample (resampling) action. Convert to a uniform sample rate (44.1K) and then output audio data through the audiohardware layer of the audio system.

Src[sample rate Converter, sample frequency conversion] The role of SRC is to change the signal sampling rates, low sample rate to high sampling rate is a resampling process, the Resampling object is no longer the original signal, but this low sample rate of the signal, Because the sampling rate is not enough, more sample points need to be inserted to achieve the desired sample rate and sample size.

Comparison of audio architectures for three different platforms

One, Android platform

Android uses the ALSA(advancedlinux sound Architecture, Advanced Linux Voice Architecture) framework specifically developed for smart devices.

From the Android audio architecture and process analysis, because its drive library is located at the core layer, it means that the audio equipment manufacturers, users can not install the driver as the PC platform to improve sound quality. We can also feel in practical applications that the audio quality of Android devices is generally biased. Sound quality deviations A major problem is here in Src. Low sample rate is a resampling process when converting to a high sample rate, the Resampling object is no longer the original signal, but the low sample rate signal because the sample rate is not enough to insert more sample points to achieve the desired sample rate and sample size

In theory, SRC can be improved by changing the algorithm, but it is not very realistic, SRC needs a lot of floating-point computing resources, even with the high-quality SRC algorithm, the operation is to sacrifice equipment performance and power consumption at the expense of poor practicality.

Second, WindowsPhone platform

Universal Audio Architecture (UAA)

UAA is divided into exclusive mode and sharing mode, it minimizes the impact of audio device driver on system stability, but also can increase the transparency of signal processing process, the processing link can be controlled, in theory, it can achieve more excellent sound quality, can also significantly improve the system response speed, greatly reduce delay

In general, the application is the path of shared mode, which is called the channel, according to the above introduction of sharing mode, all the sound signal will be transferred to the audio engine (sound engines) section, there is software mixing action, it may pass SRC, Make the sound appear biased. When the application makes use of exclusive mode, the system will cut off the path of sharing mode, the sound signal will be sent directly to kernel mode finally to the bottom of the audio device output, the audio device at this time also fully 100% in conjunction with the exclusive mode of the audio format for processing.

Third, the iOS platform

We look at Ios,ios very close, we don't even know the exact structure of its architecture, but iOS devices don't have a variety of hardware devices, so it's easier to achieve better sound quality. iOS enables targeted development and improvement to achieve better sound quality. The reality is that, so far, no Android device has the same sound quality as any iOS device, which we think is not from the hardware, but the operating system.

Introduction to Android Multimedia Development (RPM)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.