Android Multimedia System

Source: Internet
Author: User

The Android system can record and play a variety of different forms of local and streaming multimedia files. The Android multimedia system provides a very good platform for the development and application of multimedia for Android devices.

I. Architecture of Android Multimedia system

Android multimedia framework involves the application layer, Java framework, C language native framework, Hardware abstraction layer (driving layer) and so on. The following is an Android multimedia system framework hierarchy diagram.

Figure 13-1 Android Multimedia system frame level diagram


As you can see, the Android Multimedia system architecture is divided into four layers, namely Java application component layer, Java Application framework layer, System runtime C language native layer and Linux kernel driver layer, which are described in order from top to bottom respectively.

1) Java Application component layer

The Android platform offers three different multimedia features.

The Camera:android framework includes support for various cameras and their functions on Android devices, which allow users to capture photos and videos from their applications.
Media Recorder:android's Mediarecorder contains audio and video recording capabilities.
Media Player:android's MediaPlayer contains audio and video playback capabilities.

2) Java Application Framework Layer

The Android platform provides four different programming interfaces for Java components: Camera, Mediarecorder, MediaPlayer, and surface. These four programming interfaces enable almost all the multimedia functions of the Android system.

Android.Hardware.Camera: The Java class provides a programmatic interface to the operation of the camera.
Android.Media.MediaRecorder: The Java class provides an Android interface where audio and video two applications are called Mediarecorder implementations.
Android.Media.MediaPlayer: This Java class provides programming interfaces for playing audio and video two applications on Android interface.
Android.View.Surface: Raw buffer handle managed by screen compositor.

3) System Runtime Library native layer

Android system run native library layer is divided into camera hardware library, PacketVideo framework and surface Library three parts:

Camera Hardware Library: Provides a C language library for operating the underlying camera hardware.
PacketVideo Frame Library: Multimedia library, based on PacketVideo Opencore, supports a variety of commonly used audio, video format recording and playback, encoding format includes MPEG4, MP3, H 264, AAC, ARM.
ALSA Audio: AKA Premium Linux Sound System (Advanced Linux voice Architecture). Is the driver component of the Linux kernel that is provided for the sound card, replacing the original OSS (open sound system). Part of the purpose is to support the automatic configuration of the sound card, as well as the perfect processing system of multiple sound devices, these purposes are mostly achieved. Another sound frame Jack uses ALSA to provide low-latency professional-grade audio editing and mixing capabilities.
Android.View.Surface: When executing multiple applications, it is responsible for managing the interaction between the display and access operations, as well as for the display synthesis of 2D drawings and 3D drawings.

4) Linux kernel driver layer

This layer provides support for hardware drivers, including cameras, hardware codecs, audio/video drivers, and more.

Second, the Android multimedia system function

Multimedia mainly includes two aspects: Audio and video input and output, encoding and decoding links.

Among them, the input and output link is implemented by other hardware abstraction layer, the intermediate processing link is mainly implemented by PacketVideo, and hardware acceleration can be used.

In general, the Android multimedia system is supported by its open platform with a powerful library of functions including: audio playback, video playback, camera function, audio recording, video recording.

iii. Introduction of Opencore

Opencore is the core of the Android multimedia system. Compared with other Android libraries, Opencore is a multimedia library based on C + + code, which defines a full-featured operating system porting layer, all kinds of basic functions are encapsulated in the form of classes, the interface between the levels of multiple use of inheritance and so on.

It mainly contains two main aspects of the content:

Pvplayer: Provide the function of media Player, complete the playback (Playback) function of various audio and video streams;
Pvauthor: Provide the function of media stream recording, complete various audio, video stream and still image capturing function;

Pvplayer and Pvauthor are provided to developers in the form of an SDK that can build a variety of applications and services on top of this SDK.

Iv. introduction of OpenMAX

OpenMAX is a framework standard for multimedia applications. The OpenMAX IL (Integration Layer) specification defines the media component interface for fast integration of accelerated codecs in the streaming media framework of embedded devices.

OpenMAX is divided into three levels, top-down respectively, OpenMAX DL (development layer), OpenMAX IL (Integration layer), and OpenMAX AL (Application layer). Three levels of content are shown below.

First layer: OpenMAX DL (development layer, development tier)

The OpenMAX DL defines an API, which is a collection of audio, video, and image features. Silicon suppliers are able to implement and optimize on a new processor and then codec vendors use it to write a wider range of codec functions. It includes processing functions for audio signals such as FFT and filter, image raw processing, such as color space conversion, video raw processing, to achieve optimizations such as MPEG-4, H, MP3, AAC, and JPEG codecs.

Second layer: OpenMAX IL (integration layer, integration level)

OpenMAX IL interacts with multimedia codecs as audio, video, and image codecs, and supports components (for example, resources and skins) in a unified behavior. These codecs may be a mixture of hardware and software, and the user is transparent to the underlying interface applied to embedded, mobile devices. It provides an application and a media framework that is transparent. The S codec vendor must write a private or closed interface that integrates into the mobile device. The main purpose of IL is to use feature sets to provide a system abstraction for codecs to solve the problem of portability between multiple different media systems.

Layer Three: OpenMAX AL (appliction layer, application tier)

The OpenMAX AL API provides a standardized interface between applications and multimedia middleware that provides services to achieve the desired API functionality.

Android Multimedia System

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.