Http://blog.csdn.net/menguio/article/details/6323965
1 gallery Application Performance
Gallery only provides a rendering framework. Gallery is used to manage all video and image files. It has functions such as playing, viewing, and deleting. Automatically search for the picture and video stored in the local sdcard and classify the picture and video files of the same nature to be displayed during playback. The internal playback functions of gallery are the same as those of mediaplayer, which mainly includes the playing functions of audio and video.
Added the function of selecting a playback file from a specified directory in Gallery:
Method: traverse the directory under the sdcard, select a directory, traverse the file, and click play.
Note:
Two list tables are defined:
Videopathlist: traverses/sdcard directories and stores them in this list.
Videlfilelist: traverses the files in the corresponding directory and saves them in this list.
Two activities are defined:
Videolist2play: Implements directory and file traversal under/sdcard.
Videoplayer: video playback using the videoview class.
Start with the buttons and menu items of gallery and implant this function into Gallery.
The "Photo button" in the figure box that gallery loads for the first time can find the class and method of internal function implementation and add this function.
The more menu is used to add the "select other videos" function, which involves complicated internal classes and requires further research.
You can use the "select other video" button in movieview to play a video. When a video is actually playing, you can click the button to select another video.
Improvements: Through the pop-up selection dialog box, you can freely select directories and files under/sdcard for playback.
Data caching and processing process in Gallery
There are three threads in the application: the main thread (start destruction with the activity declaration cycle) and the feed initialization thread (run only once when the program is started, used to load the initial information of the album), feed listening thread (listen for album and photo changes ). The main process is summarized as follows:
1. For the first time, go to the program gallery and call oncreate. At this time, send the initialization message to the Message Queue. Then, gallery calls onresume and goes down to the onresume of gridlayer. The mediafeed object needs to be initialized before the onresume of mediafeed can be called;
2. process the handle_intent message in the message queue. Gallery initializes the data source to call the setdatasource method of gridlayer. This method triggers the start method of the underlying mediafeed, after execution, start the feed listening thread to continue executing the mediafeed run method.
The start method will do two things: Call your underlying re-start method onresume, and onresume will add a "content change listener" for the two media sources of the image and video respectively ", and request to refresh these two media sources (add to the global refresh request list); start the feed initialization thread malbumsourcethread.
3. The mediafeed initialization thread calls mediafeed's loadmediasets to load the album. It also calls the refresh method in the lower localdatasource (to check whether the database has album changes, add or modify the corresponding Mediaset album name in the feed) and the loadmediasets method (call the lower-layer cacheservice. the loadmediasets method) loads all photos in all albums and albums.
4. The mediafeed listening thread mediafeed. Run () constantly updates the album and photo variables in mediafeed Based on the media change messages returned by the content change listener (add, delete, modify, and delete. The specific mechanism is as follows: if the global refresh request list contains content, call localdatasource. refresh updates the album information (localdatasource. refresh calls the computedirtysets of cacheservice), then run to traverse each album and call datasource. the loaditemsforset () method loads photo records for the album.
2 function module description
1) Hierarchy
There are three layers: Upper-layer Java application, middle-layer framework, and lower-layer libaries. When the entire mediaplayer is running, it can be roughly divided into two parts: client and server. They run in two processes respectively, and the binder mechanism is used between them to implement IPC communication.
2) Description
2.1) Gallery. java implements inter-process communication through handler, involving the sendinitialmessage () method; checks the stored media files, involving the checkstorage () method; and initializes the data source, involving the initializedatasource () method; after determining the data type, use onactivityresult () to execute the corresponding activity. It involves layers, which are the primary layers of gridlayer and the same as griddrawmanager for media rendering management.
2.2) Java local call of mediaplayer
The Java local call of mediaplayer is implemented in the file frameworks/base/Media/JNI/android_media_mediaplayer.cpp. Android. Media. mediaplayer has two parts, one for the upper layer of Java, such as videoview, and the other for the native method, which calls JNI.
The mediaplayerservice enables interaction and data communication between the client and mediaplayer. It involves calling the data source through mediaprovider (multimedia content provider) and mediascannerservice (Multimedia scanning service) and mediascannerreceiver checks the data type (these two classes use server and broadcasterrevceiver, the main method is scan (), scanflle (), and classifies files of the same type as mediastore (Multimedia Storage) for data storage; mediaplayer. java calls jni_android_media_mediaplayer.jni to perform the same operations as mediapalyer. CPP implements communication.
Note:
• Mediastore is a multimedia database provided by the Android system. Multimedia Information in android can be extracted from this class. This mediastore includes all information about the multimedia database, including audio, video, and images.
• Mediascannerreceiver is enabled when any action_boot_completed, action_media_mounted, or action_media_scanner_scan_file intent (intent) is sent. Because parsing the metadata of a media file may take a long time, mediascannerpolicer starts mediascannerservice.
• Mediascannerservice calls a public class mediascanner for media scanning. Mediascannerreceiver maintains two types of scan directories: one is internal volume pointing to $ (android_root)/media. The other is external volume pointing to $ (external_storage ).
3) mediaplayer
Android mediaplayer includes the playing functions of audio and video. On the android interface, both the music and video applications call mediaplayer, and the upper layer also includes inter-process communication, the basis for inter-process communication is the Binder Mechanism in the basic Android library. Android's media playback function is divided into two parts: one is the media playback application and the other is the media playback service. These two parts run in different processes respectively. Media Playback applications include Java programs and some C ++ code. The media playback service is C ++ code. Media Playback applications and media playback services must use the Binder Mechanism to call each other. These calls include:
(1) The media playback application sends control commands to the media playback service;
(2) The apsaravideo player Service sends an event notification (Y) to the apsaravideo player application ).
• The media playback Service provides multiple external interfaces, including two important interfaces: imediaplayerservice and imediaplayer. imediaplayerserver is used to create and manage playback instances, while the imediaplayer interface is the playback interface, it is used to control the playback and playback of specified media files.
• An imediaplayerclient interface provided by the media playback application to the media playback service, which is used to receive y (). These interfaces need to be called across processes, involving the Binder Mechanism (that is, to establish a connection between the two parts ). Each interface consists of two parts: one is the real implementation of interface functions (bninterface), which runs in the interface providing process; the other is the interface's proxy (bpinterface ), this part runs in the process that calls the interface.
3. Code framework
1) Java program path:
Packages/apps/camera/
After compilation, produce camera.apk, which corresponds to three applications: Camera, gallery, and camcorder.
Packages/apps/Gallery/src/COM/Android/camera/gallery,
Packages/apps/gallery3d/src/COM/cooliris/APP
Packages/providers/mediaprovider/
Including mediaprovider. Java, mediascannerservice. Java, mediascannerreceiver. Java,
Generate mediaprovider.apk after compilation. Media files (images, videos, and audios) on the local machine and sdcard will be scanned at startup, and/data/COM. android. providers. media/databases directory to generate internal. dB (/system/Meida) and external -?. DB (/sdcard) Two database files. After that, all multimedia information is obtained from these two databases.
2) Java framework path:
Frameworks/base/CORE/Java/Android/provider/mediastore. Java
All multimedia data information can be extracted from the provided multimedia database. Database operations are implemented by calling related interfaces using contentresolver.
Frameworks/base/Media/Java/Android/Media/
Provides operation interfaces for the Multimedia Application Layer on Android. Note:
• Mediaplayer. Java: Provides operations such as video, audio, and data stream playback control.
• Mediascanner *. Java: supports media scanning interfaces. Media scanning is added to the database, involving mediascannerconnection. Java and mediascannerconnectionclient. java.
3) Java local call (JNI ):
Frameworks/base/Media/JNI
Java local call. The goal generated after compilation is libmedia_jni.so.
• Listener: Java local call, which defines a jninativemethod (Java local call method) Data gmethods used to describe interface association information; defines jnimediaplayerlistener: mediaplayerlistener notify () method (this method calls the mediaplayer at the C ++ level to implement playback control ).
• Android_media_mediascanner.cpp: Local call implementation related to media scanning. Release the processing path, file, and ablum album content.
• Soundpool/android_media_soundpool.cpp: defines the local call Implementation of the audio system and the mediaplayer callback method android_media_callback ().
4) multimedia underlying Library:
Frameworks/base/include/Media/, frameworks/base/Media/libmedia/
Here, libmedia. So is compiled for the underlying library of multimedia. This library is at the core of the android multimedia architecture and provides interfaces for the upper layer, such as mediaplayer and mediascanner.
Android. Meida. * is implemented by calling the libmedia. So interface through libmedia_jni.so.
A) The mediaplayerinterface. h header file defines the underlying interfaces of the mediaplayer and defines the following classes:
• Mediaplayerbase: the abstract base class of mediaplayerinterface, which contains basic interfaces for audio output, video output, and playback control.
• Mediaplayerinterface and mediaplayerhwinterface inherit the extensions provided by mediaplayerbase for different outputs.
• Mediaplayerinterface obtains the same playing interface. You can inherit the mediaplayerinterface method to add a new player implementation.
B) imediaplayer. h defines the native playing class of bnmediaplayer; imediaplayer. CPP defines the bpmediaplayer proxy class (where messages are sent using the remote ()-> transact () method) and the specific method of bnmediaplayer: ontransact.
C) imediaplayerclient. h defines the bnmediaplayerclient local client class; imediaplayerclient. CPP defines the bpmediaplayerclient proxy class (where messages are sent using the remote ()-> transact () method in notify () and the bnmediaplayerclient: ontransact () method.
D) imediaplayerservice. h defines the bnmediaplayerservice local server class; imediaplayerservice. CPP defines the bpmediaplayerservice proxy class (where messages are sent using the remote ()-> transact () method) and the bnmediaplayerservice: ontransact () method.
E) mediaplayer. h defines the notify () method and class mediaplayer: bnmediaplayerclient; mediaplayer of the mediaplayerlistener class. CPP mainly implements mediaplayer data settings for playing and implements the mediaplayerlistener class's notify () method.
5) multimedia services:
Frameworks/base/Media/libmediaplayerservice/
The files are mediaplayerservice. h and mediaplayerservice. cpp.
This is the Multimedia Service Section (which provides the proxy executed by Media Player, establishes connections with the client, sets data sources, and creates playback Based on Different types), and compiles libmediaplayerservice. So.
• Mediaplayerservice. cpp implements a service named media. Player through the instantiate () method, and mediaplayer communicates with it through IPC;
• Different players are created based on the playertype;
• Implements the notify y () Notification client, callbackthread () callback mechanism, and decode decoding.
Frameworks/base/Media/mediaserver/
The file main_mediaserver.cpp is the main program started by the mediaplayer server and involves loading audioflinger (), audiopolicyservice (), and mediaplayerservice.
6) mediaplayer lifecycle:
4 audio Concept
The audio system is responsible for audio input/output and Management Layers in Android. It is generally responsible for playing PCM sound output and obtaining PCM Sound from outside, as well as managing sound devices and settings. It mainly involves audiomanager, audiotrack, audioserviece, and audiorecord. It can be divided into the following layers:
(1) local audio system interfaces provided by the media library;
(2) audioflinger serves as the middle layer of the audio system;
(3) The hardware abstraction layer of audio provides underlying support;
(4) The audio interface is provided to the upper layer through the JNI and Java frameworks.
Audio management stage Audio Output Audio input
Java-layer Android. Media.
Audiosystem Android. Media
Audiotrack Android. Media.
Audiorecorder
Local framework layer audiosystem audiotrack audiorecorder
Audioflinger iaudioflinger iaudiotrack iaudiorecorder
Hardware Abstraction Layer audiohardwareinterface audiostreamout audiostreamin
Audiotrack. Java: soundpool. Java: play audio resources of the android application.
Audiorecord. Java: Provides the recording setting (sample, Chanel, etc.) interface for Android applicatio;
Audiomanager. Java: provides audio volume control and playback mode (mute, vibrate, etc.
Note:
1) Audio driver (Linux system, only for different platforms)
2) Audio Hardware Abstraction Layer: Hardware/libhardware_legacy/include/hardware/
Audiohardwareinterface. H (defines the audio hardware abstraction layer interface). Three main classes are auidostreamout, audiostreamin, and auidohardwareinterface to implement audio output, input, and management.
2.1) Key audiostreamout interface write (const void * buffer, size_t bytes)/audiostreamin key interface Read (void * buffer, size_t bytes ), by defining the memory pointer and length of audio data output and input.
2.2) audiohardwareinterface uses the openoutputstream () and openinputstream () functions to obtain audiostreamout and audiostreamin.
2.3) the parameters involved in audiohardwareinterface are defined in audiosystem. h. The setparameters and getparameters interfaces are used to set and obtain parameters, and the system mode is set through setmode.
2.4) Policy Management audiopolicyinterface is introduced in audio to separate the core parts of audio from the auxiliary functions.
3) Implementation of audioflinger
3.1) the general method androidhardwaregeneric is used to implement the general audio Hardware Abstraction Layer Based on specific drivers.
3.2) the implementation method androidhardwarestub is used to implement a pile in the audio hardware abstraction layer. It is an empty operation to ensure that the system works normally without the audio device.
3.3) audiodumpinterface implements the audio Hardware Abstraction Layer with files as input and output, and simulates the input and output links of the audio hardware stream with files.
Audio Code distribution:
(1) Java: Frameworks/base/Media/Java/Android/Media
The Java package related to audio is Android. Media, which mainly includes several classes of audio manager and audio system. This part mainly provides the audio related interfaces for the upper-layer AP.
(2) JNI part: Frameworks/base/CORE/JNI
The Android system generates a libandroid_runtime.so, And the JNI of audio is a part of it.
(3) Audio frameworks
Header file path: Frameworks/base/include/Media/
Code path: Frameworks/base/Media/libmedia/
The local audio framework is part of the media library. This part of content is compiled into the library libmedia. So, which provides APIs for the audio part (including the binder-based IPC Mechanism ).
(4) Audio flinger: Frameworks/base/libs/audioflinger
This part of content is compiled into the library libaudioflinger. So, which is the local service part of the audio system.