Stagefright Framework process explanation

Source: Internet
Author: User

1. stagefright Introduction
The multimedia engine of Android froyo version has been changed and added to the stagefright framework. By default, Android chooses stagefright and does not abandon opencore completely. It mainly implements an OMX layer, however, the OMX-component section of opencore is referenced. Stagefright is added at the mediaplayerservice layer, and is in parallel with opencore. Stagefright exists in the form of shared library (libstagefright. So) in Android. Module-awesomeplayer can be used to play video/audio. Awesomeplayer provides many APIs that can be called by upper-layer applications (Java/JNI.

2. stagefright data stream Encapsulation
2.1 "mediaextractor is generated by the data source datasource. Mediaextractor: Create (datasource. The create method generates the corresponding mediaextractor (mediaextractor. cpp) in two steps ):
? Use datasource-> sniff to explore the struct Data Type
? Generate the corresponding extractor:
If (! Strcasecmp (MIME, media_mimetype_container_mpeg4)
|! Strcasecmp (MIME, "audio/MP4 ")){
Return new mpeg4extractor (source );
} Else if (! Strcasecmp (MIME, media_mimetype_audio_mpeg )){
Return new mp3extractor (source, Meta );
} Else if (! Strcasecmp (MIME, media_mimetype_audio_amr_nb)
|! Strcasecmp (MIME, media_mimetype_audio_amr_wb )){
Return new amrextractor (source );
} Else if (! Strcasecmp (MIME, media_mimetype_container_wav )){
Return new wavextractor (source );
} Else if (! Strcasecmp (MIME, media_mimetype_container_ogg )){
Return new oggextractor (source );
} Else if (! Strcasecmp (MIME, media_mimetype_container_matroska )){
Return new matroskaextractor (source );
} Else if (! Strcasecmp (MIME, media_mimetype_container_mpeg2ts )){
Return new mpeg2tsextractor (source );
}

2.2 separated audio and video tracks to generate two mediasource: mvideotrack and maudiotrack. The Code is as follows (awesomeplayer. cpp ):
If (! Havevideo &&! Strncasecmp (MIME, "Video/", 6 )){
Setvideosource (extractor-> gettrack (I ));
Havevideo = true;
} Else if (! Haveaudio &&! Strncasecmp (MIME, "audio/", 6 )){
Setaudiosource (extractor-> gettrack (I ));
Haveaudio = true;
}

The mediasource obtained in 2.3 only has the parser function and does not have the decode function. We also need to further package these two mediasource and obtain two mediasource (with parse and decode functions ):
Mvideosource = omxcodec: Create (
Mclient. Interface (), mvideotrack-> getformat (),
False, // createencoder
Mvideotrack,
Null, flags );
Maudiosource = omxcodec: Create (
Mclient. Interface (), maudiotrack-> getformat (),
False, // createencoder
Maudiotrack );
After the mediasource. Start () method is called, it obtains and parses data from the data source internally and stops when the buffer is full. In awesomeplayer, the read method of mediasource can be called to read decoded data.
? For mvideosource, the read data: mvideosource-> Read (& mvideobuffer, & options) is handed over to the display module for rendering, mvideorenderer-> render (mvideobuffer );
? For maudiosource, maudioplayer is used to encapsulate maudiosource, and then maudioplayer is responsible for reading data and playing control.

3. Decode of stagefright
The two mediasource obtained after "Data Stream encapsulation" is actually two omxcodec. Both awesomeplayer and maudioplayer obtain data from mediasource for playing. Awesomeplayer obtains the raw video data that needs to be rendered at last, while maudioplayer reads the original audio data that needs to be played at last. That is to say, the data read from omxcodec is already raw data.
How does omxcodec convert the data source to raw data after two steps: parse and decode. Starting from the constructor omxcodec: Create, the number of partitions:
? Iomx & OMX refers to an instance of an omxnodeinstance object.
? The number of metadata for metadata & Meta is obtained by mediasource. getformat. The main member of this object is a keyedvector <uint32_t, typed_data> mitems, which stores some name-value pairs that represent mediasource format information.
? Bool createencoder indicates whether the omxcodec is encoded or decoded.
? Mediasource & source is a mediaextractor.
? Char * matchcomponentname specifies a codec used to generate this omxcodec.
First use findmatchingcodecs to find the corresponding codec, find it, assign nodes for the current iomx, and note the notify event listener: OMX-> allocatenode (componentname, observer, & node ). Finally, encapsulate iomx into an omxcodec:
Sp <omxcodec> codec = new omxcodec (
OMX, node, quirks,
Createencoder, mime, componentname,
Source );
In this way, omxcodec is obtained.
After the omxcodec is obtained in awesomeplayer, mvideosource-> Start () is called for initialization. The initialization of omxcodec mainly involves two tasks:
? Send the start command to openmax. Momx-> sendcommand (mnode, omx_commandstateset, omx_stateidle)
? Call allocatebuffers () to allocate two buffers, which are stored in vector <bufferinfo> mportbuffers [2] for input and output respectively.
After awesomeplayer starts playing a video, it reads data through mvideosource-> Read (& mvideobuffer, & options. Mvideosource-> Read (& mvideobuffer, & options) calls omxcodec. Read to read data. The omxcodec. Read mainly implements Data Reading in two steps:
? Call draininputbuffers () to fill in mportbuffers [kportindexinput]. This step completes parse. Openmax reads the data after Demux from the data source to the input buffer as the input of openmax.
? Fill mportbuffers [kportindexoutput] With filloutputbuffers (), and decode this step. Openmax decodes the data in the input buffer, and then outputs the video data that can be displayed after decoding to the output buffer.
Awesomeplayer renders data processed by parse and decode through mvideorenderer-> render (mvideobuffer. An mvideorenderer is actually an awesomeremoterenderer packed with iomxrenderer:
Mvideorenderer = new awesomeremoterenderer (
Mclient. Interface ()-> createrenderer (
Misurface, component,
(Omx_color_formattype) format,
Decodedwidth, decodedheight,
Mvideowidth, mvideoheight,
Rotationdegrees ));

4. stagefright processing process
Audioplayer is a member of awesomeplayer. audioplayer uses callback to drive data acquisition. awesomeplayer uses videoevent to drive data acquisition. There is a commonality between the two, that is, data acquisition is abstracted into msource-> Read (), and the parser and Dec are tied together in the read. In the stagefright AV synchronization part, audio is the callback-driven data stream, and the video part obtains the audio timestamp in the onvideoevent, which is a traditional AV timestamp for synchronization.
 
4.1 awesomeplayer video mainly includes the following members:
? Mvideosource (video decoding)
? Mvideotrack (reads video data from multimedia files)
? Mvideorenderer (convert the format of the decoded video. The format used by Android is rgb565)
? Misurface (redraw layer)
? Mqueue (event queue)

4.2. Part of the abstract process of audio during stagefright execution is as follows:
? Set muri path
? Start mqueue and create a thread to execute threadentry (named timedeventqueue, which is the event Scheduler)
? Open the header of the file specified by muri, and select a separator (for example, mpeg4extractor) based on the type)
? Use mpeg4extractor to separate the audio and video tracks of MP4 and return the mpeg4source video tracks to mvideotrack.
? The decoder is selected based on the encoding type in mvideotrack. avcdecoder is selected for the AVC encoding type and returned to mvideosource. msource in mvideosource is set to mvideotrack.
? Insert onvideoevent to the queue and start decoding and playing
? Read parsed video buffer through mvideosource object
Assuming that the parsed buffer has not reached the AV timestamp synchronization time, the operation will be postponed to the next round.
? If mvideorenderer is empty, initialize it. (If you do not use OMX, mvideorenderer is set to awesomelocalrenderer)
? Use the mvideorenderer object to convert the parsed video buffer into the rgb565 format and send it to the display module for image rendering.
? Insert onvideoevent into the event scheduler again to loop

4.3 The data process after being decoded from the source to the end is as follows:
Uri, FD
|
Datasource
|
Mediaextractor
|
Mvideotrack maudiotrack // audio and video data streams
|
Mvideosource maudiosource // Audio/Video Decoder
|
Mvideobuffer maudioplayer
Note:
? Set datasource. There are two types of data sources: URI and FD. Uri can be http: //, RTSP: //, etc. FD is a description of a local file. You can use FD to find the corresponding file.
? Mediaextractor is generated by datasource. Use sp <mediaextractor> extractor = mediaextractor: Create (datasource. Mediaextractor: Create (datasource) creates different Data Reading Objects Based on different data content.
? By calling setvideosource, The mediaextractor breaks down and generates audio data streams (maudiotrack) and video data streams (mvideotrack ).
? Onprepareasyncevent (), assuming that datasource is a URL, obtains data based on the address and starts buffering until mvideotrack and maudiotrack are obtained. Mvideotrack and maudiotrack call initvideodecoder () and initaudiodecoder () to generate the audio and video decoder mvideosource and maudiosource. Then call postbufferingevent_l () to submit the event to enable the buffer.
? The data buffer running function is onbufferingupdate (). When the buffer zone has enough data to play back, call play_l () to start playing. In play_l (), the key is to call postvideoevent_l () and submit the mvideoevent. The onvideoevent () function is called when this event is run (). This function calls mvideosource-> Read (& mvideobuffer, & options) to decode the video. Audio Decoding is implemented through maudioplayer.
? After decoding, the video decoder reads a frame of data through mvideosource-> Read, stores the data in mvideobuffer, and finally sends the video data to the display module through mvideorenderer-> render (mvideobuffer. When you need to pause or stop, you can call cancelplayerevents to submit an event to stop decoding and choose whether to continue buffering data.

5. code mark log
The vide video playback process described in stagefright is described in Section 4th, and the log mark is used to track the video data acquisition and CODEC processes. Start with the method in awesomeplayer. cpp. The process is as follows:
? In the modified/mydroid/frameworks/base/Media/libstagefrigh/directory, compile with mm and debug until the corresponding. So file is generated. Note: To compile a single module, you must first agree to the ../build/envsetup. Sh file under/mydroid.
? Make in the/mydroid/folder to generate the system. imgfile. Note: The advantage of compiling a single module before compilation is to shorten the debugging Compilation Time.
? Copy the system. imgfile to/Android-SDK-Linux/platforms/Android-8. Note: Back up the original system. IMG in advance.
? Start the simulator with sdcard, execute the./ADB shell file under/Android-SDK-Linux/tools/, and then execute logcat
? Open Gallery, select the video file for execution, and view the log synchronously.

The feedback result is as follows:
I/activitymanager (61): Starting: intent {act = android. intent. action. view dat = content: // media/external/Video/Media/5 typ = video/MP4 CMP = com. cooliris. media /. movieview} from PID 327
I/renderview (327): onpause renderview [email protected]
E/awesomeplayer (34): Beginning awesomeplayer... by Jay remarked...
E/awesomeplayer (34): Returning awesomeevent... by Jay remarked...
E/awesomeplayer (34): Returning awesomeevent... by Jay remarked...
E/awesomeplayer (34): Returning awesomeevent... by Jay remarked...
E/awesomeplayer (34): Returning awesomeevent... by Jay remarked...
E/awesomeplayer (34): Ending awesomeplayer... by Jay remarked...
E/awesomeplayer (34): setting video source now... by Jay remarked...
E/awesomeplayer (34): setting video type... by Jay remarked...
E/awesomeplayer (34): Returning awesomeevent... by Jay remarked...
E/awesomeplayer (34): Beginning initvideodecoder by Jay remarked...
D/mediaplayer (327): getmetadata
I/activitymanager (61): displayed com. cooliris. Media/. movieview: + 1s761ms
E/awesomeplayer (34): Beginning awesomelocalrenderer init... by Jay remarked...
E/awesomeplayer (34): Returning open (libstagefrighthw. So) correctly by Jay remarked...
E/memoryheapbase (34): Error opening/dev/pmem_adsp: no such file or directory
I/softwarerenderer (34): creating physical memory heap failed, reverting to regular heap.
E/awesomeplayer (34): Ending awesomelocalrenderer init close... by Jay remarked...
E/awesomeplayer (34): Returning awesomelocalrenderer... by Jay remarked...
I/cacheservice (327): Starting cacheservice

Stagefright Framework process explanation

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.