Codec
Integration and video overlayAndroid
Codec library stores all the content modified by multimedia in the prebuilt directory in the form of. So, without any source code.File
. The implementation of video overlay mainly uses the fsl ipu underlying libraryData
It is directly sent to the hardware for merge.
A. codec Integration
1. codec Integration Method
First, declare that codec integration refers to integrating codecOpencore
In the Framework, someone directly puts a library on the Internet and calls it through JNI. This method is a bit tricky. You have to implement control, synchronization, and output on your own, we will not discuss it. There are three ways to put a bare Codec in the opencore framework:
A. register an openmax component on an existing Android OMX core, or provide your own OMX core.
B. Implement a code encapsulated by CodecPvmf
Standard Mio (Media input/output)
C. Implement a node that encapsulates codec's pvmf
All three methods involve a bunch of Bt terms in opencore. First, we have to digest these terms. Otherwise, it is difficult to understand them intuitively. Because opencore is so huge that I can
In addition to the force, I only looked at the skeleton from the overall structure. I understand this as follows: opencore actually contains two parts: the command pipeline and the data pipeline.
Command is our player/author engine, and the data flow is in pvmf
. The basic component mounted under pvmf is node, which is a unit that implements a specific function, such as File Parse, codec,
Sink and so on. The Mio mentioned above is actually a special node. Its function is media input/output.
Engline accepts the command from the upper layer and controls nodes under pvmf to work. pvplayer/author is an SDK provided for Android based on engline, which is the work of opencore.Principle
.
Download
(67.05 KB) just now
The first method is OMX encapsulation. FSL also provides HW codec library in this way, and provides its own OMX
Core. In other words, FSL implements the entire/external/opencore/codecs_v2 directory, although this cool company only provides a few
. So. To implement a complete OMX-encapsulated codec port, prepare the followingKnowledge
:
/External/opencore/doc/openmax_call_sequences.pdf
/External/opencore/doc/omx_core_integration_guide.pdf
Http://omxil.sourceforge.net/docs/modules.html
In addition to these spec and guide, the ready-made example is Android's encapsulated OMX core, that is, the content in codecs_v2/OMX. If there is a bare codec, It is encapsulated as an OMX slaveTechnology
It should be difficult to talk about it. The basic process is to first package it into OMX and then package it into pv_omx. However, the spec of the opemax il layer is very complicated, and there may be a lot of work to do.
Compiled OMX Library
We can put the prebuilt directory in the FSL mode and provide the corresponding configuration files, such as fslomx. cfg. Here we will talk about the encapsulated library as follows:
Why is it called. All compiled libraries are stored in the/system/lib directory. Android reads all. cfg files in/etc
UUID to determine whether it is OMX encapsulated library. If the uuid matches, it will load the corresponding Library to the lib directory. This involves an important file
/Opencore/external/codecs_v2/OMX/omx_mastercore/src
/Pv_omxmastercore.cpp. This file is used to process multiple OMX cores.
Omxmastercore. cpp manages a priority problem. For example, when multiple OMX cores exist, each OMX core has an MP3
Which component should we use for decode component decoding? Omxmastercore processes the selection as follows:
A. Load the. cfg file in alphabetical order according to the file name of. cfg. That is to say, fslomx. cfg will load the file first than pvomx. cfg.
B. Determine whether the library is encapsulated by OMX according to the UUID. If yes, load the corresponding library and register all the component in OMX core.
[In other words, the configuration file name is first loaded before the letter, and the corresponding component registration will also be registered before]
C. omxmastercoreApplication
The role (such as MP3) required by the program and its required configuration are searched for the component that meets the requirements in the registered component. Once found, the node is selected for decoding.
Therefore, if you want to use your own codec for decoding, you must put your configuration file name in front, or if you do not need other OMX
Delete the configuration file of core. I have done an experiment to remove the FSL codec,
51. if you remove the android codec video and audio, the video cannot be played, because FSL only provides hard decoding of the video. When the application cannot find the audio solution
An error is reported directly. FSL video
Codec is still awesome. It calls the/external/fsl_imx_lib/VPU interface.
Therefore, it is not difficult to port Codec in general. In the future, you can also use the lazy method. That is to say, you can only implement the corresponding component and register it with the existing android
In OMX core, the registration is in/external/opencore/codecs_v2/OMX/omx_common/src
/Pv_omxregistry.cpp.
In addition to OMX encapsulation, I have not carefully read the other two methods. node-based PV does not provide documentation, while Mio integration provides its development documentation in Doc.
B. video overlay
Android is video
Playback outputs the isurface interface, that is, surfaceflinger is used to merge windows. SW
Merge will inevitably lead to low playback efficiency and high resource consumption. FSL implements hardware overlay to play videos, that is, using IPU for hardware merge.
After decoding the VPU, the decoded data is directly sent to the overlay buffer of IPU.
Here there are two underlying Lib, one is libipu. So, the other is libvpu. So, the VPU is responsible for decoding, and the IPU is responsible for displaying. There are two main changes to be made here. First, we need to obtain the data decoded by the VPU, which mainly involves the files in the following directory:
/External/opencore/nodes/pvomxbasedecnode/src/pvmf_omx_basedec_node.h
/External/opencore/nodes/pvomxbasedecnode/include/pvmf_omx_basedec_node.cpp
/External/opencore/nodes/pvomxvideodecnode/src/pvmf_omx _
The data is then sent to the overlay buffer. The modified content is actually a sequence of IPU. The content of this sequence can be referred:
/External/fsl_imx_lib/IPU/mxc_ipu_hl_lib.h
The specific modification content can be found in the following files:
/Android/android_surface_output.cpp
/Android/android_surface_outpur.h