Reprint please the head source link and the tail two-dimensional code together reprint, this article from countercurrent fish yuiop: http://blog.csdn.net/hejjunlin/article/details/53183718
Preface: The previous sections are introduced CAMERA2 related, for CAMERA2 preview to display the image on the Surfaceview, as well as the video, always refresh the current image area. Back to the earliest introduction of the MediaPlayer play video, these are inseparable from the important role Mediacodec, today introduced MEDIACODEC, see Agenda:
- What is Mediacodec?
- Codec the data type of the operation
- Compress buffer
- Raw Audio Buffer
- Original video buffer
- Mediacodec Status Cycle diagram
- Codec-specific data
- Codec data processing process
- Mediacodec case
What is Mediacodec?
The Mediacodec class can access the underlying media codec framework (Stagefright or OpenMAX), which is the encoder/decoder component. This is part of the Android low-level multimedia support infrastructure (usually with Mediaextractor, Mediasync, Mediamuxer, Mediacrypto, MEDIADRM, Image, Surface, and Audiotrack. Use together)).
Broadly speaking, codec processes input data to produce output data. Asynchronously processes the data and uses a set of input and output buffer. At a simple level, request (or receive) an empty input buffer, fill the data and send it to codec for processing. The data used by the codec is converted into an empty output buffer. Finally, request (or receive) the output buffer, consume its contents, and release it back to codec.
Codec the data type of the operation
Codec operates three types of data: Compressed data, raw audio data, and raw video data. All three of these data can be processed using bytebuffers, and you should use a surface for your original video data to improve your coding performance. The buffers that surface uses native video do not map or copy them bytebuffers, so it is more efficient. Typically cannot access raw video data when using a surface, you can use the ImageReader class to access the uncompressed decoded video frame (native). This may be more effective than using bytebuffers, and some native buffers may map directly to Bytebuffers. When using Bytebuffer mode, you can use the image class and getinput/outputimage (int) to access the original video frame.
Compress buffer
The input buffer (decoder) and the output buffer (encoder) contain the type of the compressed data format. For video type, this is a compressed video frame. For audio data, this is usually a single access unit (an audio segment typically consists of a few milliseconds of audio encoding format determined by the type), but this requirement is slightly relaxed, and the buffer may contain multiple access units of audio encoding. In both cases, buffer does not start and end on any character byte boundary until it is on the (frame/access) unit boundary.
Raw Audio Buffer
The original audio buffer contains the entire PCM audio frame data, and below is an example of the channel sequence for each channel. Each sample is a 16-bit signed integer in the native byte channel.
Short[] Getsamplesforchannel (Mediacodec codec,intBufferid,intChannelix) {Bytebuffer OutputBuffer = Codec.getoutputbuffer (Bufferid); Mediaformat format = Codec.getoutputformat (Bufferid); Shortbuffer samples = Outputbuffer.order (Byteorder.nativeorder ()). Asshortbuffer ();intNumchannels = Formet.getinteger (Mediaformat.key_channel_count);if(Channelix <0|| Channelix >= numchannels) {return NULL; } Short[] res =New Short[Samples.remaining ()/numchannels]; for(inti =0; i < res.length; ++i) {Res[i] = samples.Get(i * numchannels + Channelix); }returnRes }
Original video buffer
Bytebuffer mode video buffers are laid out according to their color format. You can get from Getcodecinfo (). Getcapabilitiesfortype. Colorformats (...) Gets the supported array of color formats. The video codec supports three color formats:
- Native RAW video format: with color_formatsurface tag and for input/output to surface.
- Flexible YUV buffers (with color_formatyuv420flexible): When in Bytebuffer mode, the Getinput/outputimage (int) method is also used for input/ Output to surface.
- Specific formats: These are usually only supported in Bytebuffer mode, some vendor-specific color formats, others are defined in Mediacodecinfo.codeccapabilities, for color formats, Equivalent to a flexible format, you can still use Getinput/outputimage (int)
From LOLLIPOP_MR1 onwards, all video codecs support flexible YUV 4:2:0 buffer
This article is derived from the countercurrent fish yuiop:http://blog.csdn.net/hejjunlin/article/details/53183718
Mediacodec status
There are three states in the concept of the life cycle of codec: Stopped, executing or released. 。 Stopping all states is actually the aggregation of three states: uninitialized, CONFIGURED and Error, while the execution state conceptually progresses through three sub-states: Flushed, Running and End-of-stream.
When you create a codec using a factory method, codec is in an uninitialized state. First, you need to configure it through Configue (...), which makes it in a configured state, and then calls the start () method to move it to the executing state. In this state you can manipulate the process data through the buffer queue above.
The execution state has three sub-states: Flushed, Running, and End-of-stream. After the start () method call, codec flushes the child state immediately, which has all of the buffer. Once the first input buffer is removed from the column, codec will take a long time to move to the running child state. When the input buffer for your queue has a end-of-stream tag, the codec will be converted to End-of-stream sub-state. In this state the codec no longer accepts further input buffer, but still generates an output buffer until the End-of-stream output is reached. You can move back to the flushed state at any time when you use the Flush () method in the executing state.
The call to the Stop () method returns the codec uninitialized (uninitialized) state, because it may be configured again. When you use another codec (codec), you must release it by calling the release () method.
In rare cases, the codec may encounter an error that states a fault. This is communicated using an invalid return value from a queue operation, or through an exception. Call the Reset () method to make the codec (codec) available again. You can call it to transfer any current state to the uninitialized state, otherwise, call the release () method and return to the terminal released state.
This article is derived from the countercurrent fish yuiop:http://blog.csdn.net/hejjunlin/article/details/53183718
Codec-specific data
Some formats, especially AAC audio and mpeg4,h.264 and h.265 video formats, require a large number of buffer inclusions to set up data, or codec specific data to start as real data. When processing such a compression format, the data must be submitted to codec after the Start method is called before any frame data is started. When Queueinputbuffer is called, the data must use Flag:buffer_flag_codec_config.
Codec-specific data can also be included in the Bytebuffer entry format passed to the configuration key "csd-0", "csd-1", and so on. These keys are always included in the trace Mediaformat obtained from Mediaextractor. The codec-specific data format is automatically submitted to Codec at start, and you cannot explicitly submit the data. If you do not include encoding-specific data formats, you can choose to submit the specified buffers in the correct order, as required by the format. For the H. AVC, you can also connect all the codec-specific data and submit it as a single codec-config buffer.
Android uses the following codec-specific data buffers. These also need to be in the tracking format to set the appropriate mediamuxer track configuration. Each parameter setting and Codec-specific-data section callout (*) must start with "\x00\x00\x00\x01".
Note: It is important to note that codec refreshes immediately after start, and codec specific data may be lost during the refresh process until any output buffer or format is changed to be returned. In this refresh, to ensure that the proper CODEC operation is not problematic, the data must be resubmitted, using the buffers with the BUFFER_FLAG_CODEC_CONFIG flag.
The encoder (or codec generates compressed data) will create and return encoding-specific data before any valid output buffers buffer with the CODEC-CONFIG flag. Buffer contains a timestamp that codec-specific-data not meaningful.
This article is derived from the countercurrent fish yuiop:http://blog.csdn.net/hejjunlin/article/details/53183718
Codec data processing process
A set of input and output buffer maintained by each codec is directed to the Buffer-id of the API call. After a successful call to the start () method, the client "owns" the input and output buffer. In synchronous mode, call Dequeueinput/outputbuffer (...) To obtain (ownership) the input or output buffer of the codec. In asynchronous mode, you pass the mediacodec.callback.oninput/outputbufferavailable (...) The callback method will automatically receive buffer.
When you get an input buffer, fill in the data and submit it to codec using queueinputbuffer--or Queuesecureinputbuffer if decryption is used. Do not use the same timestamp to submit multiple input buffer (unless the codec-specific data is marked).
Codec the onoutputbufferavailable callback method in asynchronous mode returns a read-only output buffer, or responds to a dequeuoutputbuffer call in synchronous mode. After the output buffer is processed, a Releaseoutputbuffer method is called to return the buffer of codec.
When you do not need to resubmit/immediately release codec buffer, grab input and/or output buffer may be postponed in codec, this condition is related to the device. Specifically, codec may defer generating output buffer until all buffer is unpublished/resubmitted. Therefore, try to get the available buffer as little as possible.
Depending on the API version, you can handle data in three ways:
Asynchronous processing using buffers
Starting from Lollipop, the preferred method of handling asynchronous process data is by setting a callback before the call is configured. The asynchronous Pattern alters the state transition because the codec of the start () method must be called to transition to the running child state and start receiving input buffer. Similarly, after an initial direct call begins, codec will be running the child state and begin passing the available input buffer through the callback.
This article is derived from the countercurrent fish yuiop:http://blog.csdn.net/hejjunlin/article/details/53183718
Case
Previously we used MediaPlayer broadcast network video and local video, today do not mediaplayer, instead of using MEDIACODEC to codec a video to Surfaceview, the following code:
Case engineering has been uploaded to Github:https://github.com/hejunlin2013/multimediasample, and other multimedia framework related cases are also in this warehouse.
The first time to get blog update reminders, as well as more Android dry, source code Analysis , Welcome to follow my public number, sweep the bottom QR code or long press to identify two-dimensional code, you can pay attention to.
If you feel good, easy to praise, but also to the author's affirmation, can also share this public number to you more people, original not easy
Android Multimedia Framework Summary (20) Mediacodec status cycle and codec and input/output buffer process (with instance)