Android Platform YUV Zoom related < go >

Source: Internet
Author: User
<span id="Label3"></p><p><p>Android video-related Development has probably been the most prominent part of the entire Android ecosystem, as well as the most divisive and compatibility issues in the Android Api. camera, and video coding related to the api,google has been very poor control of this aspect, resulting in different vendors on the implementation of the two API is quite a difference, and from the design of the api, has been very limited optimization, even some people think it is "one of the most difficult to use Android api"</p></p><p><p>For example, we record a 540p mp4 file, which for Android is generally followed by such a process:</p></p><p><p></p></p><p><p>In general, from the camera output of the YUV frame after preprocessing, sent to the encoder, to obtain a coded h264 video Stream.</p></p><p><p>The above is just coding for the video stream, in addition to recording the audio stream separately, and finally synthesizing the video stream and audio stream into the final video.</p></p><p><p>This article will mainly analyze two common problems in video stream coding:</p></p><p><p>1. Selection of video encoders (hard or soft)?</p></p><p><p>2. How to quickly pre-process (mirror, scale, Rotate) The YUV frame of the camera output?</p></p><p><p>Selection of Video Encoders</p></p><p><p>For video recording needs, many apps need to process each frame of data separately, so rarely directly use Mediarecorder to directly accept the video, generally speaking, there will be so two choices</p></p> <ul class="list-paddingleft-2"> <ul class="list-paddingleft-2"> <li><p>Mediacodec</p></li> <li><p>ffmpeg+x264/openh264</p></li> </ul> </ul><p><p>Let's analyze it one by one.</p></p><p><p>Mediacodec</p></p><p><p>Mediacodec is API 16 after the introduction of Google's audio and video codec for a set of low-level api, you can directly use hardware acceleration for video codec. When the call needs to initialize the Mediacodec as the video encoder, and then only need to pass through the original YUV data into the encoder can directly output the encoded H264 stream, the entire API design model, is to include both the input and output of the two queues:</p></p><p><p></p></p><p><p>therefore, as an encoder, the input queue is stored in the original YUV data, the output queue output is encoded H264 stream, as the decoder corresponds to the Opposite. At the time of invocation, Mediacodec provides both synchronous and asynchronous invocation methods, but the asynchronous use of callback is only added after API 21, as an example of synchronous invocation, typically called in the usual way (excerpt from the official example):</p></p><p><p></p></p><p><p>To explain briefly, get the input queue through getinputbuffers, and then call Dequeueinputbuffer to get the input queue idle array subscript, Note that Dequeueoutputbuffer will have several special return values representing the current codec state changes, Then through the Queueinputbuffer the original YUV data into the encoder, and the output queue end of the same through getoutputbuffers and Dequeueoutputbuffer to obtain the output of the H264 stream, after processing the output data, The output buffer needs to be returned to the system via Releaseoutputbuffer and re-placed in the output Queue.</p></p><p><p>For more complex usage examples of mediacodec, refer to the following CTS Test:</p></p><p><p>Encodedecodetest.java:</p></p><p><p>https://android.googlesource.com/platform/cts/+/jb-mr2-release/tests/tests/media/src/android/media/cts/ Encodedecodetest.java</p></p><p><p></p></p><p><p>From the above example is really a very primitive api, because the bottom of the mediacodec is directly called the phone platform hardware codec capability, so very fast, but because Google is very weak in the entire Android hardware ecosystem, so this API has a lot of problems:</p></p><p><p></p></p><p><p>1. Color formatting issues</p></p><p><p></p></p><p><p>Mediacodec in the initialization time, in the Configure time, need to pass in a Mediaformat object, when used as an encoder, we generally need to specify in the Mediaformat video width, frame rate, bitrate, I frame interval and other basic information, In addition, one important message is to specify the color format of the YUV frames that the encoder accepts. This is because YUV according to its sampling scale, UV components in the order there are many different color formats, and for the Android camera in the Onpreviewframe output YUV frame format, If no parameters are configured in the case, is basically the NV21 format, But google on the Mediacodec API in the design and specification, it seems very unkind, too close to the Android Hal layer, resulting in the NV21 format is not all the machine MEDIACODEC support this format as the encoder input format!</p></p><p><p></p></p><p><p>therefore, when initializing the mediacodec, we need to query the machine by Codecinfo.getcapabilitiesfortype to find out which YUV format is supported by the Mediacodec implementation, in general, at least 4.4+ system, These two formats are supported on most machines:</p></p><p><p></p></p><p><p></p></p><p><p></p></p><p><p></p></p><p><p>The two formats are yuv420p and NV21, if the machine only supports yuv420p format, you need to first convert the NV21 format of the camera output to yuv420p before it can be fed into the encoder to encode, or the final video will be spent screen, or color confusion</p></p><p><p>This is a little bit of a pit, basically with the MEDIACODEC for video coding will encounter this problem</p></p><p><p>2. Encoder support features are quite limited</p></p><p><p>If you use Mediacodec to encode the H264 video stream, There are some video quality settings for the H264 format that are related to compression ratios and bitrate, typically such as profile (baseline, main, high), Profiles level, Bitrate mode (CBR, CQ, VBR), The reasonable configuration of these parameters allows us to achieve a higher compression rate at the same rate, thereby improving the quality of video, Android also provides the corresponding API settings, can be set to Mediaformat in these settings:</p></p><p><p></p></p><p><p>But the problem is, for profile,level, bitrate mode these settings, most of the phone is not supported, even if the settings will eventually not take effect, such as setting the profile is high, the final video will still be baseline ....</p></p><p><p>This problem, the machine under 7.0 is almost mandatory, one of the possible reasons is that Android HardCode the Profile's settings at the source Level:</p></p><p><p></p></p><p><p>Android didn't cancel the hardcode of this place until after 7.0:</p></p><p><p></p></p><p><p>This problem can be said indirectly led to the MEDIACODEC encoded video quality is low, the same code rate, hard to get with the soft code and even iOS video Quality.</p></p><p><p>3.16-bit Alignment Requirements</p></p><p><p>As mentioned earlier, Mediacodec this API in the design, too close to the Hal layer, which on many Soc implementation, is directly into the Mediacodec buffer, without any pre-processing in the case directly into the Soc. While encoding H264 video stream, because the H264 encoding block size is generally 16x16, and then in the beginning to set the width of the video, if you set a non-aligned 16 size, such as 960x540, on some cpus, the final encoded video will directly spend the screen!</p></p><p><p>Obviously this is because the manufacturer in the implementation of this api, the lack of verification of incoming data and pre-processing caused by, at present, huawei, Samsung Soc This problem will be more frequent, some other manufacturers of some early Soc also have this problem, in general, the solution is to set the video width high, The size of the uniform set to 16 bits is FINE.</p></p>

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.