Introductory This article mainly revolves around the function of the Android live assistant to do some research, because before to the Android multimedia related content to know very little, only the concept, then looked up the related material and did to summarize.
Thanks to my knowledge of audio and video 0, I have added some relevant knowledge.
Audio-Visual Codec technology 0 Basic Learning methods
Getting started with visual audio data processing: RGB, YUV pixel data processing
Capture Audio raw Data----> Compression coding----> Encapsulation
Capture Video Raw Data----> Compression coding----> Encapsulation
Audio and Video coding
Compression coding is the compression of data to save space for easy storage and transmission.
Video compression coding is the video frame of the pixel data RGB or YUV compressed into video stream, encoding generally to YUV format, Video coding scheme H.264,MPEG2,MPEG4 and so on.
The audio compression coding is to compress the sampled audio sampling data PCM into audio stream. Audio Coding scheme: Aac,wma,mp3 and so on.
Encoding is one of the most important technologies in audio and video technology, and it is also difficult, fortunately, Android provides mediacodec for developers to make audio and video codec, and for some kind of encoder can also specify the frame format, however, we cannot specify any format, because it requires hardware support , the supported frame formats can be queried through the API.
The Createencoderbytype method only needs to pass in the MIME to create the corresponding decoder, MIME can be the following form.
The use logic of the Mediacodec class is as follows, in a nutshell, an encoder is used to process the input data and encode it after the output, it has a series of input and output buffers, when used, from the encoder to request an empty input buffer, and then fill the data sent to the encoder processing, After the encoding is complete, the encoder will put the encoded data into the output buffer, only need to remove the encoded data from the output buffer for processing, and finally return the space to the encoder.
There are three ways to use Mediacodec
1. Synchronizing using the buffer array (API20 later deprecated)
2. Use the buffer sync method
3. Async mode
The first two approaches are basically similar, except that the synchronous mode of using buffer performs better. In the first way, for example:
When you want to encode the middle finger only need to be in the last valid data buffer or send an additional empty buffer to specify its flag flag bit buffer_flag_end_of_stream, and then call Queueinputbuffer sent to the encoder.
In addition to using Bytebuffer as the input output for Mediacodec, you can also use surface as a carrier for your data, in two ways: using input surface and using output surface.
To use input surface as an example, use the Createinputsurface () method to create an input surface.
Requests a surface to using as the input to an encoder, in place of input buffers. Indicates that this surface is used instead of buffer as input to the encoder.
The encoder will automatically read the frame data from input surface and send it to the encoder. The input buffer buffer is not accessed at this time, and an exception is thrown using methods such as Getinputbuffers. Call the Signalendofinputstream function at the end of the encoding stream, and after the function is called, surface stops providing the data flow to the encoder.
Obviously, this method is very convenient when it is not necessary to acquire the original data of the audio and video stream.
Audio and Video Mixing
Encapsulation generally refers to the compression encoding of the audio stream and video stream to merge, the package format a lot of types, such as mp4,mkv, etc., it is the role of compressed encoded video and audio in a certain format packaged together. For example, combining h-coded video streams and AAC-encoded audio streams into data in MP4 format,
Android also provides mediamuxer support for merging encoded audio and video streams into MP4 format files.
The key codes for using Mediamuxer are as follows
Android5.0 recording a video scheme
Record the screen with Mediaprojectionmanager. The key code is as follows
In the Onactivityresult, determine whether you have access to the recording screen. Then perform the following actions.
Createvirtualdisplay:creates a virtualdisplay to capture the contents of the screen
This parameter indicates which surfaceview the contents of the recorded phone screen should be displayed on, which actually indicates the direction of the screen frame data, which is very important.
Related to several classes
Imagereader:the ImageReader class allows direct application access to image data rendered into a Surface
The ImageReader class allows the app to directly access image data rendered to surface. Screen content recorded with Mediaprojectionmanager can be rendered directly onto a surface, which is passed in at Createvirtualdisplay, but we cannot access the rendered content. So the ImageReader class is primarily used when you are using surface without accessing the original video data stream. To access the contents of each frame, you can use the ImageReader class.
The class has a function getsurface gets a surface that uses this function to get a surface used to generate images-video stream frame data for ImageReader.
When you assign this surface to the surface parameters of the recording screen, you can get an image object by ImageReader reading the data for each frame rendered on your surface, and by using Acquirelatestimage ().
IMAGE:A single complete image buffer to use with a media source such as a mediacodec or a cameradevice.
That is, the image class represents a picture buffer for use with MEDIACODEC. The image object can get the pixel data of this frame picture, it should be RGB data, convert it into YUV format Data House, then encode it by Mediacodec.
Local recording screen scheme
With the preparation knowledge above, it is much easier to see the two main functions of live recorder and push-stream function.
The logic of the local recording screen: The local recorder does not need to manipulate the video raw data, so use input surface as input to the encoder.
Video: Mediaprojection the Virtualdisplay incoming surface created by Createvirtualdisplay is returned through the Createinputsurface method of Mediacodec, It shows that the input of the encoder actually comes from the recorded screen data, so it only needs to get the encoded bytebuffer in the output buffer of Mediacodec.
Audio: The recording program obtains the audio raw data PCM, passes it to the MEDIACODEC encoding, and then gets the encoded bytebuffer from the MEDIACODEC output buffer.
The audio and video mix is eventually combined with the merge module.
Push flow logic: The Push Stream SDK provides an encoder tvlivesdk_liveencoder, which accepts the YUV420 video data format and PCM encoded raw audio stream. Therefore, to obtain the original frame data of the video, it is possible to implement this function through ImageReader.
Video: Mediaprojection the Virtualdisplay incoming surface created by Createvirtualdisplay is returned through the Getsurface method of ImageReader, Indicates that the recorded screen frame data is passed to ImageReader, so through the ImageReader related API can read to the recorded screen every frame of data, but this data when the RGB format, converted to YUV format after uploading to the push stream SDK.
AUDIO: Because the push-stream SDK requires the original PCM encoded audio data, the push-stream SDK is called directly after recording to the audio data.
It's simply redirecting the direction of the screen-recorded data, what this surface provides, and where the recorded video data goes. Surface provides a local surfaceview control, then the screen content will be displayed on the control, providing MEDIACODEC is the input source as the encoder eventually obtained encoded data, The ImageReader is provided as the ImageReader data source, and the original data stream of the video is finally obtained.
Android Live Assistant video screen and push stream specific frame design
...
View Original: http://qhyuang1992.com/index.php/2016/08/08/android5_0_lu_ping_fang_an/
Android5.0 Recording Screen scheme