Should the project demand aimed at Bilibili's recording screen live function, basic imitation to do a bar. The study found that Bilibili was implemented using a combination of mediaprojection and virtualdisplay, which required Android 5.0 Lollipop API 21 to be used.
In fact, the official offer of the android-screencapture this sample has been the implementation and use of Mediarecorder, as well as the use of mediarecorder implementation of the recording screen to the local file demo, We can understand the use of these APIs from here.
If you need a live push stream, you need to customize the Mediacodec, and then encode from the MEDIACODEC to get the encoded frame, eliminating our original frame acquisition of the steps to save a lot of things. But the problem came, because previously did not carefully understand the structure of the H264 file and the relevant technology of FLV encapsulation, which climbed a lot of pits, and then I will record down, I hope to use a friend to help.
The most important reference for me in the project is the user Yrom GitHub project Screenrecorder,demo to realize the recording screen and the video stream as a local MP4 file (cough, YROM is Bilibili employees? (゜-゜) つロ)??。 In this first roughly analyze the implementation of the demo, and then I will explain how I realized.
Screenrecorder
The specific principle in the demo's readme has been made clear that:
DisplayCan "project" to aVirtualDisplay
MediaProjectionManagercreated by the obtained MediaProjectionVirtualDisplay
VirtualDisplayWill render the image to Surface medium, which Surface is created by MediaCodec the
mEncoder = MediaCodec.createEncoderByType(MIME_TYPE);...mSurface = mEncoder.createInputSurface();...mVirtualDisplay = mMediaProjection.createVirtualDisplay(name, mWidth, mHeight, mDpi, DisplayManager.VIRTUAL_DISPLAY_FLAG_PUBLIC, mSurface, null, null);
MediaMuxerMediaCodecencapsulates and outputs the resulting image metadata into a MP4 file
int index = mEncoder.dequeueOutputBuffer(mBufferInfo, TIMEOUT_US);...ByteBuffer encodedData = mEncoder.getOutputBuffer(index);...mMuxer.writeSampleData(mVideoTrackIndex, encodedData, mBufferInfo);
So in fact, on the Android 4.4 can be created by the DisplayManager creation VirtualDisplay is also possible to implement the recording screen, but because the permission limit requires ROOT. (see Displaymanager.createvirtualdisplay ())
Demo is simple, two Java files:
- Mainactivity.java
- Screenrecorder.java
Mainactivity
The most important method is simply to implement the entry, onActivityResult because the mediaprojection needs to be opened from this method. But don't forget to initialize the Mediaprojectionmanager first.
@Overrideprotected void Onactivityresult(intRequestcode,intResultCode, Intent data) {mediaprojection mediaprojection = mmediaprojectionmanager.getmediaprojection (ResultCode, DA TA);if(Mediaprojection = =NULL) {LOG.E ("@@","Media projection is null");return; }//Video size Final intwidth = the;Final intHeight =720; File File =NewFile (Environment.getexternalstoragedirectory (),"record-"+ width +"x"+ Height +"-"+ system.currenttimemillis () +". mp4");Final intbitrate =6000000; Mrecorder =NewScreenrecorder (width, height, bitrate,1, Mediaprojection, File.getabsolutepath ()); Mrecorder.start (); Mbutton.settext ("Stop Recorder"); Toast.maketext ( This,"Screen Recorder is running ...", Toast.length_short). Show (); Movetasktoback (true);}
Screenrecorder
This is a thread, the structure is very clear, the run() method completes the initialization of the MEDIACODEC, the creation of Virtualdisplay, and the full implementation of the loop encoding.
Thread Body
@Override Public void Run() {Try{Try{Prepareencoder (); Mmuxer =NewMediamuxer (Mdstpath, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4); }Catch(IOException e) {Throw NewRuntimeException (e); } Mvirtualdisplay = Mmediaprojection.createvirtualdisplay (TAG +"-display", Mwidth, Mheight, MDpi, Displaymanager.virtual_display_flag_public, Msurface,NULL,NULL); LOG.D (TAG,"created virtual display:"+ Mvirtualdisplay); Recordvirtualdisplay (); }finally{release (); }}
Initialization of the Mediacodec
In this method, the parameter configuration and startup of the encoder and the creation of surface are two key steps
Private void Prepareencoder()throwsIOException {Mediaformat format = Mediaformat.createvideoformat (Mime_type, Mwidth, mheight); Format.setinteger (Mediaformat.key_color_format, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);//The parameters that must be configured for the recording screenFormat.setinteger (Mediaformat.key_bit_rate, mbitrate); Format.setinteger (Mediaformat.key_frame_rate, frame_rate); Format.setinteger (Mediaformat.key_i_frame_interval, iframe_interval); LOG.D (TAG,"created video format:"+ format); MEncoder = Mediacodec.createencoderbytype (Mime_type); Mencoder.configure (format,NULL,NULL, Mediacodec.configure_flag_encode); Msurface = Mencoder.createinputsurface ();//need to be created before Createencoderbytype and start (), source comments are written clearlyLOG.D (TAG,"Created input surface:"+ Msurface); Mencoder.start ();}
Encoder for cyclic coding
The following code is the encoding process, because the author uses muxer for video capture, so in the Resetoutputformat method, the actual meaning is to pass the encoded video parameter information to Muxer and start muxer.
Private void Recordvirtualdisplay() { while(!mquit.get ()) {intindex = Mencoder.dequeueoutputbuffer (Mbufferinfo, Timeout_us); LOG.I (TAG,"dequeue output buffer index="+ index);if(index = = mediacodec.info_output_format_changed) {Resetoutputformat (); }Else if(index = = mediacodec.info_try_again_later) {LOG.D (TAG,"Retrieving Buffers time out!");Try{//wait 10msThread.Sleep (Ten); }Catch(Interruptedexception e) { } }Else if(Index >=0) {if(!mmuxerstarted) {Throw NewIllegalStateException ("mediamuxer Dose not call Addtrack (format)"); } encodetovideotrack (index); Mencoder.releaseoutputbuffer (Index,false); } }}
Private void Resetoutputformat() {//should happen before receiving buffers, and should only happen once if(mmuxerstarted) {Throw NewIllegalStateException ("output format already changed!"); } Mediaformat Newformat = Mencoder.getoutputformat ();//can also be used to obtain SPS and PPS, see Method Getspsppsbytebuffer ()LOG.I (TAG,"output format changed.\n new format:"+ newformat.tostring ()); Mvideotrackindex = Mmuxer.addtrack (Newformat); Mmuxer.start (); mmuxerstarted =true; LOG.I (TAG,"Started media muxer, videoindex="+ Mvideotrackindex);}
Get Bytebuffer for SPS PPS, note that SPS PPS is read-only read-only status here
privatevoidgetSpsPpsByteBuffer(MediaFormat newFormat) { ByteBuffer rawSps = newFormat.getByteBuffer("csd-0"); ByteBuffer rawPps = newFormat.getByteBuffer("csd-1"); }
The encoding process of video frame of recording screen
Bufferinfo.flags represents the current encoded information, such as source comments:
/** * This indicates, the (encoded) buffer marked as such contains * the data for a key frame. */ Public Static Final intBuffer_flag_key_frame =1;//Key frame/** * This indicated the buffer marked as such contains codec * INITIALIZATION/CODEC specific data instead of M Edia data. */ Public Static Final intBuffer_flag_codec_config =2;//This status indicates that the current data is AVCC, where you can obtain the SPS PPS/** * This signals the end of stream, i.e. no buffers would be available * after this, unless of course, {@link #flush} Follows. */ Public Static Final intBuffer_flag_end_of_stream =4;
Implementing the Encoding:
Private void Encodetovideotrack(intIndex) {Bytebuffer encodeddata = mencoder.getoutputbuffer (index);if((Mbufferinfo.flags & mediacodec.buffer_flag_codec_config)! =0) {//The codec config data is pulled out and fed to the muxer when we got //The info_output_format_changed status. //Ignore it. //The general meaning is that the configuration information (AVCC) has been fed to muxer in the previous Resetoutputformat () and is not available here, but this step in my project is a very important step because I need to manually implement the SPS in advance, PPS synthesis sent to streaming media serverLOG.D (TAG,"Ignoring Buffer_flag_codec_config"); Mbufferinfo.size =0; }if(Mbufferinfo.size = =0) {LOG.D (TAG,"Info.size = = 0, drop it."); Encodeddata =NULL; }Else{LOG.D (TAG,"Got buffer, info:size="+ Mbufferinfo.size +", presentationtimeus="+ Mbufferinfo.presentationtimeus +", offset="+ Mbufferinfo.offset); }if(Encodeddata! =NULL) {encodeddata.position (mbufferinfo.offset); Encodeddata.limit (Mbufferinfo.offset + mbufferinfo.size);//Encodeddata is the encoded video frame, but note that the author does not make the difference between the keyframe and the normal video frame, and unify the data to write the MuxerMmuxer.writesampledata (Mvideotrackindex, Encodeddata, Mbufferinfo); LOG.I (TAG,"Sent"+ Mbufferinfo.size +"bytes to muxer ..."); }}
The above is the Screenrecorder this demo of the general analysis, due to summary time, many details I did not carry out in-depth excavation research, so please hold a skeptical attitude to read, if the explanation is wrong or not in place, I hope you help point out, thank you!
Reference documents
In the development of the function also refer to a lot of valuable information and articles:
- Android Screen Live program
- Google's official Encodevirtualdisplaytest
- FLV file Format parsing
- Use librtmp for H264 and AAC live
- Subsequent updates ...
Android recording screen Live (a) simple analysis of Screenrecorder