To play the H264 bare stream, you can split it into the following three jobs:
1. Decoding H264 bare stream to get YUV data
2. Convert YUV data to RGB data fill picture
3. Display the captured picture
To complete the work 1, we can directly use the HiSilicon decoding library, because the HiSilicon decoding library is C + + dynamic library, to complete in C # call can refer to HiSilicon h264 Decoding library This article, introduced in detail. But for this blog post only introduces a method of frame decoding, and does not introduce a really practical streaming decoding method. I wrote a C # version of the streaming decoding algorithm according to the reference document of the decoding library.
//Initialize//This is the decoder output image informationhih264_dec_frame_s _decodeframe =Newhih264_dec_frame_s (); //This is the decoder property informationhih264_dec_attr_s decattr =Newhih264_dec_attr_s (); Decattr.upictureformat=0; Decattr.ustreamintype=0; /*Decoder max image width high, D1 image (1280x720)*/DECATTR.UPICWIDTHINMB= (UINT) Width/ -; DECATTR.UPICHEIGHTINMB= (UINT) Height/ -; /*Maximum reference frame of decoder:*/Decattr.ubufnum= -; /*bit0 = 1: standard output mode; bit0 = 0: Fast output mode*/ /*BIT4 = 1: Start internal deinterlace; bit4 = 0: Do not start internal deinterlace*/Decattr.uworkmode=0x10; //Create and initialize a decoder handleIntPtr _dechandle = H264dec.hi264deccreate (refdecattr); //Decode End BOOLIsend =false; intBufferlen =0x8000; //Code Stream Segment byte[] buf =New byte[Bufferlen]; while(!isend) { //get a stream of code, accumulate a certain amount of memory and then solution if(Streambuf.count >= Bufferlen | | isstop = =1) { byteTempbyte; intj =0; for(inti =0; i < Bufferlen; i++) { if(Streambuf.trydequeue ( outtempbyte)) Buf[j++] =Tempbyte; Else { Break; }} IntPtr PData=Marshal.allochglobal (j); Marshal.Copy (BUF,0, PData, J); intresult =0; Result= H264dec.hi264decframe (_dechandle, PData, (UInt32) J,0,ref_decodeframe, (UINT) isstop); while(Hi_h264dec_need_more_bits! =result) { if(Hi_h264dec_no_picture = =result) {Isend=true; Break; } if(HI_H264DEC_OK = = result)/*output A frame image*/ { //Get YUVUInt32 Tempwid =_decodeframe.uwidth; UInt32 Tempheig=_decodeframe.uheight; UInt32 Ystride=_decodeframe.uystride; UInt32 Uvstride=_decodeframe.uuvstride; byte[] y =New byte[Tempheig *Ystride]; byte[] U =New byte[Tempheig * Uvstride/2]; byte[] v =New byte[Tempheig * Uvstride/2]; Marshal.Copy (_decodeframe.py, y,0, y.length); Marshal.Copy (_decodeframe.pu, U,0, u.length); Marshal.Copy (_DECODEFRAME.PV, V,0, v.length); //Convert to YV12 format//byte[] Yuvbytes = new Byte[y.length + u.length + v.length]; //array.copy (y, 0, yuvbytes, 0, y.length); //array.copy (V, 0, Yuvbytes, Y.length, v.length); //array.copy (u, 0, yuvbytes, Y.length + v.length, u.length); //Update Display This. D3dsource.render (_decodeframe.py, _decodeframe.pu, _DECODEFRAME.PV); } /*continue decoding the remaining H .*/result= H264dec.hi264decframe (_dechandle, IntPtr.Zero,0,0,ref_decodeframe, (UINT) isstop); }} System.Threading.Thread.Sleep (5); } /*destroying the decoder*/H264dec.hi264decdestroy (_dechandle);
To complete the work 2, there are many ways, one is to achieve the conversion, the second is to use FFmpeg library for YUV and RGB conversion, three is the use of D3D for conversion, the most efficient is the third way, because it is the use of video cards to convert, more detailed content can be referenced D3D solution for YUV playback under WPF .
To complete the work 3 is very simple, to get a imageview to display it just fine.
C # play H264 bare stream