This article tells you how to use the Android standard API (MEDIACODEC) for hardware encoding and decoding of video. The routine starts with the camera capture video, then H264 encoded, then decoded, and then displayed. I'll try to be brief and clear, not to show the irrelevant code. However, I do not recommend that you read this article, and do not recommend that you develop such applications, but should instead develop some poke fish, birds, enjoyable program. Well, the following is for those who are stubborn, after reading maybe you will agree with me: Android is just a toy, it is difficult to rely on it to do a reliable application.
1. Capturing video from the camera
Video data can be obtained through the callback of the camera preview.
First create the camera and set the parameters:
Cam=Camera.open (); Cam.setpreviewdisplay (holder); Camera.parameters Parameters=cam.getparameters (); Parameters.setflashmode ("Off");//No Flashparameters.setwhitebalance (Camera.Parameters.WHITE_BALANCE_AUTO); Parameters.setscenemode (Camera.Parameters.SCENE_MODE_AUTO); Parameters.setfocusmode (Camera.Parameters.FOCUS_MODE_AUTO); Parameters.setpreviewformat (IMAGEFORMAT.YV12); Parameters.setpicturesize (Camwidth, camheight); Parameters.setpreviewsize (Camwidth, camheight); //these two properties will be error if these two properties are set differently than the real phoneCam.setparameters (parameters);
The width and height must be the size supported by the camera, or the error will be. To get all the supported dimensions, use getsupportedpreviewsizes, which is no longer described here. It is said that all parameters must be set up, missing one can be an error, but it is said that I only set a few properties are not wrong. Then I started the preview:
New byte [Camwidth * camheight * 3/2]; Cam.addcallbackbuffer (BUF); Cam.setpreviewcallbackwithbuffer (this);
Setpreviewcallbackwithbuffer is necessary, or it will be inefficient to reallocate buffers every time the callback system is reassigned.
The original picture can be obtained in onpreviewframe (of course, this must be implements Previewcallback). Here we pass it to the encoder:
Public void onpreviewframe (byte[] data, Camera camera) { ifnull) { 0, data.length, 0); } Cam.addcallbackbuffer (BUF); }
2. Coding
The first thing to initialize the encoder:
Mediacodec = Mediacodec.createencoderbytype ("VIDEO/AVC"); = Mediaformat.createvideoformat (type, width, height); 125000); ); Mediaformat.setinteger (Mediaformat.key_color_format, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar) ; 5); NULL NULL , Mediacodec.configure_flag_encode); Mediacodec.start ();
And then he feeds the data, and the data comes from the camera:
Public voidOnframe (byte[] buf,intOffsetintLengthintflag) {bytebuffer[] inputbuffers=mediacodec.getinputbuffers (); Bytebuffer[] Outputbuffers=mediacodec.getoutputbuffers (); intInputbufferindex = Mediacodec.dequeueinputbuffer (-1); if(Inputbufferindex >= 0) Bytebuffer InputBuffer=Inputbuffers[inputbufferindex]; Inputbuffer.clear (); Inputbuffer.put (buf, offset, length); Mediacodec.queueinputbuffer (Inputbufferindex,0, length, 0, 0); } mediacodec.bufferinfo Bufferinfo=NewMediacodec.bufferinfo (); intOutputbufferindex = Mediacodec.dequeueoutputbuffer (bufferinfo,0); while(Outputbufferindex >= 0) {Bytebuffer OutputBuffer=Outputbuffers[outputbufferindex]; if(Framelistener! =NULL) Framelistener.onframe (OutputBuffer,0, length, flag); Mediacodec.releaseoutputbuffer (Outputbufferindex,false); Outputbufferindex= Mediacodec.dequeueoutputbuffer (bufferinfo, 0); }
First, the data from the camera is fed to it, and then the compressed data from it is fed to the decoder.
3. Decoding and display
First, the initial resolution code device:
Mediacodec = Mediacodec.createdecoderbytype ("VIDEO/AVC"); = Mediaformat.createvideoformat (mime, width, height); null, 0); Mediacodec.start ();
Here, by giving the decoder a surface, the decoder is able to display the screen directly.
Then the data is processed:
Public voidOnframe (byte[] buf,intOffsetintLengthintflag) {bytebuffer[] inputbuffers=mediacodec.getinputbuffers (); intInputbufferindex = Mediacodec.dequeueinputbuffer (-1); if(Inputbufferindex >= 0) {Bytebuffer InputBuffer=Inputbuffers[inputbufferindex]; Inputbuffer.clear (); Inputbuffer.put (buf, offset, length); Mediacodec.queueinputbuffer (Inputbufferindex,0, Length, MCount * 1000000/frame_rate, 0); MCount++; } mediacodec.bufferinfo Bufferinfo=NewMediacodec.bufferinfo (); intOutputbufferindex = Mediacodec.dequeueoutputbuffer (bufferinfo,0); while(Outputbufferindex >= 0) {Mediacodec.releaseoutputbuffer (Outputbufferindex,true); Outputbufferindex= Mediacodec.dequeueoutputbuffer (bufferinfo, 0); } }
Queueinputbuffer The third parameter is the timestamp, in fact, how to write it does not matter, as long as the linear increase in time can be, here casually get one. The next section of the code is to release the buffer, because we directly let the decoder display, there is no need to decode the data, but must be so released, or the decoder is always left for you, memory is not enough.
Well, by now, it's almost ready. If you're lucky enough, you can see the video now, like on my Samsung phone. However, I have tried several other platforms, most of them can not, there are always a variety of problems, if you want to develop a platform-independent application, there are many problems to solve. Tell me about some of the things I've encountered:
1. Video size
Generally can support 176x144/352x288 this size, but the larger, 640x480 has a lot of machines, as for why, I do not know. Of course, this size must be the same size as the camera preview, and the size of the preview can be enumerated.
2. Color space
According to the Android SDK documentation, make sure that all hardware platforms support colors, the camera preview output is YUV12, and the encoder input is Color_formatyuv420planar, which is set in the previous code. However, the document is ultimately a document, otherwise Android will not be Android.
On some platforms, the two color formats are the same, and the camera's output can be used directly as input to the encoder. There are platforms, two are not the same, the former is YUV12, the latter equals I420, need to reverse the former UV components. The following code is inefficient and can be used for reference.
byte[] I420bytes =NULL; Private byte[] swapyv12toi420 (byte[] yv12bytes,intWidthintheight) { if(I420bytes = =NULL) I420bytes=New byte[Yv12bytes.length]; for(inti = 0; i < width*height; i++) I420bytes[i]=Yv12bytes[i]; for(inti = width*height; I < Width*height + (WIDTH/2*HEIGHT/2); i++) I420bytes[i]= Yv12bytes[i + (WIDTH/2*HEIGHT/2)]; for(inti = Width*height + (WIDTH/2*HEIGHT/2); I < Width*height + (WIDTH/2*HEIGHT/2); i++) I420bytes[i]= Yv12bytes[i-(WIDTH/2*HEIGHT/2)]; returni420bytes; }
The difficulty here is that I don't know how to tell if I need this conversion. Android 4.3 is said to avoid the problem by taking images from the camera's preview. Here is an example, although I did not read, but looks very strong appearance, should not be wrong (feel Shing). Http://bigflake.com/mediacodec/CameraToMpegTest.java.txt
3. Format of input and output buffers
There is no format in the SDK, but the H264 format is basically appendix B. However, there are more features, it is not with that Startcode, is that 0x000001, make his encoder made out of something to give his decoder, he can not solve his own. Fortunately, we can add it ourselves.
Bytebuffer OutputBuffer =Outputbuffers[outputbufferindex]; byte[] Outdata =New byte[Bufferinfo.size + 3]; Outputbuffer.get (Outdata,3, bufferinfo.size); if(Framelistener! =NULL) { if((outdata[3]==0 && outdata[4]==0 && outdata[5]==1) || (outdata[3]==0 && outdata[4]==0 && outdata[5]==0 && outdata[6]==1) {framelistener.onframe (Outdata,3, Outdata.length-3, Bufferinfo.flags); } Else{outdata[0] = 0; outdata[1] = 0; outdata[2] = 1; Framelistener.onframe (Outdata,0, Outdata.length, bufferinfo.flags); } }
4, sometimes die in Dequeueinputbuffer (-1) above
According to the SDK documentation, the parameters of Dequeueinputbuffer indicate the time to wait (in milliseconds), 1 to be equal, and 0 to be unequal. According to common Sense 1 on the line, but actually in a lot of machines will hang off, no way, or pass 0, drop frame is better than hanging off. Of course, you can pass a specific number of milliseconds, but it doesn't mean much.
After encountering the above problems, I gave my feeling: Android is a toy.