Android uses ffmpeg to implement push video streaming to the server

Source: Internet
Author: User

Background

In the past 2015 years, the newest favorite of the video live page is no doubt the outdoor live. With the popularization of 4G network and the increase of coverage, the host can live in the outdoors by mobile phone. The audience is willing to pay for the services that can be used to view the world. Based on this background, this paper mainly implements the acquisition of video on Android devices and pushes it to the server.

Overview

As shown, the capture and push stream on Android is mainly applied to two classes. First, the Android API comes with a camera that captures images from the camera. The Ffmpegframerecorder class in JAVACV then encodes the frame that the camera captures and pushes the stream.

Key steps and Code

The following combination of the above flowchart gives the key steps of video capture. The first is the initialization of the camera class.

// 初始化Camera设备cameraDevice = Camera.open();     Log.i(LOG_TAG, "cameara open");     cameraView = new CameraView(this, cameraDevice);

The Cameraview class above is the class that we implement for previewing video capture and writing the captured frames to Ffmpegframerecorder. The specific code is as follows:

ClassCameraviewExtendsSurfaceviewImplementsSurfaceholder.Callback,Previewcallback {Private Surfaceholder Mholder;Private Camera Mcamera;PublicCameraview(context context, camera camera) {Super (context); LOG.W ("Camera","Camera View"); Mcamera = camera; Mholder = Getholder ();Sets the callback function for the surfaceholder of the Surfaceview mholder.addcallback (Cameraview.this); Mholder.settype (surfaceholder.surface_type_push_buffers);Sets the callback function for Camera preview Mcamera.setpreviewcallback (Cameraview.this); }@OverridePublicvoidsurfacecreated(Surfaceholder Holder) {try {stoppreview (); Mcamera.setpreviewdisplay (holder);}catch (IOException exception) {mcamera.release (); Mcamera =Null } }PublicvoidSurfacechanged(Surfaceholder holder,int format,int width,int height) {stoppreview (); Camera.parameters camparams = Mcamera.getparameters (); List<camera.size> sizes = camparams.getsupportedpreviewsizes ();Sort the list in ascending order Collections.sort (sizes,New Comparator<camera.size> () {PublicIntCompare(Final Camera.size A,Final Camera.size b) {return a.width * a.height-b.width * b.height; } });Pick The first preview size that's equal or bigger, or Pick the last (biggest) option if we cannotReach the initial settings of Imagewidth/imageheight.for (int i =0; I < sizes.size (); i++) {if ((Sizes.get (i). Width >= imagewidth && sizes.get (i). Height >= imageheight) | | i = = sizes.size ()-1) {imagewidth = Sizes.get (i). width; imageheight = Sizes.get (i). Height; LOG.V (Log_tag,"Changed to supported resolution:" + ImageWidth +"X" + ImageHeight);Break }} camparams.setpreviewsize (ImageWidth, imageheight); LOG.V (Log_tag,"Setting imagewidth:" + imagewidth +"ImageHeight:" + ImageHeight +"Framerate:" + framerate); Camparams.setpreviewframerate (framerate); LOG.V (Log_tag,"Preview framerate:" + camparams.getpreviewframerate ()); Mcamera.setparameters (Camparams);Set the holder (which might has changed) againtry {mcamera.setpreviewdisplay (holder); Mcamera.setpreviewcallback (Cameraview.this); Startpreview (); }catch (Exception e) {log.e (Log_tag,"Could not set preview display in Surfacechanged"); } }@OverridePublicvoidSurfacedestroyed(Surfaceholder Holder) {try {mholder.addcallback (NULL); Mcamera.setpreviewcallback (NULL); }catch (RuntimeException e) {The camera has probably just been released, ignore. } }PublicvoidStartpreview() {if (!ispreviewon && mcamera! =NULL) {Ispreviewon =True Mcamera.startpreview (); } }PublicvoidStoppreview() {if (Ispreviewon && mcamera! =NULL) {Ispreviewon =False Mcamera.stoppreview (); } }@OverridePublicvoidOnpreviewframe(byte[] data, camera camera) {if (Audiorecord = =null | | Audiorecord.getrecordingstate ()! = audiorecord.recordstate_recording) {startTime = System.currenttimemillis ();Return }If it is a recording, the frame first exists in memoryif (Record_length >0) {int i = imagesindex++% Images.length; Yuvimage = Images[i]; Timestamps[i] =1000 * (System.currenttimemillis ()-startTime);} if (yuvimage! = null && recording) {((Bytebuffer) Yuvimage.image[0].position (0)). put (data); if (record_length <= 0) Span class= "Hljs-keyword" >try {log.v (Log_tag,  "Writing Frame"); long t = 1000 * (System.currenttimemillis ()-startTime); if (T > Recorder.gettimestamp ()) {Recorder.settimestamp (t);} recorder.record (YuvImage); } catch (ffmpegframerecorder.exception e) {log.v (Log_tag, E.getmessage ()); E.printstacktrace (); } } }}

Initialize the Ffmpegframerecorder class

    new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);    //设置视频编码  28 指代h.264    recorder.setVideoCodec(28);    recorder.setFormat("flv"); //设置采样频率 recorder.setSampleRate(sampleAudioRateInHz); // 设置帧率,即每秒的图像数 recorder.setFrameRate(frameRate); //音频采集线程audioRecordRunnable = new AudioRecordRunnable(); audioThread = new Thread(audioRecordRunnable); runAudioThread = true;

The audiorecordrunnable is our own implementation of the audio capture thread, the code is as follows

 ClassAudiorecordrunnableImplementsRunnable {@OverridePublicvoidRun() {android.os.Process.setThreadPriority (Android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);Audioint buffersize; Shortbuffer Audiodata;int Bufferreadresult; buffersize = Audiorecord.getminbuffersize (Sampleaudiorateinhz, Audioformat.channel_in_mono, AudioFormat.ENCODING_ Pcm_16bit); Audiorecord =New Audiorecord (MediaRecorder.AudioSource.MIC, Sampleaudiorateinhz, Audioformat.channel_in_mono, Audioformat.encoding_pcm_16bit, buffersize);If it is a recording, you need to record the length of the cacheif (Record_length >0) {Samplesindex =0; Samples =New Shortbuffer[record_length * Sampleaudiorateinhz *2/buffersize +1];for (int i =0; i < samples.length; i++) {Samples[i] = shortbuffer.allocate (buffersize);}}else {Live streaming only needs to be equivalent to one frame of audio data cache Audiodata = Shortbuffer.allocate (buffersize); } log.d (Log_tag,"Audiorecord.startrecording ()"); Audiorecord.startrecording ();/* Ffmpeg_audio encoding loop */while (Runaudiothread) {if (Record_length >0) {audiodata = samples[samplesindex++% samples.length]; Audiodata.position (0). Limit (0); }LOG.V (Log_tag, "recording? "+ recording); Bufferreadresult = Audiorecord.read (Audiodata.array (),0, Audiodata.capacity ()); Audiodata.limit (Bufferreadresult);if (Bufferreadresult >0) {LOG.V (Log_tag,"Bufferreadresult:" + bufferreadresult);If "recording" isn ' t true when the start this thread, it never get ' s set according to this if statement ...!!!Why? Good question ...if (recording) {//if it is live, call Recordsamples directly to write the audio recorder Span class= "Hljs-keyword" >if (record_length <= 0) try { Recorder.recordsamples (Audiodata); //LOG.V (Log_tag, "recording" + 1024*i + "to" + 1024*i+1024);} catch (ffmpegframerecorder.exception e) {log.v (Log_tag, E.getmessage ()); E.printstacktrace (); }}}} log.v (Log_tag, /* encoding finish, release recorder */if (audiorecord! = null {audiorecord.stop (); Audiorecord.release (); Audiorecord = null; LOG.V (Log_tag,  "Audiorecord released");}}        

Next is the way to start live and stop live

Start LivePublicvoidStartrecording () {Initrecorder ();try {recorder.start (); startTime = System.currenttimemillis (); recording =True Audiothread.start (); }catch (Ffmpegframerecorder.exception e) {e.printstacktrace ();}}PublicvoidStoprecording () {Stop Audio Thread runaudiothread =Falsetry {audiothread.Join (); }catch (Interruptedexception e) {e.printstacktrace ();} audiorecordrunnable =Null Audiothread =Nullif (Recorder! =Null && recording) {If it is a recording, the frames in the cache are written after the time stampif (Record_length >0) {LOG.V (Log_tag,"Writing frames");try {int firstindex = imagesindex% Samples.length;int lastIndex = (Imagesindex-1)% Images.length;if (Imagesindex <= images.length) {firstindex =0; LastIndex = Imagesindex-1; }if ((StartTime = Timestamps[lastindex]-Record_length *1000000L) <0) {StartTime =0; }if (LastIndex < FirstIndex) {lastIndex + = Images.length;}for (int i = FirstIndex; I <= LastIndex; i++) {Long T = timestamps[i% timestamps.length]-startTime;if (t >=0) {if (T > Recorder.gettimestamp ()) {Recorder.settimestamp (t);} recorder.record (images[i% images.length]); }} firstindex = Samplesindex% Samples.length; LastIndex = (Samplesindex-1)% Samples.length;if (Samplesindex <= samples.length) {firstindex = 0; LastIndex = samplesindex-1;} if (LastIndex < FirstIndex) {lastIndex + = Samples.length;} for (int i = firstindex; I <= lastIndex; i++) { Recorder.recordsamples (samples[i% samples.length]); }} catch (ffmpegframerecorder.exception e) {log.v (Log_tag, E.getmessage ()); E.printstacktrace (); }} recording = FALSE; LOG.V (Log_tag,  "finishing recording, calling stop and release on Recorder"); try {recorder.stop (); Recorder.release ();} catch (ffmpegframerecorder.exception e) {e.printstacktrace ();} recorder = null; }}

The above is the key step and code, the complete project address is given below Rtmprecorder

Recommended:

Multi-process shared data in Android development

Android uses ffmpeg to implement push video streaming to the server

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.