The simplest FFMPEG-based streamer (using rtmp push as an example)

Source: Internet
Author: User
Tags file url sleep function wowza
This document records a simple FFMPEG-based streamer (simplest FFMPEG streamer ). The streamer is used to push local video data to the Streaming Media Server. The streamer recorded in this article can use streaming media protocols (such as rtmp, HTTP, UDP, TCP, RTP, etc.) to transfer local media files in mov/AVI/MKV/MP4/FLV formats) push it out as a live stream. Due to the wide variety of streaming media protocols, they are not recorded one by one. Here, we will record how to push local files to rtmp streaming media servers (such as Flash Media Server, red5, and wowza) in the form of rtmp live streams.

Based on this streamer, you can modify it in multiple ways to implement various streamer types. For example:
* You can change the input file to a network stream URL to implement a streaming tool.
* Change the input file to a callback function (read by memory) to push the video data in the memory.
* Change the input file to a system device (through libavdevice), and add the encoding function to implement real-time streamer (live broadcast ).

PS: this program does not include the video transcoding function.

Introduction

The role of rtmp streamer (streamer) in the streaming media system can be expressed. First, the video data is sent to the streaming media server in the form of rtmp (server, such as FMS, red5, wowza, etc.), and then the client (usually Flash Player) by accessing the streaming media server, you can watch Real-time streams.


Before running this program, you must first run the rtmp Streaming Media Server and establish the corresponding application on the Streaming Media Server. The operations on streaming media servers are not covered in this article and will not be detailed here. After the program runs, you can use the rtmp client (such as Flash Player and ffplay) to watch the pushed live stream.
Note that the rtmp Encapsulation Format is FLV. Therefore, when you specify the output streaming media, you must specify the Encapsulation Format as "FLV ". Similarly, other streaming media protocols also need to specify the Encapsulation Format. For example, when using UDP to push streaming media, you can specify its Encapsulation Format as "mpegts ".

Latency is required when streaming media data is sent in a delayed manner. Otherwise, FFMPEG processes the data quickly and can immediately send all the data, which cannot be accepted by streaming media servers. Therefore, data must be sent based on the actual Frame Rate of the video. The streamer recorded in this article uses the av_usleep () function sleep function between video frames to delay transmission. In this way, data can be sent according to the frame rate of the video. The reference code is as follows.
//…int64_t start_time=av_gettime();while (1) {//…//Important:Delayif(pkt.stream_index==videoindex){AVRational time_base=ifmt_ctx->streams[videoindex]->time_base;AVRational time_base_q={1,AV_TIME_BASE};int64_t pts_time = av_rescale_q(pkt.dts, time_base, time_base_q);int64_t now_time = av_gettime() - start_time;if (pts_time > now_time)av_usleep(pts_time - now_time);}//…}//…

PTS/DTS problems do not encapsulate the format of the raw stream (for example, H.264 bare stream) does not contain parameters such as PTS and DTs. When sending such data, you need to calculate and write parameters such as pts, DTS, and duration of avpacket. I have not studied it in depth, but I have simply written some code, as shown below.
//FIX:No PTS (Example: Raw H.264)//Simple Write PTSif(pkt.pts==AV_NOPTS_VALUE){//Write PTSAVRational time_base1=ifmt_ctx->streams[videoindex]->time_base;//Duration between 2 frames (us)int64_t calc_duration=(double)AV_TIME_BASE/av_q2d(ifmt_ctx->streams[videoindex]->r_frame_rate);//Parameterspkt.pts=(double)(frame_index*calc_duration)/(double)(av_q2d(time_base1)*AV_TIME_BASE);pkt.dts=pkt.pts;pkt.duration=(double)calc_duration/(double)(av_q2d(time_base1)*AV_TIME_BASE);}

The flowchart of the program is shown in. It can be seen that it is similar to the Encapsulation Format converter in the simplest FFMPEG-based Encapsulation Format converter (without coding/decoding. The obvious difference between them is:
1. streamer output is URL
2. streamer includes the latency part.

 

The Code is as follows.

/*** The simplest FFMPEG-based streamer (push rtmp) * simplest FFMPEG streamer (send rtmp) ** Lei Xiaohua * [email protected] * China Media University/Digital TV technology * Communication University of China/Digital TV technology * http://blog.csdn.net/leixiaohua1020 ** this example realizes pushing local videos to the Streaming Media Server (using rtmp as an example ). * Is the simplest tutorial for streaming media push using FFMPEG. ** This example stream local media files to streaming media * server (use rtmp as example ). * it's the simplest FFMPEG streamer. **/# include <stdio. h> extern "C" {# include "libavformat/avformat. H "# include" libavutil/mathematics. H "# include" libavutil/time. H "}; int main (INT argc, char * argv []) {avoutputformat * ofmt = NULL; // The input corresponds to an avformatcontext, the output corresponds to an avformatcontext // (input avformatcontext and Output Avformatcontext) avformatcontext * ifmt_ctx = NULL, * ofmt_ctx = NULL; avpacket Pkt; const char * in_filename, * out_filename; int ret, I; // in_filename = "cuc_ieschool.mov "; // in_filename = "inherit"; // in_filename = "cuc_ieschool.ts"; // in_filename = "cuc_ieschool.mp4"; // in_filename = "inherit"; in_filename = "inherit "; // Input File URL out_filename = "rtmp: // localhost/publi Shlive/livestream "; // output URL (output URL) [rtmp] // out_filename =" RTP: // 233.233.233.233: 6666 "; // output URL (output URL) [UDP] av_register_all (); // networkavformat_network_init (); // input (input) if (ret = avformat_open_input (& ifmt_ctx, in_filename, 0, 0) <0) {printf ("cocould not open input file. "); goto end;} If (ret = avformat_find_stream_info (ifmt_ctx, 0) <0) {printf (" failed to retrieve input stream failed Ion "); goto end;} int videoindex =-1; for (I = 0; I <ifmt_ctx-> nb_streams; I ++) if (ifmt_ctx-> streams [I]-> codec-> codec_type = avmedia_type_video) {videoindex = I; break;} av_dump_format (ifmt_ctx, 0, in_filename, 0 ); // output partition (& ofmt_ctx, null, "FLV", out_filename); // rtmp // avformat_alloc_output_context2 (& ofmt_ctx, null, "mpegts", out_filename ); // udpif (! Ofmt_ctx) {printf ("cocould not create output context \ n"); ret = averror_unknown; goto end;} ofmt = ofmt_ctx-> oformat; for (I = 0; I <ifmt_ctx-> nb_streams; I ++) {// create an output stream (create output avstream according to input avstream) avstream * in_stream = ifmt_ctx-> streams [I]; avstream * out_stream = avformat_new_stream (ofmt_ctx, in_stream-> codec); If (! Out_stream) {printf ("failed allocating output stream \ n"); ret = averror_unknown; goto end;} // copy avcodeccontext settings (copy the settings of avcodeccontext) ret = avcodec_copy_context (out_stream-> codec, in_stream-> codec); If (Ret <0) {printf ("failed to copy context from input to output stream codec context \ n"); goto end;} out_stream-> codec-> codec_tag = 0; if (ofmt_ctx-> oformat-> flags & avfmt_globalheader) out _ Stream-> codec-> flags | = codec_flag_global_header;} // dump format ---------------- av_dump_format (ofmt_ctx, 0, out_filename, 1); // open output URL (Open output URL) if (! (Ofmt-> flags & avfmt_nofile) {ret = avio_open (& ofmt_ctx-> Pb, out_filename, avio_flag_write); If (Ret <0) {printf ("cocould not open output URL '% S'", out_filename); goto end ;}// Write File Header ret = avformat_write_header (ofmt_ctx, null); If (Ret <0) {printf ("error occurred when opening output URL \ n"); goto end;} int frame_index = 0; int64_t start_time = av_gettime (); while (1) {avstream * in_stream, * Out_stream; // obtain an avpacket (get an avpacket) ret = av_read_frame (ifmt_ctx, & Pkt); If (Ret <0) break; // fix: No PTS (example: raw H. (264) // simple write ptsif (Pkt. PTS = av_nopts_value) {// write ptsavrational time_base1 = ifmt_ctx-> streams [videoindex]-> time_base; // duration between 2 frames (US) int64_t calc_duration = (double) av_time_base/av_q2d (ifmt_ctx-> streams [videoindex]-> r_frame_rate); // parameterspkt. PTS = (do Uble) (frame_index * calc_duration)/(double) (av_q2d (time_base1) * av_time_base); Pkt. DTS = Pkt. PTS; Pkt. duration = (double) calc_duration/(double) (av_q2d (time_base1) * av_time_base);} // important: delayif (Pkt. stream_index = videoindex) {avrational time_base = ifmt_ctx-> streams [videoindex]-> time_base; avrational time_base_q = {1, av_time_base}; int64_t pts_time = cursor (Pkt. DTS, time_base, time_base_q); int64_t now_ti Me = av_gettime ()-start_time; If (pts_time> now_time) av_usleep (pts_time-now_time);} in_stream = ifmt_ctx-> streams [Pkt. stream_index]; out_stream = ofmt_ctx-> streams [Pkt. stream_index];/* Copy packet * // convert pts/DTS (convert pts/DTS) Pkt. PTS = av_rescale_q_rnd (Pkt. PTS, in_stream-> time_base, out_stream-> time_base, (avrounding) (av_round_near_inf | av_round_pass_minmax); Pkt. DTS = av_rescale_q_rnd (Pkt. DTS, In_stream-> time_base, out_stream-> time_base, (avrounding) (av_round_near_inf | av_round_pass_minmax); Pkt. duration = av_rescale_q (Pkt. duration, in_stream-> time_base, out_stream-> time_base); Pkt. pos =-1; // print to screenif (Pkt. stream_index = videoindex) {printf ("Send % 8d video frames to output URL \ n", frame_index); frame_index ++;} // ret = av_write_frame (ofmt_ctx, & Pkt); ret = av_interleaved_write_frame (ofmt _ CTX, & Pkt); If (Ret <0) {printf ("error muxing packet \ n"); break;} av_free_packet (& Pkt );} // Write File trailer (Write File trailer) av_write_trailer (ofmt_ctx); End: avformat_close_input (& ifmt_ctx);/* close output */If (ofmt_ctx &&! (Ofmt-> flags & avfmt_nofile) avio_close (ofmt_ctx-> Pb); avformat_free_context (ofmt_ctx); If (Ret <0 & RET! = Averror_eof) {printf ("error occurred. \ n"); Return-1;} return 0 ;}


Result

After the program starts running. As shown below.


You can use a web player to play the pushed live streams.

As shown in, use videoplayer in the samples folder of Flash Media Server to play live streams, as shown in. (Live video address: rtmp: // localhost/publishlive/livestream)


In addition, you can use a client such as ffplay to play a live stream.

Download

SourceForge project homepage:
Https://sourceforge.net/projects/simplestffmpegstreamer/

Csdn project:
Http://download.csdn.net/detail/leixiaohua1020/8005311


The simplest FFMPEG-based streamer (using rtmp push as an example)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.