The simplest FFmpeg-based streamer (taking pushing RTMP as an example) and ffmpegrtmp
This document records a simple FFmpeg-based streamer (simplest ffmpeg streamer ). The streamer is used to push local video data to the Streaming Media Server. The streamer recorded in this article can use streaming media protocols (such as RTMP, HTTP, UDP, TCP, RTP, etc.) to transfer local media files in MOV/AVI/MKV/MP4/FLV formats) push it out as a live stream. Due to the wide variety of streaming media protocols, they are not recorded one by one. Here, we will record how to push local files to RTMP streaming Media servers (such as Flash Media Server, Red5, and Wowza) in the form of RTMP live streams.
Based on this streamer, you can modify it in multiple ways to implement various streamer types. For example:
* You can change the input file to a network stream URL to implement a streaming tool.
* Change the input file to a callback function (read by memory) to push the video data in the memory.
* Change the input file to a system device (through libavdevice), and add the encoding function to implement real-time streamer (live broadcast ).
PS: this program does not include the video transcoding function.
Introduction
The role of RTMP Streamer (Streamer) in the streaming media system can be expressed. First, the video data is sent to the streaming media Server in the form of RTMP (Server, such as FMS, Red5, Wowza, etc.), and then the client (usually Flash Player) by accessing the streaming media server, you can watch Real-time streams.
Before running this program, you must first run the RTMP Streaming Media Server and establish the corresponding Application on the Streaming Media Server. The operations on streaming media servers are not covered in this article and will not be detailed here. After the program runs, you can use the RTMP client (such as Flash Player and FFplay) to watch the pushed live stream.
Note that the RTMP Encapsulation Format is FLV. Therefore, when you specify the output streaming media, you must specify the Encapsulation Format as "flv ". Similarly, other streaming media protocols also need to specify the Encapsulation Format. For example, when using UDP to push streaming media, you can specify its Encapsulation Format as "mpegts ".
Latency is required when streaming media data is sent in a delayed manner. Otherwise, FFmpeg processes the data quickly and can immediately send all the data, which cannot be accepted by streaming media servers. Therefore, data must be sent based on the actual Frame Rate of the video. The streamer recorded in this article uses the av_usleep () function sleep function between video frames to delay transmission. In this way, data can be sent according to the frame rate of the video. The reference code is as follows.
//…int64_t start_time=av_gettime();while (1) {//…//Important:Delayif(pkt.stream_index==videoindex){AVRational time_base=ifmt_ctx->streams[videoindex]->time_base;AVRational time_base_q={1,AV_TIME_BASE};int64_t pts_time = av_rescale_q(pkt.dts, time_base, time_base_q);int64_t now_time = av_gettime() - start_time;if (pts_time > now_time)av_usleep(pts_time - now_time);}//…}//…
PTS/DTS problems do not encapsulate the format of the raw stream (for example, H.264 bare stream) does not contain parameters such as PTS and DTS. When sending such data, you need to calculate and write parameters such as PTS, DTS, and duration of AVPacket. I have not studied it in depth, but I have simply written some code, as shown below.
//FIX:No PTS (Example: Raw H.264)//Simple Write PTSif(pkt.pts==AV_NOPTS_VALUE){//Write PTSAVRational time_base1=ifmt_ctx->streams[videoindex]->time_base;//Duration between 2 frames (us)int64_t calc_duration=(double)AV_TIME_BASE/av_q2d(ifmt_ctx->streams[videoindex]->r_frame_rate);//Parameterspkt.pts=(double)(frame_index*calc_duration)/(double)(av_q2d(time_base1)*AV_TIME_BASE);pkt.dts=pkt.pts;pkt.duration=(double)calc_duration/(double)(av_q2d(time_base1)*AV_TIME_BASE);}
The flowchart of the program is shown in. It can be seen that it is similar to the Encapsulation Format converter in the simplest FFMPEG-based Encapsulation Format converter (without coding/decoding. The obvious difference between them is:
1. Streamer output is URL
2. Streamer includes the latency part.
The Code is as follows.
/*** The Simplest FFmpeg-based Streamer (push RTMP) * Simplest FFmpeg Streamer (Send RTMP) ** leixiao Lei Xiaohua * leixiaohua1020@126.com * China Media University/Digital TV Technology * Communication University of China/Digital TV Technology * http://blog.csdn.net/leixiaohua1020 ** this example pushes local videos to streaming media servers (Take RTMP as an example ). * Is the simplest tutorial for streaming media push using FFmpeg. ** This example stream local media files to streaming media * server (Use RTMP as example ). * It's the simplest FFmpeg streamer. **/# include <stdio. h> extern "C" {# include "libavformat/avformat. h "# include" libavutil/mathematics. h "# include" libavutil/time. h "}; int main (int argc, char * argv []) {AVOutputFormat * ofmt = NULL; // The input corresponds to an AVFormatContext, the Output corresponds to an AVFormatContext // (Input AVFormatContext and Output AVFormatContext) AVFormatContext * ifmt_ctx = NULL, * ofmt_ctx = NULL; AVPacket pkt; const char * in_filename, * out_filename; int ret, I; // in_filename = "cuc_ieschool.mov "; // in_filename = "inherit"; // in_filename = "cuc_ieschool.ts"; // in_filename = "cuc_ieschool.mp4"; // in_filename = "inherit"; in_filename = "inherit "; // Input file URL out_filename = "rtmp: // localhost/publi Shlive/livestream "; // Output URL (Output URL) [RTMP] // out_filename =" rtp: // 233.233.233.233: 6666 "; // Output URL (Output URL) [UDP] av_register_all (); // Networkavformat_network_init (); // Input (Input) if (ret = avformat_open_input (& ifmt_ctx, in_filename, 0, 0) <0) {printf ("cocould not open input file. "); goto end;} if (ret = avformat_find_stream_info (ifmt_ctx, 0) <0) {printf (" Failed to retrieve input stream Failed Ion "); goto end;} int videoindex =-1; for (I = 0; I <ifmt_ctx-> nb_streams; I ++) if (ifmt_ctx-> streams [I]-> codec-> codec_type = AVMEDIA_TYPE_VIDEO) {videoindex = I; break;} av_dump_format (ifmt_ctx, 0, in_filename, 0 ); // Output partition (& ofmt_ctx, NULL, "flv", out_filename); // RTMP // avformat_alloc_output_context2 (& ofmt_ctx, NULL, "mpegts", out_filename ); // UDPif (! Ofmt_ctx) {printf ("cocould not create output context \ n"); ret = AVERROR_UNKNOWN; goto end;} ofmt = ofmt_ctx-> oformat; for (I = 0; I <ifmt_ctx-> nb_streams; I ++) {// Create an output stream (Create output AVStream according to input AVStream) AVStream * in_stream = ifmt_ctx-> streams [I]; AVStream * out_stream = avformat_new_stream (ofmt_ctx, in_stream-> codec); if (! Out_stream) {printf ("Failed allocating output stream \ n"); ret = AVERROR_UNKNOWN; goto end;} // Copy AVCodecContext settings (Copy the settings of AVCodecContext) ret = avcodec_copy_context (out_stream-> codec, in_stream-> codec); if (ret <0) {printf ("Failed to copy context from input to output stream codec context \ n"); goto end;} out_stream-> codec-> codec_tag = 0; if (ofmt_ctx-> oformat-> flags & AVFMT_GLOBALHEADER) out _ Stream-> codec-> flags | = CODEC_FLAG_GLOBAL_HEADER;} // Dump Format ---------------- av_dump_format (ofmt_ctx, 0, out_filename, 1); // Open output URL (Open output URL) if (! (Ofmt-> flags & AVFMT_NOFILE) {ret = avio_open (& ofmt_ctx-> pb, out_filename, AVIO_FLAG_WRITE); if (ret <0) {printf ("cocould not open output URL '% S'", out_filename); goto end ;}// Write file header ret = avformat_write_header (ofmt_ctx, NULL); if (ret <0) {printf ("Error occurred when opening output URL \ n"); goto end;} int frame_index = 0; int64_t start_time = av_gettime (); while (1) {AVStream * in_stream, * Out_stream; // obtain an AVPacket (Get an AVPacket) ret = av_read_frame (ifmt_ctx, & pkt); if (ret <0) break; // FIX: No PTS (Example: raw H. (264) // Simple Write PTSif (pkt. pts = AV_NOPTS_VALUE) {// Write PTSAVRational time_base1 = ifmt_ctx-> streams [videoindex]-> time_base; // Duration between 2 frames (us) int64_t calc_duration = (double) AV_TIME_BASE/av_q2d (ifmt_ctx-> streams [videoindex]-> r_frame_rate); // Parameterspkt. pts = (do Uble) (frame_index * calc_duration)/(double) (av_q2d (time_base1) * AV_TIME_BASE); pkt. dts = pkt. pts; pkt. duration = (double) calc_duration/(double) (av_q2d (time_base1) * AV_TIME_BASE);} // Important: Delayif (pkt. stream_index = videoindex) {AVRational time_base = ifmt_ctx-> streams [videoindex]-> time_base; AVRational time_base_q = {1, AV_TIME_BASE}; int64_t pts_time = cursor (pkt. dts, time_base, time_base_q); int64_t now_ti Me = av_gettime ()-start_time; if (pts_time> now_time) av_usleep (pts_time-now_time);} in_stream = ifmt_ctx-> streams [pkt. stream_index]; out_stream = ofmt_ctx-> streams [pkt. stream_index];/* copy packet * // Convert PTS/DTS (Convert PTS/DTS) pkt. pts = av_rescale_q_rnd (pkt. pts, in_stream-> time_base, out_stream-> time_base, (AVRounding) (AV_ROUND_NEAR_INF | AV_ROUND_PASS_MINMAX); pkt. dts = av_rescale_q_rnd (pkt. dts, In_stream-> time_base, out_stream-> time_base, (AVRounding) (AV_ROUND_NEAR_INF | AV_ROUND_PASS_MINMAX); pkt. duration = av_rescale_q (pkt. duration, in_stream-> time_base, out_stream-> time_base); pkt. pos =-1; // Print to Screenif (pkt. stream_index = videoindex) {printf ("Send % 8d video frames to output URL \ n", frame_index); frame_index ++;} // ret = av_write_frame (ofmt_ctx, & pkt); ret = av_interleaved_write_frame (ofmt _ Ctx, & pkt); if (ret <0) {printf ("Error muxing packet \ n"); break;} av_free_packet (& pkt );} // Write file trailer (Write file trailer) av_write_trailer (ofmt_ctx); end: avformat_close_input (& ifmt_ctx);/* close output */if (ofmt_ctx &&! (Ofmt-> flags & AVFMT_NOFILE) avio_close (ofmt_ctx-> pb); avformat_free_context (ofmt_ctx); if (ret <0 & ret! = AVERROR_EOF) {printf ("Error occurred. \ n"); return-1;} return 0 ;}
Result
After the program starts running. As shown below.
You can use a web player to play the pushed live streams.
As shown in, use videoPlayer in the Samples folder of Flash Media Server to play live streams, as shown in. (Live video address: rtmp: // localhost/publishlive/livestream)
In addition, you can use a client such as FFplay to play a live stream.
Download
SourceForge project homepage:
Https://sourceforge.net/projects/simplestffmpegstreamer/
CSDN project:
Http://download.csdn.net/detail/leixiaohua1020/8005311