The simplest Video Player Based on FFMPEG + SDL: Split-decoder and player, ffmpegsdl
========================================================== ==================
List of the simplest FFmpeg-Based Video Player series:
100-line code is the simplest Video Player (SDL1.x) based on FFMPEG + SDL)
The simplest Video Player ver2 Based on FFMPEG + SDL (using SDL2.0)
Simplest FFmpeg-based decoder-pure version (excluding libavformat)
The simplest Video Player Based on FFMPEG + SDL: Split-decoder and player
The simplest Helloworld program based on FFMPEG
========================================================== ==================
In this article, two examples of the simplest Video Player Based on FFMPEG + SDL are recorded: FFmpeg Video Decoder and SDL pixel data player. These two parts are two examples split from the video player. The FFmpeg video decoder decodes video data from YUV to YUV, while the SDL pixel data player displays YUV data. In short, the original FFmpeg + SDL video player is implemented:
Video data-> YUV-> display
The FFmpeg Video Decoder implements:
Video data-> YUV
The SDL pixel data player implements:
YUV-> display
Source code of FFmpeg Video Decoder
/*** The Simplest FFmpeg-Based Video Decoder * Simplest FFmpeg Decoder ** leixiao Lei Xiaohua * leixiaohua1020@126.com * China Media University/Digital TV technology * Communication University of China/Digital TV Technology * http://blog.csdn.net/leixiaohua1020 ** this program implements video file decoding into YUV data. It uses libavcodec and * libavformat. It is the simplest tutorial on FFmpeg video decoding. * By studying this example, you can understand the FFmpeg decoding process. * This software is a simplest decoder based on FFmpeg. * It decodes video to YUV pixel data. * It uses libavcodec and libavformat. * Suitable for beginner of FFmpeg. **/# include <stdio. h> # define _ STDC_CONSTANT_MACROS # ifdef _ WIN32 // Windowsextern "C" {# include "libavcodec/avcodec. h "# include" libavformat/avformat. h "# include" libswscale/swscale. h "}; # else // Linux... # ifdef _ cplusplusextern "C" {# endif # I Nclude <libavcodec/avcodec. h> # include <libavformat/avformat. h> # include <libswscale/swscale. h ># ifdef _ cplusplus}; # endif # endifint main (int argc, char * argv []) {AVFormatContext * pFormatCtx; inti, videoindex; AVCodecContext * pCodecCtx; AVCodec * pCodec; AVFrame * pFrame, * pFrameYUV; uint8_t * out_buffer; AVPacket * packet; int y_size; int ret, got_picture; struct SwsContext * img_convert_ctx; char filepath [] = "; FILE * fp_yuv = fopen (" output. yuv "," wb + "); av_register_all (); avformat_network_init (); pFormatCtx = avformat_alloc_context (); if (avformat_open_input (& pFormatCtx, filepath, NULL, NULL )! = 0) {printf ("Couldn't open input stream. \ n "); return-1;} if (avformat_find_stream_info (pFormatCtx, NULL) <0) {printf (" Couldn't find stream information. \ n "); return-1 ;}videoindex =-1; for (I = 0; I <pFormatCtx-> nb_streams; I ++) if (pFormatCtx-> streams [I]-> codec-> codec_type = AVMEDIA_TYPE_VIDEO) {videoindex = I; break;} if (videoindex =-1) {printf ("Didn't find a video stream. \ n "); return-1;} pCodecCtx = pFormatCtx-> streams [Videoindex]-> codec; pCodec = avcodec_find_decoder (pCodecCtx-> codec_id); if (pCodec = NULL) {printf ("Codec not found. \ n "); return-1;} if (avcodec_open2 (pCodecCtx, pCodec, NULL) <0) {printf (" cocould not open codec. \ n "); return-1;} pFrame = av_frame_alloc (); pFrameYUV = av_frame_alloc (); out_buffer = (uint8_t *) av_malloc (bytes, pCodecCtx-> width, pCodecCtx-> height); avpicture_fill (AVPicture *) p FrameYUV, out_buffer, PIX_FMT_YUV420P, pCodecCtx-> width, pCodecCtx-> height); packet = (AVPacket *) av_malloc (sizeof (AVPacket )); // Output Info ------------------------------- printf ("--------------- File Information ---------------- \ n"); av_dump_format (pFormatCtx, 0, filepath, 0); printf ("bytes \ n "); img_convert_ctx = sws_getContext (pCodecCtx-> width, pCodecCtx-> height, p CodecCtx-> pix_fmt, pCodecCtx-> width, pCodecCtx-> height, PIX_FMT_YUV420P, SWS_BICUBIC, NULL, NULL); while (av_read_frame (pFormatCtx, packet)> = 0) {if (packet-> stream_index = videoindex) {ret = avcodec_decode_video2 (pCodecCtx, pFrame, & got_picture, packet); if (ret <0) {printf ("Decode Error. \ n "); return-1;} if (got_picture) {sws_scale (img_convert_ctx, (const uint8_t * const *) pFrame-> data, pFrame-> linesize, 0, pCodecCtx-> height, pFrameYUV-> data, pFrameYUV-> linesize); y_size = pCodecCtx-> width * pCodecCtx-> height; fwrite (pFrameYUV-> data [0], 1, y_size, fp_yuv); // Y fwrite (pFrameYUV-> data [1], 1, y_size/4, fp_yuv ); // Ufwrite (pFrameYUV-> data [2], 1, y_size/4, fp_yuv); // Vprintf ("Succeed to decode 1 frame! \ N ") ;}} av_free_packet (packet);} // flush decoder // FIX: Flush Frames remained in Codecwhile (1) {ret = avcodec_decode_video2 (pCodecCtx, pFrame, & got_picture, packet); if (ret <0) break; if (! Got_picture) break; sws_scale (img_convert_ctx, (const uint8_t * const *) pFrame-> data, pFrame-> linesize, 0, pCodecCtx-> height, pFrameYUV-> data, pFrameYUV-> linesize); int y_size = pCodecCtx-> width * pCodecCtx-> height; fwrite (pFrameYUV-> data [0], 1, y_size, fp_yuv ); // Y fwrite (pFrameYUV-> data [1], 1, y_size/4, fp_yuv); // Ufwrite (pFrameYUV-> data [2], 1, y_size/4, fp_yuv); // Vprintf ("Flush Decoder: Succeed to decode 1 frame! \ N ");} sws_freeContext (img_convert_ctx); fclose (fp_yuv); av_frame_free (& pFrameYUV); av_frame_free (& pFrame); avcodec_close (pCodecCtx); convert (& pFormatCtx ); return 0 ;}
After the running result program is run, the following video files are decoded.
The decoded YUV420P data is saved as a file. After setting the width and height of the YUV player, you can view the YUV content.
Source code of the SDL pixel data player
/*** Simplest example of SDL2 Video playback (SDL2 playback RGB/YUV) * Simplest Video Play SDL2 (SDL2 play RGB/YUV) ** leixiao Li Lei Xiaohua * leixiaohua1020@126.com * China Media University/Digital TV Technology * Communication University of China/Digital TV Technology * http://blog.csdn.net/leixiaohua1020 ** this program uses SDL2 to play RGB/YUV video pixels data. SDL is actually an encapsulation of the underlying graph * API (Direct3D, OpenGL). It is obviously easy to use to directly call the underlying * API. ** Function call procedure: ** [initialization] * SDL_Init (): Initialize SDL. * SDL_CreateWindow (): Creates a Window ). * SDL_CreateRenderer (): Creates a Renderer (Render) based on the window ). * SDL_CreateTexture (): Creates a Texture (Texture ). ** [Cyclically rendered data] * SDL_UpdateTexture (): sets texture data. * SDL_RenderCopy (): Copies the texture to the Renderer. * SDL_RenderPresent (): display. ** This software plays RGB/YUV raw video data using SDL2. * SDL is a wrapper of low-level API (Direct3D, OpenGL ). * Use SDL is much easier than directly call these low-level API. ** The process is shown as follows: ** [Init] * SDL_Init (): Init SDL. * SDL_CreateWindow (): Create a Window. * SDL_CreateRenderer (): Create a Render. * SDL_CreateTexture (): Create a Texture. ** [Loop to Render data] * SDL_UpdateTexture (): Set Texture's data. * SDL_RenderCopy (): Copy Texture to Render. * SDL_RenderPresent (): Show. */# include <stdio. h> extern "C" {# include "sdl/SDL. h "}; const int bpp = 12; int screen_w = 500, screen_h = 500; const int pixel_w = 320, pixel_h = 180; unsigned char buffer [pixel_w * pixel_h * bpp/8]; // Refresh Event # define REFRESH_EVENT (SDL_USEREVENT + 1) # define BREAK_EVENT (SDL_USEREVENT + 2) int thread_e Xit = 0; int refresh_video (void * opaque) {thread_exit = 0; while (! Thread_exit) {SDL_Event event; event. type = REFRESH_EVENT; SDL_PushEvent (& event); SDL_Delay (40);} thread_exit = 0; // BreakSDL_Event event; event. type = BREAK_EVENT; SDL_PushEvent (& event); return 0;} int main (int argc, char * argv []) {if (SDL_Init (SDL_INIT_VIDEO )) {printf ("cocould not initialize SDL-% s \ n", SDL_GetError (); return-1 ;}sdl_window * screen; // SDL 2.0 Support for multiple windowsscreen = SDL_Create Window ("Simplest Video Play SDL2", SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED, screen_w, screen_h, SDL_WINDOW_OPENGL | SDL_WINDOW_RESIZABLE); if (! Screen) {printf ("SDL: cocould not create window-exiting: % s \ n", SDL_GetError (); return-1;} SDL_Renderer * sdlRenderer = SDL_CreateRenderer (screen, -1, 0); Uint32 pixformat = 0; // IYUV: Y + U + V (3 planes) // YV12: Y + V + U (3 planes) pixformat = delimiter; SDL_Texture * sdlTexture = SDL_CreateTexture (sdlRenderer, pixformat, SDL_TEXTUREACCESS_STREAMING, pixel_w, pixel_h); FILE * fp = NULL; fp = fopen ("tes T_yuv420p_320x180.yuv "," rb + "); if (fp = NULL) {printf (" cannot open this file \ n "); return-1;} SDL_Rect sdlRect; SDL_Thread * refresh_thread = SDL_CreateThread (refresh_video, NULL, NULL); SDL_Event event; while (1) {// WaitSDL_WaitEvent (& event); if (event. type = REFRESH_EVENT) {if (fread (buffer, 1, pixel_w * pixel_h * bpp/8, fp )! = Pixel_w * pixel_h * bpp/8) {// Loopfseek (fp, 0, SEEK_SET); fread (buffer, 1, pixel_w * pixel_h * bpp/8, fp );} SDL_UpdateTexture (sdlTexture, NULL, buffer, pixel_w); // FIX: If window is resizesdlRect. x = 0; sdlRect. y = 0; sdlRect. w = screen_w; sdlRect. h = screen_h; SDL_RenderClear (sdlRenderer); SDL_RenderCopy (sdlRenderer, sdlTexture, NULL, & sdlRect); SDL_RenderPresent (sdlRenderer);} else if (event. type = sdl_javaswevent) {// If ResizeSDL_GetWindowSize (screen, & screen_w, & screen_h);} else if (event. type = SDL_QUIT) {thread_exit = 1;} else if (event. type = BREAK_EVENT) {break;} SDL_Quit (); return 0 ;}
After running the result program, a YUV420P file in the program folder is read as follows.
The YUV content will be drawn in the pop-up window.
Download
Simplest FFmpeg Player
Project Homepage
SourceForge: https://sourceforge.net/projects/simplestffmpegplayer/
Github: https://github.com/leixiaohua1020/simplest_ffmpeg_player
Open source China: http://git.oschina.net/leixiaohua1020/simplest_ffmpeg_player
This program implements video file decoding and display (support for HEVC, H.264, MPEG2, etc ).
It is the simplest tutorial on FFmpeg video decoding.
By studying this example, we can understand the decoding process of FFmpeg.
The project contains six projects:
Simplest_ffmpeg_player: Standard Edition, the start of FFmpeg learning.
Simplest_ffmpeg_player_su: SU (SDL Update) version, added a simple SDL Event.
Simplest_ffmpeg_decoder: A decoder that contains the Encapsulation Format processing function. Libavcodec and libavformat are used.
Simplest_ffmpeg_decoder_pure: a pure decoder. Only libavcodec is used (libavformat is not used ).
Simplest_video_play_sdl2: An Example of YUV playback using SDL2.
Simplest_ffmpeg_helloworld: output the information of the FFmpeg class library.
Copyright Disclaimer: This article is an original article by the blogger and cannot be reproduced without the permission of the blogger.