The address of this project, ask Star
https://github.com/979451341/Audio-and-video-learning-materials/tree/master/FFmpeg%E6%92%AD%E6%94%BE%E8%A7%86%E9%A2%91
First FFmpeg is written in C, so we need the NDK technology, and then I use the NDK to use CMake, the first is to say how to import FFmpeg project, using my method to import ffmpeg not a minute.
This requires you to first download the project code in the above code address
Because FFmpeg this Android-based so file how to generate I do not write it out, I also directly use other people's files, directly using my project is good
1.FFmpeg Simple Description
The Multimedia video processing tool FFmpeg has the very powerful function including the video collection function, the video format conversion, the video capture picture, gives the video watermark and so on.
His function has 7 of the most complete
LIBAVCODEC: Provides a wider range of implementations of codecs.
Libavformat: Implements streaming media protocol, container format and basic I/O access.
Libavutil: Includes calibration, decompression and various practical functions.
Libavfilter: Provides an average change to decode audio and video through the filter chain.
Libavdevice: Provides abstract access to capture and replay devices.
Libswresample: Implement audio blending and resampling programs.
Libswscale: Implements color conversion and scaling programs.
2. Environment configuration
Paste Jnilibs and CPP from the downloaded project into the main folder of the project you created
I also need to add code in the Build.gradle of the app module, add the NDK support type in the Defaultconfig, add parameters to CMake, import cmakelists files under Android, the example code is as follows:
Android {
Compilesdkversion 26
Defaultconfig {
ApplicationID "Jonesx.videoplayer"
Minsdkversion 19
Targetsdkversion 26
Versioncode 1
Versionname "1.0"
Testinstrumentationrunner "Android.support.test.runner.AndroidJUnitRunner"
NDK {
Abifilters ' Armeabi '
}
Externalnativebuild {
CMake {
Arguments '-dandroid_toolchain=clang ', '-dandroid_stl=gnustl_static '
}
}
}
Buildtypes {
Release {
Minifyenabled false
Proguardfiles getdefaultproguardfile (' proguard-android.txt '), ' Proguard-rules.pro '
}
}
Externalnativebuild {
CMake {
Path "Src/main/cpp/cmakelists.txt"
}
}
}
3. Code description
The first is the ability to use the Videoplayer code under the CPP folder, then we need to create a Videoplayer Java class
public class Videoplayer {
static { System.loadLibrary("VideoPlayer");}public static native int play(Object surface);
}
Use this play function to open thread usage directly in Surfaceview's surfacecreated function
@Overridepublic void surfaceCreated(SurfaceHolder holder) { new Thread(new Runnable() { @Override public void run() { VideoPlayer.play(surfaceHolder.getSurface()); } }).start();}
That's the point, tell me what Videoplayer used to ffmpeg.
Get the video format environment and open the MP4 file
Avformatcontext *pformatctx = Avformat_alloc_context ();
if (avformat_open_input(&pFormatCtx, file_name, NULL, NULL) != 0) { LOGD("Couldn‘t open file:%s\n", file_name); return -1; // Couldn‘t open file}
See if there is a stream, and if so, see if there is a video stream
if (avformat_find_stream_info(pFormatCtx, NULL) < 0) { LOGD("Couldn‘t find stream information."); return -1;}int videoStream = -1, i;for (i = 0; i < pFormatCtx->nb_streams; i++) { if (pFormatCtx->streams[i]->codec->codec_type == AVMEDIA_TYPE_VIDEO && videoStream < 0) { videoStream = i; }}if (videoStream == -1) { LOGD("Didn‘t find a video stream."); return -1; // Didn‘t find a video stream}
Get the video decoder environment and see if the decoder can be turned on
AVCodecContext *pCodecCtx = pFormatCtx->streams[videoStream]->codec;// Find the decoder for the video streamAVCodec *pCodec = avcodec_find_decoder(pCodecCtx->codec_id);if (pCodec == NULL) { LOGD("Codec not found."); return -1; // Codec not found}if (avcodec_open2(pCodecCtx, pCodec, NULL) < 0) { LOGD("Could not open codec."); return -1; // Could not open codec}
Get the current phone screen to give this surface memory space through surface
// 获取native windowANativeWindow *nativeWindow = ANativeWindow_fromSurface(env, surface);// 获取视频宽高int videoWidth = pCodecCtx->width;int videoHeight = pCodecCtx->height;// 设置native window的buffer大小,可自动拉伸ANativeWindow_setBuffersGeometry(nativeWindow, videoWidth, videoHeight, WINDOW_FORMAT_RGBA_8888);ANativeWindow_Buffer windowBuffer;if (avcodec_open2(pCodecCtx, pCodec, NULL) < 0) { LOGD("Could not open codec."); return -1; // Could not open codec}
Turn format
struct Swscontext *sws_ctx = Sws_getcontext (Pcodecctx->width,
Pcodecctx->height,
PCODECCTX->PIX_FMT,
Pcodecctx->width,
Pcodecctx->height,
Av_pix_fmt_rgba,
Sws_bilinear,
Null
Null
NULL);
First this decoding is in a loop, then decoding, and before the same frame of a frame decoding, but if one frame is too large then continue decoding in the next loop
Avcodec_decode_video2 (Pcodecctx, Pframe, &framefinished, &packet);
Freeing resources
av_free(buffer);av_free(pFrameRGBA);// Free the YUV frameav_free(pFrame);// Close the codecsavcodec_close(pCodecCtx);// Close the video fileavformat_close_input(&pFormatCtx);
Finished, said is finished, this is just the beginning, my study of FFmpeg is also beginning, after I may intermittently share my experience of using ffmpeg.
Android Audio Video In depth six using ffmpeg to play video (with source download)