For those who do not know about ffmpeg, go to the common basic commands of FFmpeg and read one or two to see if its function can make your app more fun and powerful. It can synthesize audio and video, play audio and video of all types of encoding, intercept a certain segment, and merge multiple images into one video and audio, convert the format, and so on.
The dubbing in the dubbing show is fun. We also want to create a dubbing module. So I found the tool ffmpeg. Try to port it to Android. Ffmpeg has been playing around for three or four days. It is almost a version test, and many articles on the Internet do not work, finally, I succeeded in this article "using ffmpeg and calling interfaces for android". I have tried many versions. Only ffmpeg1.2 and ndkr9 can be transplanted successfully.
Ideas
Ffmpeg is a c-written program, ffmpeg. c contains the main function. We first compile it in Android ndk libffmpeg. so file, and then use this library to compile ffmpeg. c file, put ffmpeg. in the c file, change the main function to the video_merge function, that is, video_merge (int argc, char ** argv ). Next we will merge video videos like this:
void jstringToCstr(JNIEnv *env,jstring _jstring,jbyte **_cstr){ jboolean isCopy=0; (*_cstr)=(*env)->GetStringUTFChars(env,_jstring,&isCopy);}JNIEXPORT jstring JNICALL Java_com_lzw_iword_video_Myffmpeg_stringFromJNI (JNIEnv * env, jclass clz, jstring videoPath, jstring audioPath, jstring avPath){ char const *str1; int n = 0; char *argv[20]; jbyte* str[3]; jstringToCstr(env,videoPath,&str[0]); jstringToCstr(env,audioPath,&str[1]); jstringToCstr(env,avPath,&str[2]); argv[n++] = "ffmpeg"; argv[n++] = "-i"; argv[n++] = str[0]; argv[n++] = "-i"; argv[n++] = str[1]; argv[n++] = "-y"; argv[n++] = "-strict"; argv[n++] = "-2"; argv[n++] = str[2]; int ret = vedio_merge(n, argv); str1 = "Using FFMPEG doing your job"; return (*env)->NewStringUTF(env, str1);}
It is like calling ffmpeg-I src1-I src2-y output in the command line.
Environment
Ubuntu 12.04
Let's take a look at that tutorial first. I can't go back to this article and talk about the problems that will occur in that tutorial.
Interface change call
LOCAL_PATH: = $ (call my-dir) include $ (CLEAR_VARS) PATH_TO_FFMPEG_SOURCE: = $ (LOCAL_PATH)/ffmpegLOCAL_C_INCLUDES + = $ (PATH_TO_FFMPEG_SOURCE) LOCAL_LDLIBS: =-lffmpeg-llog-ljnigraphics-lz-ldl-lgcc // These are the libraries to be associated. so copied to the android-ndk-r8d/platforms/android-14/arch-arm/usr/lib directory is for this LOCAL_MODULE: = ffmpeg-jni LOCAL_SRC_FILES: = ffmpeg-jni.c ffmpeg/cmdutils. h ffmpeg/cmdutils. c ffmpeg/ffmpeg . H ffmpeg/ffmpeg_opt.c ffmpeg/ffmpeg_filter.c // You must compile these files. Otherwise, many undefinded files will exist .. Include $ (BUILD_SHARED_LIBRARY)
When I changed the interface call, I used this Android. mk,
First,
LOCAL_LDLIBS: =-lffmpeg
I have never succeeded in this operation.
This indicates the link library, that is, to compile the current MODULE, and find the required functions in a directory to find libffmpeg. so file,-l means link, ffmpeg indicates the library name, as for lib and. so is automatically added.
So I used this:
LOCAL_PATH := $(call my-dir)include $(CLEAR_VARS)LOCAL_MODULE =myffmpegLOCAL_SRC_FILES :=libffmpeg.soinclude $(PREBUILT_SHARED_LIBRARY)include $(CLEAR_VARS)PATH_TO_FFMPEG_SOURCE:=$(LOCAL_PATH)/ffmpegLOCAL_C_INCLUDES += $(PATH_TO_FFMPEG_SOURCE)LOCAL_MODULE=ffmpeg-jniLOCAL_LDLIBS := -llog -ljnigraphics -lz -ldl -lgcc -lmLOCAL_SRC_FILES := myffmpeg.c ffmpeg/cmdutils.h ffmpeg/cmdutils.c ffmpeg/ffmpeg.h ffmpeg/ffmpeg_opt.c ffmpeg/ffmpeg_filter.cLOCAL_SHARED_LIBRARIES:=myffmpeg include $(BUILD_SHARED_LIBRARY)
Note that this first compiled a shared library myffmpeg, and then passed
LOCAL_SHARED_LIBRARIES:=myffmpeg
Link to compile the final ffmpeg-jni.
Put the libffmpeg. so file in the jni directory, as shown in the following figure:
Debug ffmpeg
After ffmpeg is transplanted, jni is also used to call functions. However, there may be many problems. How can we debug it?
It would be nice if we could also get the debug information like the command line,
This is log. We only need to find the corresponding location and add a Log. I near its printf statement.
In eclipse, you can press and hold the ctrl key from the original main function entry at a position similar to av_log to jump to the function position and track it step by step. It is in the av_log_default_callback function in ffmpeg/libavutil/log. c, like this:
Have you noticed LOGD? It is
... Is a variable length parameter. The simplest example of a variable length parameter is printf. Sometimes it can be pirntf ("% d", a) or printf ("% d", a, ); the first parameter is two parameters, and the last parameter is three parameters. It is done using similar syntax.
_ Android_log_print is the function provided by android to print to Logcat.
In this way, you can see the output log. If the log does not support mp3, amr, or video synthesis, you can use this log to help.
Sometimes it suddenly becomes abnormal.
How do I know the error location.
Enter adb shell logcat | ndk-stack-sym obj/local/armeabi in the command line.
You will find that the app will crash and exit after merging, because the original main function of ffmpeg has an exit statement at the end. Just comment it out.
After merging, you will find that the nvalid heap address in dlfree ffmpeg error occurs when you call it again. The reason is that ffmpeg memory leaks and some dynamic application space is not released. Let's simply put the synthesis of ffmpeg in a service. Merge it once, and then kill the service.
AnroidMainfest. xml:
Register a handler in your program:
Therefore, you can call ffmpeg for the first time to generate a video file without sound, play it, let the user dubbing, and then call it for the second time to synthesize the user's voice and video.
What hurts me here is that, if the recording AAC Encoder can be used to synthesize videos, but media layer cannot be used to play AAC files (What about Xiaomi 2 s ?), If armnb is used, it cannot be synthesized and libffmpeg must be regenerated. so, enable-armnb in configure. If some problems still fail, enable-armnb will run later. /config. sh compilation of armnb has a problem, saying that the corresponding file cannot be found. The file included in the file is not found. Similarly, the enable-libmp3lame was not successful.
The reason why the video can be played is that it takes about one minute to synthesize a 10 s 1280*720 and 10 seconds recording. It is hoped that the user can listen to the dubbing effect first, and then determine whether the combination is not merged.
Observe the dubbing folder of the dubbing show and you will find that the dubbing show records the user's voice and obtains tmp. amr, tmp. pcm is then uploaded to the server for merging and then downloaded.
When dubbing, you have downloaded the audio of the original audio video, subtitles, and audio of the clip that needs to be dubbing. In this way, it can play the audio all the time. When the user voices the audio, the audio is muted.
Use ndk-build in Eclipse
In fact, you do not need to perform ndk build in the command line, select the project, right-click and select Android tools, and then select Add native Support. Then, ndk build is automatically generated every time you run the program in the project.
One-click jni function header generation
It is quite troublesome to write this on your own every time. In fact, it can be generated with one click:
Configure an external tools,
It is generated using the javah command and then integrated into eclipse.
In the title bar of eclipse, click it to view a file similar to com_lzw_iword_video_Myffmpeg.h in the jni directory, as shown in the following figure:
In this way, include this header file into your c file and copy and paste the function header.
If ffmpeg is transplanted to Android, or if you encounter any problems with ffmpeg, you can comment and ask questions, and exchange ideas ~