Compile ffmpeg0.8.1 for Android ndk R6 in the 32-bit Ubuntu 11.04 series of articles-use FFMPEG media library in Android (1) and use the compiled FFMPEG library through JNI in Android-use the FFMPEG media library in Android (2). After the article is published, this article will refer to an open-source project of church labs in GitHub, to illustrate how to use the FFMPEG library for multimedia development.
The code in this article is from https://github.com/chilnlabs/android-ffmpeg-sample. For more information, see this project code. I will add some comments in the code. Thank you for providing us with such a good example for us to learn.
In Android, most application development at the system layer is called using JNI. In addition, JNI encapsulation can be used for programs that consume CPU resources or have complicated processing logic. It can improve the execution efficiency of the program.
This article involves the following aspects:
1. Push the 3GP file to the sdcard of the simulated machine.
2. Write the JNI code, call the FFMPEG library method internally, and compile the JNI library.
3. The library generated by loadlibrary, and then write the corresponding Java code
4. Execute the program and view the final running result.
The Display Effect of the final program is as follows:
1. Use the ddms tool of eclipse to push vid.3pg to sdcard.
2. Write the corresponding JNI File
/** Copyright 2011-Churchill labs, LLC ** licensed under the Apache license, version 2.0 (the "License "); * You may not use this file before t in compliance with the license. * You may be obtain a copy of the license at ** http://www.apache.org/licenses/LICENSE-2.0 ** unless required by applicable law or agreed to in writing, software * distributed under the license is distributed on an "as is" basis, * Without warranties or conditions of any kind, either express or implied. * See the license for the specific language governing permissions and * limitations under the license. * // ** this is mostly based off of the FFMPEG Tutorial: * http://dranger.com/ffmpeg/ * with a few updates to support Android output mechanisms and to update * Places where the APIs have shifted. */# include <JNI. h> # inclu De <string. h> # include <stdio. h> # include <Android/log. h> # include <Android/bitmap. h> // contains the FFMPEG library header file. These files are directly stored in the JNI directory # include <libavcodec/avcodec. h> # include <libavformat/avformat. h> # include <libswscale/swscale. h> # define log_tag "ffmpegsample" # define Logi (...) _ android_log_print (android_log_info, log_tag ,__ va_args _) # define LogE (...) _ android_log_print (android_log_error, log_tag ,__ va_args _)/* cheat To keep things simple and just use some globals. * // Global Object avformatcontext * pformatctx; avcodeccontext * pcodecctx; avframe * pframe; avframe * pframergb; int videostream;/** write a frame worth of video (in pframe) into the android bitmap * described by info using the raw Pixel Buffer. it's a very inefficient * draw routine, but it's easy to read. relies on the format of the * bitmap being 8 bits per col Or component plus an 8bit alpha channel. * // defines the static method. Draw a static void fill_bitmap (androidbitmapinfo * info, void * pixels, avframe * pframe) {uint8_t * frameline In the bitmap of Android; int YY; For (yy = 0; YY <Info-> height; YY ++) {uint8_t * line = (uint8_t *) pixels; frameline = (uint8_t *) pframe-> data [0] + (yy * pframe-> linesize [0]); int xx; For (xx = 0; XX <Info-> width; XX ++) {int out_offset = xx * 4; int in_offset = XX * 3; line [out_offset] = frameline [in_offset]; line [out_offset + 1] = frameline [in_offset + 1]; line [out_offset + 2] = frameline [in_offset + 2]; line [out_offset + 3] = 0;} pixels = (char *) pixels + Info-> stride ;}} // define the Java callback function, equivalent to com. the openfile method in the mainactivity class in ffmpegsample in church labs. Void java_com_churchill _ ffmpegsample_mainactivity_openfile (jnienv * ENV, jobject this) {int ret; int err; int I; avcodec * pcodec; uint8_t * buffer; int numbytes; // register all functions av_register_all (); LogE ("registered Formats"); // open the vid.3gp file err = av_open_input_file (& pformatctx, "file: /sdcard/vid.3gp ", null, 0, null); LogE (" called Open File "); If (Err! = 0) {LogE ("couldn't open file"); return;} LogE ("opened file"); If (av_find_stream_info (pformatctx) <0) {LogE ("unable to get stream Info"); return;} videostream =-1; // define the videostream for (I = 0; I <pformatctx-> nb_streams; I ++) {If (pformatctx-> streams [I]-> codec-> codec_type = codec_type_video) {videostream = I; break ;}} if (videostream =-1) {LogE ("unable to find video stream"); return;} Logi ("vid EO stream is [% d] ", videostream); // defines the encoding type pcodecctx = pformatctx-> streams [videostream]-> codec; // obtain the decoder pcodec = avcodec_find_decoder (pcodecctx-> codec_id); If (pcodec = NULL) {LogE ("unsupported codec"); return ;} // use a specific decoder to open if (avcodec_open (pcodecctx, pcodec) <0) {LogE ("unable to open codec"); return ;} // assign frame space pframe = avcodec_alloc_frame (); // assign RGB frame space pframergb = avcodec_alloc_frame (); Logi ("video size Is [% d x % d] ", pcodecctx-> width, pcodecctx-> height); // obtain the size numbytes = avpicture_get_size (pix_fmt_rgb24, pcodecctx-> width, pcodecctx-> height); Allocate space buffer = (uint8_t *) av_malloc (numbytes * sizeof (uint8_t); avpicture_fill (avpicture *) pframergb, buffer, buffers, pcodecctx-> width, pcodecctx-> height);} // defines the Java callback function, which is equivalent to com. the drawframe method in the mainactivity class of ffmpegsample in church labs. Void Merge (jnienv * ENV, jobject this, jstring Bitmap) {androidbitmapinfo Info; void * pixels; int ret; int err; int I; int framefinished = 0; avpacket packet; static struct swscontext * img_convert_ctx; int64_t seek_target; If (ret = androidbitmap_getinfo (ENV, bitmap, & info) <0) {LogE ("androidbitmap_getinfo () failed! Error = % d ", RET); return;} LogE (" checked on the bitmap "); If (ret = androidbitmap_lockpixels (ENV, bitmap, & pixels) <0) {LogE ("androidbitmap_lockpixels () failed! Error = % d ", RET);} LogE (" grabbed the pixels "); I = 0; while (I = 0) & (av_read_frame (pformatctx, & Packet)> = 0) {If (packet. stream_index = videostream) {avcodec_decode_video2 (pcodecctx, pframe, & framefinished, & Packet); If (framefinished) {LogE ("packet PTS % LlU", packet. PTS); // This is much different than the tutorial, sws_scale // replaces img_convert, but it's not a complete drop in. // This ver Sion keeps the image the same size but swaps to // rgb24 format, which works perfect for ppm output. int target_width = 320; int target_height = 240; img_convert_ctx = sws_getcontext (pcodecctx-> width, pcodecctx-> height, pcodecctx-> pix_fmt, target_width, target_height, distance, sws_bicubic, null, null, null); If (img_convert_ctx = NULL) {LogE ("cocould not initialize conversion context \ n"); RET Urn;} sws_scale (img_convert_ctx, (const uint8_t * const *) pframe-> data, pframe-> linesize, 0, pcodecctx-> height, pframergb-> data, pframergb-> linesize); // save_frame (pframergb, target_width, target_height, I); fill_bitmap (& info, pixels, pframergb); I = 1 ;}} av_free_packet (& Packet);} androidbitmap_unlockpixels (ENV, bitmap);} // Internal call function, not external, used to find the int seek_frame (int tsms) {int64_t frame; frame = av_res Cale (TSMS, pformatctx-> streams [videostream]-> time_base.den, pformatctx-> streams [videostream]-> time_base.num); frame/= 1000; If (avformat_seek_file (pformatctx, videostream, 0, frame, frame, avseek_flag_frame) <0) {return 0;} avcodec_flush_buffers (pcodecctx); return 1;} // defines the Java callback function, which is equivalent to com. the drawframeat method in the mainactivity class of ffmpegsample in church labs. Void Merge (jnienv * ENV, jobject this, jstring bitmap, jint SECs) {androidbitmapinfo Info; void * pixels; int ret; int err; int I; int framefinished = 0; avpacket packet; static struct swscontext * img_convert_ctx; int64_t seek_target; If (ret = androidbitmap_getinfo (ENV, bitmap, & info) <0) {LogE ("failed () failed! Error = % d ", RET); return;} LogE (" checked on the bitmap "); If (ret = androidbitmap_lockpixels (ENV, bitmap, & pixels) <0) {LogE ("androidbitmap_lockpixels () failed! Error = % d ", RET);} LogE (" grabbed the pixels "); seek_frame (SECs * 1000); I = 0; while (I = 0) & (av_read_frame (pformatctx, & Packet)> = 0) {If (packet. stream_index = videostream) {convert (pcodecctx, pframe, & framefinished, & Packet); If (framefinished) {// This is much different than the tutorial, sws_scale // replaces img_convert, but it's not a complete drop in. // This version keeps the image the same size but swaps to // rgb24 format, which works perfect for ppm output. int target_width = 320; int target_height = 240; img_convert_ctx = sws_getcontext (pcodecctx-> width, pcodecctx-> height, pcodecctx-> pix_fmt, target_width, target_height, distance, sws_bicubic, null, null, null); If (img_convert_ctx = NULL) {LogE ("cocould not initialize conversion context \ n"); return;} sws_scale (img_convert_ctx, (const uint8_t * const *) pframe-> data, pframe-> linesize, 0, pcodecctx-> height, pframergb-> data, pframergb-> linesize); // save_frame (pframergb, target_width, target_height, i); fill_bitmap (& info, pixels, pframergb); I = 1 ;}} av_free_packet (& Packet) ;}androidbitmap_unlockpixels (ENV, bitmap );}
3. Write the corresponding Android. mk File
LOCAL_PATH := $(call my-dir) include $(CLEAR_VARS) LOCAL_MODULE := ffmpegutilsLOCAL_SRC_FILES := native.c LOCAL_C_INCLUDES := $(LOCAL_PATH)/includeLOCAL_LDLIBS := -L$(NDK_PLATFORMS_ROOT)/$(TARGET_PLATFORM)/arch-arm/usr/lib -L$(LOCAL_PATH) -lavformat -lavcodec -lavdevice -lavfilter -lavcore -lavutil -lswscale -llog -ljnigraphics -lz -ldl -lgcc include $(BUILD_SHARED_LIBRARY)
Pay attention to the directory of the file.
In Android. mk, the path of the corresponding FFMPEG header file is specified by a local_c_includes :=$ (local_path)/include parameter. Therefore, the Code contains
#include <libavcodec/avcodec.h>#include <libavformat/avformat.h>#include <libswscale/swscale.h>
You can.
4. Call ndk-build to generate libffmpegutils. so file, copy this file to the/root/develop/android-ndk-r6/platforms/Android-8/arch-arm/usr/lib directory, so that when we use Android avd2.2 below, you can load the so file.
5. Write the corresponding Eclipse project code. the C file specifies the project term, class term, and function term, so our project is com. church labs. mainactivity under ffmpegsample. java files
package com.churnlabs.ffmpegsample; import android.app.Activity;import android.graphics.Bitmap;import android.os.Bundle;import android.view.View;import android.view.View.OnClickListener;import android.widget.Button;import android.widget.ImageView; public class MainActivity extends Activity { private static native void openFile(); private static native void drawFrame(Bitmap bitmap); private static native void drawFrameAt(Bitmap bitmap, int secs); private Bitmap mBitmap; private int mSecs = 0; static { System.loadLibrary("ffmpegutils"); } /** Called when the activity is first created. */ @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); //setContentView(new VideoView(this)); setContentView(R.layout.main); mBitmap = Bitmap.createBitmap(320, 240, Bitmap.Config.ARGB_8888); openFile(); Button btn = (Button)findViewById(R.id.frame_adv); btn.setOnClickListener(new OnClickListener() { public void onClick(View v) { drawFrame(mBitmap); ImageView i = (ImageView)findViewById(R.id.frame); i.setImageBitmap(mBitmap); } }); Button btn_fwd = (Button)findViewById(R.id.frame_fwd); btn_fwd.setOnClickListener(new OnClickListener() { public void onClick(View v) { mSecs += 5; drawFrameAt(mBitmap, mSecs); ImageView i = (ImageView)findViewById(R.id.frame); i.setImageBitmap(mBitmap); } }); Button btn_back = (Button)findViewById(R.id.frame_back); btn_back.setOnClickListener(new OnClickListener() { public void onClick(View v) { mSecs -= 5; drawFrameAt(mBitmap, mSecs); ImageView i = (ImageView)findViewById(R.id.frame); i.setImageBitmap(mBitmap); } }); }}
6. Compile and run the program.
7. Download the project code:
Https://github.com/churnlabs/android-ffmpeg-sample/zipball/master
Refer:
1 https://github.com/churnlabs/android-ffmpeg-sample
2 http://www.360doc.com/content/10/1216/17/474846_78726683.shtml
3 https://github.com/prajnashi
This article is the same as the release address:
Http://doandroid.info /? P = 497
Thanks to the original author share, reproduced: http://www.cnblogs.com/doandroid/archive/2011/11/09/2242558.html